Académique Documents
Professionnel Documents
Culture Documents
Cameras
An Intro To Cameras
In Unity, Cameras are used to display the game world to the player. You will always have one camera in the scene. It is, however, possible to have more than one. Using multiple cameras, you can pull off some pretty cool effects, like multiplayer split screen, render to texture, etc. You can animate cameras, control them with physics, or bind them to animated rigs and watch them take off. Practically anything you can imagine is possible with cameras, and you can use typical or unique cameras to t your games style.
Anatomy of a Camera
The Camera property found on a Camera is very unique, and has some remarkably powerful properties. Over the next set of slides, well take a look at some of these properties and how you can use them to your advantage.
Clear Flags
Clear Flags Each Camera stores color and depth information when it renders its view. The portions of the screen that are not drawn in are empty, and will display the skybox by default. When you are using multiple Cameras, each one stores its own color and depth information in buffers, accumulating more data as each Camera renders. As any particular Camera in your scene renders its view, you can set the Clear Flags to clear different collections of the buffer information. This is done by choosing one of the four options
Skybox This is the default setting. Any empty portions of the screen will display the current Camera's skybox. If the current Camera has no skybox set, it will default to the skybox chosen in the Render Settings (found in Edit->Render Settings). It will then fall back to the Background Color. Solid Color Any empty portions of the screen will display the current Camera's Background Color.
Clip Planes
Clip Planes The Near and Far Clip Plane properties determine where the Camera's view begins and ends. The planes are laid out perpendicular to the Camera's direction and are measured from the its position. The Near plane is the closest location that will be rendered, and the Far plane is the furthest. The clipping planes also determine how depth buffer precision is distributed over the scene. In general, to get better precision you should move the Near plane as far as possible.
Culling Mask
Culling Mask The Culling Mask is used for selectively rendering groups of objects using Layers. Commonly, it is good practice to put your User Interface on a different layer, then render it by itself with a separate Camera set to render the UI layer by itself. In order for the UI to display on top of the other Camera views, you'll also need to set the Clear Flags to Depth only and make sure that the UI Camera's Depth is higher than the other Cameras.
Orthographic Camera
Marking a Camera as Orthographic removes all perspective from the Camera's view. This is mostly useful for making isometric or 2D games.
Render Texture
Render Texture This feature is only available for Unity Pro licenses (not applicable to any version of Unity iPhone). It will place the camera's view onto a Texture that can then be applied to another object. This makes it easy to create sports arena video monitors, surveillance cameras, reections etc.
DAY 2
Lights
Lights
Lights are an essential part of every scene. While meshes and textures dene the shape and look of a scene, lights dene the color and mood of your 3D environment. Youll likely work with more than one light in each scene. Making them work together requires a little practice, but the results can be quite amazing.
Adding Lights
Lights can be added to your scene from the GameObject > Create Other menu option. Lights can be added to your scene from the GameObject by using Component > Rendering > Light. There are many different options within the Light Component in the Inspector By simply changing the color of a light, you can give a whole different mood to the scene.
Adding Lights
This is the same scene with different lighting effects
Types of Lights
There are three different types of lights in Unity: - Point Lights shine from a location equally in all directions, like a light bulb. - Directional Lights are placed innitely far away and affect everything in the scene, much like the sun. - Spot Lights shine from a point in a direction and only illuminate objects within a cone, like the headlights of a car. Lights can also cast Shadows. Shadows are a Pro-only feature. Shadow properties can be adjusted on a per light basis.
Light Properties
Lights have some pretty unique Properties, including: Type, Color, Attenuate, Intensity, Range, Spot Angle, Shadows, Resolution, Strength, Projection, Constant, Bias, Object Size, Cookie, Draw Halo, Fare, Render Mode, Auto, Force Pixel, Force Vertex, and Culling Mask. Needless to say, we wont cover all of these.
Directional Lights
Rendering Lights
Lights have a big impact on render speeds. A tradeoff has to be made between lighting and game speed. Since pixel lights are much more expensive than vertex lights, Unity will only render the brightest lights at per-pixel quality. The actual number of pixel lights can be set in a setting. You can explicitly control if a light should be rendered as a vertex or pixel light using the Render Mode property. By default, Unity will classify the light automatically based on how much the object is affected by the light. The actual lights that are rendered as pixel lights are determined on an object-by-object case.
DAY 2
Meshes
Mesh Types
Most of the objects in a game are going to be Meshes. Meshes are not built in Unity. They must be built in another 3d modeling program and exported into a Unity compatible format. The accepted formats are: .fbx, .dae, .obj, .3ds, and .dxf. The most common Unity compatible 3d art programs are: Maya, Cinema 4d, 3ds Max, Cheetah3d, Modo, Lightwave, and Blender. However, there are many more programs, including 3d art exporters, that allow for successful incorporation of mesh assets.
Mesh Animation
Animation Properties: Generation - Controls how animations are imported. This contains several options in a drop down including: Dont import - does not import animations Store in original roots - maintains placement of animations in root object of animation package. Store in nodes - stores animations in the object they animate. Store in root- stores the animation in the Scenes transform root object. This should be used when animating things that have a hierarchy.
DAY 2
Audio
Audio Sources
An Audio Source takes a clip and plays it from the source location in the world. This is very useful for utilizing 3D sound. Audio Sources will not do anything without an Audio Clip. The source works as the controller for the clip, allowing stop and playback. To create an Audio Source rst create a GameObject. Do this by going to GameObject > Create Empty. With the new object selected go to Component > Audio > Audio Source. Then assign a Clip.
Audio Clips
Audio Clips are assets used by an Audio Source. Supported formats are: .aif, .wav, .mp3, and .ogg. Both Mono and Stereo assets. Properties: Format - Compressed or Native. Native has a larger les size and is higher quality, usually used for short clips. 3d Sound - plays in 3d space. Force to mono - plays stereo as single channel. Decompress on Load - if enabled, the clips is decompressed on scene load, otherwise it is decompressed in real time.
Audio Listener
The Audio Listener acts as a microphone. It receives input from the Audio sources round it in the scene and plays sounds through the speakers. By default this is attached to the Main Camera. The Audio Listener has no properties, it simply must be in the scene in order for the audio clips to work. Audio Listeners allow for creating an aural experience in the game. When an Audio Listener is attached to a Game Object, any Sources near by will be picked up. Sources that are in a Mono format will automatically be played through the Stereo eld from the appropriate direction. In order for the audio to work properly in the Scene, there should only one Audio Listener.
DAY 2
Particle Systems
Particle Systems
Particle Systems in Unity are used to make clouds of smoke, steam, re and other atmospheric effects. Particle Systems work by using one or two Textures & drawing them many times, creating a chaotic effect.
Min Emitter Range The Min Emitter Range determines the depth within the ellipsoid that particles can be spawned. Setting it to 0 will allow particles to spawn anywhere from the center core of the ellipsoid to the outer-most range. Setting it to 1 will restrict spawn locations to the outer-most range of the ellipsoid.
Interpolate Triangles Enabling your emitter to Interpolate Triangles will allow particles to be spawned between the mesh's vertices. Interpolate Triangles is off by default, so particles will only be spawned at the vertices.
Particle Animators
Particle Animator Particle Animators move your particles over time, you use them to apply wind, drag & color cycling to your particle systems. Particle Animators allow your particle systems to be dynamic. They allow you to change the color of your particles, apply forces and rotation, and choose to destroy them when they are nished emitting
Animating Color If you would like your particles to change colors or fade in/out, enable them to Animate Color and specify the colors for the cycle. Any particle system that animates color will cycle through the 5 colors. The speed at which they cycle will be determined by the Emitter's Energy value. If you want your particles to fade in rather than instantly appear, set your rst or last color to have a low Alpha value.
Particle Colliders
The World Particle Collider is used to collide particles against other Colliders in the scene. To create a Particle System with Particle Collider: 1.Create a Particle System using GameObject->Create Other->Particle System 2.Add the Particle Collider using Component->Particles->World Particle Collider
Particle Renderer
The Particle Renderer renders the Particle System on screen. Particle Renderers are required for any Particle Systems to be displayed on the screen. Using this Component, you can greatly change the visual appearance of the particles you are generating.
Animated textures Particle Systems can be rendered with an animated tile texture. To use this feature, make the texture out of a grid of images. As the particles go through their life cycle, they will cycle through the images. This is good for adding more life to your particles, or making small rotating debris pieces.
DAY 2
Physics
Rigid Bodies
Unity has the next-generation Ageia PhysX physics engine built-in. To put an object under physics control, simply add a Rigidbody to it. GameObjects with the Rigidbody component will be affected by gravity, and can collide with other objects in the world. You use Rigidbodies for anything that needs to simulate physics Rigidbodies are most often used in combination with primitive colliders.
Kinematic Rigidbodies
Kinematic Rigidbodies are not affected by forces, gravity or collisions. Kinematic Rigidbodies driven explicitly by setting the position and rotation of the transform or animating them. Kinematic Rigidbodies can interact with other non-Kinematic Rigidbodies. Kinematic Rigidbodies are used for three purposes: 1.Sometimes you want an object to be under physics control but in another situation to be controlled explicitly from a script or animation. 2.Kinematic Rigidbodies play better with other Rigidbodies. For example if you have an animated platform and you want to place some Rigidbody boxes on top, you should make the platform a Kinematic Rigidbody instead of just a Collider without a Rigidbody. 3.You might want to have a Kinematic Rigidbody that is animated and have a real Rigidbody follow it using one of the available Joints.
Static Colliders
Static Colliders are used for level geometry which does not move around much. You add a Mesh Collider to your already existing graphical meshes. There are two reasons why you want to make a Static Collider into a Kinematic Rigidbody instead: 1.Kinematic Rigidbodies wake up other Rigidbodies when they collide with them. 2.Kinematic Rigidbodies apply friction to Rigidbodies placed on top of them
Character Controllers
You use Character Controller if you want to make a humanoid character. These Controllers don't follow the rules of physics since it will not feel right (in Doom you run 90 miles per hour, come to halt in one frame and turn on a dime). Instead, a Character Controller performs collision detection to make sure your characters can slide along walls, walk up and down stairs, etc. Character Controllers are not affected by forces but they can push Rigidbodies by applying forces to them from a script. Character Controllers are inherently unphysical, thus if you want to apply real physics - Swing on ropes, get pushed by big rocks - to your character you have to use a Rigidbody, this will let you use joints and forces on your character. But be aware that tuning a Rigidbody to feel right for a character is hard due to the unphysical way in which game characters are expected to behave.
Constant Force
Constant Force is a quick utility for adding constant forces to a Rigidbody. Constant Force works great for one shot objects like rockets, if you don't want it to start with a large velocity but instead accelerate To make a rocket that accelerates forward set the Relative Force to be along the positive z-axis. Then use the Rigidbody's Drag property to make it not exceed some maximum velocity (the higher the drag the lower the maximum velocity will be). In the Rigidbody, also make sure to turn off gravity so that the rocket will always stay on its path.
Sphere Colliders
The Sphere Collider is a basic sphere-shaped collision primitive. The Sphere Collider can be resized to uniform scale, but not along individual axes. It works great for falling boulders, ping pong balls, marbles, etc. Properties of a collider Colliders work with Rigidbodies to bring physics in Unity to life. Whereas Rigidbodies allow objects to be controlled by physics, Colliders allow objects to collide with each other. Colliders must be added to objects independently of Rigidbodies.
Triggers
Triggers An alternative way of using Colliders is to mark them as a Trigger. Triggers are effectively ignored by the physics engine, and have a unique set of that are sent out when a collision with a Trigger occurs. Triggers are useful for triggering other events in your game, like cutscenes, automatic door opening, displaying tutorial messages, etc.
Box Colliders
The Box Collider is a basic cube-shaped collision primitive. The Box Collider can be resized into different shapes of rectangular prisms. It works great for doors, walls, platforms, etc.
Mesh Colliders
The Mesh Collider takes a Mesh Asset and builds its Collider based on that mesh. It is far more accurate for collision detection than using primitives for complicated meshes. Mesh Colliders that are marked as Convex can collide with other Mesh Colliders. Collision meshes use backface culling. If an object collides with a mesh that will be backface culled graphically it will also not collide with it physically. There are some limitations when using the Mesh Collider: ! Usually, two Mesh Colliders cannot collide with each other. ! All Mesh Colliders can collide with any primitive Collider. ! If your mesh is marked as Convex, then it can collide with other Mesh Colliders.
Physics Materials
The Physic Material is used to adjust friction and bouncing effects of colliding objects. To create a Physic Material select Assets->Create->Physic Material from the menu bar. Then drag the Physic Material from the Project View onto a Collider in the scene.
Friction is the quantity which prevents surfaces from sliding off each other. This value is critical when trying to stack objects. Forms of Friction ! Static friction is used when the object is lying still. It will prevent the object from starting to move. If a large enough force is applied to the object it will start moving. At this point Dynamic Friction will come into play. ! Dynamic Friction will now attempt to slow down the object while in contact with another.
Hinges
The Hinge Joint groups together two Rigidbodies, constraining them to move like they are connected by a hinge. -A single Hinge Joint should be applied to a GameObject. The hinge will rotate at the point specied by the Anchor property, moving around the specied Axis property. -You do not need to assign a GameObject to the joint's Connected Body property. -You should only assign a GameObject to the Connected Body property if you want the joint's Transform to be dependent on the attached object's Transform. Chains Multiple Hinge Joints can also be strung together to create a chain. Add a joint to each link in the chain, and attach the next link as the Connected Body.
DAY 2
Terrain Engine
Today We Covered:
Cameras Lights Particle Systems Meshes Audio Physics Terrain Engine
THANK YOU