Game Audio Flashcards
Producer
Has the final say on all matters with the game and answers only to company execs.
Audio Production
Responsible for sound effects, dialogue, environment ambiance, music & integration of sounds into the game. Handles field, dialogue, and foley recording
Level Design
Games with core system of game play, then multiple stages or levels are designed. Ex. IQ, Crash Bandicoot, Super Mario Bros
Art Team
Concept artists do paintings/sketches of characters and world. Modelers - objects, people, buildlings, 3D. Texture artists - colour and texture
Quality Assurance (QA)
Testers that look for bugs/glitches during gameplay over and over. Bugs that might crash the game/anything that take away from smooth game play - including audio related problems
Marketing
Department that decides when a game can be announced and how to present to the public
6 Production Phases
- Concept
- Design
- Alpha
- Beta
- QA
- Gold
Concept
Game in idea phase, before any real production work begins. Creating a clear vision for game type, story line, visuals, characters and game play to create Game Design Document
Game Design Document
Essential blueprint for video game that includes production schedules, and technical guidelines for memory and processor use.
Design
Characters, game components/levels are created with software, following the game design doc and concept art. Audio team is also working at this time following game design doc
Alpha
Game is somewhat playable and designers should be able to perform basic movements in-game. Process for integrating sounds into game through computer programming (Wwise) happens in this stage
Beta
Phase where the game really comes together, and the designers now see how the game will actually look, play, and sound
Quality Assurance PHASE
Where testers are doing most of their work and is the last chance to find bugs that might crash the game. All sounds should be in the game and the sound designer is just mixing, tweaking effects, and fixing bugs.
Gold
The final build of the game is referred to as Gold Master. At this phase, the game is locked in and can’t be changed
Middleware
Third party pre-made engines that can be mixed and matched. Middleware from different companies will require custom software coding in order for them to communicate
Game Engine
Keeps track of everything happening in the game, but isn’t directly responsible for generating visuals/sound. Elements in this virtual world are game objects
Game objects
Could be a creature, wall, vehicle, weapon, projectile, or anything else represented in the world. Each game object has a specific location in the world that is referenced on a x/y (2D) or x/y/z (3D) axis.
Positional location of game objects affects audio team - explain
Positional information of game objects is used to determine how sounds will be heard
Audio Engine
Handle when and how sounds will be played in a game. You can control volume, panning, EQ, but specialised controls include ability to randomise playback parameters and to help manage resources (memory & CPU)
Graphics Engine
Paints the picture shown to the player based upon where all the game objects are. Use a lot of processing power to render the visual fast enough so that gameplay remains fluid
Graphics Engine effect on Audio team
If a lot of CPU power is being used by the graphics system, then the amount of sounds that can be played at any one time may need to be limited
Physics Engine
Used to determine how objects in the game interact with other objects within a 3D environment - these factors can be communicated to audio engine to match
AI Engine
Artificial intelligence systems are used to determine how NPC’s within game react to what game player does. Audio example - if a player is detected by enemy, a different type of music can be played, letting player know he’s been detected
Behaviour
When a particular game call is received by the audio engine, a predetermined sound is played - not to be mixed with the term ‘trigger’
How can behaviour differ from a midi message triggering a sound
Behaviour can be set to have randomisation - so for example footsteps can have advanced behaviour so that when a footstep game call is received, one of several footstep recordings is played, so that one sound isn’t repetitive - this can also go further on the type of ground the footsteps are occuring!
Spatialisation
Ability to localise the source position of a sound - must be decided if sound will be rendered as 2D or 3D
2D sound
When the audio engineer precisely controls the characteristics of how the sound will be heard by using volume, pan control, and filters to sonically match the onscreen image - there is no technical link between the image and sound
3D sound
When the audio engineer allows the audio engine to automatically make decisions about how the sound will be heard - changes to volume, pan position, etc will happen automatically based on geometric relationship between the emitter and listener
Emitter
Game object making the sound
Listener
Typically the player’s position
Attenuation curves
How sound changes depending on the emitter/listener relationship can be tailored by creating these. You can set max/min values for volume, etc
Cone attenuation
How sound is affected when the listener is not facing the emitter
Obstacles
Affect how the listener will hear sound
3 ways the game engine will deal with presence or absence of obstacles
Direct Path
Obstruction
Occlusion
Direct Path
Unblocked path of sound - no obstacle
Obstruction
When the direct sound is partially blocked, but the reflections are unaltered
e.g. Being behind a pillar
Occlusion
When both the direct path and reflected paths are blocked
e.g. being behind a wall
Primary reason for sound limitations in game audio
Memory and processing power will mostly go to visuals and gameplay
Areas in games that use most of the RAM
Graphics and regular gameplay
Ways to use allotted memory space (get the most bang for your buck)
Lower sample rates & bit depths
Data compressed audio files
Active (how and when sounds are loaded into RAM)
Audio that is held in RAM and can be played instantly
Streaming (how and when sounds are loaded into RAM)
Audio played back directly from the disk. There may be some hesitation between when the sound is requested and when the sound is played. Commonly used for long ambient sounds or music
Pre-fetching (how and when sounds are loaded into RAM)
The leading edge of the audio file is loaded into RAM with the remainder streaming from the disk when necessary
Game voices
Just like in a DAW, every sound generated requires a voice, and there is a limit on how many ‘voices’ can be produced at one time due to processing power
Multi-platform development
Making a game for different platforms
i.e. making a game playable on PS5, XBOX, PC
Things to consider when making a game multi-platform, audio-wise
Different consoles may have different processing power, therefore a game on the Switch may be able to handle a collection of 50 footstep audio files, while the Wii may only be able to handle 10
Designer Layout
The main layout to do work in. Build your project, import files, adjust parameters, insert effects, link events and create event actions
Profiler Layout
Analyse CPU usage, RAM, streaming voices, as well as troubleshoot events and actions during realtime gameplay
SoundBank Layout
Used to organise all triggered events into categories, which can be used to generate the necessary code and assets to be used in the final build of the game
Mixer Layout
Create custom mixing desks and sound casters to audition and mix groups of audio
Schematic Layout
Gives the integrator a signal flow type view
Interactive Music Layout
Controls the playback of different musical scores according to game parameters such as ‘power up’ or ‘stress’
Dynamic Dialogue Layout
Allows the dialogue played in the game to change according to game states or parameters
Game Object Profiler Layout
Realtime 3D dynamic visual representation of the watched game objects and listeners
3 Hierarchies
Master Mixer
Actor Mixer
Interactive Music
Parent Child concept
Within a hierarchy - there are folders/objects that are parents e.g. Weapons, that would contain children e.g. shotgun, rifle. Any parameters changed for the parent will be applied to all its children. And if you make an adjustment to just a child, it will only affect that one child
Work Unit
A type of folder that can be moved between projects. It holds individual work and settings (not audio). This allows different parts of a game to be worked on by different people and different times
Actor-Mixer container
A container found in the Actor Mixer Hierarchy, and is used to hold and organise different audio elements. It is a parent container to children sound objects. It is like a VCA control
Sound Object
Similar to an audio track and is parent to an actual .wav file. Controls all parameters for a single sound
Events
A type of object that receives game calls from the game engine. When an event receives a game call, it can perform actions like ‘play,’ and play can be linked to a sound effect
SoundBank container files
Contains audio/motion data and media for a defined section of the game, and are loaded into the game platform’s memory at different points in the game. This allows us to organise audio content and optimise amount of memory being used
Random & Sequence Containers
Used to organise sound objects in a way similar to the actor-mixer, but can be configured to randomly play its child sound objects.
Sequence containers on their own will play its child sound objects in a pre-determined order using ‘playlist’