Sound and Action
Introduction to Sound and Action - W1
Audio Assets
We were tasked to create 3 types of audio assets.
Some original assets were provided so we could edit them using Reaper to make them sound different.
Gunshot - Edited
Button click - Edited
Ambient loop - Edited
Listening Exercise
This weekly activity was to do a listening exercise at 3 different locations.
Location 1 - My Living Room DAYTIME
​
-
Desktop fans
-
Monitor buzz sound
-
Phone call hold music whilst on speaker mode
-
Sister assembling shoe rack
-
metal pieces​
-
cloth pieces
-
footsteps whilst walking around
-
footsteps rustling
-
-
Exhaust fans above the stove
-
Backdoor opening and closing as someone walked in from outside
-
Microwave beeping
-
Potato peeling/cutting
Location 2 - Backyard Gazebo
​
-
Airplanes
-
Dry leaves rustling as wind carries them through the floor tiles
-
Gazebo creaking noises from wind, wood and plastic materials
-
Birds as they fly by
-
Birds chirping
-
Cars/Motorbikes from the main road across from the house building
-
water falling from the drains
Location 3 - Park by main road
​
-
People running/walking past
-
wind
-
Rustling leaves from the trees caused by the wind
-
Airplanes flying by
-
Cars/motorbikes passing in the nearby road
-
Birds chirping
-
Ambulance/Police Siren
-
Dogs barking and running through
Introduction to Audio Middleware - W2
Workshop FMOD Events
The workshop task this week was to create Events in FMOD using the audio assets created last week. This was to make us more familiar with the FMOD interface.
FMOD Footsteps Event
Then an additional task was to create a footsteps event. This event would require a parameter with labeled values which would could be manually set to trigger a different set of sounds.
​
I downloaded 4 different footsteps tracks from Pixabay for 4 different types of surfaces, these were Concrete, Dirt, Stone and Wood.
Using reaper I separated the track into small individual sounds and exported them.
​
On FMOD under assets I created separate folders for each variation and imported the tracks.
I created a new footsteps event, added a timeline that takes multi instruments then created a new parameter that takes labeled values and named them after each surface type.
Then dragged the appropriate audios from the assets to their relevant labels in the timeline.
Reaper footsteps tracks edited
FMOD Footsteps Event
Middleware Integration - W3
Workshop FMOD Unity Events
I integrated FMOD into a Unity Project, disabled unity's built in audio system and used the previous week's FMOD project's audio for basic audio events.
​
I added 2 empty objects, one for the ambient loop and another to play either the gunshot or the button click sound depending on which button you press (E or F key).
FMOD/Unity Footsteps Event (Using parameters)
I also used the footsteps audio and set 4 platforms each with different tags to test the Terrain parameter of the footsteps audio.
I set it to trigger a sound on pressing Button Q, on press it would cast a ray to detect the tag (I wanted to check by layer but most of the layers were already taken) and set the current terrain according to tag. then it would play a footstep sound corresponding to the terrain type.
Advanced Parameter & Logic - W4
Workshop
For the workshop task we divided into groups to discuss events and what parameters could be applied to these events that could cause a change in the sound.
We came up with the following events.
Event: Bullet Impact
Parameters: Surface, Location, Occlusion, Caliber, Speed
​
Event: Weapon Reload
Parameters: Magazine size, Full mag, Partial empty, Empty mag
​
Event: Door Opening
Parameters: Door Type, Door Material, Door Size, Location
​
Event: Engine running
Parameters: Idle, Accelerating, Gear shift, Top speed
Event: Playing a guitar
Parameters: Guitar Type, Strings, Location, Speed
Event: Punching Impact
Parameters: Stamina, Body part, Impact sound, Damage
Event: Impact feedback
Parameters: Damage, Enemy type
Event: Explosion
Parameters: Distance, Elevation, Type, Size, Building, Nature, Occlusion
Weekly Activity
Using parameters set in FMOD and applying them in engine, refer to previous week footsteps event.
Analysis & Industry Conventions - W5
Workshop task
For this week's workshop task we were to analyse audio of 1 minute of gameplay footage of a Warhammer 40,000 video.
I have identified the following sounds:
SFX
-
Pistol shot
-
Pistol bullet travelling
-
Pistol bullet hit enemy shield
-
Shotgun being shot
-
Shotgun bullet travelling
-
Shotgun bullet hit enemy shield
-
Rifle shots
-
Plasma rifle shots
-
Player footsteps
-
Player shotgun shot
-
Player shotgun bullet case coming off the shotgun
-
Player shotgun opening
-
Player inserting new bullet in the shotgun
-
Player shotgun closing
-
Player shotgun bullet hitting enemy
-
Enemy “ghoul” growl when dropping
-
Enemy “ghoul” when dropping on the floor
-
Enemy “ghoul” growl when attacking
-
Enemy “ghoul” swish when attacking
-
Enemy “ghoul” being hit, fleshy sound
-
Player shield being hit
-
Player shield break
-
Player melee weapon swing
-
Player melee weapon hitting enemy
-
Player melee defend being hit
-
Grenade explosion sound
-
Boss enemy being hit by melee weapon
AMB
UI
-
Radio end transmission sound
MX
-
Initial calmer battle song
STATE
-
radio transmission lower sounds
-
Enemy number- lower, higher, horde
-
Player shield – down reduce frequencies, up returns frequencies
-
Boss dialogue – dialogue stopped, sidechaining
-
Player ability – scoops frequencies
-
Player dialogue
-
Music change
VOICE
-
VX NPC morrow 0001
Presentation Week 6
For the presentation I found games with audio implementation that were relevant to what I intended to do for the A2 Artefact.
I wanted to take an approach to immerse the player in the environment, as examples of games that do this I chose Red Dead Redemption 2 (Rockstar Games, 2018) and The Last of Us Series (Naughty Dog, 2013; Naughty Dog, 2020).
​
Red Dead Redemption 2:
-
Environmental Sounds: The game features a vast open-world setting with diverse environments such as forests, deserts, and mountains. The ambient sounds like wildlife, weather, and other environmental elements are meticulously designed to create a realistic and immersive atmosphere.
-
Dynamic Music: The game's music dynamically adapts to the player's actions and the in-game events. For example, the music may intensify during action-packed moments or become more subdued during exploration. This dynamic music system enhances the emotional impact of the gameplay.
-
Dialogue and Voice Acting: The game boasts a rich narrative with a vast cast of characters. The voice acting is of high quality, and the dialogues are spatially implemented, meaning you can hear characters from different directions based on their positions in the game world.
-
Horse Sounds: The sounds of horses, including their neighs, galloping, and whinnies, contribute to the authenticity of the game's Wild West setting. Horses play a significant role in transportation and immersion.
​
The Last of Us:
-
Spatial Audio: The Last of Us series uses advanced audio techniques to create a sense of spatial awareness. This is particularly crucial in stealth sections where players must rely on sound cues to navigate and avoid enemies.
-
Emotional Impact: The game's score, composed by Gustavo Santaolalla, is minimalist but emotionally resonant. The music complements the somber and tense atmosphere, enhancing the emotional impact of key moments in the story.
-
Environmental Detail: Similar to Red Dead Redemption 2, The Last of Us pays attention to environmental sounds. Whether it's the creaking of floorboards, the rustling of leaves, or distant sounds of infected creatures, these details contribute to the immersive nature of the game.
-
Character Voice Acting: The voice acting in The Last of Us series is highly praised. The performances of the main characters, Joel and Ellie, convey a wide range of emotions, adding depth to the storytelling.
​
Both games showcase the importance of audio in creating a truly immersive gaming experience. The careful attention to detail in the sound design enhances the sense of presence and emotional engagement for players.
References
​
Naughty Dog. (2013). The Last of Us [Video Game]. Sony Computer Entertainment.​
Naughty Dog. (2020). The Last of Us Part II [Video Game]. Sony Computer Entertainment.​​
Rockstar Games. (2018). Red Dead Redemption 2 [Video Game]. Rockstar Games.
Mixing for Real-Time Media - W7
Workshop task - Bus hierarchy
For this week we had to create a bus hierarchy appropriate to the project. I have created the following hierarchy, I have separated various sound effects into separate groups so i could adjust their output volume in comparison to other when testing.
I added a multiband EQ to the Master Bus, added a highpass and a lowpass filter, then created an override snapshot reduces the audio of the Voice, Ambience and SFX groups, this created a muffled effect and lower volume of these audios for a paused game effect status. I then created an event that uses the snapshot and passed a parameter for the game status.
Weekly Task - VCAs
For the weekly task we were to create a VCA structure, since my project only has 3 volume sliders I chose to make a VCA for the Master Bus, the Ambience and SFX, I organised them as shown in the image below.
Then I made a simple script that takes the float value of the slider and adjusts the VCA volume based on its position.
Advanced Games Audio 1 - W8
Workshop task - Field Recording
This week we borrowed field recorders from the university and went outside to first hand record some sounds.
I tried recording footsteps on different surfaces (stone, grass/leaves, wood), also tried to record the environment around but it was relatively windy so many of the recordings didn't turn out well.
Recording 1 - Footsteps stone 1
Recording 3 - footsteps stone 3
Recording 5 - footsteps leaves 1
Recording 7 - ambience walking
Recording 9 - ambience walking (too windy)
Recording 11 - metal chain 2
Recording 2 - footsteps stone 2
Recording 4 - footsteps
Recording 6 - footsteps
Recording 8 - footsteps wood 1
Recording 10 - metal chain 1
Recording 12 - hits on metal rubbish bin
Weekly task - Recordings
For the weekly task we were to record various Foley SFX, I tried recording sounds from different materials such as pieces of clothing, zippers, keychains, among others. However the I set the recording sensitivity too low so the recordings became very hard to hear.
​
I made the recordings available on the following link:
​
https://drive.google.com/drive/folders/1DvBkiu8ZUa8nONjerFSoLreIPdTKr8pq?usp=sharing
Advanced Games Audio 2 - W9
Workshop task - Discussion
Discussion recording
We split into groups to discuss a few topics.
​
-
Why we think silence can be important in audio.
-
What function does dialogue and lack of dialogue serves.
-
How music can be used as a narrative device.
-
We discussed that lack of sound makes the viewer feel to what is happening in the scene, it can make the scene more intense. We used the example scenes shown during the lecture of the movie "No Country for Old Men" on how this applies. The silence putting the viewer on edge, uneasy and focusing on the action(s) shown.
​
-
The function of dialogue is for the viewer to understand what is actually happening and is used to put emotion behind the scene. Whereas in the movie "The Quiet Place" there is an absence of dialogue having the characters communicate through sign language to fit the narrative, as making noise would mean imminent danger for them.
​
-
We discussed how music can be used to dramatize a scene, and example being the use of music in Steven Universe. Another example in games being the a character's suicide scene in the game Cyberpunk 2077 where the track along with the voiced lines added a dramatization to the scene to enhance the emotional charge of the scene.
Other ways music is used in narrative is in death scenes, in particular when a character is giving an emotional speech backed up by music then is suddenly taken out and the music stops, to quote Najeh "that character's story just ends there, you will never hear their song again".
Lastly we discussed on how in musicals they are telling the story by singing making the song the narrative itself.
Weekly task
For this task we were to pick a piece of media and analyse the audio, since I am very passionate about Cyberpunk 2077 I chose to write about its sound design.
Diving into Cyberpunk 2077's Sound: What Works and What Doesn't
Picture this - you're wandering the neon-lit streets of Night City, the rain pouring down, and the futuristic sounds of Cyberpunk 2077 all around you. Developed by CD Projekt Red, the game aimed to give players an unforgettable audio experience. In this breakdown, we'll talk about what makes the game's sound cool and where it kinda falls short.
​
The Good:
Feeling the City Vibes:
One thing Cyberpunk 2077 nails is making you feel like you're smack in the middle of a busy, crazy city. The distant chit-chat, the whir of flying cars, and the buzz of high-tech gadgets - it all adds up to make Night City feel alive.
​
Action-Packed Sounds:
When the bullets start flying, the game kicks into high gear with intense sound effects. Guns blaze, explosions pop, and high-tech weapons whirr, making the action scenes super immersive. The audio reacts to what you're doing, making everything feel more real.
​
Jamming Soundtrack:
The music in Cyberpunk 2077 is like the cherry on top. Composed by Marcin Przybyłowicz and his team, the soundtrack fits the vibe perfectly. It even changes based on what's happening in the game, adding extra emotion to key moments. The mix of different music styles fits the futuristic theme and adds a ton of depth to the game.
​
Paying Attention to the Little Things:
In a world full of people with high-tech upgrades, the game pays attention to the details. The hum of bionic limbs, the beeping of implants - these little touches make the game world feel more authentic. It's the kind of stuff that makes you believe you're in a city on the edge of a tech revolution.
​
The Bad:
Tech Hiccups and Glitches:
Despite all the good stuff, Cyberpunk 2077 got a bad rap for its technical problems. Some of these bugs messed with the audio, causing sounds to not sync up with what was happening on screen. These glitches took away from the immersion and, in some cases, messed with the story, which is a bummer considering how cool the audio could have been.
​
Boring Radio Stations:
Night City is supposed to be this crazy mix of cultures, but the radio stations in the game don't fully capture that. The options are kinda limited, and it feels like they could've added more variety to give a better sense of the city's diversity. A bit more music variety would have made the game feel more authentic.
​
Talking Troubles:
Some players had issues with how the game mixed dialogue, especially in busy areas. In a city full of noise, sometimes it's hard to hear what characters are saying. This makes it tough to follow the story, and you end up straining to catch the important details in conversations.
​
​
​
Cyberpunk 2077's audio is like a roller-coaster ride - thrilling highs and a few disappointing lows. It successfully immerses you in a futuristic world, but technical hiccups and some design choices hold it back. As the game gets updates and fixes, let's hope the sound matches the visual spectacle, so players can fully dive into the heart of Cyberpunk 2077.
​
​
Advanced Games Audio 3 - W10
Workshop task - Side chain compression
For this week's workshop task we were to create a side chain compression in FMOD, I used the assets created in week one.
​
Side chain screenshots
Side chain video example
Weekly activity task
The weekly task was to create a sound only game design document that does have any visuals.
​
Before doing this task I checked a game called The Vale: Shadow of the Crown for inspiration.
​
Harmony's Embrace
​
Concept:
"Harmony's Embrace" is a groundbreaking audio-only game that transcends the need for visuals, delivering an immersive and accessible experience. Players embark on an enchanting journey through a world shaped entirely by sound, offering a unique adventure for everyone.
​
Genre: Adventure, Music, Puzzle
​
Platform: PC, Console
​
Game Overview:
In "Harmony's Embrace," players explore a fantastical realm where sound is the essence of existence. The absence of visuals enhances the reliance on audio cues, providing an innovative and inclusive gaming experience. Through strategic gameplay and a musical narrative, players uncover the secrets of a magical world.
​
Key Features:
-
Symphonic Soundscapes:
-
Craft a mesmerizing symphony of audio elements, using spatial audio technology to create a vivid and dynamic environment.
-
Differentiate environments through unique musical compositions, immersing players in a world where each area has its own distinctive sound signature.
-
-
Musical Storytelling:
-
Convey the narrative through a musical score, blending voice acting with orchestrated melodies that evolve based on player decisions.
-
Player choices influence the musical themes, creating a personalized soundtrack that adapts to their journey.
-
-
Auditory Exploration:
-
Encourage players to explore by following the harmonies and rhythms within the environment.
-
Implement interactive musical landmarks that trigger changes in the audio landscape, providing both guidance and challenges.
-
-
Puzzles in Tune:
-
Design puzzles that revolve around rhythm, melody, and harmony, requiring players to use their auditory skills for problem-solving.
-
Introduce instruments as puzzle elements, allowing players to orchestrate solutions through strategic manipulation of sound.
-
-
Sonorous Characters:
-
Populate the world with characters represented by distinct musical instruments and tones.
-
Develop relationships through musical interactions, with characters having unique musical motifs that convey their personalities.
-
-
Ensemble-Based Gameplay:
-
Collaborative gameplay mechanics that allow players to join forces with in-game companions, creating harmonies that unlock new abilities and solve challenges.
-
​
Gameplay Mechanics:
-
Harmonic Resonance Ability:
-
Provide players with a Harmonic Resonance ability, enabling them to manipulate and control sound elements for solving puzzles and overcoming obstacles.
-
-
Dynamic Audio Events:
-
Introduce dynamic events triggered by player actions, creating seamless transitions between different musical compositions and altering the game world's ambiance.
-
-
Melodic Crafting System:
-
Implement a musical crafting system, allowing players to create harmonious items and tools by combining various musical elements.
-
​
"Harmony's Embrace" redefines the possibilities of audio-only gaming, offering a musical adventure accessible to all players. By leveraging innovative audio design and weaving a musical narrative, the game aims to provide an inclusive and enchanting experience that transcends traditional gaming boundaries.
A1 - Report
Audio and Media Mastery in The Last of Us Part II and Practical Applications for Game Development
The Last of Us Part II: A Symphony of Soundscapes
The Last of Us Part II, an iconic creation by Naughty Dog released in 2020, stands at the forefront of modern gaming experiences, demonstrating the pivotal role that audio and sound design play in shaping player engagement. Moving beyond the conventional role of background music, the game employs sophisticated audio techniques that not only enhance gameplay but also contribute significantly to the narrative, atmosphere, and emotional impact. This report meticulously explores the audio and sound design aspects of The Last of Us Part II, delving into the intricacies that make it a standout example of how audio can elevate a gaming experience.
Environmental Sounds and World-building:
A distinctive feature of The Last of Us Part II is its meticulous attention to environmental sounds, which play a crucial role in constructing a believable and immersive game world. The auditory landscape of the game is carefully crafted to reflect the desolation of a post-apocalyptic world. The eerie creaking of dilapidated buildings, the distant echoes of wildlife in a forgotten forest, and the ominous groans of infected creatures all contribute to the rich tapestry of the game's setting. These environmental sounds not only ground players in the fictional world but also serve as narrative devices, foreshadowing dangers and providing essential cues for strategic gameplay.
In exploring the environmental soundscape, Naughty Dog showcases a commitment to authenticity. The development team conducted extensive field recordings to capture the genuine sounds of various environments, ensuring that every footstep, weapon reload, or gust of wind feels true to life. This attention to detail not only enhances the immersive quality of the game but also adds layers of depth to the player's experience, reinforcing the notion that every sound tells a story in the desolate landscapes of The Last of Us Part II.
Dynamic Audio Design:
The Last of Us Part II's dynamic nature demands a flexible and adaptive audio design that responds to the player's actions and the unfolding narrative. Unlike static soundtracks, the game's audio elements dynamically adjust based on the player's choices, creating an emotionally resonant experience. Moments of tension are accentuated by an intensifying heartbeat, while the transition between indoor and outdoor environments prompts a subtle change in ambiance. This dynamic audio design not only heightens the player's emotional engagement but also contributes to the seamless integration of sound into the overall gameplay experience.
The dynamic audio elements become particularly evident during intense combat sequences, where the soundscape evolves in response to the ebb and flow of battle. The clinking of weapons, the shouts of enemies, and the visceral sounds of hand-to-hand combat create an immersive auditory experience that mirrors the chaos and urgency of the on-screen action. This adaptability in audio design serves not only to enhance the gaming experience but also to reinforce the narrative impact, making each encounter in The Last of Us Part II a visceral and unforgettable auditory journey.
Realism and Authenticity:
The commitment to realism in The Last of Us Part II extends beyond environmental sounds to encompass all aspects of the game's audio design. The authenticity of the soundscape is a result of Naughty Dog's dedication to capturing real-world sounds through extensive field recording sessions. The team painstakingly recorded the sounds of nature, urban decay, and everyday life to infuse the game with a level of authenticity that sets it apart.
The realistic audio elements contribute to the player's suspension of disbelief, creating an illusion where the virtual world of The Last of Us Part II feels tangible and tangible. Whether it's the crunch of gravel underfoot, the distant howls of infected creatures, or the subtle echoes in abandoned buildings, each sound serves as a testament to the developers' commitment to delivering an unparalleled audio experience. This dedication to authenticity not only adds layers of believability to the game world but also enhances the emotional impact of key narrative moments.
Emotional Impact through Music:
Beyond the realm of environmental and interactive sounds, The Last of Us Part II employs a powerful musical score to evoke emotions and underscore pivotal moments in the narrative. Renowned composer Gustavo Santaolalla's contribution to the game cannot be overstated, as his haunting and evocative score becomes an integral part of the storytelling process.
The music in The Last of Us Part II is more than a background accompaniment; it is a narrative tool that guides the player through the emotional peaks and valleys of the story. Melancholic guitar tunes during quiet moments create a sense of introspection, while intense orchestral arrangements heighten the emotional impact of action sequences. The seamless integration of music into the overall audio design enhances the player's emotional connection with the characters and events, making The Last of Us Part II a masterclass in using music as a storytelling device in gaming.
Spatial Audio and Immersive Experience:
In the realm of technological innovation, The Last of Us Part II leverages advanced spatial audio technologies to elevate the player's sense of immersion. The implementation of 3D audio is a game-changer, allowing for a more accurate representation of sound directionality. This technology enables players to pinpoint the source of a sound with unprecedented precision, adding a layer of realism that enhances situational awareness.
Spatial audio becomes particularly crucial in stealth and combat situations, where the ability to discern the direction of approaching enemies or the distant sounds of danger can mean the difference between success and failure. The integration of 3D audio not only serves a practical purpose in gameplay mechanics but also contributes to the overall sense of presence, making the post-apocalyptic world of The Last of Us Part II feel more tangible and immediate.
​
​
In conclusion, The Last of Us Part II sets an exceptional standard for audio and sound design in gaming. The intricate interplay of environmental sounds, dynamic audio design, realism, emotional musical scoring, and spatial audio technologies collectively contribute to an unparalleled level of immersion. The auditory elements of the game are not mere embellishments but active participants in shaping the player's emotional journey through the desolate landscapes of a post-apocalyptic world. The commitment to authenticity and innovation in audio design positions The Last of Us Part II as a benchmark for how sound can be harnessed to create an unforgettable gaming experience.
Crafting Immersive Auditory Landscapes: Lessons from The Last of Us Part II
Having examined the exemplary audio and sound design of The Last of Us Part II in the previous section, this part of the report shifts the focus to practical applications. How can the sophisticated audio techniques from the game be implemented in a new project to enhance player immersion and storytelling? By drawing lessons from The Last of Us Part II, game developers can gain insights into crafting immersive auditory landscapes that elevate the overall gaming experience.
Environmental Sounds and World-building:
Lesson Learned: In The Last of Us Part II, environmental sounds play a pivotal role in building a believable game world. To implement this concept in a new project, developers should conduct extensive field recordings to capture authentic sounds from various environments. This attention to detail not only grounds players in the game world but also provides narrative cues and foreshadows events.
Application: For a new game project, invest time and resources in capturing real-world sounds. From urban environments to natural settings, each sound should contribute to the authenticity of the game world. Consider implementing a dynamic environmental sound system that reacts to the player's actions, creating a living, breathing world that responds to their presence.
Dynamic Audio Design:
Lesson Learned: The Last of Us Part II excels in dynamic audio design, adjusting the soundscape based on player actions and narrative developments. This adaptability enhances emotional engagement and creates a seamless integration of sound into gameplay.
Application: In a new project, prioritize dynamic audio design to create a responsive and immersive experience. Develop a system that adjusts the audio based on the player's choices and the evolving narrative. Whether it's changing the intensity of the background music during tense moments or dynamically altering ambient sounds, the goal is to make the player's actions feel directly connected to the auditory experience.
Realism and Authenticity:
Lesson Learned: Authenticity is a cornerstone of The Last of Us Part II's audio design. The commitment to realism, achieved through extensive field recordings, enhances player immersion and adds layers of believability to the game world.
Application: For a new game project, follow Naughty Dog's lead by investing in authentic sound sources. Conduct field recordings to capture the sounds of the intended environments. Use these recordings as a foundation to build a sound library that reflects the nuances of the game world. This commitment to authenticity will contribute to a more immersive and convincing player experience.
Emotional Impact through Music:
Lesson Learned: The Last of Us Part II demonstrates how a powerful musical score can elevate emotions and underscore narrative moments. Gustavo Santaolalla's score is not merely background music; it actively contributes to the storytelling process.
Application: In a new project, collaborate with a skilled composer to create a musical score that enhances the narrative. Consider how different musical themes can evoke specific emotions in different parts of the game. Integrate the music dynamically, allowing it to respond to the player's actions and the unfolding story. A well-crafted musical score can significantly enhance the emotional impact of key moments.
Spatial Audio and Immersive Experience:
Lesson Learned: The Last of Us Part II leverages advanced spatial audio technologies to heighten the player's sense of immersion. The implementation of 3D audio enhances situational awareness and contributes to a more realistic and immediate game world.
Application: In a new project, prioritize spatial audio technologies to create a more immersive experience. Implement 3D audio to accurately represent sound directionality, enabling players to pinpoint the source of a sound. This is particularly crucial in stealth and combat situations, where spatial awareness can significantly impact gameplay. Investing in these technologies will contribute to a more immersive and engaging player experience.
​
In conclusion, The Last of Us Part II provides valuable lessons for game developers seeking to enhance their audio and sound design. By incorporating concepts such as environmental sounds, dynamic audio design, realism, emotional impact through music, and spatial audio, developers can create games that offer a heightened level of immersion and storytelling. These lessons serve as a roadmap for crafting auditory landscapes that captivate players and elevate the overall gaming experience.
As game development continues to advance, the integration of sophisticated audio techniques will play an increasingly critical role in shaping the future of interactive entertainment. By building on the lessons learned from The Last of Us Part II, developers can contribute to the evolution of gaming experiences, delivering not just visually stunning worlds but also rich, immersive soundscapes that leave a lasting impression on players.
References:
Naughty Dog. (2020). The Last of Us Part II [Video Game]. Sony Computer Entertainment.
A2 Artefact
UNITY 3D Game Kit
This assessment's task consists of implementing adaptive audio into a scene.
For this project I chose to use the 3D Game Kit in Unity and I will be changing the audio and implementing it using FMOD rather than Unity's build in audio system.
To start off I set up the project and had a run through to assess how the audio in the project worked, then I made an audio asset list based on the current project and considering the changes I want to make.
Just removing the background music in the project by itself gave it already a different feel.
Audio Considered Asset List
SFX
​
Player Source
Footsteps
Walking (Earth, Grass, Puddle, Stone)
Running (Earth, Grass, Puddle, Stone)
Landing (Earth, Grass, Puddle, Stone)
Emote – Jump
Emote – Landing
Emote – Death
Emote – Attack
Emote – Hurt/Take Damage
​
Weapon Pick-up
Weapon Attack Swing
Weapon Impact Hit – Foliage, Rubble
​
​
Environment
​
Dropship (Idle)
Wind
Water
Birds
Fireflies
​
Interactable
Destructive Boxes
Doors (small, big)
Chests (Glow, Open)
Portal
Music
Possibly none
​
UI
Menu – Start Game Button
Menu – Button Confirm
Menu – Button Return
Menu - Ambient
Voice (optional)
“Ellen” Tutorial Dialogue lines
INFO1 - Where am I? Can I even move my legs? I'll try W, A,S and D. Maybe they'll start to feel normal again.
INFO2 - Can I jump over this if I press SPACE?
INFO3 - This looks fun to smash stuff with! I'll LEFT CLICK on my MOUSE and see what happens...
INFO4 - Oooh what a weird stepping stone... I want to tread on it and see what it does.
INFO4b - This must open somehow. I'll explore and see what I can find.
INFO5 - I wonder what those massive door crystals are for? I should look around for some more switches.
​
Keep in mind
Reverb Areas (refer to original project)
​
​
Project Ready Assets
Made, recorded, edited
​
Ambience (Game Synth tool)
Wind
Strong Wind
​
​
UI (Game Synth tool)
Menu – Button Hover
Menu – Button Confirm
Menu – Button Return
​
SFX (Recorded Zoom H1n)
Emote – Landing (7 assets)
Emote - Weapon Swing (7 assets)
Emote - Death (2 assets)
Emote - Jump (6 assets)
​
Online sourced and edited
​
Environment
Dropship Idle (Pixabay)
Water running (Pixabay)
​
SFX
Chests (Glow)
​
​
Original Project Used Assets
​
SFX
Footsteps
Chest Open
Weapon Pick-up
Weapon Swing
Weapon Impact Hit
Fireflies
Doors
Portal
Destructive boxes
​
Voice Lines (Recorded - Zoom H1n)
INFO1 - Where am I? Can I even move my legs? I'll try W, A,S and D. Maybe they'll start to feel normal again.
INFO2 - Can I jump over this if I press SPACE?
INFO3 - This looks fun to smash stuff with! I'll LEFT CLICK on my MOUSE and see what happens...
INFO4 - Oooh what a weird stepping stone... I want to tread on it and see what it does.
INFO4b - This must open somehow. I'll explore and see what I can find.
INFO5 - I wonder what those massive door crystals are for? I should look around for some more switches.
FMOD
In FMOD I setup and organised the events by groups, I made a bus hierarchy based on what I intended to add and also set up the VCAs, testing them also inside engine to make sure they were working as intended.
​
With the new audio made/recorded/procured I created the necessary events in FMOD to be added in the Unity Project.
Some of the audio assets used were from the original project which were previously implemented using Unity's built-in audio system.
​
Some of the audios were edited to be seamless loops and set as such within FMOD.
Unity
I started by adding the UI button sounds, just made 3 new game objects each with the FMOD event emitter containing the 3 different sounds, and in the inspector I dragged them into the event trigger and on click boxes and trigger the audio from the Event emitter script of the game objects that include the audio.
​
In this project there were already a great deal of scripts that easily allowed to add events. Instead of using audio sources I used references to the studio emitter.
​
Some scripts were altered to make direct references to the studio emitter of other game objects and trigger the audio in that manner.