You have 3 free guides left 😟
Unlock your guides
You have 3 free guides left 😟
Unlock your guides

allows us to pinpoint sound sources in space using both ears. It relies on subtle differences in timing, level, and phase between our ears to determine a sound's direction and distance. This ability is crucial for navigating our acoustic environment.

help us determine a sound's position in 3D space. Interaural time and level differences are key for horizontal localization, while spectral cues from our outer ears aid vertical localization. Understanding these cues is essential for creating realistic spatial audio experiences.

Binaural hearing basics

  • Binaural hearing involves the use of both ears to localize sound sources in space
  • Enables humans and animals to determine the direction and distance of sound sources
  • Plays a crucial role in spatial awareness and navigating complex acoustic environments

Differences between ears

Top images from around the web for Differences between ears
Top images from around the web for Differences between ears
  • Sound waves reach the two ears at slightly different times and levels due to the physical separation of the ears
  • These differences provide cues for the brain to determine the location of the sound source
  • Anatomical differences between the left and right ear (ear canal, pinna shape) can also contribute to binaural cues

Interaural time differences

  • Interaural time differences (ITDs) refer to the difference in arrival time of a sound wave at the two ears
  • ITDs are the primary cue for localizing low-frequency sounds (below ~1.5 kHz)
  • The brain processes ITDs to determine the (horizontal angle) of the sound source

Interaural level differences

  • Interaural level differences (ILDs) are the differences in sound pressure level between the two ears
  • ILDs are caused by the acoustic shadow cast by the head, which attenuates high-frequency sounds (above ~1.5 kHz)
  • The brain uses ILDs to localize high-frequency sounds in the horizontal plane

Interaural phase differences

  • Interaural phase differences (IPDs) occur when the phase of a sound wave differs between the two ears
  • IPDs are most effective for localizing sounds with wavelengths comparable to the size of the head
  • The brain processes IPDs in conjunction with ITDs to improve localization accuracy

Sound localization cues

Localization in horizontal plane

  • Localization in the horizontal plane (azimuth) is primarily based on ITDs and ILDs
  • The combination of these cues allows the brain to determine the left-right position of a sound source
  • The resolution of horizontal localization is best for sources directly in front or behind the listener

Localization in vertical plane

  • Localization in the vertical plane () relies on spectral cues provided by the outer ear (pinna)
  • The pinna's complex shape causes frequency-dependent reflections and resonances that vary with the elevation of the sound source
  • The brain learns to associate these spectral patterns with specific elevations
  • Head-related transfer functions (HRTFs) describe how the head, torso, and outer ears alter the frequency and time characteristics of sound waves
  • HRTFs are unique to each individual and depend on the size and shape of their head and ears
  • HRTFs can be measured or simulated to create realistic 3D audio experiences (virtual reality, gaming)

Cone of confusion

  • The refers to a region in space where ITDs and ILDs are ambiguous, leading to localization errors
  • It occurs when a sound source is equidistant from both ears (e.g., directly in front, behind, or above the listener)
  • Resolving front-back confusions often requires head movements to introduce dynamic binaural cues

Binaural recording techniques

Dummy head recording

  • involves using a mannequin head with microphones placed in the ear canals
  • The mannequin head simulates the acoustic properties of a human head, capturing binaural cues
  • Recordings made with a dummy head can create a realistic 3D audio experience when played back over headphones

In-ear microphones

  • are small microphones placed inside the ear canals of a human or mannequin head
  • They capture the sound pressure at the eardrum, including all the binaural cues introduced by the head and outer ear
  • In-ear recordings provide a highly realistic and individualized binaural audio experience

Binaural synthesis

  • involves creating binaural audio from mono or stereo recordings using HRTFs
  • The original audio is convolved with HRTFs to simulate the spatial cues that would be present in a natural listening environment
  • Binaural synthesis allows for the creation of immersive 3D audio without the need for specialized recording techniques

Limitations of binaural recording

  • Binaural recordings are most effective when played back over headphones, as loudspeakers can introduce cross-talk between channels
  • Individual differences in HRTFs can lead to variations in the perceived spatial quality of binaural recordings
  • Head movements during playback can disrupt the binaural illusion, as the spatial cues remain fixed relative to the head

Spatial hearing and architecture

Room acoustics impact on localization

  • Room acoustics can significantly influence the ability to localize sound sources in space
  • Reflections from walls, ceiling, and floor can interfere with direct sound, affecting binaural cues
  • The time and early reflection pattern of a room can enhance or degrade localization accuracy

Reverberation effects on localization

  • Reverberation can make it more difficult to localize sound sources, especially in highly reverberant spaces (churches, concert halls)
  • The direction and timing of early reflections can provide additional cues for localization
  • Excessive reverberation can mask binaural cues and lead to a diffuse, enveloping sound field

Precedence effect in rooms

  • The (Haas effect) refers to the dominance of the first-arriving sound in determining localization
  • In rooms, the direct sound from a source is followed by early reflections and reverberation
  • The brain gives more weight to the localization cues provided by the direct sound and early reflections, suppressing the effect of later reflections

Designing spaces for optimal localization

  • Architectural design can be optimized to enhance sound localization and spatial awareness
  • Controlling the reverberation time and early reflection pattern can improve localization accuracy
  • The use of sound-absorbing materials and diffusers can help reduce the negative effects of excessive reverberation on localization

Binaural technology applications

Virtual reality audio

  • Binaural audio is a key component of immersive virtual reality experiences
  • Head-tracked binaural rendering allows for dynamic, real-time updating of spatial cues based on the user's head movements
  • Binaural audio enhances the sense of presence and realism in virtual environments (gaming, simulations, virtual concerts)

Gaming and immersive audio

  • Binaural audio is increasingly used in gaming to create more realistic and engaging sound experiences
  • Game engines can simulate the acoustic properties of virtual environments, providing dynamic binaural cues based on the player's actions
  • Immersive audio in gaming can improve situational awareness, spatial orientation, and overall gameplay experience

Telepresence and remote collaboration

  • Binaural audio can enhance telepresence and remote collaboration by providing a sense of spatial presence
  • Capturing and reproducing binaural cues can create the illusion of being in the same physical space as remote participants
  • Binaural audio can improve communication and understanding in virtual meetings, conferences, and remote training sessions

Assistive listening devices

  • Binaural hearing aids and assistive listening devices can help individuals with hearing impairments better localize sounds
  • These devices can preserve and enhance binaural cues, improving spatial awareness and speech understanding in noisy environments
  • Binaural noise reduction algorithms can selectively attenuate background noise while preserving the spatial cues of the desired signal

Psychoacoustics of spatial hearing

Minimum audible angle

  • The (MAA) is the smallest angular separation between two sound sources that can be reliably discriminated
  • MAA varies with the frequency of the sound and the location of the sources relative to the listener
  • The human auditory system is most sensitive to changes in the horizontal plane, with MAAs as small as 1-2 degrees for sources near the midline

Localization blur

  • refers to the inherent uncertainty in the perceived location of a sound source
  • It is influenced by factors such as the frequency content of the sound, the presence of background noise, and the listener's familiarity with the sound
  • Localization blur is typically larger for high-frequency sounds and in the vertical plane compared to the horizontal plane

Localization in noise

  • The presence of background noise can degrade the ability to localize sound sources
  • Noise can mask binaural cues, particularly ITDs and ILDs, making it more difficult to determine the direction of a sound
  • The effect of noise on localization depends on the signal-to-noise ratio, the spectral characteristics of the noise, and the listener's age and hearing status

Localization for hearing impaired

  • Hearing impairments can significantly affect the ability to localize sounds in space
  • Individuals with hearing loss may have reduced sensitivity to binaural cues, particularly ITDs and ILDs
  • Asymmetric hearing loss can lead to an imbalance in binaural cues, causing localization errors and difficulty understanding speech in noisy environments

Binaural hearing disorders

Unilateral hearing loss

  • (UHL) refers to hearing impairment in one ear, while the other ear has normal hearing
  • UHL can cause difficulties in sound localization, as binaural cues are disrupted
  • Individuals with UHL may struggle to understand speech in noisy environments and have reduced spatial awareness

Central auditory processing disorder

  • (CAPD) is a condition where the brain has difficulty processing auditory information, despite normal hearing sensitivity
  • CAPD can affect the ability to localize sounds, as the brain may not effectively integrate binaural cues
  • Individuals with CAPD may have trouble understanding speech in noisy environments and following complex auditory instructions

Auditory neglect and extinction

  • is a condition where an individual with brain damage (often due to a stroke) fails to respond to sounds on the side opposite the brain lesion
  • occurs when an individual can detect sounds on either side alone but fails to respond to sounds on one side when presented with stimuli on both sides simultaneously
  • These conditions can severely impact sound localization and spatial awareness

Evaluation and treatment approaches

  • Evaluation of binaural hearing disorders involves a combination of audiological tests, spatial hearing assessments, and neuropsychological evaluations
  • Treatment approaches may include the use of hearing aids, assistive listening devices, and auditory training programs
  • Auditory training can help individuals with binaural hearing disorders better utilize the available cues for sound localization and speech understanding in challenging acoustic environments
© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.


© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.

© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.
Glossary
Glossary