£100,000 grant puts all-new 3D sound ‘Listening Room’ to the test
Fri, 25 Nov 2016 12:05:00 GMT
One of the main outcomes is to upmix recordings made for 2D surround for 3D to provide the listener with a horizontal and vertical 3D listening experience
AT the University of Huddersfield, music technology lecturer Dr Hyunkook Lee (pictured right) has built the ultimate listening room.
Designed for 3D sound research, it supports all of the currently available 3D loudspeaker formats from Auro-3D 9.1 to NHK 22.2 and Dolby Atmos. It is also playing a vital role in research that will help accelerate the spread of 3D sound systems in the home, the cinema and in the burgeoning field of virtual reality.
Dr Lee, who leads the University’s Applied Psychoacoustic Lab (APL) within the School of Computing and Engineering, has now completed a two-year research project titled Perceptual rendering of vertical image width for 3D multichannel audio that received funding of £100,000 from the Engineering and Physical Sciences Research Council (EPSRC). The construction of a special listening room – which complies with International Telecommunications Union recommendations on room acoustics – was central to the project. Dr Lee is now building on the research.
One of the main outcomes of the project is a technique that he has termed Perceptual Band Allocation (PBA), which is used to upmix recordings made for 2D surround for 3D to provide the listener with a horizontal and vertical 3D listening experience. The technique is based on Dr Lee’s research on how the human brain perceives the vertical location of different sound frequencies.
For this research, Dr Lee developed a new system in which his listeners were provided with a rotation knob linked to a vertical strip of LED bulbs that they used to indicate the perceived height of frequencies. This was an improvement on previous techniques, in which people used a numbered scale or a laser pointer, because it is more intuitive and has less of a visual bias, said Dr Lee.
The knowledge gained from the research will also be of huge assistance in determining the perfect placing of speakers in order to enhance the 3D experience, in which the listener hears sounds from above as well as around.
During the project, he also investigated the effect of vertical microphone spacing on spatial impression and proposed a new design concept for 3D microphone array based on the research findings. This concept was adopted by Schoeps in their new 3D microphone array called “ORTF 3D”.
Virtual elevation perception
Another current strand to his research is an exploration of “virtual elevation perception”, an intriguing psychoacoustic phenomenon which means that 3D sound can be simulated when certain frequency bands are reproduced from two speakers placed side-by-side.
“This phenomenon was discovered in the 1940s, but received little attention until I revisited it,” said Dr Lee. “I carried out an in-depth study of this phenomenon and established a new theory about the reason why it occurs. The virtual elevation effect has a lot of practical implications for 3D sound applications.
“For example, using the effect we can reproduce overhead sound without having height-channel speakers. It is even possible to create this effect over headphones using a binaural technology,” explained Dr Lee, who has combined his Perceptual Band Allocation with a novel method called virtual elevation panning in order to develop an upmixing technique that has enormous potential.
“A lot of people don’t have height-channel speakers at home, so if you want to provide the same impression of vertical, overhead sound, this method could be very useful,” he said.
His research is also taking him into the increasingly important field of virtual reality, so that the realism of the sound matches the visual imagery experienced via head mount displays.
“Again, the first important thing is to understand how we perceive sound in various auditory environments through fundamental research. Then the research findings can be exploited to develop new techniques to convincingly capture and render different types of sound field,” said Dr Lee.
- Dr Hyunkook Lee is a Senior Lecturer in Music Technology and Production and the leader of the Applied Psychoacoustics Lab (APL) at the University of Huddersfield’s School of Computing and Engineering. He is also an experienced freelance sound engineer working in a wide range of musical genres and has undertaken a number of consultancy works for companies such as Samsung Electronics and Manchester’s Bridgewater Hall. Before joining Huddersfield, he was a Senior Research Engineer at LG Electronics in South Korea.