Be it sports, TV series, movies, 360-degree videos or VR – audiences have become accustomed to high quality video content, irrespective of the device used. But when it comes to audio content, broadcasters and content producers are only beginning to exploit new opportunities to provide a flexible and equally immersive experience with object-based audio.
An upcoming Object-Based Audio Seminar (17-18 May 2017 at EBU headquarters in Geneva) will deliver the cutting-edge intelligence required to better understand this nascent technology.
Everything is possible
Object-based audio promises to be able to address a multitude of use cases. Not only can object-based content be rendered for any playback scenario – from mono, to multi-speaker configurations, to binaural headphone playback – on the client side; it also increases the scope for personalization and could help to deliver subtitling, sign language, and even abridged audio playback.
Broadcasters could derive and deliver huge benefits from authoring object-based audio content with regard to multiplatform accessibility, personalization and access services. But they are also opening a Pandora’s box of workflow-related questions. There is a pervading ignorance in this domain, as the limitless number of use and business cases that this technology facilitates are currently unquantified.
One-stop OBA shop
At the EBU’s OBA Seminar, a stellar cast of consummate audio and multimedia experts, from producers to sound engineers, developers and researchers, will be presenting and debating the knowns and unknowns of this exciting technology.
There will be a string of demos alongside the paper sessions and the unparalleled opportunity to mingle and network with fellow audio and multimedia professionals during the refreshment, catering and demo breaks.
Download the programme and register now!