Frans de Jong (EBU)
There are five ways to improve current HDTV: add more pixels (UHD, see glossary below), more luminance levels (HDR), more colour (WCG), more frames per second (HFR) or better audio (NGA). To test what this means in practice, the EBU organized a live trial production during the inaugural edition of the European Championships. The goal was to test ‘Beyond HD’ technology under real-world conditions.
Outside the Olympic Stadium in Berlin five containers were transformed into an audio control room, a video control room and a technical area with seven racks of the latest production equipment. About 40 representatives from five Members and 20 industry partners worked together to create content in 2160p/100 HDR HLG, 1080p/100 HDR HLG and 1080p/50 SDR. Audio in AC-4 (Dolby Atmos) and several flavours of MPEG-H audio were produced.
The signals were sent over fibre to the Broadcast Operations Centre in Glasgow and via satellite to the Rai 5G test bed in Italy. The material was also recorded locally, both uncompressed and in XAVC Class 480 (~1,600 Mbps!).
HFR not (yet) practical
Athletics was chosen as it provides fast-moving action, making it ideal for testing HFR. To achieve capturing at 100 Hz, four Sony HDC-4300 cameras were positioned in the stadium and connected via SMPTE 311M cable. In the technical area the video was obtained as two separate 2160p/50 signals, each representing either the ‘odd’ or the ‘even’ frames of the full 100 Hz signal. Using 3G-SDI interfaces, this meant eight cables were needed per source.
The trial proved it is possible to achieve high-quality images this way, but it also showed it is not (yet) practical. Besides the SDI interfacing, there is no phase signalling, nor p/100 timecode and it is currently not possible to properly monitor the p/100 signals, as there are no p/100 reference monitors yet. The only 100 Hz capable displays available are consumer devices, and even those are very rare. In Berlin two 55” LG televisions with prototype software were used to receive the signals using a DVB input. The production monitoring was done using one phase only, on Sony BVM X-300 reference monitors.
HDR: easier for the shader
The event in Berlin was ideal for testing HDR. The combination of bright sunlight and deep shadows, daytime and night-time events, and the use of artificial lighting in the stadium provided a wide variety of challenging conditions. The clear overall impression is that HDR can add quality to the picture. Especially where bright sunlight or artificial light is in the game. The shader quickly learned how to use the wider luminance range. He was impressed by the natural look of the images and the fact that fewer corrections were needed.
The shader Kevin Joubay (France Télévisions) quickly learned to use the wider luminance range.
One important aspect for broadcasters is to be able to produce both HDR and SDR in parallel with a single team. The typical approach today for broadcasters is to focus on SDR and to let the HDR follow automatically. This makes sense as the SDR output is, of course, still the most important and because shaders are familiar with working in SDR. It is also relatively easy to let the HDR follow the SDR. Doing the inverse is more complex, because the latitude of SDR is more limited, so the ‘down-converted’ SDR may turn out to be compromised in quality.
In the Berlin trial the focus was on HDR, because there was no strong SDR-constraint and because the team wanted to make sure to exploit the full HDR capabilities. Overall the HDR production worked well. However, the set-up of converters turned out to be complex and needed careful attention.
NGA: complex to monitor
The audio production on site was challenging. Not only because it involved the production of three different audio signals in a single control room, but also because each signal in itself consisted of multiple elements: an immersive audio bed and four objects of which the end user can change the volume and position.
The main mix was created using a variety of mics, including a 3D ORTF array, an Eigenmike and many spot mic signals that were provided by the host broadcaster. The objects consisted of commentary and audio description, each in two languages (French and English). The mixing team used the available height dimension for creative effect; for example, by mixing in mics that were placed near the bar of the high jump, so viewers could hear it being touched above their heads.
The team gathered several ideas for improved visual metering to help future immersive sound audio production. Another desired improvement is to have a serialized form of metadata (S-ADM) suitable for streaming and live productions.
The Berlin exercise has provided a wealth of experience for both the participating EBU Members and the industry partners. A selection of 2160p/100 HLG test sequences are available from the EBU for technical testing. Some of those tests are already planned for early next year, in particular to subjectively evaluate the quality improvements HFR can provide. This is important, as broadcasters have several options to improve their current HD offerings. They can choose which of the new features (UHD, HDR-WCG, HFR, NGA) they want to introduce. There is no need to add all at once.
So instead of jumping to 2160p/100, a broadcaster may for example decide to start distributing in 1080p/50 HDR, as this saves bandwidth costs. When produced using UHD cameras, the image quality can be very high.
In terms of the production workflow, the EBU is planning to provide guidelines for its Members to help achieve high quality HDR results, for example concerning equipment set-up and conversions. On the audio side, the EBU is promoting the use of ADM and ADM-based tools to achieve an open technology codec-independent workflow for NGA production deployment. In this context, EBU Tech 3392 provides a constrained production subset of the ADM to simplify implementations and prevent interoperability problems.
The EAC 2018 project will be presented at the EBU Production Technology Seminar 2019 and 2160p/100 HLG (uncompressed and XAVC) test sequences recorded at the event can be now be ordered by Members and non-members.
This article was first published in issue 38 of tech-i magazine.
Glossary of abbreviations
Ultra High Definition. Collective name for the television generation beyond HDTV. Currently typically means ‘4K’ television resolution (3830 x 2160 pixels).
High Dynamic Range, enables images to be shown with more ‘luminance steps’; including more details in the dark and more beautiful highlights.
Hybrid Log Gamma, one of the two HDR standards. Highly appropriate for live television.
Perceptual Quantizer, the other HDR technique; uses metadata. Assumes a controlled home viewing environment, like in cinemas.
High Frame Rate, more images per second, allows for smoother movements. For television in Europe, HFR currently means 100 fps.
Wide Colour Gamut, extends the available range of colours with more saturated colours (“redder red’’, “greener green”, etc.). Included with HDR.
Next Generation Audio, a new way of broadcasting audio to provide users with an immersive and personalized sound experience.
Audio Definition Model, defined in ITU‑R BS.2076 as a set of metadata and parameters for all next generation audio technology.
(Photos: Claude Pfeifer, Sony)