‘Game of Thrones’ Exposes HDR, Video Compression Issues

[ALERT: SPOILERS AHEAD!]

“The night is dark and full of terrors” was never truer than in the most recent Game of Thrones installment (Season 8, Episode 3), “The Long Night” (aka “The Battle of Winterfell”). While viewers rapturously recapped and reviewed the episode’s heroes and villains, action scenes, thrilling climax and consequences, one complaint was constant and continuing: because the battle action took place during Westeros’ pre-dawn hours, with an added magical obscuring blizzard, it was often difficult, if not impossible, to tell what was going on.

CNN was among multiple outlets detailing the episode’s visual murkiness. Several late-night comics poked fun at the “The Long Night” being dark and full of incomprehension. In her recap in the Washington Post, Alyssa Rosenberg, encapsulated many viewer complaints:

The Battle of Winterfell, much-ballyhooed as the biggest and longest battle sequence ever, was for the vast majority of its run largely incomprehensible, thanks to the episode’s incredibly muddy lighting; smeary images that made it appear that my television was malfunctioning during certain crucial moments.*

*A number of other people mentioned this to me on Twitter, so I’m pretty sure it’s not my flat screen.

With all due respect, however, and not knowing what type of flat screen TV Ms. Rosenberg owns, it is not only her viewing device, but the production and delivery technologies as well that caused the widespread video quality quibbles.

The good news is, the brouhaha over the visual acuity of “The Long Night”—the most-watched episode of the highest-rated series ever on the premier pay cable channel—could be a tipping point for producers, content providers and consumers to address the shortcomings of the varying display, encoding and delivery technologies currently being deployed. Both broadcast and streaming content is becoming ever-more visually ambitious, leaving content creators with an untenable choice: lower their artistic visions to appease the lowest common distribution and display technologies, or aim high, push the envelope on production value and artistic storytelling, and push providers/distributors, hardware makers and consumers toward the highest-quality reproduction and viewing choices.

Obviously, in practicality, this is no choice at all. Content creators properly choose the technologies best suited to present their creative vision and production purposes, rightly paying no mind to the downstream ramifications. It is the responsibility of those who control the downstream to present this creative vision to their audience in the highest-quality form possible.

Mastering, HDR, Compression Issues Exposed

“The Long Night” hit the digital video pitfall trifecta: darkness, smoke/fog and fire. All are traditionally difficult to shoot, master, encode and decode accurately. Anyone with an older, low bit-rate 2K LCD TVs—even those equipped with full array/local dimming (FALD)—is sure to be exposed to all manner of digital artifacts, pixelation, color banding and blocking. And anyone streaming the episode via HBO Go or HBO Now through thinner and more fragile online pipes producing wildly variable bit rates are likely even worse off experientially.

One big issue on the downstream is HBO’s surprising lack of High Dynamic Range (HDR) technology implementation, lack of HEVC compression, and what appeared to be the lack of scene-adaptive encoding to compensate for the constantly shifting blue-ish moon and reddish flame-lit color and lighting palettes. Application of any or all of these technologies—especially HDR—would have significantly improved the clarity of the trench-lit battles, the dim terror-in-the-crypt, the dragon aerial dogfights, the climactic fire-and-ice battle at the Godswood, and other more challenging mise-en-scène in not only GoT 8:3, but all TV programs with challenging visuals.

Fortunately, we have learned that HBO is preparing to implement HDR; when this will happen is unknown—but certainly not in time to assist the final three GoT installments.

Establishing Baseline Standards

There is an ironic path forward for TV content creators and content providers to make sure their vision is accurately rendered and fully enjoyed by the consumer, regardless of reception and display technologies: create and distribute programs as if they will be seen on an 8K TV.

Not that resolution has anything to do with anything in this conversation; it is widely accepted that 1080p HDR presents a higher-quality—and more compact—result than 4K SDR.

Although little 8K content and few TVs for play back exist, 8K TVs present the largest collection of cutting-edge video reproduction technologies available. Aspiring to meeting the high-quality demands of an 8K program—wide color gamut, adaptive scene encoding, high bit-rate HDR, high bit-rate compression, et al—by everyone along the content production and delivery line would keep the lack of proper video rendering from becoming more controversial than the actual program content, so that the night can be full of terror and not incomprehension.