Connected Magazine

Main Menu

  • News
  • Products
    • Audio
    • Collaboration
    • Control
    • Digital Signage
    • Education
    • IoT
    • Networking
    • Software
    • Video
  • Reviews
  • Sponsored
  • Integrate
    • Integrate 2024
    • Integrate 2023
    • Integrate 2022
    • Integrate 2021

logo

Connected Magazine

  • News
  • Products
    • Audio
    • Collaboration
    • Control
    • Digital Signage
    • Education
    • IoT
    • Networking
    • Software
    • Video
  • Reviews
  • Sponsored
  • Integrate
    • Integrate 2024
    • Integrate 2023
    • Integrate 2022
    • Integrate 2021
ContributorsFeaturesLet's Get Technical
Home›Contributors›HDMI 2.2 doubles down

HDMI 2.2 doubles down

By David Meyer
10/06/2025
0
0

CES saw the introduction of a new HDMI specification. David Meyer looks at how it works, why it was created and what it means for integrators.

Every year, the Consumer Electronics Show (CES) heralds in the new year by showcasing tech innovations and announcements from around the world to set up the year ahead.

ADVERTISEMENT

This year, among the cacophony of the show and the latest in AI, transparent displays and electric vehicles (EVs) – to name but a few headline grabbers – was the announcement of HDMI 2.2. This appears to have implications in the CEDIA channel… or does it?

Key upgrades with the impending new HDMI specification include 96Gbps uncompressed data rate and a new improved lip-sync mechanism called “LIP”. So, does this mean you need to start upgrading AV components and cables again? Short answer: No.

The serial pragmatist in me can’t see the extra bandwidth requiring any action by integrators for the time being. Furthermore, the lip-sync feature could well be supported through a firmware update, or let’s hope so. But a much bigger pipe could, in time, unleash some incredible benefits to video, but maybe not in the ways you might think. Bear with me as I explain…

Why 96Gbps specifically?

There’s a recurring theme with successive HDMI specifications doubling the maximum usable bandwidth. Way back when I was a signatory to the HDMI adopter agreement in 2005, HDMI version 1.2 was current with a maximum nominal data rate of 4.95Gbps. However, it’s important to note that the maximum useable rate was actually just under 4.5Gbps. That’s 1.5Gbps on each of the three data lanes. Then these releases followed:

  • HDMI 1.3 (2006) added first-gen auto lip-sync but left AV bandwidth unchanged. I’ll come back to this later.
  • HDMI 1.4 (2009) doubled the data rate to 3Gbps/lane (9Gbps total) — note that it was marketed as 10.2Gbps “High Speed” but the 1.2Gbps headroom was superfluous and unusable
  • HDMI 2.0 (2013) doubled it again to 6Gbps per lane (18Gbps total), connected via Premium High-Speed HDMI Cable
  • HDMI 2.1 (2017) doubled it yet again to 12Gbps per lane but added another lane to make 48Gbps total with Ultra High-Speed HDMI Cable
  • And now HDMI 2.2 doubles it again to 24Gbps per lane (96Gbps total) together with a new Ultra96 HDMI Cable

I won’t get into how they achieve this new huge data haul, suffice it to say they’ve beefed up the Fixed Rate Link (FRL) encoding and equalisation technology. Furthermore, the option for Display Stream Compression (DSC) is carried over from HDMI 2.1. With up to a 3.5:1 compression ratio, this means HDMI 2.2 will in effect be able handle formats equivalent to more than 300Gbps. Yikes!

What does that mean in practice?

In their CES Press Release, HDMI Forum, Inc. stated that HDMI 2.2 is for “immersive and virtual applications such as AR/VR/MR, spatial reality and light field [holographic] displays as well as various commercial applications such as large-scale digital signage, medical imaging and machine vision.” So, little relevance so far in the CEDIA space other than perhaps VR.

Just an observation — I can’t help but feel that VR headset manufacturers would continue to favour not tethering their products with a cable, instead using onboard processing (as with Meta Quest and Apple Vision Pro) or wireless transfer from an outboard processor. Or if tethered, using USB Type-C to also facilitate power and controls (like Sony PlayStation VR2).  So why the need for HDMI connectivity?  That’s a rhetorical question that we’ll eventually see answered by manufacturers. Remember, the HDMI Forum provisions capabilities within the specification; it’s then up to manufacturers what they wish to implement.

But there is far more to HDMI 2.2 than what was included in the CES Press Release. I’ve seen proposals for 96Gbps enabling 12K and 16K video. That’s up to sixteen times the resolution of 4K! Do we need that? Absolutely not. Well, not on TV screens anyway. What I think is far more relevant, and I was delighted to see it, was the inclusion of 4K up to a whopping 480fps and 8K up to 240fps.

I’ve often lamented that the main beneficiaries of 8K are little kids in their habit of sitting sprawled a metre or less from the screen. But everybody can benefit from higher frame rates, regardless of resolution, screen size or viewing distance — especially gamers (back to this shortly). Of course, combining high resolution and high frame rate is best, but in a bandwidth-limited world, there’s typically a trade-off between them. For instance, 8K at 30fps makes use of the same bandwidth as 4K at 120fps.

I advocate temporal (time-based) resolution as more important in video than spatial resolution (the number of pixels in each frame), for the following reasons:

  • Motion blur: Any motion that occurs in the time interval of one frame is represented as motion blur between an object’s start and finishing points. The lower the frame rate, the longer the time interval for each frame and the higher the relative motion blur, especially with fast-moving objects. Conversely, the higher the frame rate, the lower the motion blur. Motion blur does not occur in the real world, so while it’s necessary to create the illusion of smooth motion in video, it does also inhibit realism.

Back to the comparison of 8K/30 and 4K/120, the latter will exhibit one-quarter of the motion blur and may actually look better with fast-moving content. But when comparing the same low frame rate, blur is still blur whether it’s in 4K or 8K!

  • Temporal aliasing: These are time-based artifacts that appear in the image that shouldn’t be there. A classic example is the wagon wheel effect whereby a spinning wheel looks like it’s stationary or going backwards because of a conflict between its actual rotation speed and the camera’s frame rate and shutter speed.

Back in 2008, BBC R&D released a research white paper titled High Frame-Rate Television (document WHP169). The authors explored at what frame rate temporal aliasing would no longer be visibly present. Long story short, they recommended mastering video at 300fps. For the record, motion blur would also be eliminated at 300fps in all but the very fastest-moving objects, like an F1 car.

Back then, my mind boggled at the notion of 300fps. Even by 2020 that was still the case. Never did I imagine that the developers of HDMI would so soon nominate frame rates even beyond that lofty threshold, but now they have with HDMI 2.2!

The CEDIA white paper Video Principles: Resolution (2024 review) expands on the abovementioned BBC R&D paper and its discussion of “dynamic resolution”. This states that the higher the spatial resolution of motion blur in an image, the more pronounced it will be against the unblurred high-resolution background. In other words, motion blur may be more evident at 8K than it is at 4K with the same frame rate. That’s yet another reason to favour higher frame rate over resolution if given a choice. But choose both if you can.

Use Cases

Two things: Gaming and VR.

It’s been widely reported over the last few years by the likes of the Entertainment Software Association (ESA) in the US and Entertainment Retailers Association (ERA) in the UK that the global video gaming market is bigger than the global music industry and Hollywood global box office combined. In fact, it’s more than double. If it isn’t already, gaming should be a core consideration for any integrator when designing video systems for clients.

HDMI Specification 2.1 introduced variable refresh rate (VRR), whereby each frame could be resolved onscreen in the same time it took for the graphics processing unit (GPU) to produce it. That can be different for every frame, depending on its rendering complexity, with the frame rate swinging anything from 30fps up to potentially around 120fps.

I’ve personally experienced gaming up to 280fps (with NVIDIA 40-series via DisplayPort) and was amazed how silky smooth the experience was. Uncannily natural motion, zero blur. Incredible. Lifting the bandwidth ceiling with HDMI 2.2 could enable VRR up to 240fps uncompressed 4:4:4/RGB HDR or up to 480fps if subsampled to 4:2:0.

As for VR, achieving its namesake goal of realism in a virtual space requires a few ingredients: resolution (already achievable), wide field of view to meet or exceed human peripheral vision (not there yet), dynamic range like the real world (far beyond current HDR specs), and naturally smooth motion. This last point again highlights the absolute need for very high frame rates.

Lip-sync

“LIP” looks like an example of coming up with a clever acronym and then devising the words to fill it — Latency Indication Protocol. But doesn’t matter, I like it! Anything to improve lip-sync must be a good thing. In fact, it’s already factored in the upcoming CEDIA/CTA-RP23 Immersive Video Design Recommended Practice.

I mentioned early in this piece that auto lip-sync was first introduced to HDMI with version 1.3 in 2006. It’s worked pretty well, for the most part. HDMI 2.0 later introduced a new generation bi-directional auto lip-sync capability for eARC. Good but not infallible.

We’ll have to wait for specific details about how LIP works, but my understanding is that it’s a metadata block to indicate video latency in the system so that the audio can be delayed to match, whether it be before the display (AVR) or after (soundbar). Latency can occur in the display itself, but may be introduced by other devices inline: video processors, extenders, etc.

Fingers crossed that this feature may be enabled through a firmware update rather than needing new hardware.

Compliance

This may disgruntle some, but HDMI 2.2 will make the HDMI 2.1 specification obsolete. So technically there will be no more HDMI 2.1, just as there is no more HDMI 2.0. All compliant devices become HDMI 2.2, even if they don’t feature any of the new capabilities. But don’t let that bother you. As always, just focus on what a device can actually do, or not do, rather than what version it says and leaving you to make assumptions.

Conclusion

I think by now you’ve got the message that I’m a fan of frame rate over resolution, though in practice that’s usually determined by the content media.

While I am excited by what HDMI 2.2 can eventually bring to gaming and VR, it doesn’t mean any change for integrators in the interim. Just keep plugging away… literally.

  • ADVERTISEMENT

  • ADVERTISEMENT

Previous Article

ADI | Snap One Unveils Control4 X4

Next Article

Williams AV launches the Digi-Wave T410

  • ADVERTISEMENT

  • ADVERTISEMENT

Advertisement

Sign up to our newsletter

Advertisement

Advertisement

Advertisement

Advertisement

  • HOME
  • ABOUT CONNECTED
  • DOWNLOAD MEDIA KIT
  • CONTRIBUTE
  • CONTACT US