More Standards Confusion, This Time with HDMI 2.1

Originally published at: More Standards Confusion, This Time with HDMI 2.1 - TidBITS

We recently covered the dizzying confusion of USB standards, but HDMI standards for displays are even more confusing and even misleading. Read on to see what we mean.

Yep. HDMI has been a tangled mess for a long time. I made some small contributions to the Wikipedia page several years ago to try and clarify a few points, and got into some rather intense discussions with experts who showed me how much of a mess the standards really are.

You might find this recent Apple Insider article similarly relevant:

The root cause of the problem is that each HDMI standard is a grab-bag of features, nearly all of which are optional. So it is possible for a device to claim to be “HDMI 2.1” while not providing or supporting any advanced features.

It can get especially frustrating when shopping for cables, because there’s no such thing as a “version” for a cable, despite various company’s marketing departments’ attempt to claim otherwise. The HDMI specs define three categories of cable (with four different brandings, just to be even more confusing), that are defined entirely in terms of bandwidth:

  • Category 1 (aka “standard”) HDMI requires cables to pass a 74.25 MHz signal. It says nothing about how many bits per second that can be transmitted without error (e.g. it may pass a 74.25 MHz signal with really bad signal integrity).

    You should expect a certified category 1 cable to be able to support any combination of features from HDMI 1.0 through 1.3 (which may require up to about 5 Gbit/s of good data), but there’s nothing guaranteeing it.

  • Category 2 (aka “high speed”) HDMI requires the cable to pass a 340 MHz signal, but also has no data-rate requirement. You should expect a certified category 2 cable to be able to support all HDMI 1.0 through 1.4 features (up to 10.2 Gbit/s), but again, there is no guarantee.

  • The “Premium High Speed” branding was introduced at the same time as HDMI 2.0. Premium high speed cables are category 2 cables (with the same 340 MHz analog bandwidth), but they must pass data-throughput tests proving that they can reliably pass 18 Gbit/s (the maximum that will be used by any combination of HDMI 2.0 features).

    IMO, this is the first HDMI certification standard that means anything to normal users.

  • Category 3 (aka “ultra high speed”, “48G”) HDMI says nothing about the analog frequency that must be passed (as far as I know), but it requires the cable to pass 48 Gbit/s, which is enough to support every valid combination of HDMI 2.1 features.

See also HDMI - Wikipedia

The critical point, which almost nobody makes, is that the amount of bandwidth you require depends on what features your devices are using (resolution, color depth, encoding, etc.). Even if your devices are HDMI 2.1 compliant, if the particular combination of features used by your devices only requires (for example) 8 Gbit/s, then you can use cheap category 2 (“high speed”) cables. You don’t need a category 3 cable unless the features you’re using require more than 18 Gbit/s.

The problem this creates for the world is that if you’re not a very technical person, you probably don’t know all the details about what features your devices are using, let alone how much bandwidth they require. Which means you’re left with only two choices, neither of which is particularly appealing:

  • Buy an expensive cable that can support the maximum feature set of an HDMI standard (e.g. a category 3 cable for an “HDMI 2.1” device). This should definitely work (if your cable is certified, of course), but you may end up paying extra for bandwidth you don’t require, depending on your installation.

  • Buy something cheaper (e.g. a category 2 cable) and hope it is good enough. In some cases (e.g. if you’re connecting to a TV that only supports 1080p), this will be no problem. But you may also find yourself with a cable that can’t support the required bandwidth, leading to a bad quality image, or no image at all. And a cable you thought was good may “go bad” in the future as a result of you (or a firmware update) using a feature you weren’t previously using.

Given knowledge of what HDMI did (and failed to do), I’m starting to like Apple’s solutions (for Thunderbolt and Lightening) much more. By including an ID chip in the cables, the devices that connect to it can check on the capabilities and (at least in theory) respond gracefully if they aren’t up to the task. (e.g. Present a warning to the user and/or operate at reduced capability).

5 Likes

I wrote about many of these issues a few years ago. It’s a little out of date now, but the confusion remains. (My solution to these problems was buying a new TV.)

Per David above, 2.1a is now being introduced at CES with the same issues! CES 2022 will introduce HDMI 2.1a, another confusing new spec - The Verge

1 Like

But on the other hand, when you do update to equipment that does need that extra bandwidth, you won’t have to buy all new cables.

What’s the price differences? That’s the question.

If typical length 3 metre Cat 3 and Premium ones can be gotten for similar prices, then most can likely sidestep the issue by just buying the Cat 3 and be done.

Presumably if the cable advertises 48Gbps speeds, and the brand is recognisable, one can assume everything else is fine.

The real issue is devices advertising 2.1 (or soon 2.1a), but half the expected new functionality completely missing. So average ‘non-tech display nerds’, are left to fend for themselves. Such a pointless spec. :roll_eyes: