Originally published at: More Standards Confusion, This Time with HDMI 2.1 - TidBITS
We recently covered the dizzying confusion of USB standards, but HDMI standards for displays are even more confusing and even misleading. Read on to see what we mean.
Yep. HDMI has been a tangled mess for a long time. I made some small contributions to the Wikipedia page several years ago to try and clarify a few points, and got into some rather intense discussions with experts who showed me how much of a mess the standards really are.
You might find this recent Apple Insider article similarly relevant:
The root cause of the problem is that each HDMI standard is a grab-bag of features, nearly all of which are optional. So it is possible for a device to claim to be âHDMI 2.1â while not providing or supporting any advanced features.
It can get especially frustrating when shopping for cables, because thereâs no such thing as a âversionâ for a cable, despite various companyâs marketing departmentsâ attempt to claim otherwise. The HDMI specs define three categories of cable (with four different brandings, just to be even more confusing), that are defined entirely in terms of bandwidth:
-
Category 1 (aka âstandardâ) HDMI requires cables to pass a 74.25 MHz signal. It says nothing about how many bits per second that can be transmitted without error (e.g. it may pass a 74.25 MHz signal with really bad signal integrity).
You should expect a certified category 1 cable to be able to support any combination of features from HDMI 1.0 through 1.3 (which may require up to about 5 Gbit/s of good data), but thereâs nothing guaranteeing it.
-
Category 2 (aka âhigh speedâ) HDMI requires the cable to pass a 340 MHz signal, but also has no data-rate requirement. You should expect a certified category 2 cable to be able to support all HDMI 1.0 through 1.4 features (up to 10.2 Gbit/s), but again, there is no guarantee.
-
The âPremium High Speedâ branding was introduced at the same time as HDMI 2.0. Premium high speed cables are category 2 cables (with the same 340 MHz analog bandwidth), but they must pass data-throughput tests proving that they can reliably pass 18 Gbit/s (the maximum that will be used by any combination of HDMI 2.0 features).
IMO, this is the first HDMI certification standard that means anything to normal users.
-
Category 3 (aka âultra high speedâ, â48Gâ) HDMI says nothing about the analog frequency that must be passed (as far as I know), but it requires the cable to pass 48 Gbit/s, which is enough to support every valid combination of HDMI 2.1 features.
See also HDMI - Wikipedia
The critical point, which almost nobody makes, is that the amount of bandwidth you require depends on what features your devices are using (resolution, color depth, encoding, etc.). Even if your devices are HDMI 2.1 compliant, if the particular combination of features used by your devices only requires (for example) 8 Gbit/s, then you can use cheap category 2 (âhigh speedâ) cables. You donât need a category 3 cable unless the features youâre using require more than 18 Gbit/s.
The problem this creates for the world is that if youâre not a very technical person, you probably donât know all the details about what features your devices are using, let alone how much bandwidth they require. Which means youâre left with only two choices, neither of which is particularly appealing:
-
Buy an expensive cable that can support the maximum feature set of an HDMI standard (e.g. a category 3 cable for an âHDMI 2.1â device). This should definitely work (if your cable is certified, of course), but you may end up paying extra for bandwidth you donât require, depending on your installation.
-
Buy something cheaper (e.g. a category 2 cable) and hope it is good enough. In some cases (e.g. if youâre connecting to a TV that only supports 1080p), this will be no problem. But you may also find yourself with a cable that canât support the required bandwidth, leading to a bad quality image, or no image at all. And a cable you thought was good may âgo badâ in the future as a result of you (or a firmware update) using a feature you werenât previously using.
Given knowledge of what HDMI did (and failed to do), Iâm starting to like Appleâs solutions (for Thunderbolt and Lightening) much more. By including an ID chip in the cables, the devices that connect to it can check on the capabilities and (at least in theory) respond gracefully if they arenât up to the task. (e.g. Present a warning to the user and/or operate at reduced capability).
I wrote about many of these issues a few years ago. Itâs a little out of date now, but the confusion remains. (My solution to these problems was buying a new TV.)
Per David above, 2.1a is now being introduced at CES with the same issues! CES 2022 will introduce HDMI 2.1a, another confusing new spec - The Verge
But on the other hand, when you do update to equipment that does need that extra bandwidth, you wonât have to buy all new cables.
Whatâs the price differences? Thatâs the question.
If typical length 3 metre Cat 3 and Premium ones can be gotten for similar prices, then most can likely sidestep the issue by just buying the Cat 3 and be done.
Presumably if the cable advertises 48Gbps speeds, and the brand is recognisable, one can assume everything else is fine.
The real issue is devices advertising 2.1 (or soon 2.1a), but half the expected new functionality completely missing. So average ânon-tech display nerdsâ, are left to fend for themselves. Such a pointless spec.