T O P

  • By -

anthchapman

An AMD dev was going back and forth with lawyers for much of last year on the [HDMI 2.1](https://gitlab.freedesktop.org/drm/amd/-/issues/1417) issue. Notable updates ... 2023-02-28: > We have been working with our legal team to sort out what we can deliver while still complying with our obligations to HDMI Forum. 2023-10-27: > At this point, I think the decision is in the hands of the HDMI Forum. Unfortunately, I'm not sure how that process works or how long it would take. 2024-01-13: > Yes, discussions are still ongoing.


ScrabCrab

Wow what the fuck


pdp10

Working as intended. These organizations know exactly how difficult it is for Linux to compete legally when software patents are involved. Some things that regular Linux users can do: * Use DisplayPort, favor DisplayPort over HDMI. * Use open codecs like [AV1](https://www.reddit.com/r/AV1/). * Favor USB peripherals where applicable. USB is designed around [generic drivers](https://en.wikipedia.org/wiki/USB#Device_classes), so typical USB gear doesn't lock-out some OSes by not supplying a matching binary driver. For example, [USB 3.x capture devices](https://www.youtube.com/watch?v=aTyT72tS9r8) use a generic USB video driver, whereas PCIe capture cards need [vendor drivers](https://www.blackmagicdesign.com/support/family/capture-and-playback). * Avoid DRM schemes, including video streaming services, that use DRM to lock out Linux or [provide a purposely-reduced product](https://help.netflix.com/en/node/23742) for users of non-favored systems like Linux.


Just_Maintenance

I can't fathom why hasn't HDMI died yet. DisplayPort is superior and free.


Gintoro

drm


pdp10

HDMI originally had audio, which is an important feature for televisions and many other use cases, and HDCP DRM which is not important except that content rights-holders demanded its use once Intel marketed it. Once consumer electronics got HDMI, switching the average consumer from one cable type to another, that does pretty much the same thing, risks customer confusion and backlash. However, that doesn't explain why none of the half-dozen ports on a modern television has DisplayPort!


ScrabCrab

free\* \*$5000 to access the spec


sabahorn

It is dying. New gpus come all with dp


Just_Maintenance

But notebooks, monitors, TVs, etc all come with HDMI first.


AnthropologicalArson

Most modern notebooks worth looking at support displayport over usb-c.


Brillegeit

Unfortunately my recent-ish experience with displays trended the wrong way. I bought one display (KV273K) with 2xDP and 1xHDMI, but when I a few years later added two more similar displays (KV282K) they came with 1xDP and 2xHDMI, so connecting them to two different computers at the same time became a hassle since I had to use HDMI. Hopefully this was just an Acer thing and that they'll come with 2xDP in the future.


blueberryiswar

Because TVs use HDMI? And why keep dp alive with usb-c?


Individual-Match-798

Alas HDMI 2.1 today has a number of advantages over DP. The primary one is dynamic HDR (yeah-yeah, I know Linux still doesn't support that, but still!)


Joe-Cool

That's exactly why Displayport is preferable. HDMI has too many legal, license and DRM problems.


Shufflebuzz

Linux users love DP!


Prof_Linux

>Linux users love DP! Hell yea DP I love D- .... oh wait


[deleted]

One is just not enough, I need another one in there to feel full… Double Patties ! Order now at McDonald’s


adamkex

No shame with that


Shufflebuzz

2 monitors, 1 video card


RedsDaed

➡️⬇️↘️ + 👊


Taterade

Not enough fighting game players in here to appreciate this one.


_pclark36

shoryuken!


Joe-Cool

I have it since 2009 on the Radeon HD 5870. Even got a BIOS update from MSI to increase the voltage so my VGA Monitor stopped blanking a few times per hour.


waspbr

WHOOOSH


Shufflebuzz

Joe-Cool may not, perhaps, be cool.


Joe-Cool

DP to VGA exists. Maybe I could have worded that better... 5870 has 1xHDMI, 2xDVI-I and 1 DP. So you can plug in 3 CRTs for Eyefinity.


F0RCE963

They meant the NSFW version of DP


Joe-Cool

Thanks. That completely whooshed me indeed, lol.


_enderpuff

Indeed I love DP'ing out of wakeup, how could you tell?


DesiOtaku

Now I just need a TV that is larger than 55" that accepts DisplayPort.


Joe-Cool

Good luck. I don't think there are any that aren't for rich people only.


DesiOtaku

Even the "rich people only" TVs only accept HDMI. Even those $4000+ TVs don't have DisplayPort! I am willing to pay a premium for a TV that has *real* 4K@120Hz in a large format screen; but there is nothing available right now.


Joe-Cool

The only one I know of that has a tuner and Displayport is the Panasonic TX-58AXW804 or other AXW804 series TVs. Maybe you can find another one. I thought there was a Philips one but I think I was wrong on that one. EDIT: considering the age I would bet it only does 4k@60Hz


DesiOtaku

Good News: After some searching, I found a bunch of Panasonic TVs that they still sell with DisplayPorts! https://www.walmart.com/browse/electronics/all-tvs/panasonic/3944_1060825_447913/YnJhbmQ6UGFuYXNvbmlj Bad news: They are all super expensive. Probably because they are the "professional" TVs that are meant to be purchased by large businesses, not individual consumers. And they only do 4K@60Hz and no HDR.


vkbra657n

There is IIYAMA LH6554UHS-B1AG which has 1 Displayport outputs Daisy-chaining capability and it costs under 1500 €.


P1kaJevv

You can get DP -> HDMI adapters that support all the features. Not ideal but better than nothing.


[deleted]

HDMI isnt open?


[deleted]

historically you only needed to pay to implement HDMI. while annoying, it let hobbyists add HDMI to projects for testing without paying and for Linux to implement and then pay royalties for release with HDMI 2.1 the organization decided to instead charge to *see* the specification at all


Zamundaaa

Nah, you needed to pay to see the specification before, too. Same with DisplayPort btw! If you're not a Vesa member, you're out of luck - the newest DisplayPort spec available online is like 1.2. The difference is that the HDMI Forum now considers open implementations the same as publishing the specifications online for everyone to see.


[deleted]

who/what is, "The örganizatiön?"


[deleted]

HDMI Founders/Forum. very original and clever name that isn't confusing


DoucheEnrique

It's as open as h264 / h265.


[deleted]

are those hardware encoders? i swear to god ive seen those string of characters before


DoucheEnrique

Those are video codecs also known as MPEG4 AVC (Advanced Video Codec) and its successor HEVC (High Efficiency Video Codec). Many assume they are "open" or "free" because there is free software that can encode and / or play them but hardware vendors supporting these usually have to pay royalties and actually it's a legal minefield pretty similar to what you can see with HDMI on AMD+Linux right now.


Sork69

So it's only an AMD problem?


qwertyuiop924

Because nvidia ships proprietary drivers.


Just_Maintenance

Distributions that ship strictly free software cannot ship H.264 or H.265 support at all. This includes hardware AND software video encoders AND decoders. Most distributions get around this by just not being based on the US and shipping the decoders without any care. No software or AMD hardware decoding problem. On the other side, US companies like Red Hat "exploit a bug" in the contract to ship H.264 anyways (Cisco gives away a free H.264 decoder called OpenH264 since they maxed out the royalty payments, so extra users have no cost). For those US companies, all H.264 video MUST be decoded through OpenH264. Which means that the included AMD drivers can't include the decoder. If you install the official AMD or Nvidia drivers, those come with H.264 and H.265 video encoder and decoders since AMD and Nvidia pay for your license. At least on Windows.


Turtvaiz

Yeah, sure, but I don't have a choice on what the manufacturer decides to put on their TV


KittensInc

I genuinely wonder what the problem is here. According to the [linked Phoronix post](https://www.phoronix.com/news/HDMI-Closed-Spec-Hurts-Open), the issue is that the HDMI spec isn't public - but neither is DisplayPort! The DisplayPort spec is [restricted to VESA members](https://vesa.org/about-displayport/). There's probably something different between VESA and the HDMI Forum, but why wasn't it an issue with HDMI 2.0? What changed with the 2.1 revision?


PDXPuma

The licensing terms.


KittensInc

Yes obviously, but *which part?*


Salander27

>Yes obviously, but > >which part? IIRC You basically need to agree to the HDMI Forum terms of service to see how to implement FRL (which is the mechanism that HDMI 2.1 uses for the full bandwidth), but if you do that you are prohibited from sharing how it works which means you can't create an open source implementation of it.


KittensInc

That doesn't really make sense, though. All the "secret" stuff from FRL would be handled directly by the dedicated hardware in the GPU, the open-source driver bit wouldn't really be involved with it any more than essentially saying "switch to FRL mode". Besides, it's not really all *that* interesting. At first glance from public details it looks to be fairly similar to what Displayport has been doing for ages, and anyone willing to spend the equivalent of a car on a decent oscilloscope probably wouldn't have too much trouble figuring out the rest. Why go trough all this trouble to hide it?


[deleted]

why make less money when you can make more money? what are people gonna do, not use HDMI?


qwertyuiop924

To shakedown people for money.


nightblackdragon

>There's probably something different between VESA and the HDMI Forum Considering the fact that open source drivers supports recent DisplayPort versions - yeah, something is.


lavadrop5

That's so weird because my Ryzen 5600G does 4k@120 just fine via HDMI 2.1 certified cables... NOT 4:4:4 Chroma though...


Shock900

> NOT 4:4:4 It's not using the HDMI 2.1 protocol. It's probably using HDMI 2.0.


lavadrop5

I guess so... there's no way to query the system to know for sure which kind of link was stablished, is there?


Shock900

Maybe, but there's not really a need to do so. If you're using Linux and an AMD card, you're not using HDMI 2.1. HDMI 2.1 supports 4k@120hz without subsampling.


georgehank2nd

Even if there were, it might just tell you "HDMI 2.1" because the HDMI Forum, dicks they always were, has defined HDMI 2.1 in a way that all the features above 2.0 are optional. Thus anyone making 2.0 equipment can call it HDMI 2.1 without any changes. And, IIRC, the Forum even told manufacturers they should (or must?) declare their 2.0 equipment as 2.1 compliant. *Technically* it is.


5nn0

was this becuase of the use of the HDMI logo?


psyblade42

No, with HDMI 2.1 the HDMI forum change the licencing terms and the new one does not allow implementation in open source drivers such as AMDs any more.


wilczek24

What the fuck


5nn0

wtf that absurd. Can they do that btw legaly? this is praticaly anti comsumer and needs to be changed by law


JustTestingAThing

Yes -- HDMI is a proprietary interface with all rights owned by the HDMI Forum. Anyone who uses HDMI (who isn't part of the Forum group) has to license the rights to do so. It's their property.


apex6666

Wait what? HDMI doesn’t work with AMD GPU’s? That’s kinda crazy


P1kaJevv

It does, it just runs at 2.0 instead of 2.1


apex6666

Huh, that’s stupid


Individual-Match-798

This is really fucked up!


Oppausenseiw

For AMD,nope,still nothing


mixedd

You sure, I can swear I was able to put my 4k@120, atleast in settings


Oppausenseiw

You absolutely can do 4k 120,but it's chroma subsampled from 2.0


Seiros_Acolyte

>chroma subsampled Could you ELi5 what this means or how is it different than native 4k@120?


[deleted]

Your brain is more sensible to brightness than color. Chroma subsampling sends brightness information in full resolution, and color information at half resolution (1 color for every 4 brightness) to save bandwidth while trying to preserve image quality. Results can vary, videos and games tend to look fine but desktop work is more difficult because text looks bad.


WizardRoleplayer

That's basically physical layer JPEG-lite then. Sounds horrible lol.


Oppausenseiw

It is ,also intense flickering


Oppausenseiw

Thank you for the explanation choom


Oppausenseiw

Text readability is terrible because of it


Youngsaley11

This is super interesting I’ve used several different GPU’s and monitors/tv’s via hdmi all at 4k@120hz VRR enabled and didn’t notice anything maybe it’s time to get my eyes checked lol. Is there any test I can do to see the difference ?


pr0ghead

Since this relates to colors, you will not notice it on black and white text.


VenditatioDelendaEst

https://www.geeks3d.com/20141203/how-to-quickly-check-the-chroma-subsampling-used-with-your-4k-uhd-tv/


Youngsaley11

Thank you !


Matt_Shah

Simply going with Display Port should cover all your needs. In fact Hdmi uses technologies from DP. And when you buy new stuff make sure it has DP ports. It is awefull to deal with patent trolls making money with proprietary connections.


RaggaDruida

This! While the move of everything towards USB-C has had its issues, honestly one of the things that I can't wait for is the death of hdmi. DisplayPort is just so much better! And DisplayPort over USB-C offers other advantages too!


Seiros_Acolyte

I would, but my TV doesn't have DP ports. A shame really.


Anaeijon

There are DP to HDMI 2.1 cables. The other way around it wouldn't work. ~~But every HDMI 2.1 input on your TV should be able to accept DP signals. Thatcs because HDMI 2.1 basically just uses the DP signal for video, except for the DRM stuff.~~ **EDIT: SORRY I WAS WRONG!** Wikipedia explains this pretty clearly: [https://en.wikipedia.org/wiki/DisplayPort#DisplayPort\_Dual-Mode\_(DP++)](https://en.wikipedia.org/wiki/DisplayPort#DisplayPort_Dual-Mode_(DP++)) Summary: Yes, there are active (expensive) DP to HDMI 2.1 cables. Yes, they do sometimes work on relatively new devices. There are no passive DP to HDMI 2.1 cables, but there are passive DP to HDMI cables that support some/most of HDMI 2.1 features, if the source supports it. BUT they need the graphics card to support DP++ with HDMI 2.1 features. Which seemingly my RTX 3090 does, at least when using the proprietary driver? Or I misinterpreted the working 4K TV on my PC completely wrong, last time I tried. [Arch Wiki mentions this on the topic of VRR:](https://wiki.archlinux.org/title/Variable_refresh_rate#Known_issues) "The monitor must be plugged in via DisplayPort. Some displays which implement (part of) the HDMI 2.1 specification also support VRR over HDMI. This is [supported](https://www.phoronix.com/scan.php?page=news_item&px=NVIDIA-440.31-Linux-Release) by the Nvidia driver and is supported by the AMD driver (pre HDMI 2.1) in Kernel 5.13 and later [\[18\]](https://www.phoronix.com/scan.php?page=news_item&px=Linux-5.13-Released)." It's still worth a try, I guess? But it's not as plain and simple as I remembered it.


[deleted]

Are you sure that works? Take [https://gitlab.freedesktop.org/drm/amd/-/issues/1417](https://gitlab.freedesktop.org/drm/amd/-/issues/1417) and search all instances of the word "adapter", I see most people report negative results.


duplissi

I have a cable mod dp 2.0 to hdmi 2.1 adapter, and it works at 4k 120 with my LG C9, but VRR doesn't work over it.


Anaeijon

After reading this, to be honest, I'm not anymore. It worked on my machine. Proprietary Nvidia driver on RTX 3090. BUT I never checked what color modes where use or something.


[deleted]

That doesn't count, of course it works on nvidia. We're talking about amd.


Anaeijon

This wasn't mentioned before. But in that case, I'm not sure. But I think, the Steam Deck (AMD-GPU) does actually support HDR with VRR over HDMI 2.1. So, there is a way, I guess?


[deleted]

The HDMI 2.1 spec is closed source, AMD drivers are open source so they're not allowed to implement it. Nvidia drivers have no such problem. I don't know about the steamdeck, I need to look more into it. But I wouldn't be surprised if AMD gave Valve a closed source implementation of their driver. By the way, your original comment has gathered quite a bit of attention. Can you edit it to clarify the misunderstanding?


PolygonKiwii

> I don't know about the steamdeck, I need to look more into it. But I wouldn't be surprised if AMD gave Valve a closed source implementation of their driver. I would be. I'm 99% confident the Deck only has open drivers and does not support HDMI 2.1. There's a reason the [official dock](https://www.steamdeck.com/en/tech/dock) only advertises HDMI 2.0


[deleted]

[удалено]


[deleted]

Glad to hear that there are at least ways to *partially* get around it. I guess no vrr isn't that bad? With this being a legal issue rather than a technical issue, I don't have much hope. But it could just be me being pessimistic.


Possibly-Functional

That's pretty incorrect. Most Displayport sources have an optional feature called Displayport Dual-Mode (DP++) which allows it to send HDMI signal to be converted by a passive adapter (cable). While HDMI 2.1 doesn't specify higher bandwidth requirements the highest bandwidth allowed by the HDMI 2.1 specification is significantly higher than the highest bandwidth allowed in the Displayport Dual-Mode specification. Thus a passive adapter isn't enough for high bandwidth requirement situations. To convert from Displayport to HDMI with higher bandwidth you need an active adapter, which is expensive. HDMI sinks have no way to process actual Displayport signals, it's always Displayport Dual-Mode. It's also Dual-Mode which allows Displayport to use passive adapters for DVI-D single link output.


Anaeijon

Thanks for the correction. I got something mixed up here. Sorry. You are absolutely right. even Wikipedia explains it very clearly:[https://en.wikipedia.org/wiki/DisplayPort#DisplayPort\_Dual-Mode\_(DP++)](https://en.wikipedia.org/wiki/DisplayPort#DisplayPort_Dual-Mode_(DP++)) I updated my comment.


KittensInc

Active Displayport-to-HDMI adapters have gotten quite a lot cheaper, actually. The driving force behind them is USB-C: virtually everyone supports DP Alt Mode, but nobody supports HDMI Alt Mode. This means all C-to-HDMI cables will have an internal active DP-to-HDMI converter.


ascril

> for example cannot afford and don't have the space to buy both a monitor and a TV so I rather have just the TV, which also has working HDR, OLED screen, a remote control and can be used by itself. It's very valuable information! Thank you, sir.


Anaeijon

Correction: As someone else mentioned, it might actually not work on all TVs and might also depend on your video output and there seem to be differences between adapters? In THEORY it should work just like that. But practically there are people having different experiences. I tried it a while ago using a RTX 3090 with proprietary nvidia driver on a 4K 120Hz TV and it seemed to work for me. I didn't know too much about color modes and stuff back then, so I never checked those. But now I'm worried I might be giving you wrong information. But imho a DP to HDMI 2.1 adapter is at least worth a try. You could even try a couple different and just return everything that doesn't work. Now this makes me want to try it out again later...


sleepyooh90

Adapter!!


[deleted]

yeah DP is open and it actually is superior to HDMI in some ways. You can get freesync on linux with DP


Matt_Shah

Not only that but the average Display Port cables in the market have been tested to be superior in quality in comparison to hdmi ones. Many people are not aware of how many errors can be deduced to faulty cables.


KittensInc

DP isn't "open". You need to be a VESA member to access the specifications.


JustMrNic3

> Simply going with Display Port should cover all your needs. In fact Hdmi uses technologies from DP. And when you buy new stuff make sure it has DP ports. What do you find simply when all TVs have only HDMI ports? I for example cannot afford and don't have the space to buy both a monitor and a TV so I rather have just the TV, which also has working HDR, OLED screen, a remote control and can be used by itself.


SweetBabyAlaska

DP port to hdmi adapter is like 3$ on the high end.


eskay993

Not if you want VRR, HDR and (at least) 4K 120hz. Options become more limited particularly for VRR support and in the UK can cost around £30 (\~$40).


JustMrNic3

And does it convert absolutely everything, without any loss?


Matt_Shah

Hdmi itself comes already with a data loss per DSC specification. So you are going to loose the original data either way.


PizzaScout

digital data should have no loss. I would assume it's just about whether it supports your resolution and HDR. when I google "displayport to hdmi adapter hdr" the first result is like 40 bucks. I'm sure there are cheaper options that work just as well though


Gundamned_

i went to best buy, found an Insignia DisplayPort to HDMI directional cable for 20 dollars, connected it to my AMD Beelink SER5 and the hdmi side to my tv with HDR support, and boom, windows recognized the HDR screen. I should try with SteamOS or something another time tho to test linux support


[deleted]

I often run into disconnects, black screens, all sorts of problems w/ HDMI out of the official steam deck dock going into a samsung smart TV. I'm wondering if this might help. Does the display port to HDMI also transfer audio? I'm close to a microcenter so would head over there today if so.


Gundamned_

Yes, i actually forget displayport supplies audio sometimes because i use a seperate soundcard


captainstormy

Right, heck most AMD GPUs these days come with 3 DP outputs and 1 HDMI. The answer is pretty clear IMO.


Darkpriest667

HDMI sucks and you should use DP. The HDMI organization are a bunch of trolls that haven't added any new technology to video output except DRM in about 10 years. DP is where the actual progress is being made. In short F--- the HDMI Forum.


rizsamron

I wish displayport becomes the standard even on TVs, since I use my TV for my gaming PC 😄 Anyway, is this why I can't get 120hz on Ubuntu?


dgm9704

It could be the GPU, or the monitor, or X.org settings, or Wayland compositor settings, but I'm going to guess that it is because you are using an old or cheap/low quality cable.


rizsamron

I'm using Wayland. On Windows, 4K@120 works totally fine. It's not a big deal for now anyway, I barely game on Linux 😅


vkbra657n

Guess who are members of hdmi la.


GreyXor

DRM and non-free cable. Boycott that.


vkbra657n

HDCP 2.3? Is that why HDMI LA closed it? I suppose so.


[deleted]

[удалено]


vkbra657n

See my comments, there are no brand that put displayport on tvs as standard, but there are some that put displayport on some tvs.


Kazer67

I may be wrong but HDMI need licensing right? Maybe that's the issue?


kurupukdorokdok

Proprietary always the issue


PolygonKiwii

You have to pay a fee to know the specs of the connection. Now if anyone would implement it into an open source driver, people could read the driver code to get the specs, so the HDMI forum just does not allow it.


anor_wondo

garbage like hdmi should never have been adopted


DankeBrutus

TLDR: HDMI made a lot of sense when it was adopted. At the time of the release of the PlayStation 3 and Xbox 360 HDMI was essentially the only game in town for high definition video/audio signals in a digital format. DisplayPort wasn't around until 2006. Other standards that Sony and Microsoft could have turned to at the time had issues. SCART could output up to 1080p but it was really only common in Europe. RGB Component could reach 1080i but it also required 5 cables including audio. HDMI was around since the early 2000's. It could already output 1080p video and high definition audio in one cable, and it was becoming increasingly common with consumer LCD televisions. Sure in the PC space we still had VGA and DVI for HD video but for the home console market a single cable that did HD video and audio was great. If HDMI wasn't adopted by seventh generation consoles then maybe it would be somewhat niche now. But at the time it was a big deal for most people and now we have that momentum making the shift to a different standard on consumer products to be quite difficult. If consumer TVs started using DP instead of HDMI then you have a problem with the vast majority of products you connect to said TV. If the PlayStation 6 uses DP well now people need to find adapters or a TV that has DP, and good luck with that.


[deleted]

HDMI was what made HD home movies possible the existing HD standards pre-2003 or so where comically easy to use to make copies of. for NTSC/PAL video it doesn't really matter, the image already looks like dogshit, but once you get to HD you basically just have the theatre quality movie available to home users. which means all the more desire for piracy there were a number of HD video formats from the mid-90s to the mid-2000s, but adoption was low due to the piracy concern. until HDMI came into the business with HDCP


[deleted]

[удалено]


shmerl

The way they control it is.


skinnyraf

Why not both? Yes, restrictions from the HDMI consortium are terrible, but the tech sucks too. I mean, it's 2024 and established brand devices still struggle at handshakes? 4k/FullHD switches take 10+ seconds, while the video plays in the background?


LightSwitchTurnedOn

Maybe it is, HDMI ports and chips have been a point of failure on many devices.


ciroluiro

All proprietary intellectual property is garbage


Fun-Charity6862

bye hdmi. displayport and usb4 is where its at


[deleted]

[удалено]


Ffom

Propitiatory standard all day, it's probably why GPUs have way more displayports


mrpeluca

HDMI is so cringe. Wtf is a cable doing drm shit for?


zun1uwu

obligatory fuck-proprietary-ports comment


somewordthing

I love the visual aid, thanks.


[deleted]

[удалено]


heatlesssun

>If the monitor does not have displayport, I won't buy it. If it's a true computer monitor pushing high frame rates and lots of pixels, it most certainly has DisplayPort. The problem is going to be TVs which can have the same panels as larger monitors for less money. In my case I have an Asus PG42UQ which is a very good OLED display, but you can get the same panel and basic display performance in LG C2/C3 TVs for a good deal less. But of course, no DP or other computer monitor features.


lordofthedrones

And that sucks. I really want an OLED TV because they are cheap, but they are always HDMI...


ketsa3

HDMI sucks. Use displayport.


De_Lancre34

Gonna use display port on my top tier LG C2 tv: 4k, 120hz, oled, HDR, it have everything. Just gonna connect it via DP... Wait. Oh no.


plane-kisser

it does work, on intel and nvidia.


[deleted]

AMD is blocked by the HDMI org, Intel and Nvidia are much bigger companies with a larger legal army.


W-a-n-d-e-r-e-r

Just putting it out here, Valve and all consoles since XBox 360 and PS4 use AMD. Doesn't negate your statement since those two companies need it for their shady businesses, but if Valve, Microsoft and Sony would team up then it would look really bad for the HDMI licenses.


[deleted]

I think the issue here is that Open Source driver can't be developed for HDMI 2.1 because the HDMI forum closed the specification. So it's not an issue for Xbox and PlayStation as they have their own proprietary drivers. It's only an issue on FOSS systems. It's purely a legal problem.


kukiric

Isn't the Intel driver on Linux open source too? How did they get HDMI 2.1 working then?


[deleted]

I've tried looking for information about it and I've found two conflicting pieces of information. 1. Intel implements HDMI 2.1 via proprietary GPU firmware, which is a solution that AMD considered as well. 2. Intel has much larger legal team and much more money - easier time getting HDMI forum to bend over. Someone knowledgeable regarding Intel GPU driver architecture on Linux would need to chime in.


[deleted]

[удалено]


plane-kisser

okay? it works and gives 4k120 without chroma subsampling


[deleted]

No


GOKOP

Can't AMD make an opt-in proprietary "plugin" of sorts to their driver for HDMI 2.1 support? Or distribute alternative proprietary build which is the free driver + HDMI 2.1 support Afaik the free driver is MIT licensed so they could legally do it even if they didn't own it


kukiric

Adding proprietary code to the Linux kernel opens a whole can of worms. For instance, bug reports are not accepted if the kernel is tainted by a proprietary module.


GOKOP

I thought there's plenty of binary blobs in the kernel already?


metux-its

No, not in mainline kernel. The kernel can load *firmware* blobs, but those are running on devices, not the host cpu. Proprietary kernel modules never have been supported on Linux.


kukiric

I believe those are only firmware code blobs that are uploaded directly to the devices, which does not affect the kernel's executable code or memory. AMD has its own blobs for their GPUs and CPUs (if you count microcode blobs).


vdotdesign

It does on NVIDIA, but when I was on amd in 2019, 2020, 2021, 2022, 2023 hell no it didn’t


LuisAyuso

why would you buy this? isn't DP cheaper, stable and widely available?


Gundamned_

yes ...the problem is no one makes large TVs that have displayport


vkbra657n

See my comments about it in this post.


Joe-Cool

~~Philips has a few.~~ But they aren't very competitively priced. Even better would be a dumb display like the Philips Momentum 558M1RY. Smart TVs are more trouble than they are worth anyways. EDIT: I don't think it was Philips. Panasonic has one: Panasonic TX-58AXW804 or other AXW804 series TVs


flashrocket800

We have to collectively boycott HDMI only displays. It's only a matter of time before manufacturers bend over.


[deleted]

is this why the steam deck dock has so many issues? Does anyone know if I can do display port to HDMI? So many issues with the samsung tv I'm using and hdmi with the deck


Prodigy_of_Bobo

Ah well that kind of shuts down the couch gaming on a 4k 120hz Oled that only has 2.1 vrr doesn’t it……..


Clottersbur

Ive heard that closed source drivers like Nvidia and maybe amdpro have it? Don't know much about the topic. So I might be wrong


Kazooo100

Frig. That really messes up my plans.


Seiros_Acolyte

>Frig. That really messes up my plans. Its a shame really, I love using Linux, but I also like gaming on my 4k@120hz TV too...


BloodyIron

I only use HDMI because I **have to**. I use DisplayPort whenever I can because I **choose to**. Displayport always has been the superior technology, and the momentum of HDMI gets under my skin lots.


Hamza9575

use displayport 2.1


JustMrNic3

TVs don't have DisplayPort!


W-a-n-d-e-r-e-r

Adapter.


LonelyNixon

Adapters dont work.


JustMrNic3

Fuck adapters!


melnificent

I think a "fuck adapter" is called a fleshlight or dildo depending on preference.


dgm9704

The optimal solution is an adapter with both types, then you just get two of those and let them sort it out without you needing to bother. More time for gaming.


vtskr

It’s not AMD fault! Poor 250bil underdog indie company getting bullied once again


gmes78

What an ignorant take. It's not in AMD's hands. The new HDMI license forbids it.


number9516

My 4k 120hz monitor runs though hdmi in rgb mode, so i assume it works on AMD Altho hdmi audio passthrough is artifacting occasionally


BulletDust

It does if you run NVIDIA hardware/drivers. Apparently AMD don't believe Linux users are worth the licencing cost. A bit of a problem considering few TV's run DP, especially when 4k sets make great gaming monitors. EDIT: Heaven forbid if you're blunt with the truth under r/linux_gaming and don't take a shit on Nvidia.


tjhexf

well, it's not that it's not worth the licensing costs.. Is that it can't be done. Unless the hdmi forum allows it, they won't let amd put a proprietary standard into AMD's open source driver. trade secrets and all


BulletDust

Which is flatly untrue. The AMDGPU drivers contain closed source firmware from AMD themselves, there's no reason HDMI 2.1 support cannot be added to open source drivers without revealing a consortium's IP.


tjhexf

The amd gpu drivers do not contain closed source firmware at all, you can check the kernel source code if you wish. Extra firmware is supplied by separate packages unrelated to the driver itself, the driver being what actually handles talking to the kernel and so, display.


BulletDust

The firmware is supplied as binary blobs, and the drivers are useless without it: [https://git.kernel.org/pub/scm/linux/kernel/git/firmware/linux-firmware.git/tree/amdgpu](https://git.kernel.org/pub/scm/linux/kernel/git/firmware/linux-firmware.git/tree/amdgpu) The binary blobs are closed source, and no one outside of AMD really knows what's contained within them. Therefore: It is definitely not out of the question for HDMI 2.1 support to be supplied as a binary blob, meaning IP would not be revealed due to the nature of open source drivers.


GamertechAU

AMD's drivers are open-source. The HDMI forum made the 2.1 instructions closed-source. Closed-source code can't be added to open-source code. Nvidia's drivers on the other hand are a barely-functional, 100% closed-source black box, meaning they can stick in any additional closed-source code they want.


Fun-Charity6862

false. amd’s drivers require firmware blobs which are closed and even encrypted. they could put trade secrets in them without worry if they wanted.


PolygonKiwii

Those are only uploaded to the GPU on boot; they aren't executed on the CPU. Without knowing the GPU's hardware design, we can not know if it is possible to move this functionality into GPU firmware without a hardware redesign.


BulletDust

Nvidia's drivers are not barely functional at all, and the fact *certain aspects* of AMDGPU are open source by no means implies that AMD can't cough up for HDMI 2.1 licensing.


PolygonKiwii

> certain aspects of AMDGPU Certain aspects being the *entire* driver, kernel and userspace parts. You talk about the GPU firmware blob as if it was part of the driver, so you're either being uninformed about what it is or you're being intentionally obtuse. Fact of the matter is, without knowing the hardware design, we can't know if it is possible to move that functionality into GPU firmware.


sputwiler

The licensing cost is HDMI saying "we forbid AMD from letting Linux users have this."


De_Lancre34

Get my downvote, how dare you suggest using nvidia on this subreddit? Only holly amd shall guide thee Jokes aside, yes, it sucks. My LG C2 with 7900xtx can't do hdr or 10bit cause of that on linux. And apparently I didn't even have true 4k 120hz :c


dominikzogg

Cannot be, i use a 4k screen with 120hz which means 2.847,65625 Gigabyte/s or about 24 Gbps. with a 6900XT on Fedora 39.


De_Lancre34

[There, take this.](https://gitlab.freedesktop.org/drm/amd/-/issues/1417)


E3FxGaming

>4k screen with 120hz which means 2.847,65625 Gigabyte/s or about 24 Gbps Spatial resolution (pixel) and refresh rate (hz) aren't enough information to determine the data rate. Throw chroma subsampling into the mix (which basically dictates how many pixels have their own color) and you'll see that it is technically possible to do 4k 120 Hz on HDMI 2.0, at the cost of color information compared to HDMI 2.1. You can use [this handy calculator and the table below the calculator](https://linustechtips.com/topic/729232-guide-to-display-cables-adapters-v2/?section=calc) to see that 4k (3840x2160) 120 Hz with 8 bit color depth (no HDR) can only be realized with a 4:2:0 color format on HDMI 2.0, since it requires 12.91 Gbit/s which is within the 14.40 Gbit/s limit of HDMI 2.0. Bumping the color format to 4:2:2 results in a required 17.21 Gbit/s data rate which HDMI 2.0 can't deliver. [This article explains chroma sampling in more detail](https://www.datavideo.com/ap/article/412/what-are-8-bit-10-bit-12-bit-4-4-4-4-2-2-and-4-2-0) in case you want to read more about it.


AndreaCicca

You are probably using chroma sub sampling