As already mentioned, DisplayPort exists. The problem is adoption. Even getting DisplayPort adopted as the de facto standard for PC monitors hasn't done anything to get it built into TVs.
Is there a reason or way to prevent display port from having so many connection issues specifically on port replicators (docking stations)?
In corporate environments I find so many times that you plug them up over and over, unplug over and over and check the connection a million times before turning everything off one final time, holding the power button on everything (kind of like an smc reset) and then booting up everything like you originally did and they come up. Is this a result of the devices trying to remember a previous setup or is their an easy way to avoid it?
I've hooked up dozens of them and still ran into issues when a family member brought a setup home to work when they were sick last week.
We use Dell WD-19 docks. Not sure if you use similar. Updated dock firmware and laptop drivers made a difference for us with connection issues. Sometimes you gotta perform a reset on them to make them behave (disconnect dock power and USB-C and hold power button for just over 15 sec). Sometimes the laptop NVRAM needs to be reset instead (for Dell, disconnect all devices and power while off and hold button for just over 30 sec). Overall, though, no huge issues with DP specifically if the dock and laptop firmwares are up to date. Third-party docks/replicators definitely have way more issues, though.
This is really frustrating. This is the only thing holding Linux gaming back for me, as someone who games with a AMD GPU and an OLED TV. On Windows 4k120 works fine, but on Linux I can only get 4k60. I've been trying to use an adapter, but it crashes a lot.
AMD seemed to be really trying to bring this feature to Linux, too. Really tragic that they were trying to support us, and some anti-open source goons shot them down.
ive found that the issue in my experience is that X11 only supports a max of 4k60, but Wayland supports 4k120 and beyond. I dont think the cable matters as the same cable im using works on windows with 4k160.
It's a matter of cable bandwidth. 4k120 4:4:4 requires more bandwidth than hdmi 2.0 can provide. You can drop down to 4:2:0, but that's a pretty bad experience and ruins the image quality.
I've been using an adapter cable, but it's really flaky, I don't know if it's a bad cable or what. But a normal hdmi cable just plain works on Windows, since the windows amd driver supports hdmi 2.1.
I'm a bit confused by your comment. I have a 120Hz Monitor and use an AMD GPU on linux without issues. Connected via the display port on my GPU to the HDMI Port on my monitor (because samsung does not enable DDC on the display port for some reason).
Just because they're still trying to use HDMI to prevent piracy? Who in fuck's name is using HDMI capture for piracy? On a 24fps movie, that's 237MB of data to process every second just for the video. A 2 hour movie would be 1.6TB. Plus the audio would likely be over 2TB.
I've got a Jellyfin server packed with 4K Blu-ray rips that suggest there are easier ways to get at that data.
The CEO's of the media companies are all fucking dinosaurs who still think VCRs should have been made illegal. You will never convince them that built in copy protection is a dumb idea and a waste of time.
Even despite that HDMI capture is simply an awful way of obtaining that data, it's even more pathetic when that "protection" can be defeated by a $30 capture card on Amazon...
Most people don't pirate 4K media due to file size and internet speed constraints. Most people pirate 1080p video. There's also the prospect of people pirating live television, which HDMI capture would be perfect for.
Then most people need get a better ISP. My crappy $60/mo fixed 5G can download an entire 4K film in under 10 minutes or start streaming it within a second. Y'all should see if there are any options beyond cable and DSL in your town. You might be pleasantly surprised what's available these days.
You have to get there early to have much chance of getting a full 60GB+ 4K Blu-ray rip in a timely manner, but the ~15GB x265 rips are indistinguishable to me.
You can pirate media that uses that new blu ray drm by plugging a capture card into the overpriced compatible DVD player and recording the video. Also, it's a way to transfer saved content from a dvr as their hard drives are always encrypted (do those still exist). The video stream on all this stuff is encrypted with hdcp to prevent this but there exist hdcp strippers. It seems to still be possible to buy them even on Amazon. Stock up before they get banned. Frankly I'm surprised they aren't banned already.
If you were hoping at some point to see HDMI 2.1+ on Linux with AMD + Mesa, you're out of luck right now as it's simply not going to be happening.
There's been a bug report on the Mesa GitLab of "4k@120hz unavailable via HDMI 2.1" that's been open for a few years now, with lots of comments and chatter about the issue.
In an update on the bug report, AMD engineer Alex Deucher commented: "The HDMI Forum has rejected our proposal unfortunately.
So if you're on Linux, it's going to continue to be best to buy hardware that uses DisplayPort.
On the NVIDIA side though, it seems like it may not be an issue, as developer Karol Herbst wrote on Mastodon: "Even though AMD might not be able to add support for HDMI 2.1, nouveau certainly will as Nvidia's open source driver also supports HDMI 2.1 so there is no reason to believe that at least some drivers can't support HDMI 2.1
It's quite backwards, but apparently having all the logic inside firmware (like Nvidia does) will probably help us implementing support for HDMI 2.1"
The original article contains 244 words, the summary contains 183 words. Saved 25%. I'm a bot and I'm open source!
Piracy being easier is the only risk. Once again ruining the experience of legitimate customers to try and stop a thing that they have had no success at even slowing down.
Naah, DisplayPort carries everything from audio, USB, displays, etc. Version 1.2 even allows daisy chaining displays, so you don't have to have number of cables going to your PC. When it comes to audio, version 1.4 supports 1536 kHz maximum sample rate at 24bits and supports 32 individual audio channels. Scary good! Overall it's significantly better protocol.
Linux never ran on the Commodore 64 (1984). That was way before Linux was released by Linus Torvalds (1991).
I'd also like to point out that we do all rely on non-proprietary protocols. Examples you used today: TCP and HTTP.
If we didn't have free and open source protocols we'd all still be using Prodigy and AOL. "Smart" devices couldn't talk to each other, and the world of software would be 100-10,000x more expensive and we'd probably have about 1/1,000,000th of what we have available today.
Every little thing we rely on every day from computers to the Internet to cars to planes only works because they're not relying on exclusive, proprietary protocols. Weird shit like HDMI is the exception, not the rule.
History demonstrates that proprietary protocols and connectors like HDMI only stick around as long as they're convenient, easy, and cheap. As soon as they lose one of those properties a competitor will spring up and eventually it will replace the proprietary nonsense. It's only a matter of time. This news about HDMI being rejected is just another shove, moving the world away from that protocol.
There actually is a way for proprietary bullshit to persist even when it's the worst: When it's mandated by government.
Why? Most software wasn't proprietary before companies realized they could make more money at your expense (not all the profit is going into making a better product).
If given the choice of an uncomfortable dormitory or a comfortable jail, at least the residents can improve the living areas in the former.