It's really ironic and embarassing. The most valuable chip manufacturer in the world, thanks to advances in AI and AI research, which is usually done using Linux systems. And yet Nvidia still sucks hard when it comes to Linux support.
Their money is in headless systems, which TBF are much less problematic with Nvidia. Anything CUDA is first class on Linux with Windows as an afterthought.
1000% on the money here. You want encode, decode, calculation, acceleration? They got it. Rock solid, beautiful and simple. You want that shit to actually SHOW UP ON SCREEN? Get the fuck outta here. What do you think nVidia is, a GRAPHICS CARD company!?
I wouldn't call their Windows support stellar, either. There's only one error code for any and all problems and RTXes can be damn finicky if you're unlucky.
Seems to be less about the connector, but more about load balancing. The German guy who had 150°C connectors at the PSU side also measured current draws. One cable was doing 22 A (so almost half of the 5090's total consumption) while the other 7 five were just chilling.
Except all those Linux specific privacy/modularbility orientated PCs are expensive as hell with base models starting close to 1k last I checked. They're aimed at the demographic of cushy tech nerds making bank tapping at a keyboard who care about opsec or right to repair and can comfortably afford dropping 1k on a new laptop without thinking twice about if that money has better uses.
I and many others who don't live in economic lah-lah land will NEVER be able to justify 1k spent on a laptop just because it has physical kill switches or modular parts and preloaded with a good Linux distro. These companies need to touch grass and come down a couple hundred dollars to the 400-500$ range then we can talk. Until that day comes, the guy selling librebooted thinkpads on eBay running popos or mint is the better option for those who live with the reality of not having a lot of money.
Last nvidia gpus I owned were water-cooled GTX 670s in SLI back when I ran windows. Ever since then I’ve always chosen AMD or intel, because of the in-kernel drivers.
Yeah, I had to exchange my laptop for another one with an AMD GPU because Wayland was so broken, and I was getting serious input lag in games, even on X11.
And this was a few months ago. NVK worked a bit better, but compatibility wasn't great, and performance was about 50% of the proprietary driver.
Which distro? You perhaps lucked out so far. Anyone using Linux for multiple years can attest for the trash that are Nvidia drivers, especially once you compare it to AMD (who, outside of professional applications, usually don't need any driver install or setup at all).
I haven't had any problems on Linux Mint with a 3060 Ti aside from some artifacting when I try to do screen recordings (unless I disable flipping).
EDIT: I've had that GPU for about 2 years. I had a 1050 Ti for about 4 years before that.
Actually now that I think about it an update did break my graphics at one point, but that might've been partially my fault. I just reverted and reinstalled the same update right after though, and that worked just fine, so it wasn't a huge deal.
Overall I would say its been more than 10 years since I've had an actual major graphics issue (having to open xorg.conf).
I'm not that person but I impulse switched to Garuda (Arch-based) around 8 months ago with a 3080 and everything has just worked, the only thing I've had problems with is flatpak being the bane of my existence
Just upgraded my EndeavourOS (Arch btw) and saw Nvidia driver update. Reboot, KDE came up successfully, OK, good. Play game, stuttering right on the title screen. 😑
From my idiot troubleshooting with Nvidia in the past, I disable "Allow screen tearing in fullscreen windows." Test, runs perfectly now. The funny thing is that I had to enable that option in the past to make the same stuttering go away. 🤷♂️
Someone suggested maybe that option doesn't matter and I just had to start the game multiple times because of shader cache? IDK, but I do know that my next card will be AMD.
Modern Proton versions should compile shaders beforehand, I know what you describe from when it had to do it in realtime. If it happens again try clearing the Shader Cache in the Steam Settings or switch to a newer Proton version.
I didn't have any luck with PRIME. On my work laptop, I want to use Intel graphics when using the laptop screen, and Nvidia only when plugged in to external monitors. Couldn't get it working properly at all - the external monitors only work properly when hybrid graphics is disabled in the BIOS.
I'm wanting to switch my gaming PC to mint from Windows. I'm new to Linux, that's why I'm going with mint. What GPU should I buy for my use? (RTX is not important to me. I just need it to play my games well)
I can vouch for Bazzite and always will as long as they keep up the solid work. Running on a laptop (gnome variant for easier fingerprint login) and desktop (KDE, cuz I just prefer KDE day to day). It just works™️.
RX 9070 when it comes out, Mint is good but there are so many good options. I suggest using cachyos and trying out all the DEs so you pick something you like. Although you don't have to stick with CachyOS if you don't want to.
I swapped my gaming PC from Windows to PopOS a few months ago and it's been a seamless experience with driver installation with an Nvidia GPU / AMD CPU
I am this close to proposing to swap GPUs with my friend who’s coming this weekend for help building his PC. He’s using 6900XT and I’m using 3080 12GB. Technically it’ll be a downgrade but I’ll be free of fucking Windows.
Can I ask why they're still the de facto? I run AMD on CPU and GPU and don't consider purchasing otherwise when researching components (other than baseline for comparison and per cent cost)
NVidia got there early with their CUDA API.
That's been around for decade(s), which enabled all sorts of crazy GPU usages beyond just graphics.
Due to that, NVidia held the datacenter/professional scene exclusively for a long time.
As a result, their professional cards and related drivers have been industry standard.
I have no doubt that AMD is better, but so much (non-mainstream) software is built against NVidia drivers, CUDA etc., that will be slow to change until the cost of implementing similar for AMD outweighs "just sticking with NVidia".
The classic "Nobody ever got fired for buying IBM"
Iktf, I had my 1060-6GB die and for a while I was gaming on a 750Ti lol.
Recently my Crucial 1TB SATA SSD suddenly died, no errors, no SMART, no detection on any computer or via USB adapter, only coil whine, taking with it exactly all the things I never backed up because SSDs are supposed to be good :(
SSDs at the end of their lifespan do tend to fail more gracefully than HDDs, as even when they become fully worn and unable to take new writes, they will often still allow reads.
But, that depends on the specific type of failure.
I had an SSD fail in the same way as yours, where the controller chip or something along the path there died, and it went from fully working to toast in an instant.
Some drives are more reliable, some drives are less reliable, but the only rule is that any drive can break, at any time, old or new.
Yeah i should know, but I'm too lazy haha. Didn't lose anything completely irreplaceable but my beautiful bind9 local DNS zone written and annotated by hand is gone.
Plus I have basically nowhere to back up to.
At least the first thing I did when reinstalling Debian was set up an an rsync cron job to fetch the home, etc and some other select dirs, but this is backing up to a Raspberry Pi with a busted micro SD slot that runs off a rather dodgy USB enclosure'd 120 gig mSATA SSD that already failed before that originally transplanted from a busted MSI gaming laptop I sold for coke cash in the mid-2010s.
Not ideal. That pi also periodically shits the bed. It's exposed to the elements a bit because it's also in use in 2 DIY iot projects.
Is there a decent non-shit non-megacorp-empowering affordable way of doing off-site backups on a small scale?
The good thing about an nvidia driver update is that it forces you to take a backup. And hey, I figured out how apt-file works just so I could figure out where the nvidia driver put nvidia-settings (as it forgot to put it somewhere $path could find it, and no .desktop files were made).