"Easy fix with my massive unparalleled intellect. Just turn off the sensor"
If you needed any more proof that Nvidia is continuing to enshittify their monopoly and milk consumers. Hey lets remove one of the critical things that lets you diagnose a bad card and catch bad situations that might result in gpu deathdoors! Dont need that shit, Just buy new ones every 2 years you poors!
If you buy a Nvidia GPU, you are part of the problem here.
The only good thing this generation about nvidia, is that their prices were lower than expected on the low end cards, Forcing AMD to cut the lunacy of their prices of their 9070/others in half.
Surely if the card is damaged due to overheating, the customer won't be blamed since they can't keep track of the hottest part of the card, right? Right?
Yeah NVIDIA is a bullshit company and has been for a while.
AMD and Intel need to get their raytracing game up so they become a real competitor for NVIDIA especially now when there are more games that require raytracing.
This is incorrect. The new indiana Jones game requires raytracing as does the upcoming doom game. As much as you may or may not like it traditional rasterized graphics are (starting) to be phased out. At least across the AAA gaming space. The theoretical benefits to workload for developers make it pretty much an inevitability at this point once workflows and optimizations are figured out. Though I doubt rasterized graphics will completely go away. Much like how pixel art games are very much still a thing decades after becoming obsolete.
The drop in clocks in certain situations that a lof of outlets are "conveniently" attributing to CPU limitations, has all the hallmarks of throttling... It's hard to criticise the incumbent monopoly holder when they have a history of blacklisting outlets that espouse consumer advocacy.
I've never bought Nvidia but they become more like Apple every day. Why be consumer friendly for niche PC builders? The average gamer already associates Nvidia with performance so it's time to rely on good ol brand loyalty!
The problem is, it's not just an association. NVIDIA cards are the fastest cards hands down. I wish Intel and AMD would provide competition on the high end, but they just don't do it.
Even worse, the best next gen AMD GPU won't even beat AMDs best last gen GPU, they even say this themselves.
To me, buying Nvidia for performance is like buying an APC as a daily driver for work because of it's safety rating. The cost long term does not at all seem worth it.
I wonder if there was some other reason for this removal, e.g. I could imagine some change in this generation that could have made the hotspot sensor redundant for some reason.
But yeah it's far more likely to be for the reasons you outlined. Absolutely diabolical.
The Hotspot temp sensors are one of the most critical diagnostic sensors an end user can have. When the thermal interface material begins to degrade (or leak out of the rubber gasket, in the case of the 5090's liquid metal) your package temp may only go up a few C but your Hotspot may increase by 10-20C or more. That indicates problems and almost definitely is one of the leading causes of dead and crashing GPU's- it's also the easiest to detect and fix.
Removing this quite literally has zero engineering reason beyond
hiding from reviewers the fact that the 5090 pulls too much power and runs too hot for a healthy lifespan, even with liquid metal and the special cooler
Fucking over the consumer so they can no longer diagnose their own hardware
Ensure more 5090's die rapidly, via lack of critical monitoring, so that Nvidia funny number can keep going up by people re-buying new GPU's that cost more than some used cars every 2 years.
The sensors are still definitely there. They have to be for thermal management or else these things will turn into fireworks. They're just being hidden from the user at a hardware level.
This isn't even counting the fact that Hotspot also usually includes sensors inside the VRM's and memory chips, which are even more sensitive to a bad TIM application and running excessively warm for longer periods of times.
It looks bad with the insane TDP they run at now. They could cut 33% of it off and probably lose like 5-10% perf depending on the SKU. Maybe even less.
Unlikely, as the hotspot sensors/detection logic is baked into the chip silicon and it's microcode. AIB's can only change the PCB around the die. I'd almost guarantee the thermal sensors are still present to avoid fires, but if Nvidia has turned off external reporting outside the chip itself (beyond telling the driver that thermal limit has been reached), I doubt AIB's are going to be able to crack it too.
Also the way Nvidia operates, if an AIB deviates from Nvidia's mandatory process, they'll get black balled and put out of business. So they won't. Daddy Jensen knows best!