Be interesting to see just how much these are basically AI accelerators in disguise, which accidentally do gaming too.
Expecting a 5090 that’s even more expensive than the 4090 and still sold out for 6 months.
As a laptop user it would be nice to go from 8GB vram in my 3070ti to something like 16gb, but I wonder what insane power requirements even the mobile parts will have.
I’m already using a mobile workstation type thing with 1h battery at most. That’s the unfortunate reality of mobile CGI / CAD work. I’d still be interested, as I have no alternatives.
At this point, what is the point? Honestly. Why so much processing power? What is the average user going to do to require so much brawn on a graphics card?
I might switch from a 3080 to a Redeon 8xxx next year, depending on the price and performance. Main reason are, despite me using Linux, not the drivers but the VRAM. I had several games now where 10GB on 3840x1600 resolution aren't sufficient and it really bugs me.
Why AMD instead of Nvidia? Well, I didn't have that much issues with drivers, but AMD drivers on Linux are still better. Also Nvidias pricing is absurd. I don't want to take apart in that.
Nice to see it the VRAM increasing, but the bigger one for me is what the VRAM of the "mainstream" cards (if they even exist in Nvidia's pricing model) is gonna be.