Reminder that Bethesda is owned by Microsoft, the company that insists it's going to end support for Windows 10 in October and wants everyone to move to Windows 11, which doesn't officially support perfectly functional but somewhat old CPUs. So of course they don't care about GPUs too old to support ray tracing.
They make gaming more and more elitist hobby, and then get surprised when indie games with pixel graphics that can run even on potato devices make a great success.
At some point it was going to happen, this is just earlier than many thought. The real question is “when is AMD going to have an answer to Nvidia when it comes to RT performance?”
How long did they think it would take before RT was a requirement? It was introduced with the GeForce 20 series more than six years ago.
For technology, six years is vintage.
The only people this should affect is people still using GTX 10 and 16 series cards. I dunno what's happening with AMD/Radeon. Since they were purchased by AMD the banking schemes have gotten to be more and more nonsensical, so I always have a hard time knowing WTF generation a card is from by the model number.
In any case. Yeah, people using 5+ year old tech are going to be unable to play the latest AAA games. And?
Has there ever been a time when a 5+ year old system can reasonably play a modern AAA title without it being a slide show?
They're perfectly functional still and capable of pretty much anything for a modern workload, spec depending... If they can run win 11 fine (and they should be able to if they can run 10), then the cutoff is arbitrary and will cause more systems to find their way to landfills sooner than they otherwise would have.
Bethesda being owned by Microsoft means they're tainted by the influence of shareholders now. The decay is inevitable. Wall street ruins all studios with time.
2016 was a great reboot of the franchise, it felt like a modern version of the originals. It looked great and ran great.
Then Eternal added pointless wall climbing, loads of skills, put the whole game on rails. The start of the game is like a tutorial for 2 hours, good game design doesnt require you to tell me exactly what to do every 5 minutes.
The graphics got downgraded with some shitty AA added, probably Temporal, I cant remember and the performance was down across the board. The game just didnt feel like DOOM anymore and was a poor continuation from 2016.
Now we have this forced ray tracing, probably going to require DLSS and shit. This has to be a case of Nvidia slipping cash right? No way they would be so stupid? It'll probably be a case of updating to remove the requirement a few months down the line.
The first ray tracing GPU came out 7 years ago (rtx 2080), eternal came out in 2020. In 2013, the top card was a gtx 680. Eternal lists it's minimum specs as a 1050ti and from some quick google people trying to run on a 680 are getting sub 30fps on low. Of course just supporting ray tracing doesn't mean it will actually be playable, but indiana jones (the only released game that requires RT today) seems to get 50fps on low with 2080s.
Fwiw a few 2080 supers are going for sub 50 bucks on my local buy and sell groups.
They used ray tracing for the hit registration so that's presumably why.
It's a really interesting idea ... presumably that means there are some really flashy guns and there is a very intricate damage system that runs at least partially on the GPU.
really flashy guns and there is a very intricate damage system that runs at least partially on the GPU.
Short opinion: no, CPU's can do that fine (possibly better) and it's a tiny corner of game logic.
Long opinion: Intersecting projectile paths with geometry will not gain advantages being moved from CPU to GPU unless you're dealing with a ridiculous amount of projectiles every single frame. In most games this is less than 1% of CPU time and moving it to the GPU will probably reduce overall performance due to the latency costs (...but a lot of modern engines already have awful frame latency, so it might fit right in fine).
You would only do this if you have been told by higher ups that you have to OR if you have a really unusual and new game design (thousands of new projectile paths every frame? ie hundreds of thousands of bullets per second). Even detailed multi-layer enemy models with vital components is just a few extra traces, using a GPU to calc that would make the job harder for the engine dev for no gain.
Fun answer: checkout CNlohr's noeuclid. Sadly no windows build (I tried cross compiling but ended up in dependency hell), but still compiles and runs under Linux. Physics are on the GPU and world geometry is very non-traditional. https://github.com/cnlohr/noeuclid
Honestly, I'm not interested in debating it's validity especially with the exact details of what they've done still under wraps ... I have no idea if they are really on to something or not and the details are scarce, but I did find the article I read.
Not disputing you, but hasn't hitscan been a thing for decades? Or is what you're saying a different thing?
Also, I always thought that the CPU and GPU either couldn't communicate with each other, or that it was a very difficult problem to solve. Have they found a way to make this intercommunication work on a large scale? Admittedly I only scanned the article quickly, but it looks like they're only talking about graphics quality. I'd love to know if they're leveraging the GPU for more than just visuals!
It's a different thing. This is pixel perfect accuracy for the entire projectile. There aren't hotboxes as I understand it, it's literally what the model is on the screen.
W10 OK slow, but OK. W11 so much jank and buggy bullshit. I moved allmy games to Linux. With Proton and Vulkan all my games work including the RTX settings.
That's true. Ray tracing libraries and engines like UE5 are a lot easier to develop on than older engines.
But I'm not sure it's such a simple comparison. 3d acceleration made games look better, and the weakest gpus didn't make your fps tank afaik. Your average gpu these days will tank your FPS in ray tracing and cause awful visual artifacts, either from bad denoising algorithms, or from the upscalers used to hide the bad FPS and bad denoising.
This move reduces development costs, but given that the consumer doesn't get any benefits from it, it's hard not to have the cynical view that this is just greedy cost cutting on Microsoft's part.
Fair, but it's been shown time and time again that most users are either on "intro level" gpus or weaker. Heck, a midrange from 2 years ago is an 8gb card. I'm not sure how they expect to sell this game at all unless it's just planned to be a bundle add-on for the 50xx/90xx series cards.
Currently the most popular gpu according to the steam survey is a 3060. That plays the only other mandatory RT game, indiana jones, at 60fps on high. A 2080 can play on low at 50.
The amount of VRAM isn't really the issue, even an extremely good GPU like the 7900XTX (with 24GB VRAM) struggles with some ray tracing workloads because it requires specially designed hardware to run efficiently
And i agree that is a good thing and natural progression/evolution of tech.
What i dont like is nvidia's cockhold on the tech with insane prices ( for the card and power draw ) as a result. I know other cards are catching up and all that, but the difference is still huge because some methods and functions are locked to the cuda cores and nvidias tech.
I will not be giving nvidia money in the next 7 years. We will see where they stand once i have to replace my (amd) gpu.
How does RT only titles work on consoles? The RT really isn't that powerful, aren't they supposed to be equivalent to an RTX 2070 at best? It sounds like the graphics difference will be quite a lot for PC vs consoles.
Consoles have been seriously struggling to run anything past 30fps blurry messes for the past several years. It’s real bad on that side of the fence.
Although PC gamers aren’t much better off, having to buy $900 GPUs every year just to run the latest AAAA blurfest at 30 FPS with AI frame gen on top of upscaling on top of interpolation frame gen.
Both current gen consoles are RT capable, so they'll just use lowered graphical settings, some amount of optimization, and upscaling. Indiana Jones ran great though, way better than you'd expect. I was getting a perfectly smooth 75 fps on a 6750 XT on 1080p high, no upscaling or framegen in use.