A lot of the implications for ray tracing are on the dev-side of things. It's a bit hard to explain without going into technical details.
Essentially, getting light to look "right" is very very hard. To do it, devs employ a lot of different techniques. One of those older techniques is baking the light on static objects, essentially pre-rendering where light goes and how it bounces. This has been done for a long time, e.g. even in Half-Life, the lights are baked for static geometry. So in a way, we have been using ray-tracing in games for a long time. however, it isn't real time ray-tracing, as the information gets stored in light map textures, so there is no performance impact other than storing the texture in RAM/VRAM and drawing the texture together with others.
The inherit problem of that technique is that it only really works for static geometry. If you move your light or any objects in the scene, your lightmaps will no longer match. To solve this, there are mixed modes which use real-time lights, dynamic light maps, and other tricks. However, these are often subject to problems and/or the limitations of using real-time lights. Real-time light problems are: You can only do a limited number before getting a serious performance impact, especially if the lights produce shadows. Soft shadows, shadows in big areas, and very detailed shadows are extremely hard to do as well without some advanced tricks. Also, ambient occlusion and global illumination is not something you can just give lights (there is screen-space GI and AO, but they don't look good in all circumstances, and you have limited control. There are also some other techniques some engines did for real time GI.).
Also there is the problem of baked light affecting dynamic objects, such as characters. This has been solved by baking so called "light probes". These are invisible spheres that store the light data and the closest data then gets applied to the characters and other dynamic objects. This again has a some problem, as it's hard to apply multiple light probes to the same object, so lighting might be off. Also, light direction is not accurate, which causes normal maps to look very flat in this light, and local shadows do not work using light probes. The same is done for reflections using reflection probes which are static. These are 360° "screen shots" essentially storing the reflection at that point in space. This however costs DiskSpace/RAM/VRAM, and it will not hold any information for moving objects (that's why sometimes you can't see yourself in the mirror in games). Also, the reflections sometimes look "out of place" or distorted when the reflection probe is too far from the reflecting surface (again, these cost VRAM and RAM so you don't want to place them in front of every single reflective surface). It costs a lot of time to find the right balance. For the rest, usually screen space reflections are used, as any other real-time reflection is extremely costly as you essentially render the whole scene again for each local reflection. Screen space reflection is an advanced technique that works very well for stuff like reflective floors, but you will quickly see its downsides on very mirrored surfaces as it lacks information that is not on the screen. Some games like Hitman for example use the mix of those techniques extremely well.
Coming back to lighting, there are now better techniques used for example by unreal and some other engines (and now unity in experimental). The light gets stored in more predictable data structures, such as 3d textures. This way, you can store the direction of all light in each cell. The light then gets applied to the objects passing through those cells. This looks pretty good, and the runtime cost is fairly low, but the storage cost of such textures is a tradeoff of texture resolution and fidelity. These textures cost a lot of VRAM to store and without using advanced techniques and tricks, have their own limits (e.g. for scene size). It also costs a lot of time to create each time you change the scene, and it also doesn't eliminate all problems mentioned above, like reflections, moving lights, etc.
Specifically, there is the problem of character lighting itself. Using light probes on characters usually looks pretty bad, as it removes a lot of detail of advanced skin shaders. Even with the above mentioned techniques, character lighting is still extremely hard to do. There is also some other problems, like ambient shadow in already shadowed areas, and light balancing for character versus scene lighting.
For that reason, most AAA games use separate light rigs for characters. Essentially floating lights that ONLY affect the character and move with them. When the mixing with the scene lights is done right, the rig adapts to the current situation in terms of light direction, color, and intensity. If you look in most AAA games, you can often see situations where rim-light comes from a direction where there is no actual light source. However, this way, the devs and artists have full control over lighting the characters. Essentially like a real movie production would have, but without the limitation of the real world.
Now, ray-tracing as you know it right now is not quite there yet, but eventually, ray tracing is the solution to a lot of the problems mentioned above. Things like polygon density, light count, global illumination, ambient occlusion, light direction, reflections, and much more are simply "there" for you to use. Now this doesn't mean that it will automatically make everything look great, but with the overwhelming amount of different tricks that have to be used for current gen games to make the look good, it opens a whole new world of possibilities.
Also, something that will not directly influence the final game, it will eventually simplify things for devs so that more time can be invested into other things.
At this current usage of ray-tracing, it's more like a gimmick, because devs will still focus most resources on the current ways to use light. This is because most people don't have cards with sufficient ray-tracing capabilities. So for the moment, I agree that the performance hit is not worth it. However, eventually it might become the default way to draw games. While we are not quite there, in terms of performance, I think that things might become a lot more consistent and predictable eventually for raytracing.
YES, thank you! You saved me a lot of writing haha
This is spot on and the real advantage of ray tracing - when it becomes the norm it'll look better, provide effects that are extremely difficult or impossible and do so with minimal dev pain.
Awesome and great explanation for a layperson. Because the industry has been faking lighting for so long and lighting is quite important, the industry has become incredibly good at it. But it also takes a lot of development time that could be spent on adding more content or features. There's a reason the opinion about ray tracing is extremely positive within the game development industry. But also nobody's expecting it to become the norm over night, and the period with hybrid support for both raytracing and legacy lighting is only just starting.
Worth mentioning that we're also about halfway on the average time for these big features to hit significant saturation, like with PhysX. It's pretty common for a GPU (and sometimes CPU/Chip set) to take 3-4 generations to trickle down enough through new products and used product sales to have decent enough depth/usage. At this point depending on how Apple is handling ray tracing, they might slow down the transition away from rasterized.
Remember PhysX back when it was a separate
card Physics Processing Unit before they shoved it on the GPU before they even had multithreading? Yea it evolved. But the original implementation was not ideal.
The problem with raytracing is that it's real strengths are in places where traditional rendering doesn't work at all. As soon as raytraced games stop needing a rasterized option, raytracing will really become useful. Most of it's advantages are around dynamic scenes where you can't just bake the lighting, or reflections which without raytracing will break if you look at them slightly wrong.
Edit: Most of the minecraft raytracing implementations are lacking in my opinion, but minecraft is a game that is well suited for raytracing. Really just anything with a dynamic world.
Spider-Man 2 has launched with ray tracing always on and looks and plays phenomenally. Super immersive to swing around the city and have proper reflections off all the skyscrapers!
Yup, I have a 3090 and even then I don't bother with RTX. It's a gimmick Jensen and Nvidia love to push as a must have feature. In reality you don't notice it if you're playing a game normally, it's a "stop and smell the roses" feature you only turn on to check out once and turn off immediately when you get frame dips.
How can you implement anything meaningful with ray tracing when shocker, not everyone can use ray tracing. Games are unfortunately designed for the median crowd. I would argue maybe the next console generation shall be that point when ray tracing is the norm. We have seen this fairly recently with SSDs, where they floated around for nearly 10-14 years in the consumer market being a cool piece of tech but most games were being designed for a hard disk except now most consoles have SSDs as the base standard, so this means the game can be designed around that specification and take advantage of it. Even though I am a PC stan, I understand consoles have a huge impact on the gaming industry.
That is why they came up with DLSS and then the frame generation. But of course it's proprietary tech confined to the newest most expensive cards by Nvidia. Utterly useless
This, but unironically. Games don't even need such realistic graphics, anyway - I'd much rather play a stylized or even 2d game where the devs focused on mechanics and fun, rather than pretty lights.
Hollow Knight, Crosscode, Hades, Dead Cells, Signalis, Dusk, Outer Wilds, Underrail, and more are all great examples of relatively modern games that kill it in the graphical department without using anything fancy.
Edit, because this is fun: Boltgun, Sea of Stars, the Bloodborne PS1 demake, Tunic, and more.
Minecraft shaders look great though, especially the rtx one for bedrock. If it was more open source, I'm sure ray tracing would be great to implement into shaders.
Celeste still has really nice lighting in places, imo, but you are right, it all just comes from good colour choice and artistic skill, not some premade graphics option that they flipped on.
Garbage take. Few things are truly needed outside of the game being enjoyable and "good" graphics can absolutely be something contributing to that. For some good is pixel graphics, for others it's near realism. You don't get to decide for anyone but you.
And if they are serious it doesn't make sense, ray tracing, path tracing, global illumination, make a game leaps and bounds more enjoyable for me. Realistic lighting is everything, I cannot wait for the day they finally get the new global illumination system in star citizen...
I don't know man, some people unironically thinks the earth is flat, that if a supernova happened in our galaxy the earth would blow up or that Volkswagen is pronounced with an English v and not with f despite listening to a German explain the German v. You can never tell when it comes to internet strangers.
Most of the comments (at least when I opened the post) were talking about RT as if it all does is ruining performance and shouldn't be used.
The weird hybrid solutions that game devs are coming up with to beat out the old tech without doing full RTX is awesome. And for that reason I like RTX, because its pushing development of ideas that work better for today's hardware and today's applications.
After playing Portal RTX and Quake 2 RTX, my opinion is that what we really need are games that fully embrace RTX as their rendering. Lower poly count, use materials more, lean in onto the cool lighting.
Games like Cyberpunk 2077 use RTX, but it's just painted over so it is very expensive for what it brings to the table. Sure it's more accurate and having reflections is neat, but it costs more than some shadow maps and doesn't beat good artistic design.
Yea were still in that transition period. One of the other problems is having RTX requirements only. Eventually the GTX cards will have to die out in order for this to be achieved though.
Yeah, we will only start seeing games that fully rely on raytracing when low-mid tier GPUs will be able to support at least current day RTX 3070 performance. As in, you can do better but at least you can run stuff fully in raytracing.
You know, that's fair. Most of my experience with RTX in games so far been in first person shooters and they're kind of lacking in environments like those.
Mostly stuff like slightly better lighting in Cyberpunk or the flickery caustics in recent Robocop game. Bonus points for the games that implement RTX reflections and shadows but don't have your character reflect or cast a shadow.
That's exactly my point. Raytracing is being shoehorned into things without them being optimised specifically for it at the moment. That doesn't mean we should stop developing the tech entirely because people are implementing it poorly most of the time.
When I got my oculus quest I played it as often as possible. That’s the problem though, it just doesn’t make sense to play it almost ever.
If I were a teenager or someone who lived alone I could really get into it. The problem is disconnecting entirely from everyone around me for a game.
With my Steam Deck or my Switch, I can put my kid on my lap and play. I can sit it down easily and help my wife with a chore. I can walk around at work in my downtime and play.
VR is awesome. I absolutely love it. I just don’t have time to fuck with it. I would imagine that’s the case for most people.
Correct. This meme templates is a unreasonable Statement paired with outdated arguments and modern images/facts poking fun at idiots. It also commonly confounds the reason for the thing with something easily observable.
Yes. Just like the Wii, it's fun for a month and then people realize they don't want to move around in some half-assed way to play their games.
Couple that with the fact you need an empty room dedicated to VR to move around at all, and it's destined to be reserved for pasty-faced white boys with more money than sense.
Everyone I know with a VR headset has it collecting dust.
I'm surprised they didn't go with the fact that ray tracing shoots rays out of the camera rather than having light radiate from light sources.
"That's a scientifically outdated view of how light works! Light enters your eyes, not the other way around! What is this? Emission theory? Are we back in the 1600s? They've played us for absolute fools."
Exactly ! this makes the problem potentially millions of times easier, since you know with certainty that every ray fired is going to contribute to the image, whereas firing rays from the light source would guarantee you never see most of them, the processing power is wasted and your image never converges
Eh, pathtracing is pretty cool, and when used correctly, it can lead to real amazing results, while the artist does not have to care about performance as much. Baked lighting is very nice for static scenes, but it also consumes a lot of storage.
Godot's SDFGI seems like a good tradeoff, particularly as it works well on not-super-new GPUs (Juan: "but you can run them great on something old, like an gtx960 or a rx450 and get pretty real-time lighting at 1080").
Sdfgi is pretty cool, (I develope with Godot), but for now it's still really hard to figure out the right settings for it to not be a gigant splat fest... Cuz it leaves splats of color all around the place. Outside scenes work a lot better though, so that's cool.
It's not really that dynamic yet though, but I looked at their presentation where they talked about future features and they said that support for dynamic objects will be coming. I'm pretty excited about where its going.
True, it does tank performance a lot. For that though, you are getting actual realistic lighting, which makes certain scenes look like absolute garbage, since these scenes were designed with the "fake lighting" in mind.
Raytracing produces realistic visual effects without requiring tricks like ambient occlusion, screen-space reflections, shadow resolution and so on, since those emerge as a result of raytracing anyway and are much more realistic. I'm currently rendering a Donut in Blender where the effects are clearly visible in comparison.
However, due to the high amount of optimization in visually impressive realtime rendering engines like game engines, I agree with you that I don't see many benefits comparing ray tracing in games with contemporary alternative techniques.
Nevertheless I think that's the future. In the long run, there's nothing better, i.e. more accurate, than simulating the behavior of light when it comes to visual realism.
Also, baked lighting has another cost - nothing that is baked can be dynamic, and it has to be done during development, so it takes up dev resources.
Raytraced stuff happens immediately without tricks. All you need is the geometry and the materials to be accurate, and it should look right, no questions asked.
Once we get to a point where raytracing can be assumed even for low end systems, the problem where systems can't run certain games could become a thing of the past. I mean, if manufacturers weren't constantly bombarding us with planned & perceived obsolescence.
in the case where you have vehicles with explorable interiors (like the ships in Star Citizen), lighting has to be dynamic because lighting conditions change just as a result of flying around normally. The position of the sun in whichever of the two (current) star systems you're in relative to your ship, and the atmosphere which may or may not be present outside, the position of cargo and objects/materials that will be receiving light and causing it to diffuse onto surrounding surfaces in a cabin also requires at least some kind of reference or it just feels BAD.
But the recent citizencon engine presentation showed some AMAZING new short cuts that give just enough visual fidelity without tanking the framerate that it scratches some kind of itch DEEP within the predictive modeling of the human mind... when light acts more like it's supposed to, it's fucking magical.
Counterpoint: I like pretty lights and don't mind having to play at a suboptimal framerate if it means more detail that I'm going to notice and enjoy.
Also, I keep seeing people confusing photorealism with a lack of style, when that's just not true. Pixar movies for example are photorealistic but stylized. You can have fancy lights and cool styles.
Pixar movies have accurate lighting which makes them look great. Lighting separates a good looking game from a great looking game and is more important than textures imo.
Main point aside, that isn't photorealism tho. Photorealism is depicting something almost identical to a real life photograph or simply what most consider realistic graphics.
I would consider photorealism to mainly be light-based (prefix photo meaning light), but I get what you mean. I'd consider photorealism and realism to be separate. "I can see that existing" vs "I can see myself being there"
A better example probably would've been claymation, or pokemon concierge.
To be fair, lighting is the most important part of generating photorealistic graphics. Having realistic and real-time lighting makes it look so much more realistic
photorealistic ... realistic ... real-time ... more realistic ...
We had a tool for that: it was called IMAGINATION
The graphical fidelity fetish has complete ruined gamers' ability to immerse themselves in make believe worlds without the game doing all the work for us
My tone is /s, but despite my hypocrisy I do believe this is half true
Its not like games that tried to be realistic before didn't exist and not like games that purposely go for a non realistic style now are not a thing. I'm pretty sure we have more pixel style graphics games coming out now yearly than when they were actually a thing.
"Fake it till you make it". Using various techniques it is possible to simulate a fake ray tracing. It doesn't need to look as real as in real life, just similar enough so you wouldn't notice during gameplay.
AI frame gen and AI upscaling is what I am most excited about...
They're like modern anime putting all the effort into food pics while the rest of the material sucks crap. Just so losers can post on twitter to fit in and show off to others that they didn't do anything.
See, I felt that way until I played an older game, such as Alan wake, and compared it control which only came out 9 years later, and there has been an incredible increase in the graphics
Counter, or maybe side, argument; the problem is that nobody has actually done it well. There is a very real difference to be made using real time pathtracing, but everyone is distracted by pretty lights.
Just a side note: simulating light in a 3D environment is the stuff you could use to write a fucking phd, no joke. And another if you can figure a way to make the algorithm faster
My brother and I upgraded our computers recently. I went with bang for buck and got a 6600XT, he wanted the RT experience and got a 4070TI. Playing cyberpunk on his computer was disappointing tbh. I expected more from almost a 400% more expensive setup.
Just tells you how many tricks have been developed to make rasterization look as good as it does. Fascinating, really. It's always interesting to see how people work around a limitation.
The thing with real-time raytracing and pathtracing is that instead of being a workaround, it removes the limitation entirely, which is damn cool.
Just need faster hardware still, which will take at least another decade with how Nvidia keeps milking the smallest improvements gen after gen.
This tracks with every other graphic option if the person doesn't know exactly what they're looking for, in my experience. Most options don't have such a huge impact, and are more just "yeah, that feels a bit better, I guess" once you get past the basic, obvious stuff like resolution.
I just want to know about the ONE real world use for bouncing light. Probably referring to research so groundbreaking that it shifted our entire understanding of the nature of light if not the universe in general.
*Instead of developers having to use thousands of tricks, filters, shortcuts, and post-processing algorithms very carefully arranged and stacked, ray tracing simulates light waves to arrive at the same end result the same way the world actually works.
Ultimately, ray tracing will mean the vast simplification (and therefore cost reduction) of the way visuals in games are produced. Which I'd wager is why it's being pushed so hard.
Basically, a scene in a game has a bunch of objects in it.
It's not to hard to just light them, but it doesn't look that good. Most games want to have shadows, reflections, that sort of thing.
The traditional approach is to use a bunch of extra manual work by pre-calculating a bunch of stuff.
Ray tracing works by simulating how physical photons bounce around in real life. It's existed for a long time; they've used it in animated movies for decades.
The issue with games is that we haven't had hardware capable of doing it in real time until quite recently.
Edit:
That is to say, if you want to animate water or a mirror with ray tracing, you know where the camera is in the scene, and you know where the water/mirror is, so you know the angle the reflection would have come from. So you bounce the photon back that way til you get to the light source.
I turned off raytracing in 2077 and immediately had a playable experience. So what if the reflections aren't good? They fixed this on switch with SSAO, and that's a cheap way to fix it
Are you sure? The way I understand it, ray marching is not something that can really replace ray/pathtracing, it's mainly used for rendering signed distance fields which is cool if you want to draw fractals and stuff, but not very efficient for classical geometry
I've been saying raytracing was a scam ever since Nvidia didn't release a 1670.
Rather than make better cards at lower prices to compete with AMD, they take the 'premium' route of raytracing to charge more while still providing the same rasterization performance.
RT is for lazy devs to not have to deal with actual artistic implementation of "rasterized" lighting.
Everything seems to be benefiting devs more, tools are easy to use, assets are easy to obtain, etc.... At the cost of performance and "sameness" in many UE5 games especially as many games use the same assets from the store. (there are exceptions though).
Lighting in games just have to look convincing, I'm not analysing if the light is the correct shade, if shadows are the correct shape, etc... while playing.
What I notice is pop-in, flicker, shimmer... Even in those beautiful RT games this still irks me a lot.
Yes, and ray tracing helps with that, a lot. You don't actively analyze light shades but you intuitively notice when they're off. Like when you notice flicker or shimmer when there shouldn't be any.