Is Ray Tracing a meme or is it the future?

Is Ray Tracing a meme or is it the future?

Attached: images (1) (10).jpg (739x415, 55.42K)

overhyped, not worth the performance loss today, nvidia buying its way into game companies just sucks ass, boycott that shit

siege is complete dogshit but quite literally WTF is the competitive advantage of using this crappy added ray tracing to a rasterized game engine that is MULTIPLAYER ONLY

Raytracing has been a thing in pre-rendered 3d since the 80s. The only reason that it doesn't seem like a big deal that we can do it in real time now is because devs have become really good at faking it. (examples - reflections are made using premade 360 degree images or invert the entire game world to mimic reflections, ambient occlusion is create by programming light to create soft shadows when vertices meet at certain angles) . All the old "hacks" that devs used to have to do to imitate real-world lighting aren't needed in RTX, but because it is more resource intensive and has already been mimiced, people don't care anymore.

But DLSS is not ray tracing

This. It's a nice leap, and I'm comfortable with it being the generational leap this time around. But developers had spent decades perfecting dynamic lighting and reflections. In general the performance hit offsets the visual gains, for now anyways. Eventually it'll naturally be just standard and iterated to the point it's marginal. But for now it only makes sense in games where it delivers an actual wow factor like Minecraft. Makes it look like a completely different game.
I think it'll hit its stride once it's feasible to have RTX in VR games.

If that was tbe case qe wouldn't have games with objects glowing in the dark and disappearing reflections all over the screen.

It's the future but ONLY when developers take advantage of it and prepare their game enviroments and levels to take advantage of it.
Since most of the time whenever you see RayTracing you are seeing it being used on top of Baked Lightning, Screen Space reflections, Screen Space ambien occlussion, etc. which misses the point of having realtime calculated lightning in the first place.

But since RT is extremely gpu intensive, it's not worth it to focus so much resources on relying on it when more than 90% of the consumers won't be able to experience it at it's full power.
So in the end RT still needs a lot more years to go before it can be fully adopted as the default for game lightning.

It's not good lighting when the light takes seconds to start illuminating the environment as it happens in Minecraft. It looks so out of place and this hasn't been fixed in the curremt implementation.

Meme

No games are really made with Ray Tracing in mind first (if not exclusive way to play) Metro EE is quite good looking since you can only play it with RT

The only game that made any real use of it was doom eternal. Every other game just uses it as a shortcut for shit we already had but with a way bigger performance impact.

for most of us, it means fuck all, and the vast majority of gamers won't notice the difference. However it does make game development a lot easier since you don't need to do so much prebaked lighting, you can just place lights in the scene as if they were real lights.

since i got my 3070 last year i have not played a single new game. rt is a meme

Lol

Attached: Theatrhythm_Ultros.png (226x173, 52.91K)

It's the future but it needs another 2 or 3 GPU generations plus games built purely for ray tracing with no fallback to rasterization
Until then, most people will skip it. I'll continue to enjoy it on my RTX 3070 but only with DLSS

It's a cheat hiding the fact that modern tranoid and diversity hire developers have little to no coding knowledge and only know how to work with monkey tools. Which leads to abysmal optimization. I don't see Doom Eternal using upscaling to run at 4k120fps on a toaster while looking better than majority of modern AAA games. Fuck modern video games, if they require DLSS etc to run well they're probably not good to begin with.

>Raytracing has been a thing in pre-rendered 3d since the 80s
It was a thing, yeah, the same way going to space was a thing in 1920. No consumer-sized computer was able to calculate anything as complex as raytracing up to the 00's. And even then it used a shit ton of resources. For example, the software used to render Toy Story wasn't able to do any ray tracing at all.

atm it's just good to give old games a new coat of paint
in the same way, you could design a whole new game with low poly with rtx in mind
we're still quite a few years away from seeing it work without major performance hits but when it do it's gonna be a game changer

It is the future, and it is currently a total meme.

It's going to be probably 3 generations before a Nintendo console uses it, so it's pretty much irrelevant.

looks nice and you can get performance back with dlss so what seems to be the problem?

The denoising process is visible during ray tracing with darker toned dots, I hate how the seek of photorealism turns games graphics so noisy. I just want a clean looking game with pretty colors and no blurriness.

It's a meme and it always will be, they'll bring back the GTX line and then claim RTX was always for workstations.

what if you could see the reflection of an enemy in a window or reflective surface that you wouldnt see otherwiese?

Attached: 1647280203679.png (347x355, 178.34K)

I was playing Elden Ring the other day and I noticed a lot of random light sources are being reflected in my knight armor. Like not just obvious light sources but even something subtle like when you are standing next to a red invader, there is red light reflected in the armor. Didn't AMD and Nvidia make a huge deal out of this stuff when they introduced raytracing? Why is that just casually included in the framerate mode in ER? I don't think raytracing is actually enabled on consoles, definitely not in framerate mode, but it's still capable of calculating all this light bouncing around, which is what I thought was the main purpose of raytracing?

Realtime global illumination most likely using some raytracing is the future. It will take a few more years to see it become more common.

Its both. Next question.

Attached: 1650560702857.png (512x512, 667.24K)

i couldnt be fucked to play cyberpunk
doom eternals RTX looks pretty good, but the constant fuckin qte melee attacks turns me off from finishing the game
anything else i couldnt give a fuck about for rtx, the raw performance of the 3080ti is what i needed for VR anyways

It's a meme but people would still prefer a 700$ 3060ti instead of a 500$ 6600XT despite both cards performing pretty much the same..
Nvidiots getting shafted once again.

Attached: goyvidya.png (579x478, 149.83K)

why do pc players bitch and moan about console not hitting native 4k when they sing and dance for dlss and fsr on pc?
also please stop cheating in halo mp, its pathetic.

Ray tracing is simply an application of vectors (directions, basically) of assets in real time. You can for example ray trace sound. Current applications are still gay, but yes, it is a part of the coming future for games.

Attached: 00053-LRG.jpg (209x300, 9.23K)

I don't. I play on native res.
>console not hitting native 4k
That's not the complaint though. The complaint is consoles not even maintaining a solid 60 fps at 1080p.

Anyone who thinks that raytracing is just a passing fad is retarded, obviously it is ultimately the superior lighting technology. It’s just that it’s not yet quite there from the performance requirement standpoint, but couple of GPU gens forward, there’s no reason to assume why it wouldn’t be the standard.

>why do pc players bitch and moan about console not hitting native 4k
because 4k is constantly touted as a selling point for the new generation of consoles when they only have it with a huge caveat and under certain circumstances, providing you also have a 4k tv that isn't dogshit

i'm not paying $1500 for slightly better shadows or silly hyper reflective puddles/windows and i'm sure as shit not going to a buy a 4000 series housefire card that needs a fucking 600 watt connector.

Maybe in 10 years but for now it's a fad and you're retarded for opting into it this early.

>600
Try 850.

Attached: 1634972218348.gif (498x247, 2.25M)

thats the thing. dlls/rt could be a very good thing, but its in early stages. Thats why im confused which one 40x series or the 7k gpu i want to upgrade

>4000 series housefire card that needs a fucking 600 watt connector
why is everyone shitting themselves over the energy consumption
there is only one or two cards right now that use the 3 8-pin setup, it isn't standardized and theres no evidence that it will be
i have a thor psu that lets me monitor power useage and the 3080ti is more power efficient than the 1080ti ever was, 50w less at idle and similar numbers (around 400w) at full load

Nvidia is pushing power draw to stay a couple frames ahead of AMD.

It's the future, whether you like it or not.
Realistic, accurate lightning without any extra work to bake every single scene is too much of a time save for devs to pass up.

The 4090 needs a 600 watt connector maybe.
The 4070 will draw 350.

>when they sing and dance for dlss and fsr on pc
ai upscaling only proves what a clown world gpus have become

Our energy prices are the cheapest in the eu, and i'm still looking for power efficient hardware. 400W peak load is bullshit, when you can get 70% of the performance, for half that power consumption.

i'll see it when i believe it unless you have some other evidence, 3090, while having fucked placement of power delivery still doesn't get anywhere near 600w and the 3090ti rectified that, and probably doesn't push 500w overclocked
600w is suggesting massive jump that would imply 1000w psu's as the new standard

Yes, power draw isn't linear, genius.

NVIDIA marketing is just extremely effective.
They fucked people over with the 3.5 meme card and jewed retards hard with their overpriced goysync.
Now watch as fanboys will jump to the rescue to defend them.

if you value 30% more performance for more than double the power consumption you're retarded.

Having spent my time with an AMD card after moving from an Nvidia? I'll fucking buy an Nvidia, Fuck AMD lmao, Shit drivers all around, Absolute dogshit OGL support
>just install linux
How about AMD hires actual people for their non existent driver division. Maybe Microsoft should make it possible to run the Linux drivers, then AMD would have value for someone who isn't daily driving linux, and no I'm not going to fucking dualboot, I shouldn't have to boot into a different OS just to get drivers that are actually good, fucking hilarious, I'll extort myself any fucking day for the nvidia premium tax, at least I get drivers worthwhile

Remember to spend an extra 50 bucks for titanium efficiency, that extra bucks now will save you 11% more in power costs.

Attached: Screenshot_20220422-050940_Firefox.jpg (1432x1144, 458.74K)

Your made up figures are bullshit to begin with, so I don't need to make excuses for your retarded ass.

nigger, more like 3%, nobody is using default 80plus shit, it's hard to actively go below GOLD

Werks on my machine and I've had AMD for 10 years now. Sounds like a you problem.
>hurr linux
Nah, Windows werks fine.
>muh openGL
Who the fuck still uses OGL? Most emulators use Vulkan anyway these days.

ignorance is a choice in this day and age, newfriend.

So are people, who pretend to be experts on stuff they don't understand.
Either back up your numbers or fuck off.

>Asks about RTX
>Posts DLSS
Lmao retard

It's a meme. Good graphics will not sell a shit game, bad graphics will not kill a good game. If graphics mattered then PS1 games would have never taken off.

All the titaniums are like 1000+ Watts.
You don't need that

It's amazing how even after years, retards still fail to understand what this whole thing is about and just keep screeching meme meme meme talking about how graphics aren't important.

The best part of the office was when the shady old man recognized the weed strain

It is obviously the future since that's how light works in real life. Not exactly but closer than ever before. All the piled on hacks of the past like SSAO were trying to emulate how light works.

Now its a meme, but will be a standart option in future with better GPUs when the performance dip will get insignificant

Metro Exodus: Ray tracing only edition proved that it's actually a real thing when it's actually used as the base technology for lighting rather than slapped on top of other shit.
It gets a bad name currently because SO MANY games use it like absolute shit. Also static image comparisons, also entire focus on the fucking reflections which aren't the be all end all in the slightest.

It literally took me trying out blender to get what raytracing really does, and why cgi cutscenes and movies had to be rendered instead of being real-time like games

that would be a good thing. it's DLSS

its the new physX. just buzzword nonsense for console war tier discussion about hardware and graphics

It gets a bad name because it eats up too much performance. It's still beta-tier tech and it's going to take at least a decade for it to become standardized. GPU's just aren't powerful enough to handle it without breaking a sweat.

Dlss and that amd equivalent is more impresive because it actually gives you frames instead of taking them away, a FPS junkies dream