PS5 and Scarlett Ray Tracing

How the fuck would this work?

AMD doesn't have it ready, and by now, Snoy and MS would have to send out preliminary dev kits for a 2020 launch. Not to mention you need months of testing and and assured production.

Is it gonna be bespoke, non-AMD chips? MS might have an advantage there, given that they own the DXR API and have implemented parts of DX12 on the X.

Still, how? At what framerates? Cinematic 1080p30?

Attached: amd_radeon_raytracing-100798963-orig.jpg (1992x1042, 190K)

It's just marketing bullshit from microsoft.
Sony never told about hardware raytracing. Just audio raytracing which is something that already exists in Killzone.

>How the fuck would this work?
I won't or it will be some half assed marketing bullshit. Hardware capable of running it is way too expensive to fit into the price point consoles need to meet.

>cloud
lol

just marketing, people dont know that what sony is talking about is nothing like what nvidia cards can do and they can get away with it
im more interested in the "120 frames per second" since last gen they were toting the whole "60 fps" and then neither console could hold those frames for the fucking life of them until the suped up versions came out

>1080p30
That would be an improvement to every Sony console, so no way. They're always running in the low 20s

Anyone that seriously believes or has posted the claims in the threads about 9th gen consoles these last few months is either a console fanboy or a troll.

Considering what hardware is available on pc and how performance has been impacted by the use of raytracing in the three games that feature it, it' unthinkable that console manufacturers could deliver considering they're also promising 8k resolution and 120 fps, unless they will genuinly will be selling a 900-1200$ console.

Also those three games only feature it in one form, they either feature raytraced shadows, GI or reflections but never a combination of two or three of them because it's simply not feasible.

At worst the consoles will feature it as forced feature meaning games will likely run at dynamic resolutions between 1440 and 1080 / 30. At best it will be toggle able and you will be able to maybe get up to 4k native rendering resolution. This is assuming 0 increase in other graphical departments like increased poly counts, texture resolutions or other advancements that may negatively impact performance. More likely 1440p/30 will be the new goal.

So considering it's going to most likely be a difficult / niche feature to integrate im willing to predict it will likely be included in some flagship exclusives and then be relegated to a forgotten feature that everyone quickly abandons because it means playing games at 1080/30 - 1440/30.

That's just the theoretical capability of their HDMI 2.1 port, they didn't actually say the hardware will pull that off. Classic marketing.

As if any dev would actually make decent 120fps games. We regularly get unfinished piles of crap, what makes you think any of them would invest enough into optimization that their game could run at 1080p120fps?

>implying you need a special hardware to do raytracing