PS5 and Scarlett Ray Tracing

How the fuck would this work?

AMD doesn't have it ready, and by now, Snoy and MS would have to send out preliminary dev kits for a 2020 launch. Not to mention you need months of testing and and assured production.

Is it gonna be bespoke, non-AMD chips? MS might have an advantage there, given that they own the DXR API and have implemented parts of DX12 on the X.

Still, how? At what framerates? Cinematic 1080p30?

Attached: amd_radeon_raytracing-100798963-orig.jpg (1992x1042, 190K)

It's just marketing bullshit from microsoft.
Sony never told about hardware raytracing. Just audio raytracing which is something that already exists in Killzone.

>How the fuck would this work?
I won't or it will be some half assed marketing bullshit. Hardware capable of running it is way too expensive to fit into the price point consoles need to meet.

>cloud
lol

just marketing, people dont know that what sony is talking about is nothing like what nvidia cards can do and they can get away with it
im more interested in the "120 frames per second" since last gen they were toting the whole "60 fps" and then neither console could hold those frames for the fucking life of them until the suped up versions came out

>1080p30
That would be an improvement to every Sony console, so no way. They're always running in the low 20s

Anyone that seriously believes or has posted the claims in the threads about 9th gen consoles these last few months is either a console fanboy or a troll.

Considering what hardware is available on pc and how performance has been impacted by the use of raytracing in the three games that feature it, it' unthinkable that console manufacturers could deliver considering they're also promising 8k resolution and 120 fps, unless they will genuinly will be selling a 900-1200$ console.

Also those three games only feature it in one form, they either feature raytraced shadows, GI or reflections but never a combination of two or three of them because it's simply not feasible.

At worst the consoles will feature it as forced feature meaning games will likely run at dynamic resolutions between 1440 and 1080 / 30. At best it will be toggle able and you will be able to maybe get up to 4k native rendering resolution. This is assuming 0 increase in other graphical departments like increased poly counts, texture resolutions or other advancements that may negatively impact performance. More likely 1440p/30 will be the new goal.

So considering it's going to most likely be a difficult / niche feature to integrate im willing to predict it will likely be included in some flagship exclusives and then be relegated to a forgotten feature that everyone quickly abandons because it means playing games at 1080/30 - 1440/30.

That's just the theoretical capability of their HDMI 2.1 port, they didn't actually say the hardware will pull that off. Classic marketing.

As if any dev would actually make decent 120fps games. We regularly get unfinished piles of crap, what makes you think any of them would invest enough into optimization that their game could run at 1080p120fps?

>implying you need a special hardware to do raytracing

Probably Guerilla Games. They seem to generally care that their engine dosent run like complete shit.

AMD patented their RT version recently, look it up.

Behind the marketing BS all they are saying is probably that the console can handle both 4k and 120fps output... but that's about it.

Just because the console is capable of a signal output of that range, doesn't mean it will ever reach it. Unless we are talking old ass games that get a re-release.

You need it if you want to do it in real time on consumer hardware, without killing either the CPU or GPU completely. And most of that hardware is actually a) simplifying the scene in advance so you can shoot a low number of rays which then b) get de-noised so it doesn't give you eye cancer.

That's why full-on path tracing only works for shit like Quake 2 so far, because you don't need to cull shit if the models have 20 triangles.

It's all about getting away with as little geometry and rays as possible.

>AMD doesn't have it ready
says who? Consoles are only coming out in 2020.

It means that PS5 will not have hardware ray tracing. Why is it so hard for you to understand it?

>selective ray tracing

LMAO

What do you expect from overly seething console peasants, who still obsess yearning for their DRM machine cuckbox to leapfrog the PC gaming - which never ever happen forever?

Consoles never get bleeding edge GPUs. Ever.

You need mature mass production. AMD can barely supply enough of their top-end (read: mid-tier) dedicated PC GPUs *every* launch.

Battlefield V and Metro Exodus use selective ray tracing

The mustard race plays Quake 2 at 1080p50 now.

Peasant.

Depend on how powerful the GPU is.

For example, 2080 ti can ran more indirect&direct paths. I bet those consoles only run ONE path.

Attached: Ray Tracing - PC.jpg (1920x1040, 175K)

Damn, Quake 2? Consoles are dead and buried now!

the only games that are gonna run 120fps are games that look like they came out 10 years ago

How does it feel that your 1300$ card can't do 1440p60 in a 22 year old game?

Consoles use custom chips. Their production is unrelated to PC GPUs. If AMD is concocting some ray tracing hardware for consoles doesn't mean it's based on a currently released PC GPU.

>Consoles never get bleeding edge GPUs. Ever.
That was the case for xbox and xbox 360

you know the 360 only got more vram because of gears, right?

Anyway it was the most powerful GPU at the time, PC only got a more powerful one 3 months later smth like that.

With 2080 ti:
-1080p98fps
-1440p54fps
-4k26fps
If you’re underwhelmed by 98 fps on a $1,200 GeForce RTX 2080 Ti card, remember what’s going on here. The game is implementing a fully path-traced renderer and is computationally expensive to run. It’s a technique that’s really only been used in 3D movies to date, and fairly recently, too.

It also yields some beautiful effects. Glass windows and water in Quake II RTX obviously seem like night and day from the original version. So if you’re going to snark about “only 97 fps” on a GeForce RTX 2080 Ti, know that a game session of Quake II RTX will easily render more ray-traced frames than a full-length animated movie, and in real time, too.

Attached: 4K-maxed-out[1].jpg (3840x2160, 884K)

they got the crown for a month before PC got their best one. Afterward, PC still remains the king.

The next gen consoles will not able to compute that much period. Otherwise, it would be expensive. $2000 console. lol

N64 GPU was a bleeding edge SGI gpu. Dreamcast GPU was a bleeding edge full custom power VR. Gamecube's flipper chip was ready in 2000 but Nintendo delayed the hardware. Xbox and Xbox 360 had prototype Nvidia and AMD tech respectively before they were even on the market. It's only this generation where both consoles took low level consumer cards to save costs. Consoles typically had bleeding edge tech.

>in fucking quake 2
nigger come the fuck on

scarlett has hardware raytraced lighting
PS5 has software raytraced audio?

Too bad AMD isn't bleeding edge in GPUs right now and won't be in 2020 either.

Consoles are just PC APUs now, all those consoles you named were far more bespoke.

their games run at 30fps

>N64 GPU was a bleeding edge SGI gpu.
It got btfo'd by 3Dfx before N64's launch.
>Dreamcast GPU was a bleeding edge full custom power VR.
PC still beats them
>Gamecube's flipper chip was ready in 2000
Low-end PC lmao

>Consoles having bleeding edge tech
Pure meme

supposedly all their games could potentially run on PC, they apparently have an in house pc version of everygame they've released

All their game logic is optimized for 30fps.

Even if you can run it on PC, it doesn't mean you'll really reap the benefits.

DA POWUR OF DA CLOUD

Microsoft is the market leader in ray tracing API right now and all those nvidia RTX games are just using DXR which is Microsofts API. MS can easily make their own custom RT hardware in the same way they made their own custom soundcard for XB1 or their own ARM processors for windows 10.
Dunno about Sony though they've always been incompetent and just copied what everyone else has done.

all devs have PC versions, kamiya use to play bayonetta on PC long before the PC version actually came out

>MS can easily make their own custom RT hardware in the same way they made their own custom soundcard for XB1 or their own ARM processors for windows 10.
AHAHAHAHAHAHAHA

Consoles haven't used high end GPUs anymore since the PS3/Xbox360 era because high end PC GPUs went from consuming ~100W to ~250W and 250W GPUs don't fit into a small closed box that sits under a TV.

That's the only reason. Cost saving has little if nothing to do with it.

>leveraging cloud computing
They're hoping that your internet connection can handle data transfers to/from their servers in order to share the load. Sounds really stupid as shit and probably nobody will use it except one or two in-house studio games/tech demos. You would have to gather the position data from the game in real time, send it to their servers, let their servers do the math, then download the data. All in under 300ms. Couple that with input lag from your TV and it's going to feel like playing at 20fps no matter what the actual screen is outputting. Looks like they have some kind of proprietary shader tech as well to help streamline the process. This is stupid because that means devs will need plugins for their engines at the very least.

>kamiya use to play bayonetta on PC long before the PC version actually came out
I think he hinted there was a PC version for long time.

picrelated: Kamiya playing Bayonetta, with nvidia 3D vision glasses on PC. This happened in 2010.

Attached: 840937099quxo[1].jpg (600x450, 96K)

consoles literally holding the industry back

I totally agree.

>high end gpu's
>4k 8k gaming
>raytracing
biggest memes of our lifetime
The only games that have more or less proper raytracing are modded minecraft and quake 2. Both run like shit, especially minecraft (which also looks better).
There is no fucking way we will have actual games with this shit until 2030 or something.

What works properly this gen however is physically based rendering and 1080p.

Attached: iStock-155379351.jpg (730x486, 75K)