Being announced March 3rd

Being announced March 3rd

13.3TF Custom RDNA 2 GPU @ 1.7GHZ with 60 Compute Units
AMD Zen2 8 core @ 3.4 GHZ (Sony is working on boosting to 3.7GHZ)
RAM 16GB GDDR6 + 4GB DDR4
[email protected]/S @ 1TB
Dedicated RT and 3D Audio cores
565GB Bandwidth
Full digital backwards compatibility with every PlayStation console and handheld for a library of 1000's of games on day 1
Enhanced Dual Shock 5 with haptic triggers, heartbeat monitors and built in microphone
PlayStation AI assistant that allows you to change games, create parties and more with voice commands

Estimated Retail price is $499. Holiday 2020 release date

Attached: 1582577818534.jpg (1400x700, 31K)

Other urls found in this thread:

arch.b4k.co/v/search/filename/1582577818534.jpg/
jedec.org/standards-documents/docs/jesd250b
pcgamesn.com/amd-confirms-high-end-navi-gpu-2020-ray-tracing
pastebin.com/zeiA2rdQ
reddit.com/r/XboxSeriesX/comments/fc9o0r/microsoft_attends_amd_financial_day_rdna2/
en.wikipedia.org/wiki/Graphics_Core_Next#Precision_Performance_2
techpowerup.com/gpu-specs/radeon-rx-590.c3322
techpowerup.com/gpu-specs/radeon-rx-vega-64.c2871
twitter.com/NSFWRedditVideo

Is this like a 2048 bits machine?

>Full digital backwards compatibility with every PlayStation console and handheld for a library of 1000's of games on day 1
Wrong

Where the fuck is the announcement. Just get it over with mate.

It's really just sad, Sonygros keep spamming this to combat the 12 T-flops of raw RDNA 2 horsepower the SeXbox has, when we all now know the AMD GitHub leaks were true and they're getting a 9 T-flop RDNA 1 machine with RT slapped on. Sad!

>tflops
As useful as bits as a measure of power.
Not saying that the PS5 will be more powerful than the Xbox Sex Hardcore, but you should let this bone go, and ask for more relevant numbers, such as you know, Gigarays/s.
If those consoles will truly support raytracing, it would be nice to know how fast they are at it.

I dunno my friends GPU is faster because it has more tflops. It may not be then end all be all of performance but it still matters. I guarantee you that 12tflop GPU will outclass a 9tflop GTX 1080 in a pc

It's still a dumb measure, because its literally "number of muliply doodads * clock cycle".
Ram speed not involved, number of rops not involved, scheduler performance not involved, you know, things that can crush the performance of the thing.
On the last relevant video of digitalfoundry they showed the newest AMD architecture running at 4.7tflops and consistently matching 7tflops parts of the older generations.

>sony damage controller posted it again

>13.3TF Custom RDNA 2 GPU @ 1.7GHZ with 60 Compute Units
AHAHAHAHAHA

AMD Oberon leak proves you wrong.
PS5 will be 9.2TF RDNA1 GPU @ 2.0GHz with 36 compute units.
RAM is 14GB GDDR6.
SSD with be specialised "removeable" memory that starts at 500GB. With you being able to buy "PS" SSDs that only work with the PS5.

Oh and btw it wasnt one leak.
The Gonzalo leak suggested the CPU will be clocked at 3.2GHz.

All the information about the PS5's specs are out there already.
Just retarded damage control idiots who know theyre about to get an inferior product.

Won't be out until April 2021 thanks to Corona.

Oh and all these leaks came out under a year ago.
It takes 18 months to make a new APU.

>RDNA1 GPU
absolutely embarassing

Wow 50 year old fake leaks, lol hang yourself

wonder if ps5 also comes with blessed rdna drivers such as on pc

>console GPU
>2.0 GHz
Are you talking memory or core clock, cause I doubt it'll push that.

Attached: jags.png (250x250, 84K)

Sample clocks, but it will downclock itself to PS4 Pro and PS4 clocks for backwards compatibility.

Cope, Oberon leaks were April 2019.
Oberon A0 (PS5) APU 9.2TF + 14GB GDDR6 - PS5

>Enhanced Dual Shock 5 with haptic triggers, heartbeat monitors and built in microphone
nope, not buying

And in the same leak it confimed the 12TF for Series X

Fuck off retard
60 * 0.128 * 1.7 = 13.0
16Gbps (top of GDDR6 spec) * 256-bit wide bus (width of Navi10 bus, the chip leaked by github to be in PS5) = 512GB/s

You need to cope

Attached: HERES GITHUB.png (500x310, 207K)

snoyfags are on full cope mode making up fake specs because they know theyre about to be BTFO LMAO

arch.b4k.co/v/search/filename/1582577818534.jpg/
mental illness caused by lack of games

C O P E

Github doesn't lie, PS5 is a 2GHz AYYYYYMD housefire

Cope posting because theyre going to be worse than XSX

SeX likely has at least a 384-bit bus to maintain BC with X1X (384-bit bus)
384-bit bus * 14Gbps = 672GB/s minimum memory bandwidth, should be decent for RT, though a faster 768GB/s would be preferred

GRayS performance is the big unknown, we have literally zero data or rumors on RDNA2 GRayS performance

A true leaker probably would know the number.

Come on now, we're not really expecting consoles to have raytracing, or at least optimized raytracing, are we?

Even nVidia 2000 series can barely do it unless it's a TI.

lol @ thinking sony could outspend microsoft

This, OP is a fucking faggot, it was stated that its only will be compatible with ps4 games. Fucking larping cucklord

14-16 Gbps is the official GDDR6 spec
>jedec.org/standards-documents/docs/jesd250b
Also note I said LIKELY 384-bit bus. I have no way of knowing for sure, but it's an educated guess since it's BC w/ X1X and that has a 384-bit bus
Why not? RDNA2 is going to have HW RT
>pcgamesn.com/amd-confirms-high-end-navi-gpu-2020-ray-tracing

I'm talking about the gigarays

Yes, high end Radeons which are yet to be released to the PC market. I don't know if consoles in development for at least a couple years now would have high end unreleased GPUs.

>digital backwards compatibility
Yes goy, pay for your games AGAIN!

Oh, well yeah only AMD/MS engineers know that. Supposedly AMD is going to talk about RDNA2 on the 5th (their financial analysts day)
It's not out of the realm of possibilty. 360 had the world's first unified shader GPU on the TeraScale uarch, which didn't appear in PCs until 2008. And RDNA2 has already been confirmed for SeX

>March 3rd
1 more day before this leak goes away

I guess we'll just have to wait and see. Though that RDNA raytracing will have to beat out GeForce 2000 RT's performance and at half the cost.

march 5 is also rumored

fuck

Let's see if mommy su will spill the beans.

IMO Turing's GRayS is decent, it's the memory bandwidth that fucks it over

$700 2080S - 500GB/s
$1200 2080Ti - 616GB/s
That's super expensive for that low of performance. I'm lucky I got VRAM that plays nice w/ OC in my GPU, because it makes RT a little more bearable

Knock it off, it's 9.2tf and that's that.

b-b-but xbox announced 12 TF. Playstation can't be weaker than Xbox!! It's not fair!!!!!

Attached: TODD REEEEE.jpg (540x547, 32K)

OH NONONO DAMAGE CONTROL POSTS FOR THE 4TH TIME TODAY ON SNOYS 9.2 TF CONSOLE

I just want some news
i just want this to be over

this is getting direct begging tier

2ghz is easy on the 7nm node and navi if you keep the core count low to stop it being a nuclear reactor. Hell if you can fucking cool the thing even the monstrous radeon VII will dance around 2ghz.

Yes, but a console?

Again TDP doesn't need to be that high if you aren't trying to hotclock a million cores. TSMC has been building 7nm dies for AMD for the last 14ish months.

the only people that use flops as a measurement of GPU power are console retards who have no idea what they're talking about. There are a lot more important factors

I'd be interested - not that it would ever exist - what sort of performance a 2080ti could push (particularly on the tensor cores) when fed the 1tb/s bandwidth HBM2 has been used at.

>using flops as a measurement for GPU performance

Attached: 1582410036315.png (733x510, 410K)

I play games not hardware. Is there any Japanese game that will utilize 12 teraflops?

Attached: serveimage(304).jpg (1280x720, 158K)

>full bc
Pre ordered. Goodbye xbox, you served me well but it's time to part ways.

If you'd pay attention you'd realize people are not only mocking the compute, but the inferior uarch of the PS5
Yes? It's become commonplace for Japanese devs to dump extra processing power into framerate because that doesn't require any extra work for them

skipped ps4 but if this is real im getting it

Wouldnt the specs be for the PS5 Pro? We already know xbox is doing 2 consoles. So wouldnt sony do the same?

>13.3TF
I finally know where is this number from.
It's Radeon VII.
What is Radeon VII have to do with PS5? well, Radeon VII's power is almost same as 5700XT, and PS5's GPU is basically modified 5700XT, and before Navi comes out, Sony use Radeon VII as PS5 devkit, that's why we keep getting 13.3TF numbers.

>RDNA2
This part? tell me "insiders", before Microsoft announce Xbox SX specs, you guys keep saying Xbox SX won't be RDNA2, both will be RDNA or RDNA 1.5, and after the announcement, you "insiders" start saying PS5 will be RDNA2 too? are you guys think we are gold fish?

Also 1.7Ghz with 60CU should be 13TF not 13.3, stop posting fake leaks.

Well Xbox Series X better than PS5 in every part.

That's bullshit. But I believe it

"PS5 Pro" is damage control from retardera/sonygaf over the Github leak. They've been absolutely anally pained over it since the day it leaked. No one who has said PS5P is real has been revealed to be correct. Meanwhile the people leaking Lockhart were spot on about the 12TF, RDNA2, VRS, etc...

If there is a PS5 Pro it won't be a thing until mid way through the gen like 2023/2024

PS5 wont be 13TF.
The Oberon leak already leaked the specs.

You forget that PC players expect 60 fps while console players 20-30.
Also it may be easier to code some special lightweight raytracing if you're making it for one hardware only

That's what i'm saying, Radeon VII almost=5700XT in power, that's why we have 13.3TF number, because that's GCN 13.3TF.

Nvidia slaves thinking these specs are just fantasy makes me laugh. You got conned paying $1000 for that 2080S. Deal with it.

Attached: Xbox_ShortBullets_JPG.0.jpg (1200x800, 51K)

Yes but this is the generation that was talking about 4k 60 fps performance.

See
You retard. RDNA2 is leaps and bounds better than GCN. AMD has unfucked their uarchs.

Yeah and we know PS5 will be 9.2TF Navi and XSX will be 12TF Navi.
But Im definately going to think the XSX will be more expensive for sure.

>It's Radeon VII.
Did you see this?
>pastebin.com/zeiA2rdQ
I mean it's obviously made up but I wouldn't put it past Sony to try something like that

reddit.com/r/XboxSeriesX/comments/fc9o0r/microsoft_attends_amd_financial_day_rdna2/

OH NO NO NO NO NO SONYBROS HOW DID WE LET THIS HAPPEN NOOOOOOOOOOOO

Do you even aware which console i'm saying? i know XSX is RDNA 12TF, i'm talking about PS5 early devkit using Radeon VII.

My bet is MS is trying to get the SeX to be the same price as the PS5. Right now, the X is cheaper than the Pro despite being newer and more powerful. MS is probably willing to eat the cost if it meant undermining Sony.

>full scene RT using the cloud
possible but unlikely considering what exactly makes up an xCloud blade

but where the video games

Attached: 1574283230983.png (1258x926, 1.17M)

Vega will never go near a console. The radeon VII is a monster but the underlying architecture is really, really not designed to push pixels.

I like how every time you post this thread the GPU speed increases lol

Thats physical. Digital backwards compatibility is entirely possible because your download is on one list across all Sony platforms.

>heartbeat monitors and built in microphone
Thanks for the warning

Guys, this thread scares me because I was thinking of buying a PS4 brand new this month.

should I do it?

Attached: 3-1260.jpg (1260x710, 80K)

>PlayStation AI
absolute crap

hell no, buy a secondhand pro for $200-$250 and whatever games interest you, or wait a couple months and see if whatever you're interested in will get ported to PC

>i'm talking about PS5 early devkit using Radeon VII.
I doubt as RDNA is a huge departure from GCN of old.

Theyre gonna undersell it and rely on gamepass for sure.

Vega's double speed half-precision math optimization is in the PS4P
Remember "SECRET 8.4TF MODE!!" PS4P nonsense?

What if the secondhand ps4 pro breaks down minutes after I bought it? There is literally no guarantee.

I think this too. MS has spent years rebuilding their image after the Xbone launch fiasco. If taking a loss on each unit is what's needed, then Phil will do it. MS has unlike Sony infinite money and can outspend them in hardware, software and services alike. MS won't give up this gen like they did the last. They want the 360 glory days back and then some.

Attached: 1581086894502.png (801x629, 646K)

It's just the early devkit before 5700XT become a thing.

Oh and, remember report that XSX devkit being late? because AMD didn't have same power GPU to simulate XSX's GPU back in 2019.

I know but xCloud blades are made up of Xbox hardware.
Real time full scene RT would need a Threadripper and datacenter GPU, I highly doubt they are crossfiring a shit ton of xCloud blades together to do that

Not him but a PS5 devkit did use radeon 7 and that's where the 13.3tf rumors come from early last year.

there's no guarantee with a new-in-the-box one either, buy a secondhand off of ebay and you'll get your money back if it falls apart immediately after it arrives

Off the top of my head polaris was the first architecture to do FP16 at something more than 1:1 (hawaii and fiji certainly couldn't). Variable rate shading would be a cool feature if it ever actually becomes a thing, not that the PS4 is capable of it to my knowledge.

At least post a blurry photo of your badge or something, LARPer.

That rat jew Jason Schreier sure became quiet after SeX reveal. He was doomsaying MS's next console for weeks before SeX reveal.

Nope it was Vega. RDNA also keeps the double speed half precision math
en.wikipedia.org/wiki/Graphics_Core_Next#Precision_Performance_2
techpowerup.com/gpu-specs/radeon-rx-590.c3322
techpowerup.com/gpu-specs/radeon-rx-vega-64.c2871

>13.3TF @ 1.7GHZ
is this what the GPU can achieve or what it will achieve in the PS5?
cpu is weaker than amds comparable mobile cpu and ddr4 could be at 1600Mhz for all we know

>60 fps
Every fucking time new consoles are coming out people talk about 60fps and every fucking time devs target ~30fps to get higher quality graphics and higher resolution.
Even if a console could theoretically pull 120fps or higher, devs will still target 30fps unless it's VR shit or something.

I just want the PS5 revealed already. Weaker or stronger, the Internet will become an absolute shitshow either way the pendulum swings.

Attached: triss.png (680x900, 820K)

Current xcloud blades are. In the future they won't use xbox's they'll use massive servers with thread ripper and stacks of RDNA 2 GPUs the same way Stadia does it. They were forced to use xbox ones with xcloud because AMD doesn't make server solutions utilising jaguar or GPUs with ddr3 and esram. Phil spencer himself has said the real xcloud servers will be used for other tasks other than gaming when they're not in use which is what led people to thinking they're using Vega because of the high compute capabilities but we now know RDNA 2 aka big Navi will have really good compute capabilities. Xcloud will be using standard hardware built in collaboration with MS hence the rumors MS actually developed RDNA 2 hand in hand with AMD whilst Sony developed RDNA 1 with AMD.

That's just the peak theoretical compute
1.7GHz is the boost clock (boost clock is used to determine compute)

Of course it doesn't matter because the math doesn't pass a sanity check and OP is obviously lying

No server is going to run threadripper.

>They were forced to use xbox ones with xcloud because AMD doesn't make server solutions utilising jaguar or GPUs with ddr3 and esram
fair point
>Phil spencer himself has said the real xcloud servers will be used for other tasks other than gaming when they're not in use which is what led people to thinking they're using Vega because of the high compute capabilities but we now know RDNA 2 aka big Navi will have really good compute capabilities. Xcloud will be using standard hardware built in collaboration with MS hence the rumors MS actually developed RDNA 2 hand in hand with AMD whilst Sony developed RDNA 1 with AMD.
sauce? I'm interested into reading up about that

EPYC, Threadripper, whatever, CPUs with absurd core counts and support for a fuckton of memory

This.
Believing 60fps will come to consoles is lunacy

>Being announced March 3rd
Being announced on the Switch's anniversary? For what purpose?

60fps games come out on console all the time.. they just look like ass.

Can't find it for shit now but it was a massive point of discussion on retardera around the first half of last year. He said something along the lines if there's downtime the xcloud servers will be used for other tasks which basically sounds like the next gen xcloud servers are going to be using standard server hardware and a specific instance will be created utilizing x amount of cores and x amount of GPU power for the xbox emulation. Currently right now the whole xbox OS runs in a hypervisor. It's how they make the back compat games think they're running on actual xbox/360 hardware.

Just to piss on nintendo because they can.

Give me one AAA title that is not a fighting game and runs 60fps on consoles.
[Spoiler] You can't [/Spoiler]

Battlefield 1
Gears 4/5
Halo 5
Doom
Phantasy Star Online 2
Call of Duty

>FPS games

Attached: 1579725370645.png (500x522, 167K)

ya no, it aint going to be 3.4 ghz on all cores unboosted, its a mobile zen2 part unless the ps5 is going to literally use 600w with the gpu

>getting a console on day 1
just lol
wait 2 years at least

Gears and PSO2 arent fps games. You asked for 60fps big games on console and there you have it. Also a lot of nintendo games

The PS4 wouldn't let you play downloaded PS3 games on it, so I see no reason to assume differently.

At best, we'll only get a few small handfuls of selected PS1/2/3 games and nothing else.

Ok well there's probably some degree of truth to it. I just remember them repeating "LITERALLY AN XBOX IN THE CLOUD" over and over again so I was wondering.
>It's how they make the back compat games think they're running on actual xbox/360 hardware.
I know games are run in Hyper-V but they do use the Xbone's Fission emulator to run ogxbox/360 games because those systems have some degree of custom hardware arch not shared with Xbone/PCs

Might as well have given 60 FPS games

"Pro" console versions don't count

>heartbeat monitors in a fucking videogame controller
probably amerifat only feature

Thats funny cause i was just playing battlefield 1 on a 2014 xbox one last night and it was 60fps in argonne forest with a million lighting and shadow sources, endless explosions, weather effects, and audio sources

If your heart rate is too high console calls an ambulance.
No more fapping to games for console players

fuck off

Attached: 2643445 - K-On! Mio_Akiyama PatrickDJA.jpg (1996x3480, 578K)

They've said they can completely emulate xbox/360 in software without the need of any special hardware assistance or compatibility. And they aren't wrong by claiming it's an xbox in the cloud if the xbox is using 8 zen 2 cores for example they just designate that many cores from their already existing epyc servers to that instance till the player logs off. Stadia does that right now. Putting whole xboxs in a server would be costly and take up a massive amount of server space. Also there's the fact Sony will be using xcloud for PS5 streaming and there's no way in hell Microsoft would allow sony to put their hardware into Microsoft servers. They'll use similar hardware and run the PS5 cloud steam in similar virtualized servers.

Damn, on a video I've seen drops to 50, so I was pretty impressed but then I saw in the comments that
>dynamic resolution
>mainly 900p, dropping to 720p
Well they tried. I'm still suprised this game actually runs semi-60 fps

a mobile zen2 8core boots to 4.2Ghz, what OPs PS5 is using has to be overclocked to rech 3.7Ghz

The only time you notice dynamic resolution is its done really poorly or in autismal video comparisons. In bf1s case, its done well. I honestly never even knew the game had DR until reading this because it always looks clear and its easy to spot things no matter the distance. At worst you will have the reliable old school fixes, where textures get lower quality if youre from a long distance. But considering how much is going on at once with the amount of players, its solid 99% of the time. Easily dices biggest production and most ambitious project ever and it shows. Damn it now im going to start autistically foaming at the mouth over how shitty bf5 was.

Attached: 1427864097325.png (601x595, 312K)

Shareholders will never allow Xbox to lose billions.