Next gen consoles tflops

For more comparison

Xbox lockhart = gtx 1060++ at 299$
Ps5 = gtx 1080ti at 499$
Xbox anaconda = rdx 2080ti at 499$

These will be the buggest generational leap ever 60fps stable, better ai, better graphics, raytracing, fastest loading times ever! No one is holding back!

Attached: IMG_20190422_194947.jpg (950x632, 29K)

Other urls found in this thread:

youtube.com/watch?v=oJ-kW6UZ5z8
pcpartpicker.com/list/xL3HcY
pcpartpicker.com/list/Ygvb3b
businessinsider.com/consumers-are-watching-less-tv-on-their-televisions-2018-5
anandtech.com/show/11992/the-xbox-one-x-review/3
en.wikipedia.org/wiki/Xbox_One#Xbox_One_X
eteknix.com/ps4-developer-kits-spotted/
youtube.com/watch?v=dGa_7Ds13Ls
twitter.com/NSFWRedditImage

By the way they are using zen 2 cpus so dont worry about it being held back.

But will Xbox have games?

these are still shitty overpriced PCs

What in the fuck even is a tflop

>500 dollars
>overpriced
>for a small form factor PC with supposedly Zen 2 and motherfucking RTX 2080Ti GPU performance

>MUH FLOPS
Just as dumb as the 'bits' debacle back in the day.

>supposedly
is the key word here
you're dumb if you believe console marketing, and even dumber if you believe rumors about consoles that aren't even announced yet

PS5 is already confirmed to be more than 16Tflops though, this chart is made up.

Flops = floating point operation per second, which is the standard basis for how fast a gpu is. A teraflop is 10^12 flops. Floating point operations are used to process things like linear transformations (when shapes change size or rotate on a screen), they're important in graphics.

You don't buy an Xbox for "muh exclusives", you buy an Xbox for the experience and privilege of being part of the Xbox Fanbase and being able to fully appreciate Phil Spencer's basedness.

i never understood why console advertise "terraflops" when no one knows what the fuck they are

it is practically impossible to build a PC with similar specs to a modern console for less money

BIG NUMBER GOOD

Devkits are already out. Ps5 will be announced late this year and release early 2020, and xbox scalrett will be revealed this e3 and comes out on
Late 2020

thats what i was thinking even google's laughable streaming thing said it had more terraflops than the rest

>which is the standard basis for how fast a gpu is
Only in the console world

>GPU power in tflops
>better AI
>fastest loading times
>raytracing just because of muh flops
I fucking swear to god you retards need to stop.

teraflops is a pretty good indicator of raw power.

>people think the xbawx will actually have a full 2080 inside it
Even if it's a 2080 it will be underclocked big time. Also I'm pretty sure both consoles are going AMD again

Diehard consolefag here. 60 FPS will never be the standard. Devs are more interested with graphical fidelity. Which is sad, but the game are enjoyable, so meh.

Because in the past the system-on-chip solutions used by consoles were an odd mix of extreme low-end CPU cores coupled with mid-range graphics hardware which were never offered on the general market to system builders. Plus any OEM who might have tried to offer such a system would have run head on into competition with Microsoft which the company has never dealt with very well.

Now the consoles seem to be switching to what will be the common mid-range laptop-grade SOC hardware that will be generally available to OEMs, and the solutions for console-style PCs outside of the limitations imposed by Microsoft on OEMs are much more mature than they were in the past.
The days of custom hardware are basically already over, the era of closed platforms might be over soon as well.

>source: my ass

console devs will ALWAYS prioritize resolution>frames
>muh 4k
>muh 8k

a FLOP is just a unit of measurement and FLOPS is just the rate of that unit. It's been used for computing performance for 50+years. Not hard to understand. Basically yea basically this:

So XBOX is releasing an Xbone and an Xbone X again, except this time it's available straight from release day. The more things change, the more they stay the same. I remember Nintendo tried releasing a white 8gb Wii U model, and that was just a different hard drive, not different specs. How'd that work out for them?

This time it will be ray tracing taking the games down to cinematic 24fps

>tflops to measure gpu power
rtx 2080 capable of 10.1 tflops
radeon vii capable of 13.8 tflops
rtx 2080 either equal to or better than radeon vii in almost every single game, also these consoles have amd socs so prepare you're anus to be disappointed again

>xbox will be 599 US dollars
The memes have come full circle

>Now the consoles seem to be switching to what will be the common mid-range laptop-grade SOC hardware that will be generally available to OEMs
There's no SOC with the specs of the next gen PS5. The GPU is far better than anything available.

FUCK I BOUGHT A RTX 2080 FOR $1100 CANADIAN DOLLARS. FUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUUCK

Attached: rtx_2080_trio.jpg (1000x664, 297K)

>what are economies of scale anyways

What's the conversion from Canadian dollars to italian lire?

Great question user. What are economies of scale anyway? This may be a question better left to future generations.

Italy uses the Euro

>still buying consoles

Attached: becca_face_unimpressed2.jpg (539x558, 85K)

>CANADIAN DOLLARS.
dude im still using my 670. Fucking things are so expensive. I don't play enough new games to justify that kind of money.

Attached: kill_me_please2.jpg (3250x4425, 748K)

I legit thought this was that trans suicide rate meme chart.

>better ai
that would require devs who arent lazy without 'diversity' requirements

if devs unionize it'll literally never happen

>Believing the lies
Nigger they said the exact same shit about the ps4. The PS5 will be equivalent to a 1070 and a mobile processor.

>talking about nvidia cards when the consoles will have AMD cpu and gpu
Retarded consolefags

>There's no SOC with the specs of the next gen PS5.
Not yet. The economics of consoles though demands a single chip solution, and process improvements essentially guarantee it will happen.

Won't PS5 lose if their specs isn't stronger than Anaconda?

Nearly every console generation has been won by the weakest system because its usually the lowest price that wins.

This is legit, i know you faggots are scared by this. Mext gen consoles will render pcs useless, just overpriced tablet. Xbonex already does phenomenal and competes with gtx 1080 while only being 6tf with tablet cpu.

>console
>no mods
I just can't get back into them

>consoles will render pcs useless,
Are you retarded?

>no games to show off that power

You can't have a 50% more powerful system for only 100$ dollars more, that doesn't make any sense.

What did they call the xbone and ps4 again? Supercharged PCs or something?

Attached: 1508724173483.jpg (400x315, 47K)

Gaming wise. Pc gamers shouldn't be allowed on this board to begin with, retarded epic games store rants and sjw shit.

>tfw minecraft in 8k

>ray tracing
>consoles

They all use amd chipsets
Because amd is cheap and ez to customize
So no goyvidia exclusives 4 u

Attached: 2005.jpg (200x200, 15K)

Ps4 marketing lies.

My anaconda don't
My anaconda don't
My anaconda don't want none unless you got games, hun

PC has all the games though

>book isnt a masterpiece if i dont immediately read it again
>movie isnt a masterpiece if i dont immediately watch it again
>song isnt a masterpiece if i dont immediately listen to it again
>food wasnt a masterpiece if i dont immediately eat it again

most retarded logic

wrong thread

>Supporting Microsoft

Attached: hmmm.png (2688x2688, 173K)

AMD demoed real-time Ray Tracing with standard Vulkan over a year before Nvidia added hardware to keep up.
As always AMD hardware is great and held back only by the absolutely incompetent Windows driver team.

Attached: 1534901658590.png (600x499, 28K)

build me a $400 PC that can play medium settings at 4k.

Attached: 1553021173091.jpg (1077x1078, 529K)

>1 bazillion memeflops
you are still going to play at 30 fps if you have a 2ghz CPU

This. Idk why retards think power is everything.

Show me a console that does it

4tflops makes no sense for the mainstream Xbox SKU unless the 4tflops NAVI = 4tflops Polaris. Would think that MS would make it equivalent to the X on a GPU basis.

Those are AMD FLOPs, so it's actually
>Lockhart
Sub-RX 570 (5 TFLOPs)
>PS5
Sub-Vega 56 (10.5 TFLOPs)
>Nextbone
Sub-Vega 64 (12.5 TFLOPs)
So based on these specs, these machines will be outperformed by an RTX 2060.

I would expect a 12.5 TFlop console to be close to a 2080, at least better than the 2070.

Didn't you guys say this exact same shit about the PS4, PS4 Pro, and XboneX?
This ain't it chief.

No, it won't.
Because it's using AMD hardware.
TFLOPs alone don't mean much if we don't know anything about clock speeds, memory speeds, or even compute core counts.
Hence why a 12 TFLOP Vega 64 is largely outperformed by a 6.5 TFLOP RTX 2060.

A 2080 alone costs 700 how the fuck they gonna release a console with a BETTER gpu form $300 less. This is insane retarded. At best we'll see a 1060 performance wise

Fuck did you shit something out and call it a source or something?

We've been through this OP, Xbox Scarlett Anaconda will not be over $499, that's the max.

Insane, where the fuck do you go after next gen?

You think Microsoft will take that much of a loss?

You do realize, they are just using the next generation of GPUs. The ones that will be available on PC before they become available on console. Sony and Microsoft aren't making their own super secret chips, they are just buying AMDs solutions and having them slightly modified.

>xbox releases with 2080 inside it for 499
>buy xbox and move the 2080 to my pc
What now consolefags?

xbox one x

Oh, you were talking about cinematic 24fps

The way I justified my RTX 2080 was to play my main game (Black Desert) on the highest settings. It's just fucking amazing playing in remastered mode at 90+ fps. A luxury for sure at 1440p.

Have you tried playing any good games on it?

Ahh, this takes me back.
Consolefag stupidity will never cease to amuse me.

Attached: Next gen 8.jpg (2048x1152, 461K)

No, maybe you can tell me some good games :)

nope 60 fps

youtube.com/watch?v=oJ-kW6UZ5z8

The uarch will be a Navi derivative not just a repackaged Vega. AMD will likely have improved memory compression and their version of hardware based tiled deferred rendering. Also Navi is supposed to be like Maxwell, Polaris, Turing in that it is design for high frequency with large localized caches. So I would expect to see the gap between AMD TFlops and Nvidia Tflops close. Memory bandwidth will be in the ballpark per TFlop as Nvidia gpus, maybe in the best cases scenarios Nvidia will significantly ahead due to better memory compression.

Also the Radeon VII might not be great card, but its performance is around the 2080 not the 2060.

>500 dollars
>600 dollars
Do they just want to fail or something?

>Checkerboard rendering
>Dynamic resolution
Oof

> supporting sony

>Ps5 = gtx 1080ti

Attached: 1530206602558.jpg (500x616, 54K)

Are you stupid, they took that much with the X, Microsoft has an endless amount of cash, please stop getting them confused with Sony.

>Also the Radeon VII might not be great card, but its performance is around the 2080 not the 2060.
The Radeon VII is a 14 TFLOP card.
The RTX 2080 is a 10 TFLOP card.
Sorry m8, but AMD hasn't delivered on the GPU side since GCN 1.0, and considering Navi's been delayed like 4 times, they won't deliver this time either.

only piss4 uses checkerboard.

and oh wow 90% of 4k at the lowest. only 8 million pixels some of the time instead of 8.8

BotW

pcpartpicker.com/list/xL3HcY
And this one'll be able to play games like Forza Horizon 4 at 4k30 on ultra settings.

Attached: FH4 benchmark.png (1323x2566, 115K)

If the Radeon 7 has 4 more TFLOP than the 2080 how come it performs worse?

They're poorer than Sony. What are you even smoking. They can't take that loss with the Anaconda.

kek me too, I can't see 3 bars getting shorter the same way ever again

They subsidise it by using their shit ton of cash they made from the current gen, in order to stay competitive for the next gen.

Oh boy, a literally second generation with no games because companies spend 90% of dev time on bullshit visuals

>only piss4 uses checkerboard.
Nope, XboneX also uses checkerboard rendering in BFV.
As seen in this image thanks to the horribly jagged bridge.

Attached: bfv-xb1x-s1.jpg (3840x2160, 1.79M)

> pcfriends still in denial
The Xbone X proved to be just as good as a midrange PC with a 1060 6GB for a fraction of the cost considering it's an all-in-one package that costs basically the same as the graphics card alone, and native keyboard/mouse support is already a reality in most games
PC will be relegated to a more expensive alternative porn games and porn mods only.
We're going home bros

Attached: 1522346003723.jpg (982x853, 329K)

>14TFlop/10TFlop = 1.4 ratio
Even if the uarch does not improve the equivalent of a 12.5 TFlop AMD GPU would be a 9 TFlop Nvidia GPU. That would be above the 2070

no windows +$150

designated shitting street tier m/kb

motherfuckin HDD

no monitor + $400

shitty no name brands - enjoy your house fires

>posting beta screenshots

holy shit how do you even breathe

>wanna build a PC nowadays that is PS4 Pro in terms of performance
>Cheapest GPU is $150
>RAM prices out the ass even when they've already gone down significantly

pcpartpicker.com/list/Ygvb3b
This is a "Xbox One X" tier PC. For around $100 more you could get a better CPU and SSD but still fucking $600 for what can be considered to just barely be better than a current gen console is fucking bullshit. Admit it, the PC days are fucking over because of how expensive all this shit is. If you want a truly good rig that will last you the entire gen you need a god damned $1000-1500 PC because of how inflated GPU prices are. This situation might improve a little once AMD releases their Navi GPUs but as it stands, this is the best price:performance PC you can get

>the equivalent of a 12.5 TFlop AMD GPU would be a 9 TFlop Nvidia GPU.
No, the equivalent of a 12.5 TFLOP AMD GPU would be an RTX 2060, as we've already discussed.

Who is making these magic gpu's? AMD? The company that just released a card as powerful as a 1080ti for $700? They're going to develop a gpu as powerful as a 2080ti that fits in the $500 price range?

>save a couple hundred on hardware
>all games are more expensive
>nowhere near as good sales
>have to pay a subscription fee to use your already paid for internet

I hope Xbox wins next gen, fratboy gaming needs to come back to weed out the effeminate movie game faggots.

Attached: 1465257091884[1].jpg (1360x452, 276K)

>no windows +$150
Windows is free.
>motherfuckin HDD
The XboneX uses an HDD.
>no monitor + $400
XboneX doesn't come with a TV

Sorry m8, but just admit that you lost here.

Attached: 1508991076815.jpg (565x575, 36K)

Oh FUCK yeah, have you played MD:HW2?

Just let the consolefags believe user, they need this.
They don't want to believe that their platforms will be replaced by streaming.

it’s the same cycle every 5 years. some bullshit charts with no sources and endless speculation until everyone ends up getting raped by both Sony and Microsoft

MS nabbed up a bunch of studios so i hope
skipped out on 8th gen consoles myself, hoping the 9th gen sucks less

Dude I fucking wish this was true. I could have 2080ti for $200

The GPU cost will be subsidised by Sony/Microsoft cash reserve and they will probably only be sold for these two companies.

>people actually BUY Windows in 2019

Attached: PapaG.png (292x257, 100K)

This

>RX 570
Jesus fucking christ user why. Also stop talking out of your ass, I have a Ryzen 1200 + 1050 Ti PC and while it is certainly a powerful little machine that can run DMC5 at 60fps 1080, don't fucking overhype things. At best that PC is roughly equivalent to a PS4 Pro in performance

Jesus dude, do you have any idea at how much of a loss they'd have to sell this shit at?

MS releases all their games on PC too

yeah they got ninja theory

any niggas reading this that havent played Hellblade: Senua's Sacrifice better get on it

The RX 570 takes a giant shit all over the 1050 Ti, the fuck are you talking about?
Especially in RE Engine games like REMake 2, DMC 5, and RE7.
Your fault for buying Nvidia in the sub-1070 range.

Attached: Screenshot_2019-04-22 Devil May Cry 5 тест GPU CPU Action FPS TPS Тест GPU.png (900x993, 128K)

memes back then had so much soul

Why would they use Nvidia GPUs? Seems very unlikely.

Well fucking excuse me for not having the money to buy a RX 570 when it was fucking $300 because of cryptotards. Also if you really were gonna recommend the 570 might as well have gone with the 580, you know an ACTUALLY GOOD GPU.

Actually scratch all that might as well wait for the Navi GPUs at this point because only a retard would buy a 500 series AMD GPU at this point in time

Nigger, the RX 570 is $50 less than the 580, it is THE king of price-performance at the moment and no other GPU even comes close.
Fuck outta here Nvidianigger.

They take temporary losses in order to secure market share for the future generation. Both Sony/Microsoft will use their cash reserves to subsidise the costs of the new consoles. If one of them decide to not do that it would mean that their console would cost $200-300 more than the other. The consumers of course would choose the cheaper option, and the company whose console costs more would lose significant market share and thus future profit.

Also you do know the only reason the RX570 gets those framerate specs is because DX12 right? All DX12 games have been confirmed to run better on AMD cards, but running the games on DX11 gives better and smoother performance on Jewvidia GTX cards

What fucking GPU is the Anaconda going to have?

>literally say to wait for the Navi GPUs because the 500 series will be obsolete soon
>durr nvidia shill
never change retards

You cant compare tflops like that lmao. Vega 64 and 2080ti almost have the same tflops, consoles use AMD hardware so the top $499 console will be V64/1080 performance, not fucking 2080ti

The ones below are weaker than a Vega 56, most likely inbetween 580 and V56 performance which is where low end Navi will be

Wow, I'm not at all surprised that an Nvidianigger is also a tech illiterate.
No, RE Engine games run better on AMD PERIOD because the engine is extremely compute-heavy.
You fucked up by paying $180+ for a 1050 Ti, a GPU that's literally RX 560 tier.
Yea, just like the 300 series made the 200 series obsolete.
And the 400 series made the 300 series obsolete.
And the 500 series made the 400 series obsolete.
Oh wait, none of that happened because the days of insane performance increases in GPUs are over.

Attached: 1080p_Medium.png (1328x2839, 159K)

Pretty much dis. I thought Radeon VII would btfo 2080 because of its high tflops. Nope, it just bfto'd by the mere 2080. It is all about a good driver. Nvidia is excellent at making an excellent drivers that can run all games better.

The one dipped in magic dust to make it go real fast

Drivers actually have nothing to do with it, Nvidia usually does better due to architectural differences and vastly superior clockspeeds and IPC.

>Used to despise fratboy Halo/COD consolefags
>Now would rather have the culture back more than ever

That era was full of fucking casuals but at least those people still had backbones. If only I knew how bad things would really get.

Attached: 1368159563362.jpg (680x517, 524K)

The fact that SJWs completely took over console gaming should tell you that those people never actually had any backbone, they were always pussies.

You are one stupid nigger

Are you aware of the cooling required to keep a card like a 2080ti even run at stock speed? A comparison like this is moot considering the form factor of consoles.

They weren't pussies, they just got banned.

Joke's on you retard I got the 1050 Ti at $140 because at the time a 570 was FUCKING $300 THANKS TO BITCOIN RETARDS FUCK YOU GOD DAMN CRYPTOJEWS. If I had a say in the matter I would've obviously bought a 580 even
>And the 500 series made the 400 series obsolete
They're literally the same architecture you monkey, BIG difference with the Navi cards that are coming out. Enjoy playing on outdated hardware when you can just WAIT and get a top end card for relatively the same price. What do you honestly prefer? Being cucked forever by the nvidia jew and paying $1000 for a top end card or maybe waiting just a tiny little bit and seeing what AMD does with actual good, affordable GPUs based on new architecture?

Imagine not being forced to pay $500 for a fucking Vega card, wwow such a concept

Yeah, now we have to deal with woke fuckers.

Consoles will always be more expensive because piracy is easier on PC.
Seriously Yea Forums, calculate how much money you've spent on games in the past 7 years on your console, and compare if you'd just bought a midrange PC in 2013.

>7 year generation
>A new game every month = 12 games
>7 x 12 x $60 = $5,000

You can buy an absurdly powerful future proof PC for that money that outperforms every console by a huge margin.
"Next gen" consoles are just midrange PCs that will become outdated as the gen progresses. There is literally no reason to own a console except exclusives.

I haven't bought a game since 2012, and pirate over a thousand dollars worth of games every year, most of the time not playing them, but it's all available at no cost and no risk.

Attached: 1536130260114.jpg (600x600, 32K)

>BIG difference with the Navi cards that are coming out.
Hey, dumbfuck.
Navi is still GCN.
Are you fucking stupid or something?
And I can go out and get a Vega 56 for $270 right now because I don't live in a fucking shithole.
You are utterly moronic if you think the performance difference is going to be #insane.
The fucking RX 400 series made the jump from 28 nm to 14 nm AND brought in a new architecture (Polaris, but it was still fucking GCN so it's not really a new architecture, just like Navi) and the 480 STILL just BARELY competed with the R9 290.

Dont forget the online fee which can almost add up to the price of a high-mid range GPU upgrade every 3-4 years

>a new game every month

>not mastering one game for years

Casuals. Pathetic casuals.

Attached: skinner.jpg (310x163, 7K)

Don't fucking kid yourself user the only actual reason why PC is cheaper is because you can just get new games for $40-$50 on key sites instead of being jewed by official stores. And even that is going away if the Epic Store meme goes on because the fucking chinks are somehow even worse than actual jews

And did nothing to fight it, instead they sat down and took it.
So, again, pussies.
Believe it or not, the average person doesn't give a fuck about anything, so they'll never complain about getting shit on by companies.
Hence why consoles are such a fucking shithole for consumers nowadays.
Lotta faggots who lie down and take it up the ass because they don't know any better and they don't care.

You're talking 100s of millions of dollars in loss. You'd have to be a legitimate retard if you truly believe what you just puked out on your keyboard.

>most of the time not playing them

classic Yea Forums

Everyone has a TV, not everyone has a monitor, not everyone's going to sit on the floor and operate their PC from the carpet or have shitty wireless on a coffee table, it's just not viable for most people.

You can get them for $0 on the CPY/CODEX/3DM/SKIDROW etc. stores. No reason to pay.

businessinsider.com/consumers-are-watching-less-tv-on-their-televisions-2018-5
Tick tock boomer.
Zoomers don't have TVs, and they'll be doing all their gaming on their phones via streaming.
Even millennials are likely to not own a TV unless their boomer parents already have one.

Attached: 59a07df379bbfd2b008b4e80-750-563.png (750x563, 84K)

I actually wanna play some of these games online user

Such as?

Both companies made billions with this gen consoles. Even if they invested $1-2 billions into subsidising the console costs, they would still make billions in future profit. Sony's cash reserves alone jumped from $9 billion to almost $15 billion thanks to the ps4.

Get out of your mothers basement, go to your local community college, and take a basic macroeconomics course so you understand how companies secure market share.

Monster Hunter World

Attached: 1477413380804.png (657x527, 44K)

Do you people really not own a desktop computer? How are you even posting on Yea Forums?

Attached: 1531054783283.jpg (1280x720, 499K)

They're phoneniggers.
Which is ironic, considering phones are going to kill console gaming.

>company profits
>macroeconomics

>as we've already discussed.
except you are wrong nigger, an aircooled Vega 64 is 12.5 Tflops and it trades blows with the stock 2070

How is it going to feel when you're wrong?

Out of all my friends, I'm the only one that owns a desktop. They all have a laptop which they do all their important stuff on. Most normies really don't need anything beyond a basic screen for their activities. Also console gaming is still a majority for most people, then couple that with the fact that Smart TVs are now much more popular because they come with both Netflix and Youtube so there's no reason to get a desktop other than being a videogame enthusiast. Most people nowadays are phoneposters, shit even tablets are dying out because phones are just so much more convenient.

>and it trades blows with the stock 2070
Maybe in your dreams, but in the real world it trades blows with a 2060.

Attached: relative-performance_2560-1440.png (500x890, 51K)

Nice source.

PS5 will be over 8TFs at $399.

>PS5 1080Ti $499
Doubt.jpg
>Anaconda RTX 2080Ti $499
c'mon you're not even trying. More realistically

>PS5 = Vega 56 $499
>Anaconda = Vega 64 $550

>Also console gaming is still a majority for most people
You mean phone gaming.
I have not had a single conversation about games with co-workers that wasn't about phone games.
Especially women.

I'm still waiting for rare replay, gears of war 2, 3, judgement; fable 2 and fuckinghalos

I guess? Everybody only uses their phone for the following
>messaging
>Youtube
>Instagram
>Spotify
I don't know of a single person who uses their phone to play games except for my ex, she loved Risk and even bought the pack thingy so she could play indefinitely, we used to play together and she was just so cute and innocent when it came to small phone games like that. I fucking miss her ;_; /blogbpost

The people who only use their phones for those things are the same people that don't play video games at all.
Anyone who would be playing on a console also plays on their phone, and tons of people don't own consoles and only play phone games.

chad is throatfucking her every night

I don't know how much the end consoles will cost. But I assure you that both Sony and Microsoft subsidise the cost of their consoles.

>Cheaper Console = Higher Market Share = Higher Stock Price = Happy Investors

Both companies don't give a fuck about current profit. They are only interested in market share for the next gen.

Dunno, know a couple of people who don't game on phones but have their own consoles but I guess you're right

I doubt they will be using those exact graphics cards.

Attached: A3E.jpg (320x484, 22K)

They're not going to sell s console with the equivalence of a rtx 2080 ti you fucking ape

You can't compare AMD FLOPS to Nvidia FLOPS

Not one of the Next gen Consoles will get close to RTX 2080 Ti level of performance & devs will continue to target 30fps for better graphics

Attached: 74557.jpg (457x386, 41K)

>Both companies don't give a fuck about current profit
You're an idiot and don't know dick about the stock market.
Shareholders only care about quarterly performance and profit, low/no profits = unhappy investors = lower stock price = less money

Consoles are the only topic with which people unironically use flops as a unit of measurement in the context of comparison and is only used by marketing types as a way to quantify how much better the current gen console is than the last gen, despite not being a useful measurement at all.

This tells us jackshit.

Don't forget 8K resolution output that no-one will have a TV for but the console will still render it in that resolution.

PS5 is literally just using a downclocked Navi 10 & a Ryzen 2 8 core, these will both be coming in 2019 for about $200-250 each

These super special Console parts are just off the shelf PC parts

you mean upscale it heavily?

>i dont understand business
comp "science" faggots are illiterate niggers

Profit is fucking irrelevant in stock market. Only thing that matters is revenue and market share. Thats why all these tech company stocks like Netflix, Amazon, Alphabet etc are through the fucking roof.

Those people jist stopped gaming as whole.

see

Dude we have seen Navi, it's just Vega with worse compute, also confirmed by Digital Foundry more or less

Navi is not going to be magic since it's stuck to GCN limitations due to wanted BC with PS4/Xbone

>Profit is fucking irrelevant in stock market.

Attached: 1356800578331.jpg (1024x1024, 376K)

>were going home bros to paid online & no exclusive games

Okay

8 tflops for PS5 is 100% believable.

If you take the RX 580 and give it a 30% boost due to the shrink to 7nm plus some mild GCN architecture improvements that puts you right there. I have no idea what the xtwo would use to put it near 12 tflops though.

>profit matters in the stock market

If that was the case then Tesla's stock would already been at 0. No one cares about profits in the stock market.

Stock prices are also affected by speculation, if you sell a console with capabilties of a rtx 2080 for $400, guess how the the value of those stocks is gonna be? Also in the long run the shareholders with patience are gonna make a huge profit. Thank god retards like you doesnt manage any business

You're a fucking mongoloid.
>if you sell a console with capabilties of a rtx 2080 for $400
You are also a fucking mongoloid.

Glad I'm fucking done with consoles.

tflops is a meme

UserBenchmarks: Game 116%, Desk 108%, Work 105%
CPU: Intel Core i7-8700K - 117.2%
GPU: AMD RX Vega 64 - 125%
SSD: WD Blue 3D 500GB - 110.5%
SSD: Samsung 850 Evo 250GB - 104%
HDD: Seagate Barracuda 7200.14 2TB - 56.4%
HDD: WD Blue 6TB (2015) - 81.8%
RAM: G.SKILL Trident Z RGB DDR4 3200 C16 2x8GB - 95.2%
MBD: MSI Z370 GAMING PRO CARBON (MS-7B45)

never buying another console in my life

Based becca_face_unimpressed2.jpg poster

vega 56 is already ~11 tflop and in theory they could do the same thing with that as you just explained with the 7nm shrink and some refinements here and there. vega 56 are going for as cheap as rx 580 now ($250) so i guess if amd are able to sell it at that much microsoft would be able to get it in bulk for cheaper obviously. don't know if that $250 price is just to price itself lower than the rtx 2060 though they might be taking a big loss on it because it was once a $350-400 card.

>hurr durr I lost the debate and have no more arguments
>whattodo.jpg
>oh yea I will call him a mongoloid

absolute STATE

>and give it a 30% boost due to the shrink to 7nm plus some mild GCN architecture improvements
Oh my fucking god consolefags are so delusional it hurts.
You people actually think you're going to see a bigger boost in performance moving from 14 nm to 7nm than we did moving from 28 nm to 14 nm.
Or actually, I mean 10nm, since the "7nm" AMD is using for their upcoming products isn't actually even 7nm.
Fucking insane.

There is something worse than a computer "science" retard and that is a economics major subhuman, that think their field is science. So which are you?

Good, less nigger faggots in psn and xbox live

Based. Love MHW myself in BoneX

Attached: 20180707_193441.jpg (3264x1836, 1.35M)

Careful with that user, wouldn't want to get banned from XBL for using the n word or the f word.

he's right you clinical retard. amd officially said going from 14nm to 7nm (the 7nm process they're using right now) provides a 25% performance boost at the same power.

Attached: AMD-Radeon-VII-Specs.jpg (1234x642, 42K)

more like lockshart

>There is something worse than a computer "science" retard and that is a economics major subhuman

Yea there is, a fucking jobless NEET like you that doesn't even have a highschool degree and doesn't even know how modern corporations operate.

Teraflops are for console peasants who don't know anything about GPUs, what does 1 Teraflop equate to in core count, clock speed, and Vram?

And they lied, see
The Radeon VII actually has slightly better specs than the Vega 64 AND has a smaller process and it's only around 10-15% faster.
This is the same shit they pulled when they said that 2 RX 480s would outperform a GTX 1080*
*Only in this one specific instance that won't be seen in any other game.

retard. it only needs cooling if its outputting mega graphics. my 2080 ti fans dont even kick in unless I play at 4k (but I dont usually bother because I prefer max settings with high FPS and 1440p)

You could actually play MHW online with a pirated copy you frog posting retard

Can I buy a used 4K-capable PC for $260 usd? If so I honestly wanna buy one

Attached: Screenshot_20190423-005512.png (1440x2560, 1.2M)

Handheld bro here (unless you count the Switch as a home console now), is it really that big of a deal for the Xbox 3 to be better than the PS5? They're both ~10 TFlops which is downright excessive while at the same time insignificant compared to Stadia. Just how many flops you guys need when you guys drop the game like a hot pocket 2 days after buying them?

If there's anything to be interested in, it's the introduction of the blockchain tech to vidya consoles. Then you'll finally have a way to trade digital games.

>checkerboarded dynamic resolution """"4k"""" at 30 fps with shit tier medium setting console graphics

Why would you do that?
Frames are more important than resolution.

Nah, because a 4k capable PC isn't so shitty and worthless that someone would want to get rid of it.

are you fucking retarded? the vega 64 has 64 compute units, the vega 7 only has 60. if the vega 7 had 64 compute units and the same clockspeed boost applied, it would achieve that. the ps4 pro and xbox one x only have a 4 CU difference between them yet that 4 extra cu in the xbox one x paired with a higher clockspeed gives the xbox a whopping 40-50% performance advantage over the ps4 pro.

educate yourself moron

Attached: f4fnktxiyqw11.jpg (2560x1280, 218K)

i hit a nerve there kek, do you really think your shit field is hard or remarkable? You know damn well that the stock prices go up if you are mad enough to release a console with very high-end specs for $400 and in the long run this movement makes a massive profit, surpassing initial losses

The X's GPU is a lot better than the Pro's, it's not just 4 more CU's

Based subhuman ESL console warrior falling for POWER OF THE CELL in 2019

>the stadia meme
silly user

So you are retarded, glad that's settled.

Just this little boy

Attached: HBMMS.png (1365x599, 55K)

Look I'd like 4K60 but as we both know very few games can achieve it (mostly non open world games). But it is a 4k-capable console.
>inb4 you post a screenshot of rdr2 dipping to 21fps for 3 seconds in the big city

Attached: Screenshot_20181229-053656.png (1440x2560, 383K)

>the vega 64 has 64 compute units, the vega 7 only has 60.
The Radeon VII also has 1 TB/s memory speeds compared to 484 GB/s for the Vega 64, not to mention a 4096 bit bus compared to a 2048 bit bus, 200 MHz higher core clock speeds, higher IPC thanks to the process shrink, more compute power, etc. etc.
And you're a fucking moron, the XboneX doesn't outperform the PS4 Pro because it has a measly 4 extra CUs, it outperforms the PS4 Pro because it has a FUCKING 20% clock speed advantage and a 50-60% bandwidth advantage.
Fucking hell, consoleniggers are so stupid it hurts.
You are dumber than the dumber AMDdrones, and that's saying A LOT.

>costume servers
pc-owards everyone. your shit platform is dying, kek

>Admitting that it's a compromised media box

kek

I dont use my desktop anymire and its down basement collecting dust.

This
Stadia will cater to absolute casuals and poor people that somehow have a good internet connection

>You know damn well that the stock prices go up if you are mad enough to release a console with very high-end specs for $400 and in the long run this movement makes a massive profit, surpassing initial losses

First of all take the black cock out of your mouth before you type shit that makes no sense. Second of all, thats not how this principle works. Take a macroeconomics and business administration course to understand it, since you are just too retarded to understand this.

Maybe you should graduate high school first before talking about economics, zoomer.

Nah, they're just falling for the AMD hype machine.
Consolefags are just dumber, lazier, and all-around shittier versions of AMDdrones, so it makes sense.

XBOX TWO is going to win by default because Sony just decided to Seppuku themselves with censorship bullshit

it's LITERALLY the same GPU at its core. both are based on polaris arch and the ps4 pro has the same specs as the rx 580 whereas the xbox one x is rx 580 + 4 extra cu and a ~260 mhz clock boost.

are you dense? i explained all of this including the cu and clockspeed. neck yourself moron.

Attached: 04b36bcb489a23ce9b001ce86d0fe621.png (452x799, 382K)

Phoneposting is much comfier, plus you can quickly post pictures
>mfw just found out FF7 got released on Xbone. Tempted to get it...

Attached: 1555996196956517080323.jpg (3264x1836, 1.83M)

>2080ti in a console

You're delusional and I hate you

HAHAHAHA this low iq faggots still doesnt get it
>i-i swear economics is science

>Yea Forums loves consoles
>Yea Forums hates AMD
>Their favorite consoles are AMD powered

Attached: 1500870400961.jpg (600x906, 49K)

>privilege of being part of the Xbox Fanbase

fuck off

Attached: 1552713390406.jpg (584x905, 117K)

>i explained all of this including the cu and clockspeed.
No you didn't you fucking mongoloid, because the clockspeed and bandwidth are the biggest contributing factors to the performance difference.
Hence why a Vega 64 is only ~5-10% faster than a Vega 56 despite having 8 (EIGHT) additional CUs.
You're a fucking moron consolenigger that's out of your element, stick to talking about sales or some similar shit.
You're also very obviously underageb&.

>economics is science

Economics isn't science, which literally makes it worse that you don't understand this principle that literal a 18 year old high schooler understands it.

Consolefags think AMD puts some secret super special sauce in their binned SoC parts.
Mainly because they're stupid.

Those are rumors.

>because the clockspeed and bandwidth are the biggest contributing factors to the performance difference.
meanwhile in my original post >the ps4 pro and xbox one x only have a 4 CU difference between them yet that 4 extra cu in the xbox one x paired with a higher clockspeed gives the xbox a whopping 40-50% performance advantage over the ps4 pro.
>that 4 extra cu in the xbox one x paired with a higher clockspeed gives the xbox a whopping 40-50% performance advantage

neck yourself now :)

This is some old school trolling, just straight out pretending to be retarded.

>2 XBOX
Why??? Did they take the Xbox One literally so now they have to release Two?

Since you seem to be illiterate let me break it down for you.
You stated that the Vega 64 had stronger specs than the Radeon VII because it had 4 more CUs, you went on to mention the XboneX being more powerful than the PS4 Pro and mentioned that it has 4 extra CUs, implying that that's where the majority of the performance increase comes from.
I rightfully called you a fucking moron and explained that the performance increase comes from the increased clock speeds and bandwidth, and that the 4 extra CUs are largely irrelevant.
Again, you are a genuine fucking moron, have a smug loli just as an additional "fuck you" since I'm sure it'll piss you off.

Attached: 1356795887294.jpg (477x530, 75K)

Reminder that these are shill threads with empty promises and falsified information building hype. If you remember the same shit happened before the PS4 and Xbone were released as they were overhyped only to be average at best.

Attached: 130221984383.jpg (600x578, 49K)

Yes, I recall that the PS4 was originally supposed to feature an HD 7970/GTX 680 equivalent and an i5 3570k.
What we got was an HD 7870 with some ACE units slapped on and a laptop CPU.

>still proving you can't read basic english

no i said amd themselves said the same specs being shrunk from 14nm to 7nm would provide a 25% performance boost. this 7nm node amd are using isn't some miracle shit moron. reduction in core count whilst boosting clock speeds by just under 100mhz (1668 -> 1750) providing a ~15-20% performance boost is about right. educate yourself.

Ms recent track record is clean. Xbonex was everything it was promised to be. And this time big time insiders are confirming it ps5 xbox 2 roumrs

>no i said amd themselves said the same specs being shrunk from 14nm to 7nm would provide a 25% performance boost.
And I said that they lied, as AMD is apt to do.
>reduction in core count whilst boosting clock speeds by just under 100mhz (1668 -> 1750) providing a ~15-20% performance boost is about right. educate yourself.
Well first of all, it's boosted by over 200 MHz (1536 MHz boost clock to 1750 MHz boost clock), which is a 12-13% increase in clockspeed.
Secondly, since it's on a 7nm node and an improved version of Vega, it also features higher IPC than the Vega 64, which makes that clockspeed boost even more prominent.
Lastly, the Radeon VII is only about 10-15% faster than the Vega 64, and this is with all of the improvements PLUS a smaller node, the only thing it lacks is the CU count which it makes up for with its clockspeed increase.
Again, you are a fucking moron, like all console niggers, have another "fuck yourself" smug loli.

Attached: 1356857295123.jpg (1440x900, 586K)

Does it have buns? Because my anaconda don't want none if it doesn't.

Retard? do you not know the form factor of a console? There isn't much room for passive airflow like a case nor large fans that aren't noisy as fuck. And if your gpu's fans aren't kicking in playing modern games at 1440p max settings then there is something wrong with your fan curve.

>haha PC is a shit platform
>now excuse me while i load up another battle royale game, a genre invented on PC

>Well first of all, it's boosted by over 200 MHz (1536 MHz boost clock to 1750 MHz boost clock), which is a 12-13% increase in clockspeed.
lower CU count + higher clocks with small refinements gives better performance
oh wow it's what i've been saying all along! it's also what the original user said initially.

what are you even trying to argue again?

not even going to respond to the rest of your BS because it's not worth my time.

>oh wow it's what i've been saying all along! it's also what the original user said initially.
The original user said that a shrink to 7nm and some mild improvements would bring a 30% performance boost, which won't even happen in your wettest of dreams.
Sorry user, it's not my fault that you're a moron, but I am glad that the smug lolis pissed you off like I expected.
Have one last one for the road.

Attached: 1353810918681.jpg (888x535, 44K)

Lockhart
>1080p60fps
Anaconda
>4K60fps

Would make a lot of sense imo

actually yes it can happen for several reasons and if you were educated enough you'd know them. amd in consoles =/= amd on the desktop because of drivers. it's the main reason why amd does so well in vulkan based titles because there's less of a reliance on amd's single threaded drivers which are much less efficient than nvidia's and it takes amd years to optimize them to extract the full potential of their gpu. see: 1060 vs 480 on release vs now.

also i didn't even realise what you were talking about because i have thumbnails hidden to prevent spoilers for something. why would i care about cartoons? never been on Yea Forums in my life and don't watch cartoons.

Lockhart is 1440p60
Anaconda is 4k60 + raytracing

>amd in consoles =/= amd on the desktop because of drivers.
Stopped reading there.
AMD in consoles is actually largely worse than AMD in desktops since AMD parts produce fucktons of heat and end up having to murder clockspeeds in consoles in order to not melt.
But, as expected, you're another retarded consolenigger who thinks AMD puts some super special secret sauce in your binned SoC parts.

Attached: 1355019337158.png (852x371, 418K)

>AMD has a GPU to put in a Console that can do 4k/60 + ray tracing with next gen visuals

& im donald trump

once again proving how tech illiterate you are. no one is talking about fucking heat. an identical amd gpu in consoles with same clocks and everything as one on pc will always perform better on the console due to the low level optimization. amd on pc has sup par drivers and hence you have idling cores especially in dx11 games due to the single threaded driver. it's why going from dx11 to dx12 (in games with a functioning dx12 mode not dx11.3) produces huge gains on amd hardware upto 40-50% in some cases. as i said before, educate yourself before replying like a retard again.

>an identical amd gpu in consoles with same clocks and everything as one on pc will always perform better on the console due to the low level optimization.
The point is that no such GPU exists because, again, console form factors preclude high clockspeeds that are possible on desktop AMD parts (and even many laptop AMD parts).
>it's why going from dx11 to dx12 (in games with a functioning dx12 mode not dx11.3) produces huge gains on amd hardware upto 40-50%
There is not a single game where that is the case.
Not one.
And I say that as someone who owns an AMD GPU.
Again, you are a retarded consolenigger who thinks AMD is sticking some super secret sauce in your binned SoC parts.

Its not me! Im just the prophet bro. This what all the insiders are saying. Also its not just amd ms is working with them and xbox anaconda will be used as servers so they need to be strong

>This what all the insiders are saying.
Yea, just like all the insiders said that the PS4 was going to have a GTX 680.

>Xbox One X will be 4K with no compromises

Attached: 3512277-0969290310-ezgif[1].gif (1200x675, 1.2M)

>The point is that no such GPU exists because, again, console form factors preclude high clockspeeds that are possible on desktop AMD parts (and even many laptop AMD parts).
except the xbox one x managed to pack in more CU whilst having only a 94mhz clock deficiency compared to an rx 480 (the gpu it's based on). not everyone is as competent as sony.


>There is not a single game where that is the case.
no you're right i wrongfully used open gl to vulkan numbers. amd's open gl drivers are even worse than their dx11 drivers.
dx11 to dx12 in games like hitman provides a 30% performance boost on a vega 64. still massive gains and once again proving i was right about the state of amd's single threaded dx11 driver.

>hinks AMD is sticking some super secret sauce in your binned SoC parts.
where did i ever claim this? i don't even own a console you thick moron. i own an nvidia/intel pc with a 144hz monitor. jesus christ.

Incompetent*

>except the xbox one x managed to pack in more CU whilst having only a 94mhz clock deficiency compared to an rx 480 (the gpu it's based on)
The XboneX is based on neither the RX 480 nor the RX 580, considering it's literally almost twice the size of each.
Not to mention it's still based on GCN 2.0.
If anything it's a beefed up 16nm R9 390x, and even with its vapor chamber design it has lower clockspeeds than both the RX 480 and RX 580.
>dx11 to dx12 in games like hitman provides a 30% performance boost on a vega 64.
Hitman is literally one of the only examples where there's a large performance increase, in most cases the difference is minimal or DX12 actually performs worse.
>i don't even own a console you thick moron. i own an nvidia/intel pc with a 144hz monitor.
It amazes me that every time a consolefag gets called out they always swear up and down that they actually own a high end Nvidia/Intel PC.
You even brought out the "w-well optimization and low level APIs mean console GPUs perform better" when that's blatantly wrong since a low level API simply reduces CPU overhead and largely has no effect on your GPU performance unless you're running AMD and moving from OpenGL to Vulkan or you have a dogshit laptop-tier CPU like consoles do.

I'm never buying a Sony product again so I really hope Microsoft goes hard. Last E3 is a good indicator that they're starting to get it.

Xbox is irrelevant without exclusives.

>The XboneX is based on neither the RX 480 nor the RX 580, considering it's literally almost twice the size of each.
this has got to be bait. seriously. are you comparing a SoC to a discrete GPU? of course the SoC is going to be bigger fucking hell it includes the fucking cpu. and yes it is based on the rx 4/580 because it's literally a gcn 4.0 based GPU. claiming it's based on fucking grenada XT is one of the dumbest things i've ever seen. congrats.

anandtech.com/show/11992/the-xbox-one-x-review/3

>Hitman is literally one of the only examples where there's a large performance increase, in most cases the difference is minimal or DX12 actually performs worse.
and that's not because of the gpu and anyone who is educated on the subject knows this. most dx12 implementations aren't actually utilizing dx12 but are either a wrapper of some sorts (which provides worse performance) or is using dx11.3 for the async compute. you have to completely rebuild a game engine for dx12 support because dx11 and dx12 are fundamentally different.

>You even brought out the "w-well optimization and low level APIs mean console GPUs perform better" when that's blatantly wrong since a low level API simply reduces CPU overhead and largely has no effect on your GPU performance unless you're running AMD and moving from OpenGL to Vulkan
which is exactly what i fucking said. amd gpu on desktop (with shit single threaded drivers) =/= amd gpu in consoles which actually benefit from low level optimization. you're literally parroting what i said and trying to use it against me, and failed.

this is your last reply. you just proved how fucking retarded you really are especially with the 390x comment.

oh yeah and my specs. i'm not using speccy in case it shows too many identifiable specs in which case you'd be able to harass me in a speccy thread. goodnight.

Attached: 39ab43c8.png (448x218, 48K)

>The XboneX is based on neither the RX 480 nor the RX 580, considering it's literally almost twice the size of each.
>Not to mention it's still based on GCN 2.0.
>If anything it's a beefed up 16nm R9 390x, and even with its vapor chamber design it has lower clockspeeds than both the RX 480 and RX 580.
That's the Xbone you're describing here. It's bigger, because it is on TSMC's 28nm and yes, it most likely is based on a 390x, as it is GCN 2.0, but the XboneX is GCN 4, on TSMC's 16nm process and based on the 480 design.

en.wikipedia.org/wiki/Xbox_One#Xbox_One_X


>Hitman is literally one of the only examples where there's a large performance increase, in most cases the difference is minimal or DX12 actually performs worse.
That's because most games are still not made with DX12, but instead use a DX12 wrapper, that does not perform up to the task and is poorly implemented. Proper DX12/Vulkan games show an increased speed for AMD cards and even that new World War Z game is a proof of that, where in the Vulkan version, the Radeon VII outperforms the 2080TI. Hell, even the Vega64 outperforms the 2080TI

Attached: World-War-Z-1920x1080-Vulkan.png (805x935, 61K)

>you have to completely rebuild a game engine for dx12 support because dx11 and dx12 are fundamentally different.
Nigger, are you retarded? Both PS4 and Xbone also make use of high-level APIs that function as wrappers (GNMX on PS4 and a DX"12" wrapper for Xbone) so this entire fucking point is moot.
It basically boils down to "If the devs don't give a fuck about rebuilding the game to take advantage of the lower-level API, it's going to run like shit," which is the case with 90% of console games, especially nowadays.
>which is exactly what i fucking said. amd gpu on desktop (with shit single threaded drivers) =/= amd gpu in consoles which actually benefit from low level optimization.
Again, the performance increase has fuckall to do with the GPU unless it's an OpenGL game, it's not 2010 anymore, AMD's DX11 drivers are largely fine, hence why in many cases even an RX 570 is able to outperform an XboneX.

>Proper DX12/Vulkan games show an increased speed for AMD cards and even that new World War Z game is a proof of that
WWZ is a bad example since it's an example of a game with poorly-implemented Vulkan support on Nvidia hardware, hence why Nvidia performs much better in DX11 than it does using Vulkan.

Attached: Screenshot_2019-04-23 World War Z тест GPU CPU Action FPS TPS Тест GPU.png (900x1043, 101K)

Why can't consolefags just admit that they're tech illiterate?

Oh god, are you that same nigger that's always in these threads whining about your i5 being shit?

So are you going to completely ignore the Vulkan results? I mean, Nvidia's cards rarely, if ever, perform better in DX12/Vulkan scenarios than DX11. That kinda changed with the RTX cards a bit, but that Vega 64 performs better by ~30fps in Vulkan. I mean, that level of performance increase is pretty difficult to ignore.

no i haven't posted in a speccy thread in nearly a year now and the fact you think i'm someone just goes to show how right i was to avoid giving away the rest of my specs so you could harass me if i do post in one. imagine being this much of a neckbeard loner.

no, faggot, we use benchmark software or even framerate of a game with certain settings.

I can guarantee you that you will not get the performance of a $1200 nvdia card and a $500 cpu in a $600 console.

>I mean, Nvidia's cards rarely, if ever, perform better in DX12/Vulkan scenarios than DX11.
It's not 2015 anymore user, that hasn't been the case since Pascal.

Attached: Screenshot_2019-04-23 Comparing DX11 DX12 Performance using Vega 64 GTX 1080.png (1280x720, 367K)

When people say that their PC can't do 4k games, they're talking about 60 fps at max settings. Mid-range pcs can do 30 fps at medium, but nobody bothers because why the fuck would you choose 30 fps and lower settings?

>no i haven't posted in a speccy thread in nearly a year now
This isn't a speccy thread nigger, I'm specifically talking about these kinds of retarded consolewar specs circlejerk threads.
The same nigger whining about your shitty i5 being too shitty to run Battlefield trash.

1. that game isn't optimized by nvidia yet and you're using a literal who website for performance metrics

2. in absolutely no reality will a 580 ever match a 1070 when nvidia has released drivers for a game

you're deluding yourself amdrone. seek help.

Yes, it's changing. I admit that. But can you invalidate the fact that the top AMD cards gain ~30fps in Vulkan compared to DX11?

are you really telling me i'm the only person who owns an i5 in the world and thus every claim ever made by literally anyone with the same processor must be me? i don't even play battlefield you retard.

>woah, look at this great hardware that the next consoles will have that PCs won't!
>all of it is shit that's releasing a year before the consoles do
Fucking consolefags, holy shit.

Attached: DcUTJGBW4AA8yfo.jpg (640x640, 55K)

And the top Nvidia cards gain 20 FPS in DX12 compared to DX11.
They gain less than AMD because their DX11 drivers aren't as shit.

I'm not saying the 580 is a better card than a 1070. I'm saying AMD performs better in DX12/Vulkan, than it does in DX11, when that DX12/Vulkan version is properly implemented. Yes, it is poorly optimized on Nvidia hardware, no it is not a literal who website, yes the tests are repeatable, yes you can find multiple links that confirm it, I am not trying to make an AMD vs Nvidia comparison, I am making an AMD DX11 vs proper DX12/Vulkan comparison.

>that meme
Why are underage faggots allowed?

>But this one has RAY TRACING!

Attached: pathetic console ant.png (640x480, 447K)

So you agree, AMD works better in proper DX12/Vulkan than it does on DX11

What?

so they'll finally be able to hold 1080i/30 this time?

nvidia fully supports dx12/vulkan now with turing and nvidia actually gains more from it than amd because their implementation is actually superior this time.

Attached: aaab.png (779x770, 108K)

Sure, but I'm not sure what your point is besides that.

You mean 1050,1050ti and 1060ti.

>FAKE NEWS

PS4 is 16+ confirmed by dev kits, and retail console kits.

That has nothing to do with the point I am making. It has nothing to do with nvidia. I am making a comparison between AMD's performance in DX11 vs DX12/Vulkan. I am very happy, that after 7 years or so, nvidia decided to support Vulkan/DX12. I am very happy for you.

>dev kits
eteknix.com/ps4-developer-kits-spotted/
>the PS4 dev kits that game studios are currently working with pack an AMD A10 APU (a combo CPU/GPU), between 8GB and 16GB of RAM, 256GB of storage and a Blu-ray drive, and standard ethernet, HDMI and Wi-Fi capabilities.

can they just fucking announce the damn things already? fucking tired of these stupid rumours

Attached: 1402519037366.jpg (396x382, 37K)

That yes, AMD does perform better in games when DX12/Vulkan is properly implemented, compared to DX11.

Gtx1060*

No it does not. Ps4 pro can't even 60fps sekiro while my g5x1080 is constantly 60fps.

>I am very happy, that after 7 years or so, nvidia decided to support Vulkan/DX12
nigga nvidia supported async compute in hardware up till the 700 series they only stripped it out with the 900 and 1000 series because barely any games required async compute and stripping it out allowed massive efficiency gains. even today barely any games support dx12/vulkan. i can't even name 10 vulkan games off the top of my head in the last year yet i can name 10 dx11 games.

Pc dorks dont know what they are talking about. These idiots are people who build 1200$ midrange pcs and complain about consoles that can achieve the same with 1/3 of he price. The boast about playing games yet they dont. They ruin threads that have nothing to do wih them with thier dry sense of humor, rest of the time they are busy being triggered by petty shit. Ruined gaming with simpler gameplay easier levels and causal sjw and anti sjw bs and they have the audacity to turn around and blame console gamers.

It's prolly gonna be FP16 Tflops, you gullible faggots. Ever heard of marketing?

>durr consoles are gonna sport a TitanX lol

Attached: 1528651695146.png (625x773, 143K)

See
>Hitman is literally one of the only examples where there's a large performance increase, in most cases the difference is minimal or DX12 actually performs worse.

Yea I'm totally gonna get rid of my PC to downgrade 100+fps to the console standard.
Literally 100 fps higher than console plebs are even hoping for.
Disgusting

Scaled up 1080p at 30fps? Retard. Stop believing sony lies.

Shh consolecuck, don't you have some pronouns to memorize.

>$500 console as strong as a 2080ti
Don't you console peasants ever get tired of this?

Right.
What have you shown to disprove that?
I can't think of a single other DX11 game with a DX12 mode that vastly increases performance.

Op is lying.

That's because games are developed for consoles and then ported to PC, using DX11 instead of Vulkan or DX12. Which will probably change come PS5. Then you will start seeing a lot more DX12 titles, with proper implementation of the API. Just like, before the PS4 and XBone, we had mostly DX10 games and even still DX9 games, for a long time.

>AMD

Attached: 1530463019147.png (611x587, 19K)

>teraflops

biggest meme word this generation.
xboneX proved that having more tflops didnt mean shit since all their games barely looked any better and barely performed any better than what was seen on ps4.

>i can't even name 10 vulkan games off the top of my head in the last year yet i can name 10 dx11 games.
Wolfenstein 2
Strange Brigade
Sniper Elite 4
World War Z
Rage 2 (soon)
Doom Eternal (soon)
X4 Foundations
Total War Thrones of Britannia
F1 2017

Are you kidding me? No game in which the AMD hardware we were talking about performs better in DX12/Vulkan, other than Hitman? I mean, I just showed proving it.

>That's because games are developed for consoles and then ported to PC, using DX11 instead of Vulkan or DX12.
Other way around, games are ported from PC to consoles porting from DX11 to the high level wrappers used on consoles.
It wouldn't make any sense to create a console version running on a low-level API and then port it to PC using DX11, especially when you could just use Vulkan or DX12 as a wrapper and achieve better results.

The purpose of core count and clock speed is to increase operations per second.

Its fake marketing, they are fp16 teraflops

Note that I specifically said DX12, not Vulkan.

Just buy a PC holy shit.

>60fps stable
60fps could have been achieved on EVERY console.
the devs will keep pushing for graphics so console games will continue to be 30fps

Attached: wf2.jpg (300x150, 9K)

Attached: 1459348772511.png (473x561, 229K)

>RTX 2080Ti GPU performance
Impossible

Have you seen xbonex performance? Or are you blind?

>games is everything!
tell that to the wii u

So you are saying that games are developed with DX11 in mind.

>release early 2020
>Not Holiday 2020
Snoy being Snoy

Have you?

Attached: Next next gen 34.jpg (2049x1232, 575K)

>POWER IS EVERYTHING
if you were thinking like this you'd buy atleast a PC .

>60fps
LOL IS IT 2008??

NO NO NO DELETE THIS

When you push a fat tranny over

But that is not the argument I was making. So if it wasn't the argument you were making, why argue? I set a very specific parameter right up front. It is not like I tried to deceive you. You could have just said "if you take Vulkan into consideration, then yes".

Nothing wrong with 60 fps as long as it's constant at 4k with maxed out graphics and meme tracing.

a vega 64 is only ~1 tflop less than a 2080 ti. don't compare amd compute to nvidia. nvidia cards are way more efficient. a vega 64 is nearly 13 tflop yet only competes with a gtx 1080 which is ~9 tflop.

Yeah theres 84 things wrong with 60fps at a minimum.
>4k
lmfao

>horizontal resolution
console dumbass

>1 game

meanwhile the one x averages a 50% resolution advantage over the pro sometimes ranging into the 90% in games like bf1.

Attached: piss4pro.png (1920x1080, 3.55M)

The original argument was that consoles have better performance because they use a low-level API (which isn't true, since both consoles have their own high-level wrapper that most multiplats use), and that this is because AMD's DX11 drivers suck.
I pointed out that most DX12 (and even a lot of Vulkan games) don't see increased GPU performance compared to DX11, most of the time it just alleviates a CPU bottleneck.

>The original argument
I presented a counter argument, to a specific subset of your argument. You disagreed.

Real life kino comment right here

game

Attached: Next next gen 23.jpg (2048x1152, 502K)

Issue is with the game self not the hardware. Game was rushed and performs lukwarm everywhere

Microsoft net income is 4x of snoy

Uhh no, that's a telltale example of a CPU bottleneck family.

Attached: DYCV43d.jpg (2960x1440, 346K)

>4k vs 1440p
>near identical performance despite ~50% higher resolution

yikes the pro is absolute trash. if they were both the same resolution you'd have an argument there. also as time has progressed xbox one x versions have progressively become better performing as well as higher res in most games.

>Has to cherry pick when the engine is at max load
>Gun on the Pro has more details

Embarrassing for you and the Xbonex desu

actually 62% higher resolution according to DF. my bad. that's assuming the pro isn't using a dynamic res in corvega factory which would widen the gap even further.

>He has to slow it down by half to make a scruffy beard look slightly less scruffy

This isn't having the impact you think it's having.

>that graph

Attached: pepe.png (334x346, 120K)

Does somebody have the image of that mgsv soldier with the cone helmet on xbox? I need a good laugh

Attached: 1542003950439.gif (1000x700, 28K)

what the fuck are you even looking at?

Attached: metro.jpg (765x968, 481K)

this entire debate is stupid. a few fps difference is not going to make or break the experience of a great game. all this bitching and whining about the technical side of gaming is so pointless. yes graphics matter to an extent, but at the end of the day video games are about the game. having extra pixels, or shadows that you'll never even notice being slightly more defined, or water being a little more reflective doesn't improve a game experience at all. only autistic retards care about this shit. specfags need to be euthanized.

Attached: 57FB9234-73E0-486A-9D87-5070E51A9CDA.jpg (600x600, 66K)

>Next gen consoles tflops
more like next gen console flops

you should use this one instead. shows how terrible the pro version is.

Attached: DF.jpg (770x970, 440K)

>1080 ti
>Rtx 2080 ti
what an absolute retard

Attached: Gtspain+gtelf+entirely+accurate+french+salt+_6ba3a259b0e4368043945291b77ee9ed.jpg (912x905, 96K)

>Zen 2 and motherfucking RTX 2080Ti GPU performance

yeah sure, marketing is a hell of a drug.

vega 64 already has over 13 tflops and doesnt beat a 1080ti so u cant compare amd to nvidia

Oh look its the nigger who thinks 1080p and 4k make no difference

my PC shits on this.

>He has to zoom in that hard

So this...is the power of Xbox...One...X...wow....

unless you're sitting half way across the street and using binoculars to play your games, this difference in clarity is easy to see with eye. we're talking about a 77% difference in resolution there. that's the same as a game being 1080p on one console and fucking 248p on another.

>Can't show those "demonstrable differences" Uncle Phil promised outside of a few zoomed in screenshots

That'll be $499 please, Sir! We'll through in the Xcuck prescription lenses for free. :)

Worked fine for PS2 and Switch

imagine not knowing how to spell throw

and stay mad

What good is making a powerful system if there's nothing to use it for? I have yet to buy a console for this generation and I'm still on PS3 and Wii but if I had to choose a system today I'd easily go Sony or Nintendo again.

Don't talk about the Switch, you'll make the Xbox shills angry because they'll remember they didn't understand why the Xbone got blown out in the first place.

>comparing APU tflops to GPU tflops and thinking they are a good measurement of performance

You're all actually braindead

nintendo needs to get on that switch pro asap so i can buy it alongside the nextbox. it's going to be the sweetest combo again like the wii60 hopefully.

Consolefags are actually delusional. It will be another 5+ years of fake 4k and sub 30 frames gameplay with slightly shinier graphics. Too bad I'm an idiort and will have to buy it

>Goes for the easy deflection
Classic. Stay desperate.

i'm not the one who somehow can't visually see the difference between a nearly 80% resolution change. cope.

>Comparing Teraflops between different architecftures.
Just the kind of retardation you would expect of a Xbox shill

Xbox one X is almost exactly a RX 580. Not a "gtx 1060++".
PS5 and Xbox Anaconda are rumored to be based on the same Navi 10 design so a 50% performance difference is extremely unlikely.

Attached: photo-52578.jpg (300x300, 14K)

>switch pro
Only If Nvidia is capable of make orin sub 10w

>xfag trying to leech off nintendo again
"wii60" was never a thing outside of xbox marketing hopes, so we know where you're from

>Thinking APU TFLOPS are different than GPU TFLOPS

tflops? more like FLOPS lol.

>Lockhart weaker than Xbox One X
Are you retarded OP?

>wanting to buy a console is leeching off them

imagine being this asshurt. the fact you're taking issue with xbox and nintendo in the same sentence tells us all where you're from. early start to the day over at sony HQ?

Microsoft probably is. Look at the SAD.

Shouldn't every game running in native 4k30fps be able to run at 1080p60fps as easy as flipping a switch as long as a dev bothers including that option? Isn't it possible for such an option to be on the firmware side as long as the game's framerate isn't locked?

Attached: A164F284-7FE5-40BA-8702-2814608A628F.jpg (636x920, 99K)

>Probably

>Xbox 1 x - 6.5 tflops
>Gtx 1060 - 4.4 tflops
>still way better
Console turds are absolute retards

Yeah, good point.
>Definitely

>comparing amd to nvidia

and you just outed yourself as being 10x more retarded

who are you quoting?

It's too bad there will never be a console generation like the 5-6th again, where the hardware was more proprietary and specifically tailored towards rendering games. The original Xbox was an early warning sign of what consoles would become, for better or worse. At least Nintendo have consistently tried to be innovative in one way or another.

Console costs are backloaded, while PC costs are frontloaded. They get your money on consoles from expensive game sales. PC, on the other hand, gets 80-95% off Steam sales every few months, so sees a large cost up front with cheap games afterward.
And, of course, all the perks that come with a PC, like browsing the internet, having a mouse/keyboard setup, being able to upgrade piecemeal, mods, reading books, emulating consoles, downloading things, and so on.

>fable 2
Red pill me on this, m8

all the console companies have been using generic parts from amd/nvidia with their own small tweaks and enhancements for ages because it's simply cheaper. there would be no reason for sony or microsoft to try and develop their own hardware. even the switch is just an nvidia shield with a built in screen.

>expensive game sales

I just checked and Sekiro is the same price on Steam and Xbox ($60 USD). Where does this stupid meme come from? Do you PCturds even constantly check console game prices? Of course you don't. And with console games you actually get a physical copy which you can trade or resell after you beat the game

>

It's pretty funny Console gamers compare Console power to Nvidia cards, even they know AMD cards are crap compared to Nvidia, but they don't want to admit Consoles use AMD hardware & act like it's some special version of "Navi"

>GUYS PS5 IS GONNA BE LIKE 2080 TI IT HAS 16 TFLOPS
>WHY ARE YOU COMPARING AMD TO GEFORCE?

seriously, I can't tell if you're being ironic at this point or you're actually retarded

The copy-pasters at Playground are doing it. If they can do the bare minimum they usually do and straight rip-off BOTW it might be OK.

Actually it's because GCN and most of AMDs Microarchitecture is much better set up for Asynchronous compute which DX12 utilises heavily and Nvidia only really properly focused on with their 9xx series and 1xxx series cards, and even then to a moderate degree. Ironically their Fermi (4xx and 5xx series) cards actually had hardware builtin that would have made them capable of running DX12-like features, as their particular architecture was heavily based around what would become asynchronous compute but it was very power heavy, hence the old memes that the gtx 480 etc. being so power hungry. AMD cards usually excel at DX12 because their GCN and Vega gpu architecture was designed around Mantle API (which DX12 largely copied a lot of elements from).

Yeah I know, Nintendo use as much off-the-shelf parts as Sony and MS, but they utilize the hardware to do some more interesting things than either of those combined. Their experiments fail as often as they succeed, but they try to offer something unique every new generation.

There isn't a sale happening at the moment, and that's a new game, you retard. I specifically said
>sales every few months

Attached: SmuggyBowsette.png (577x651, 162K)

No because the AMD CPU in current gen consoles is extremely trash and bottlenecks the whole system

>I-its n-not amd fault

>And, of course, all the perks that come with a PC, like browsing the internet

Just... stop posting ITT for your own good user

Attached: 1556006977496-1483143936.jpg (3264x1836, 1.49M)

and yet amd is abandoning the push for compute with navi which is supposed to all be about pixel pushing power. GCN will end with navi according to most of the rumours.

based

>browsing with a controller

Attached: 1449416603079.jpg (250x250, 9K)

Amd consoles =/= amd pc
Xbonex proved itself to being better than 1060 by a mile, even with shit cpu. Give 1060 jaguar cpu and see if it can play games even.

>sales
There are sales on consoles as well
Look, AC7 is cheaper on Xbone than on Steam which charges full price (80 CAD)

Attached: 15560073009951181124569.jpg (3264x1836, 1.34M)

The system targets 1080p60 and whole architecture is much superior to xbonex. Games will look and perform better than xbonex but at lower resolution.

>Xbonex proved itself to being better than 1060 by a mile
except that it gets its ass kicked in every single game

youtube.com/watch?v=dGa_7Ds13Ls

>with GOLD

>with gold
Nice try

fucking get on, you bitches

Attached: 1528908786345.jpg (1920x1080, 838K)

How can you be this retarded

well, I can't redpill you, it isn't on pc, only fable 1 and 3

I can buy physcial games at half the price only few days after thier release, and consoles get much better sales than pc nowdays.

ps4 pro is much weaker than a xbox one X

>physcial games
That have to be installed to the HDD and updated with day-one patches anyway.

>implying
2080ti already gets btfo by 2yo vega lmao.

Attached: file.png (805x935, 479K)

If multiplats are gonna have to cater to the lowest power system anyway, and both consoles will be 4k standard, what's the point of being the more powerful console?

>World War Z
literal who game

If literal who's can optimize the game for amd hardware to the point that 400$ gpu btfo 1200$ gpu then what do you think happens when sony makes games for its ps5?

Better performance, what else would it be? Same as how console ports to PC work.

are you pretending to be retarded?

>Eating shit is the same as eating a fine steak
Spoken like a true peasant

>Vulkan

Attached: karen-meangirls.jpg (968x681, 32K)

The jew fears the Vulkan, it reminds him too much of an oven.

Attached: 1510404972940.jpg (1920x1080, 409K)

yikes that pic made by 12yo clearly proved me wrong.

there are a shitload amount of Vulkan games now and none of them run any better on AMD gpus, this one pos game is an exception

Yep sure did kike

Now fuck off back to resetera

That doesnt even make any sense. Are you just an ai learning how to post?

Are you learning how to be a tranny?

whats the point of the lockheart? they too greedy to start selling xbone x for cheap?

>implygni that they won't make it more pretty instead of ever increasing the framerate
Eyecandy sells to normies, and normies are majority by definition.

>RTX 2080ti
>a GPU that costs over 1000 dollars alone
>in a 499 console
nigger are you fucking stupid

inb4 all games run at 30-45fps anyways
why can't console developers optimise their damn games, they have pretty decent hardware to work with even right now

>Xfags have to pay for the privilege to get sales

Attached: gjYwB0B.png (680x629, 1.5M)

My fuckin 970 can run Sekiro at 1080 60fps

my fucking 960 dips to 50

graphics > fps for most people.

I don't doubt this will be a big leap in performance, not hard considering current hardware, but expecting a 2080Ti horsepower in a $500 box is just delusional.
If anything the most big leap will be the CPU, coming from jaguar to ryzen 2 is a revolution.
GPU side won't be that much obviously, you need to take into account the form factor and heat dissipation, you won't get much more than a GTX1070/RX590 equivalent.
Overall, they will probably kick the shit of poorfags PC builds but enthusiasts have nothing to fear.