Is he right?

Is he right?

Attached: carmack.png (602x317, 54K)

Other urls found in this thread:

techpowerup.com/gpu-specs/playstation-4-gpu.c2085
techpowerup.com/gpu-specs/xbox-one-gpu.c2086
techpowerup.com/gpu-specs/radeon-r9-380.c2734
youtu.be/rlTqMFRqzCs
youtube.com/watch?v=H7OP0g3Arzw
twitter.com/NSFWRedditVideo

No.

No.

>same specs have different performance

Attached: 1354268624153.jpg (613x533, 107K)

Key words are "for the same given paper spec"

he means optimizing for the same preset hardware is easier, but true performance stays the same
PC = bruteforce
console = iterative optimisation

No. John Carmack is a retarded hack. Nigga is so fucking out of touch. He claims story in video games is as unimportant as in porn, but meanwhile the most popular games tend to have a focus on story.

He's right about console vs PC and anyone can test this for themselves. Build a PC with equivalent specs of a PS4 and try to run a multiplat game. Lack of optimization and the common shitty pc port will perform MUCH worse than the ps4.

This is obvious but forbidden knowledge among pckeks.

>twitter screencap thread

Attached: 74564589.jpg (593x595, 45K)

I wonder who would know better, a genius or the bunch of retards on Yea Forums
Also you have no idea how powerful console APIs are compared to PC.

that's a 1993 opinion

Yes. You arent smarter than carmack.

>le wojak face
you have to go back

Is Carmack retarded?

Yes, but there's a caveat here.
A console with the same specs as a PC with the same specs as a mobile phone would not be something ever taken to market realistically. A Console with the spec of a PC of the same generation would result in a 599 US dollars scenario, maybe even worse. You can easily buy the hardware to make a console as powerful as a PC, no one wants to do it because that's how you get the PS3.
A PC with the specs of a mobile phone, you could totally do that, but what would this PC look like? Would it even be a gaming PC at this point? I bought a Raspi 3 that has every bit the same, if not better, performance of my mobile phone. To everyone else, it's a PC.
A mobile phone with the specs of both of them probably wouldn't be much of a mobile phone at all. Or the other two wouldn't be much of gaming machines if that were the case.

It's a "Pound of Feathers vs a Pound of Lead" word problem. Yes, they weigh the same, but if you were to actually see a visual representation of what that would look like, your immediate response would be "Oh, of course."

has this guy ever done anything worthwhile except "living and breathing in a minecraft world"?

Wojak is literally no better.

still true in 2019 tho

>it's another "consoles are easier to optimize so they're better" episode
Yawn

Not really, no

Yes, though he obviously doesn't go into detail.
When you have one specific set of hardware to work on, you can use all sorts of tricks to squeeze out that last bit of performance you'd want.
Now if you have some other hardware with equivalent specs on paper, these optimization tricks obviously won't work for all types of hardware, giving off worse performance. If it's really twice the performance depends on the software, of course.
Mobile stuff is often limited by the heat factor which is a nonissue for PCs, which is obviously something that hampers performance.

Somewhat, mainly because crossplatform games are optimized for consoles and then ported to PC and sloppy ports can run like absolute shit on PC
if the opposite was true and games were optimized for PC and then ported to console then you'd see great performance on PC and abysmal performance on consoles because of their limited specs

Who cares how good your console is if developers don't optimize their games themselves.

They objectively are for game developers. It's just by the time a console goes to market; the PC market has already outpaced it.

I don't know where people are getting "brute forcing" from, all of the hardware in a PC is compatible with one another. The difference is that console doesn't have the overhead that a PC has to have in order for it to remain a PC. A "Shitty" PC can't multitask two or three different games while browsing the internet and chatting on discord. A Good Console can play a single game while chatting in a platform-specific messenger.

Yes, I actually have fun and play games on my console compared to my PC.

dunno about that, but Carmack changed his opinion as witnessed by Quake 2, Doom 3 and Rage having more story than a text scroll

He's probably saying console games are more optimized because the developers can target a very specific hardware configuration.

No, he's leaving out so much about it that a correct statement ends up entirely wrong. The same specs have the same power everywhere, the only difference is with a static hardware configuration developers can tune the game to it much much better. However probably only 10-15% can actually be gained lossless, all other optimizations mean removing something. Saying it performs twice as well is false, because to perform twice as well it had to optimize away a lot of stuff.

Usually the first things to take a hit are areas you don't usually see, textures developers deem to not be primary (won't be close to the camera), and view distance (especially when the game is expected to run at barely 1080p on a large TV meters away from the user). You can see all of this in console games, they do run the same game as PC on much shittier hardware on settings better than what you can get with just fiddling with .inio files, but they don't do so without big losses.

Attached: 1563954824516.gif (335x500, 2.61M)

He's be right if everything wasn't multiplat these days. It's clearly observable to anyone with eyes and a functioning brain that PC always looks and runs better, albeit you occasionally need to do some finagling with settings and drivers and what not.

"perf" as in Carmacks/second, aka my ass unit, aka top fps in muh specific game, aka BS
The only take away from that claim is that consoles overhead was inferior back in 2014... now that stopped being the truth a while ago, thanks to anti-piracy measures, social features... current year consoles are PCs without an antivirus and a desktop running
hint: have a PC for gaming and another for working

Attached: 1557218594519.jpg (400x391, 41K)

>he's leaving out so much
well to be fair he is limited how much he can say on a single twitter post

no, he's old and completely out of touch.

More so in the past, games are more generalized now for multiplat releases, though you can definitely squeeze out much more if you're targeting a completely uniform hardware spec.

>meanwhile the most popular games tend to have a focus on story.
whats the amazing Pulitzer prize, Newbery Medal winning story of Fortnight? Breath of the wild? Pokemon? Street Fighter? Minecraft? Carmack is mostly right even still, but there are now games with great stories, but it is still not nearly as important as the gameplay, graphics and sound.

Story is the least important element to a video game.
Great games can have bad stories.

Yeah everybody plays tetris, minecraft and now fortnite for the lore

A PC with the same terrible specs as a PS4 wouldn't be able to run anything, the fact that they're managing to run somewhat decent games on such a shitty laptop is impressive.

john carmack isn't a game designer. he sucks total ass at the actual things people play games for. his opinions on level design or game balance or story or whatever else is about as relevant as a doctor's opinion on paleontology.

so why cant consoles do 1080p 60fps for most games? the ps4 pro has like an rx570 right

name ONE game that has a good story.
dumping lore is not story, well animated cutscenes are not story

Carmack is a tech guy, he doesn't care about your gay ERP fanfic. He just wants performance.

>Tetris
Lol
Everyone plays the the other 49 of the top 50 games for the story though

Eternal Darkness
>n-no it's bad
You asked for an opinion. There's an opinion.

@476174981
>for the story though
such a bad bait.

>dumping lore is not story, well animated cutscenes are not story
Yes it fucking is, you don't get to decide what counts as story

Too bad consoles haven't had the same paper spec since super famicom contards

Carmack the little hack

Good thing he fucked off to Facebook now where he continues to be irrelevant

that's correct though?

His engines were a collage of hacks, there's nothing in it that even qualifies him as a 'genius'.

old man that doesn't realize consoles and pcs are now on the same architecture

>Build a PC with equivalent specs of a PS4 and try to run a multiplat game.
I have 2 PCs
One which is slightly more powerful than a PS4 Pro cause it has separated RAM and VRAM while the PS4 Pro has 8GB shared between regular RAM and VRAM:
FX 8350
RX 480 8GB RAM
this quite literally kills the PS4 Pro, 1080p 60 FPS high or highest(some game even on ultra) settings on all multiplats.
The other PC, which is, in a way, more similar to a PS4 Pro since it uses an APU and shared RRAM/VRAM:
Ryzen 3 2200G
8GB RAM, 2GB shared on the Vega 8 iGPU
No dedicated GPU
This too performs better than a PS4 Pro, almost all games run 1080p 60 FPS with mid-high settings(still higher than PS4 Pro graphics settings), some games struggle to maintain stable 60, but switching to 900p fixes that.
The only exceptions so far are Shadow Of the Tomb Raider which runs no higher than 45-52 fps on mid settings, and Monster Hunter World which runs like shit on mid-high settings and can only run decently on mid settings if the fps are locked at 30.

So yeah, a pc with "same" specs of a PS4 Pro does not run worse than a PS4 Pro, at the very worst it has the exact same performance.

No. The only thing consoles have over PCs is games can be optimised more efficiently.

That's exactly what he's saying.

i'm sorry i have to be the one to tell you this user.. witcher and elders scrolls do not have good stories. they may have interesting settings and lore, but the stories are bargain bin tier.

And even then, he's wrong, read

No actually you are the one who is wrong.

What the fuck is he even trying to say? The same exact chips are used in all of those devices, they're just each designed for different thermal loads.

He is right. It might be closer to 60% performance boost rather than twice the performance.

No, just that there's less compatibility layer tomfoolery and less OS overhead.

>PC with slightly better specs than a PS4 Pro
>PC with "same" specs as a PS4 Pro
>both performs better on 99% of all multiplats
legit question, how am i wrong?

with the way console OS are now with that menu running in the background all the time I doubt the less overhead is even true anymore

You're wrong for thinking you know better than an actual genius like Carmack.

Because the pretense is that they are optimized for the hardware. multiplats are never quite that optimized especially not compared to things such as Naughty Dog titles or end of generation first party titles in general.

Consoles used to be some real shit. Custom made hardware for specific purposes. No or very minimal OS. The sky is the limit when programming.
Nu-consoles are just locked-down PCs with all of the OS and background process bloat you can expect. High level of abstraction. Used to be assembler games, now you get java shit.

literally who is this retard and how can anyone have an opinion this wrong

So he's purely talking about software? If that's the case, then he got mobile and pc mixed up because mobile operating systems are far more optimized than desktop.

Yes

Attached: 1564929680050.jpg (2048x8704, 2.19M)

what makes him a genius? that algorithm he stole and claimed he came up with?

>muh SJW tranny story reader

The comparison is retarded because you can't get the same specs in the first place. Basically all modern mobile devices are based on the ARM architecture while desktops and consoles are almost exclusively x86-64.
I sure hope people don't still believe "this processor has 4 cores and runs at 3.0ghz so it must perform the same as every other processor with 4 cores running at 3.0ghz" in 2019.

>put disc in
>play game
Was this image made in 2002?

He was back in 2007 when consoles weren't basic off-the-shelf x86 PC parts.
Not so much anymore.

Difference is that a PC with twice the console specs is considered entry level.

His pretence being wrong(even if it should be true by logic) doesnt make the facts i show wrong.
Its not even a matter of opinions, both my PCs are one the same level of a PS4 Pro, and they both perform a lot better than it.
If an actual comparison was not possible, a debate would make sense, but since it is possible and both my PCs(and many of other anons who dont have higher end builds) are on the same specs of a PS4 Pro and perform better than it as a fact, I don't see how there's even room for opinions on it.

He was before consoles just became small PCs with general purpose laptop hardware.

Name ONE book that has a good story
And no, dumping tons of description and dialogue is not story

Attached: 1559513220523.jpg (449x546, 37K)

>2014
he was, but thats no longer the case

It was still wrong back then, some o the best selling games were adventure games.

He is right and Yea Forums is full of retards who talk out of their ass. Forza horizon 3 runs like wet dogshit on pc with considerably better hardware. Optimization is important

They still aren't.
Xbox One and PS4 have had double rate FP16 since launch, Only just now with Vega/Navi and Turing is PC getting that.

Depends on the game.

>Xbox One and PS4 have had double rate FP16 since launch
Oh god, here we go again.
Time for the tech illiterates to start flooding the thread.

The PS4 has a literal tablet CPU running at 2GHZ. Not an FX 8350.

>ad hominem

pcfags just cant cope with the fact our hardware is more powerful

Not really, FP16 has been around on PC since GCN 3.0, which was around 2015, and the PS4/XB1 do not have FP16 support.
Educate yourself:
techpowerup.com/gpu-specs/playstation-4-gpu.c2085
techpowerup.com/gpu-specs/xbox-one-gpu.c2086
techpowerup.com/gpu-specs/radeon-r9-380.c2734
This is what I mean when I call you tech illiterate.

PS4 Pro is faster than that FX abomination.

What was the Og Xbox? Wasn’t it x86

Yeah you, nigger. Are you saying a laptop with the same spec as a PC, will run the same as it?

The CPU?
Not at all, the PS4 Pro is still using the same shitty Jaguar CPU as the PS4, just overclocked.
And the GPU in the PS4 Pro is literally about on par with an RX 470.

>Techpowerup is a valid source of information
lol.
Xobx One and PS4 both have FP16.

"They do because I say so"

Oh cool, where's your source then?
I'll wait.
I look forward to whatever blog post you cite.

Google is my source, I'm not here to spoonfeed trolls.

Oh, okay, so you're just a tech illiterate talking out of his ass.
Filtered :^)

They are built under the same microarch.
Jaguar is a ULV clone of Carrizo which is the APU line of Bulldozer.
They are all Excavator cores.

Attached: 12.jpg (640x426, 133K)

Oh for fuck's sake, user.

Attached: 1547880410858.png (847x476, 284K)

Good thing Consoles are mobile parts then.

isn't today's console cpus made for mobile?

>2014

this hasn't been true for years now outside of rare cases. with modern low level API like vulkan and dx12 any pc with equivalent hardware to a console will perform identically or even exceed the console performance. the only time you might see console performance better is if the game is very demanding on hardware/shit port or if the devs specifically optimise for the console by using settings which aren't even available on pc like with dying light where the ps4 is using settings lower than the lowest setting on pc. most devs don't do that though and usually the console settings and pc settings can be matched up equally.

Attached: 1423151039237.gif (580x578, 474K)

no

>optimizing
nope

Attached: EXCLUSIVEforPS4knackfps.webm (1280x720, 2.81M)

lmao

Attached: Mark Cerny PS4 game.webm (1920x1080, 1.26M)

No, he is not.

Attached: Next next gen 38.jpg (3838x2157, 793K)

Like this?

Attached: KillzoneSF_draw_distance.gif (285x340, 1.88M)

>console: upscaled 720p running at

he's right in the context of gamedev where you have a limited amount of time to optimise for infinite PC configurations

Carmack is right. Imagine if you'd built a PC back in 2013 with the same specs as the PS4. There's no way you could play titles that are coming out today at the same framerate/level of fidelity as a PS4. First party titles like GoW even more so.

>38
obsessed

oh no no no no

Attached: 1566978984141.webm (720x480, 1.51M)

This doesn't happen anymore either ever since devs no longer code their own game engines but just use UE4/Game Maker/Unity/GoDot.

>First party titles like GoW even more so.
nope see

Ok, but you can have a great game with great gameplay and a shit story (Super Mario Galaxy, TF2, Factorio), but there's no such thing as a game with a great story and shit gameplay. Maybe VNs like Steins;Gate if you count them as games.

those are launch titles

Yes, he's saying that on paper consoles are better than PCs. That's because they lie about consoles specs. Like Sony saying the PS5 surpasses PCs up to 2023 or something like that.

>non-comparison webm
what are you trying to prove here?

lmao. Those games have been developed for more than 4 years on PS4 devkit.

The creator of PS4 made the game for PS4 only, it is Knack.

>First party titles like GoW even more so.
bruh god of war isn't technically impressive at all. it's your run of the mill hack and slash except the whole game is a series of extremely tight corridors and closed spaces in general. rarely do you actually have a vast open area and in literally none of those areas can you actually run around freely in. there's always some limitation to keep it tight like surrounded by water and shit.

You don't understand, bro. The PS3- I mean, the PS4- I mean, the PS5 is going to play games at 120fps! And at 4k too!!!

There is no such thing as the optimization on the console.

Your thoughts would be less retarded if you stopped assuming all three devices had to come out in the same year.

and look at those compared to games made late in the PS4's lifecycle. That's my point, same hardware, but devs have had time to learn it and optimize for it.

He's saying that since there isn't an OS running all sorts of shit on the console it should perform better than an equivalent PC build.
It's why people wanted Linux ports of games because of SHOCKING DISCOVERY at how much better a game can run in Linux compared to Windows OS when you don't have two dozen background processes all quietly hampering performance.

Yeah, it's about the APIs you're using and the amount of optimisation you can achieve by using them.

then explain why bloodborn runs like shit on PS4. see

Yes
Theorically the only thing a console has to do is play Vidya
A PC has to do all sort of stuff, even if it looks it's doing nothing
What makes the PC have better performance is that with consoles you are stuck with the same hardware for 5-10 years, with a PC you can upgrade whathever you feel like it and the old stuff doesn't magically stops to work*
Consoles are cheaper and easier to set up
Now kill yourselfs, you consolewarring faggots
*You still need to do some research to play win98 or even winXP stuff because Microsoft stopped supporting older DirectX versions or dumb drms like safedisc, for older titles Linux is strangely easier to set up

Attached: 8186907_f520.jpg (520x579, 49K)

>There's no way you could play titles that are coming out today at the same framerate/level of fidelity as a PS4.
youtu.be/rlTqMFRqzCs
This video is about on par with the PS4 in terms of GPU specs (not so much CPU since it's borderline impossible to buy a CPU for a desktop that's as bad as the shit in the PS4) and it runs the game just as well if not a decent amount better.

That's fromsoft giving the game the genuine "souls" experience

>and look at those compared to games made late in the PS4's lifecycle. That's my point, same hardware, but devs have had time to learn it and optimize for it.
ok

Attached: consolecuck.png (1920x1080, 1.64M)

Only a weebshitter could have such a horrible, basic take on system optimization.

Optimization is dead. Publish, or perish.

This is what the absolutely delusional sonygggers believe in.

Attached: ps4s japanese support.webm (960x540, 2.96M)

did him and joe rogan 1v1 in quake ?

He's absolutely right, faggot. If you play games primarily for story then you need to be ejected from the entire hobby.

Well, I'm sorry if I've saied bullshit
Can you please say why I'm wrong?

Have you tried running any games at 4k on your supposed PS4 Pro style rig? The Pro easily gets 60 at 1080p (see Infamous Second Son), but multiplat games dont take advantage of the Pro hardware and Sony have said that they don't want people with normal 30fps PS4s playing against Pros with 60fps online. You'll never actually be able to properly compare a PS4 (Pro or otherwise) with a PC with similar specs because it's only first and second party (ie exclusives) that fully take advantage of the native hardware. Look at the technology behind GOW 2018 and Horzion Zero Dawn, along with Spiderman, and you'll see the specific way they cater to the limited RAM with streaming technology and trickery designed for the console, whereas if it were a port then they'd just have load times, less detail, or smaller draw distance. You're trying to compare apples and oranges.

...

Yes.

Numbers are numbers, idiot.

>Theorically the only thing a console has to do is play Vidya
Literally just wrong. Please stop assuming that the only function consoles have to upkeep is running the game process. Any system with the ability to do so has to run much more.

The parts he's referring to are the busses and pathways that run between the core components. Which are, of course, part of the specs. Or simply that it's easier to optimize software for a single configuration. So no, he's completely wrong. He just excluding part of the specs. All things being equal, all things ARE equal.

>8 core AMD/Intel cpu clocked in 3.0 ghz will run the same

based retard

>Look at the technology behind GOW 2018 and Horzion Zero Dawn, along with Spiderman, and you'll see the specific way they cater to the limited RAM with streaming technology and trickery designed for the console, whereas if it were a port then they'd just have load times, less detail, or smaller draw distance
Nah, all PS4 games (including the exclusive ones) have gone downgraded heavily.

Attached: 1400911045355.gif (1280x2212, 932K)

Because it has the same shitty mobile CPU that struggles ro run 900/1080p games at 30fps

pretty much this

Attached: Sony.jpg (1500x760, 223K)

Wow, consolefags actually are genuinely delusional.
Tell me user, does Days Gone do this?
How about Drawn to Death?
Oooh, or what about The Last Guardian?

The first shot obviously isn't from a PC with specs anything like the PS4, just the early demo from a much stronger computer that they'd used to develop the game. The lower shot is also obviously from a YouTube video and neither picture is at a high enough res to really be able to tell what the level of detail and resolution really is.

Grim Fandango

Attached: PS4 eye problems.jpg (1552x2000, 825K)

devs crank settings up too high and don't let you put them at sensible settings for the hardware

then what about this lmao

Attached: Unreal Engine 4 PS4 No SVOGI vs PC SVOGI.jpg (849x470, 117K)

Attached: 1418406079354.png (829x989, 673K)

I have a FX 8350 with a GTX 960 and 8GB RAM and I can't even run Dark Souls 3 at high settings 1080p

I can run DS3 pretty well with an RX580 and FX-8320.

>I’m smarter regarding computer hardware than John Carmack, trust me

That's because you have a shitty 960.
If you're not going for the absolute highest end hardware, you go AMD.
Otherwise, enjoy your trash performance.

Attached: Dark Souls 3 benchmark.png (1303x3250, 143K)

>why, yes, I am a subscriber and an active member of /r/buildapc

Im glad I dumped consoles entirely several years back, they dont even care about consistency anymore
I dont remember get away black monday having any drops

He's right in a perfect world.

If you wrote your whole game in assembly instructions for the CPU the console would run better because of the code overhead for accounting for different hardware in the PC.

Pretending that has any bearing in the real world is moronic because optimization hasn't existed for 20 years

So what was Carmack smoking?
He is too good at computing to think he's wrong

Did you know that with the same specs, the laptop also has 8 core AMD/Intel cpu clocked in 3.0 ghz? That's the definition of specification.

The point being made here is that games made specifically for a preset platform, will perform better than what is possible with similar specs elsewhere.

More simply put, Spider-Man (or another high quality exclusive, designed from the ground up for PS4 Pro) looks better on a PS4 than any game on an equivalent PC. Talking about multiplatform performance is irrelevant because it is highly unlikely that the devs developed the game specifically for each individual console/platform in order to maximize optimization. An objective comparison is thus largely impossible since you would need to run a PS4 exclusive on a PC and compare quality, which is for obvious reasons not possible.

I'm not saying I agree or disagree with any of the points made thus far, I'm just trying to clarify the arguments made.

>meanwhile the most popular games tend to have a focus on story.
says more about the kind of audience gaming has accumulated over the years than it does prove carmack wrong

Attached: oh no no no no.gif (300x190, 96K)

people always like a good story in games, why do you think rpgs and jrpgs are popular

Oh, optimization
Doing a game for one particular hardware will result in better performances than making a game that should run on every possible combination of bulded pcs

>rpgs and jrpgs
these genres are nothing compared to the rest the most well selling and popular genres

>but there's no such thing as a game with a great story and shit gameplay.
Look up Goichi Suda

okay cool but that has absolutely nothing to do with what i said that it's always been a thing for games to have decent stories

Attached: 1557856324298.png (367x381, 8K)

Drakengard is great story shitty gameplay

yea but it wasnt expected, hell it still isnt expected to be there now, look at fortnite, CS GO, PUBG, Apex legends

Delusion.

>Spider-Man (or another high quality exclusive, designed from the ground up for PS4 Pro) looks better on a PS4 than any game on an equivalent PC.
I am sure someone will post that Spider-Man pic showing the difference between E3 and the retailer/final version.

Attached: 1418948949481.jpg (748x1728, 644K)

Idk what he means by a mobile part, but properly optimized PC ports are still rare to this day, so he's correct.

I recognize that fucking area!
It's Dying Light, all those ledges and stuff are for parkouring, there's cars in the middle of the street and most vans have goodies in them if you picklock them, I think I remember going there for a mission

Literally a 2014 opinion.

Whenever consolefags talk about how good a console exclusive looks it's obvious that they have shitty eyesight.

Attached: 1548012983635.png (1920x1080, 3.85M)

>So what was Carmack smoking?
He has since went back on his word, actually. I don't have the source but look around and you'll find it.

i know right.

Attached: sonypo.jpg (600x600, 58K)

Look what he's done with the Quest. He's not wrong, most devs just don't have the technical know how or don't bother to take the time to squeeze out all the performance they can get.

>This is a bad case of chromatic aberration
lmao there is no way this is real

PC gaming is a joke

I agree, and PS4 is absolutely joke.

Sorta, DX11 is bloated and limited. Consoles you have full access to the hardware and can fiddle much more with it to achieve higher gains.

>my pc that's built very much like a certain console runs games that are optimized for that console just as well as the console itself!
Big fuckin surprise. His only crime here is speaking imprecise.

I have of yet to see a single DX12 game that runs significantly better than its DX11 mode.
Outside of maybe Metro Exodus.

>wojakposter complaining about shitty post content

Attached: dx11vsdx12-2.jpg (1920x2160, 1.28M)

DX12 is a bit of a mess but Vulkan games seem to run quite well. The improvement in draw call throughput is a huge benefit. Once mesh shaders make their way to shipping software it will be another nice boost.

Where is the story in sports games and The Sims?

He’s making a lot of assumptions on whether or not that game is optimized and that parts are linear in performance. He should just stick to programming.

dx12 works best when you have a weaker/older cpu and faster gpu. it utilises the cpu way more to get the best out of the gpu. a game like control runs identical between dx11 and dx12 because the game is gpu heavy as it is and it doesn't matter which api you use you're going to be maxing out your graphics either way.

Attached: dx11vsdx12.jpg (1920x2160, 1.11M)

>purple links
>image was literally created by a redditor

Attached: 1554218252458.png (394x355, 170K)

The custom Jaguar architecture is closely shared with the FX 8350, the two most stark differences being:
a) that Jaguar's CPU cores are all downclocked to run at about 50% nominal speed. Ostensibly this was done for better longevity, as Sony had issues with heat dissipation causing cores to run hot all the time and greatly reduce lifespan.
b) the unified memory setup allowing extremely fast GPU CPU data hand-off without bus-transfer to a dedicated isolated graphics card, as in PC architecture.


The reason many PS4 games manage to perform as well as they do, given the extremely shitty hardware conditions from (a) is that a lot of them optimize on (b).


Also, its the reason that Carmack is flat-out wrong. He's setting up a false comparison as the above means there is no such thing as an equivalent hardware spec.


PS4 can (and has to) rely on smaller consecutive render batches, and benefits greatly from delegating work to the GPU. I.e. it is an excellent fit for GPGPU-heavy material, like particle or physics engines that have to feed back data to the main CPU process.

Meanwhile, PC is a better fit for bulk-transfer to GPU and large render batches; prioritizing and optimizing for resources that can remain in GPU VRAM for extended time; render-ahead frames (upload everything needed to GPU -> have GPU render things while next frame is uploading); etc. while the CPU simultaneously can handle heavier loads than the PS4 like a boss - needing much less GPGPU assistance to pull through.

They are conceptually two entirely different approaches needed to maximize performance gains for different architecture.

Honestly the shared memory is rarely useful and PCI-E bandwidth and latency is almost never a bottleneck. Unified architectures are always hyped up and never seem to deliver. The reason the consoles use one chip and one memory bus for both CPU and GPU is cost savings.

>Dota
>pubg
>csgo
>GTAV
>Siege
>Warframe
>Rust
>Football Manager
>ARK
>TF2
>MH:W
>Underlords
>RL
>Gmod
>Remenant
Top 15 games played on steam. Out of these 7 have no story whatsoever. 4 have only lore and no playable story content. 1 has some story quests. And lastly only 3 of these have an actual campaign - MH, GTAV and Remenant. You must be delusional though to think people play for that.
The first "story-centric" game is at 26 and it's Fallout 4. The first actually good story-driven game and not a looting skinner box is Witcher 3 at 27. After that the next single player game that you would play for the story is New Vegas... at 119
Story-driven games really killin it

Attached: 1547359625862.gif (200x200, 948K)

>story.
Nobody gives a shit. Except trannies.

k

Attached: 1556749460159.jpg (3840x2160, 2.72M)

He's right about the PS3 and 360 and he's right about old generation Android. The differences have mostly washed out by now. Remember that John Carmack never made a game for the PS4 and Xbone.

John Carmack should be enslaved and jacked into the matrix as the tool he was made to be.

>ITT: I know more than john carmack who codes for a living

Idort here. As someone with a decent PC (gtx1080, i7-9900k) and a PS4 pro, I straight up wouldn't notice the graphical disparity between these comparisons when playing on a 65" tv 8 feet away from my couch. They'd look completely identical

It's framerate where the difference lies. The console is pulling that off at 30fps where as you're getting 90+ on PC. Anyone concerned about the visual fidelity has the wrong priorities

This scene really is a testament to the power of baked lighting. Sometimes I wish developers would forget about large environments and dynamic objects so we could always have graphics like that.

I think he's an old has-been
>PC is only twice mobile
Is he actually autistic?

He was right at one point in time when consoles and PCs used different hardware bases. But now that consoles have become PCs, that's no longer true.

That said, what is true is that long-term stable optimization curve of a platform will yield improvements in software capabilities. If you look at launch-day Xbox One and PS4 games versus what we have now, it's quite an impressive jump.

Horizon: Zero Dawn is quite impressive, all things considered, for the PS4. Death Stranding is basically using the engine behind that game, but with a lot of refactoring done to facilitate Kojima's bat shit perfection requirements and absolutely animu levels of brain fuck with his plot, story lines, and game play ideas.

Yeah you retard, but they never have the same specs and desktop has better cooling, so no throttling

Ironically baked lightning is a fuck lot more work then just using whatever basic dynamic lighting comes with Unreal or whatever. Always wondered why people shit on carefully hand created lighting so much

Those screenshots are identical. The difference between DX12 and DX11 isn't visual quality unless they're using DXR. The point of the image was to show off the frame rate.

>He was right at one point in time when consoles and PCs used different hardware bases. But now that consoles have become PCs, that's no longer true.
No. His point was literally that PC's are held back by their API's and non-static hardware build/performance.

>I straight up wouldn't notice the graphical disparity between these comparisons when playing on a 65" tv 8 feet away from my couch. They'd look completely identical
there is no graphical difference in those screenshots. the difference is the performance as shown in the top left corner. the games run significantly faster in dx12 compared to dx11. you're looking at 20 to nearly 70 fps difference.

Carmack was right in an era where consoles were so different architecturally from PCs (with customized shit like the cell processor, or the Emotion Engine) that forced Devs to optimize and redesign entire sections of code for the hardware. The Carmack situation is still possible on bonafide console exclusives, but the truth is that all consoles these days are just using off the shelf parts like a PC would, with only the smallest of customizations and add ons. So outside of your Uncharted 4s and Horizon Zero Dawns, Consoles perform about the same as PC specs, due to the homogeneity of the hardware ecosystem these days. Even the Switch is just an off the shelf tegra processor, and it is the one that is ironically punching above its weight due to the architecture of that being so different from most of the ports devs are trying to run on it.

>rocket scientist facts vs. the opinions of internet manchildren that are too terrified to even leave their house

he's probably mostly correct. consoles are made solely to play games. pcs are made to do far far more at any time than a console could. and just because he's right doesn't mean pc is bad for games. stop getting upset over petty shit.

The API advantage does not exist since DX12 and Vulkan. Static hardware doesn't mean much when you have a settings menu.

Now compare that game to: youtube.com/watch?v=H7OP0g3Arzw

Same platform.

Which IS my point. The long-term stable optimization curve of a platform. Case in point. Intel's fucking Skylake Architecture. Intel has optimized it on the SAME process node for so long that the performance difference between Skylake and Icelake is pretty massive. It's even more obvious when you factor in that all studios make vertical slices on PCs to take advantage of the raw capabilities of the platform and optimized for that TINY little slice to sell to Publishers. Those "slices" show the true capability of the hardware compared to consoles, if developers properly developed and optimized for the platform.

Vertical Slices AKA "Gameplay Reveals" show the divide between a high end PC taken advantage off in full vs Consoles taken advantage off in full.

You do not understand that the PS4 and Xbox One are both made using moderately repurposed PC parts.

>appeal to authority fallacy

This post really is a testament to the limits of human retardation. Fuck you, fuck large textures and fuck graphics. Go watch a fucking movie you nigger.
>man, I was peer pressured into playing games by others in my kindergarden. Unfortunately my brain hits it's limit just by breathing so I'm not very good at them. How I wish they were less reactive to what I am doing, I love static shit that just exists and doesn't interact with me in any way
I'd say go watch some paint dry but I guess even 'drying' would be too dynamic for your lobotomised ass

It's an interesting exploration of a certain style of video game. I would never wish for large dynamic games to die out. I just enjoy seeing some good technology and genuine quality lighting on occasion.

Console hardware is just PC hardware now. So that's not really true at all anymore. This guy hasn't worked on a game since the Xbox 360 was around.

Of the 5 most sold games of all time:
-4 have no story at all
-1 doesn't have the story as a focus
-3 of them were released in the last decade, 1 of them 2 years ago

In fact, not even the first 10 have a focus on the story, but on their mechanics.
So tell me again how story is so important.

>The API advantage does not exist since DX12 and Vulkan.
Wrong.
GNMX is even lower level.
>Static hardware doesn't mean much when you have a settings menu.
That is probably the dumbest thing I have heard all day.
>Talks about optimization of chip design vs optimization of game engine architecture to a certain hardware.
Not the same but I get your point. Also. it isn't true that all studios make vertical slices on PC's. Multiplat devs might do so, but not the SIE firstparty devs (unless it is prior to the consoles launch, they may use devkits or prototype hardware). The thing is, you can't optimize for the PC platform because there is no build or performance consistency to target.
The intel CPU's are great for latency but the AMD CPU's are great for parallel work, for instance. Which one will you pick to optimize your game engine for? There is no correct choice. So you have to keep both in mind which will perhaps decrease performance by 10% on one of the chips but increase performance to be 200% better on the other chips. It is a compromise and they HAVE to do this on PC. Not when developing exclusively for one console, however.

How's it feel to be peak zoomer? Here are two words that BTFO any zoomer argument you might have Sierra, Cyan

>GNMX is even lower level.
GNMX is a high level API moron. it's literally a wrapper for DirectX. GNM their main API isn't any more low level than DX12/Vulkan.

yes but with the added context that its only made possible through exclusive deals and making your customers pay a yearly internet fee. consoles technically get sold at a loss because they know they can make it nickel and diming elsewhere

Carmack is right and you all know it

Attached: 0JTg3lV.png (1021x718, 425K)

> but not the SIE firstparty devs

That's bullshit. Naughty Dog has done vertical slices for Uncharted several times. Uncharted 3 and 4 gameplay reveals were vertical slices. The final game was heavily downgraded. The gameplay reveal of Last of Us 2 is a vertical slice. Watch Dogs was a vertical slice. The Division was a vertical slice. Anthem had a vertical slice. God of War sequel had a vertical slice.

>Intel CPUs are great for latency while AMDs CPUs are great for parallel work

Only partially true. Intel CPUs are good with gaming because the industry optimized for them for nearly a decade. Mostly because AMDs offerings were dumpster fires. With Zen and Zen2, the tables have turned. On top of that, next-gen consoles are getting Zen2 and Navi cores. They're a full AMD stack. This means over the next 10 years, we'll see a long-term optimization curve for the Zen platform and as a result, gameplay performance with AMD CPUs will go up significantly compared to Intel which won't as much, because they are keen on pumping out more *Lake architectures.

He is right in theory and it's because consoles don't have to account for performance lost through swappable connections.

>GNMX is a high level API moron
You are right I mistook it for GNM, which is what I meant.
>it's literally a wrapper for DirectX
PS4 doesn't use DirectX.
>GNM their main API isn't any more low level than DX12/Vulkan.
Yes it is.

yea in 2014, and talking about ps3 you fucking idiot

John Carmack isn't above saying completely retarded shit.

>Listening to John Carmack
>Listening to John Carmack when he's being deliberately misleading when it comes to the state of optimization of engines for various hardware platforms

>PS4 doesn't use DirectX.
why is why GNMX exists holy shit you're not clever are you

>Yes it is.
you clearly have no idea about what you're talking about

Dumb the FX for a intel, it opened my eyes. Everything got smoother. I'll never get AMD again

But computers are more advanced than laptops...

Attached: steelheavierthanfeathers.gif (500x281, 1.88M)

this

Yes.
The majority of the time that it doesn't is preconfigured power settings, mostly so that a laptop doesn't last Half an hour before needing to plug in

He's roughly right but the information largely isn't useful. Most PC gamers that are playing seriously aren't running anything on the low end as a console. 2x is also a pretty bold statement, and less relevant than when he made the statement because the commonalities between Xbone and Windows architecture have rendered a lot of the logistics of making a decent PC port when you're already releasing for Xbone much simpler.

For mobile he's absolutely right, they've got great stuff in them but battery life and heat issues mean performance takes a heavy hit.

>The final game was heavily downgraded.
No, they really weren't. The photomodes included in these games has exposed that you can easily capture something that looks better than anything in the promotional material.
All vertical slices have been 100% representational of the final product. Especially from Naughty Dog. We can't speak to TLOUpt2 yet for obvious reasons. But God of War easily got a graphics update since its initial showing. The lighting engine in the reveal vertical slice was straight-up no finished and looks worse than it ended up in the final game.
Watch-Dogs, The Division, and Anthem weren't demoed on PS4's pre-release.
>With Zen and Zen2, the tables have turned
Not entirely. Intel still has the edge a lot of the time (in gaming performance) due to the game engines not being optimized for the parallel nature of Zen. Intel has better latency and that isn't that shocking. AMD has better everything else now.
>why is why GNMX exists holy shit you're not clever are you
GNMX is a wrapper for GNM. Not DirectX since PS4 doesn't use it. That was my point.
>you clearly have no idea about what you're talking about
You can go on some dev forum if you don't believe me. Good luck.

It will as long as the software isn't throttling the performance for battery life and the heat doesn't force the system to throttle due to inferior cooling.

So basically, yeah, if you want to run a laptop under unrealistic conditions, it will perform the same, but it's so much easier to achieve the optimal desktop conditions that it isn't practically worth it.

Leave it to mentally ill autists to doubt an expert's opinion about the matter. Won't even read this shithole of a thread, I already know how it went

>it opened my eyes
nah it just seems like you closed your eyes and became a Intel fanboy. FX was a shitty architecture for games due to its lacking single core performance, but AMD fixed that with ryzen and now it's able to compete more closely with Intel.

Literally unplayable

>HE SMART GUY
>DISAGREEING MEAN U WRONG
Interesting addition
I bet you defend Bill Nye, as he is the Science Guy despite the fact he's an Bachelor's of Engineering flunkee with a couple honoraries granted to use as a social tool.

Attached: 389-1-23.jpg (500x383, 20K)

It's because of external factors. For example, the hardware is more tightly connected and the software stack has less overhead. Because the configuration is unique, optimization that only works for the specific hardware can be leveraged, etc.
Still wrong. Maybe 1.2x, but not 2x.

>says he wrong
>let me just pull shit out of my ass: maybe 1.2

lol.

>GNMX is a wrapper for GNM
do you not understand how retarded you sound? why the fuck would it have a wrapper for an API it already uses you moron.

>Not DirectX since PS4 doesn't use it.
which is why it has GNMX you clinical retard. GNMX is essentially wrapping/replicating the directx code to work on ps4 hence the X at the end of GNMX. GNMX is a high level wrapper and slower than low level GNM. GNMX is designed to be almost identical to directx so porting is quick and easy, but as with all high level API you lose that "to the metal" functionality. it's not hard to understand.

Please post more based console destruction

>there's no such thing as a game with a great story and shit gameplay
any yoko taro game pre nier automata

This. People that think optimization is still a mindset in your average studio is fucking retarded.

t. retard

>GNMX is a wrapper for GNM.
DirectX 11 is a wrapper for DirectX12.
Think about this for a second m8.
That's not how it works.

John Carmack doesn't know better regarding important aspects of "modern" gaming - his opinion is dated as fuck.

Ghost Trick wouldn't be the same without the story nor even Planescape Torment. In some cases, having an well established lore makes a world (no pun intended) of difference, like in WoW.

However, it is true that the story in gaming must be conveyed in different ways than film or books. Games like The Order or the nevending cutscenes of The Last of Us is clearly a mistake. In this sense, the comparative with porn can work, but I guess a better quote would be:

"Written story or cutscenes is like playing soccer and pausing every 5 minutes to be told how everybody's been feeling. You don't need that, just play the fucking soccer and experience how the game is developing"

Snoys confirmed for the most retarded fanbase.

Yes
Why wouldn't it?
The only difference would be, like, the physical size of the casing, why would that effect performance in any way whatsoever?

Why would John "Based" Carmack betray us, bros?

>why would that effect performance in any way whatsoever?
>I don't know what thermal throttling is!
Come on user, this is pretty simple.

Carmack actually believes gamedevs do optimization still.

>do you not understand how retarded you sound? why the fuck would it have a wrapper for an API it already uses you moron.
Because it is higher level and simpler to use. GNMX uses GNM as a subsystem. It has less depth but it is more stringent an API. How is this hard to grasp?
>which is why it has GNMX you clinical retard
Are you even fucking reading my posts at this point? GNMX has NOTHING to do DIRECTLY with DirectX.
GNMX uses GNM to function. Like to even work AT ALL it uses GNM. GNM is the lowest level API that the PS4 uses, and GNMX is an abstracted version that is easier to use but has less depth and potential uses. In other words, GNMX is wrapped around GNM.
>GNMX is designed to be almost identical to directx so porting is quick and easy
Not identical. It is designed to make porting easier true. That isn't the same though. The rest of your post I can agree with.

if that's true, why do most triple A console games have unstable fps?

Games are to the point where any great game can be played on any device, the only thing holding it back is graphics/resolution.
I would literally play nothing but early 00s games on a tablet if it were viable.

Well all the best looking games are on PS4 so he's not wrong

It's another quote taken out of context. We've discussed it tons of time.

>It's another quote taken out of context.
i don't see a @ in the tweet

Your thoughts would be less retarded if you didn't share them. At no point did I say they had to come out the same year. Just in the same generation of hardware.

Your supposition is like saying that all Calculators are as good as computers because my calculator is more powerful than the Amiga I grew up with.

>476173985
>NOBODY ignored this
This is why the board is dead

See and This is just a dumb word problem that people don't understand because they want to make kneejerk 'hot takes' on everything.

Some devs still do, even for PC. Rez Infinite is a very recent PC port and it performs extremely well on even budget GPU's with AA and Internal Resolution turned up, even the much better looking Area X performs very well.

I CONFESS
I HATE MY FUCKING TN PANEL MONITOR
PS4 HAS BEAUTIFUL OLED TV
WITH PERFECT COLORS AND CONTRAST
PS4 WINS

But they look like any multiplat

Just want to play my Flight Simulator

The fact that my R9 280 plays Control at a higher framerate than a PS4 pro tells me otherwise.

but the worst pc you can buy is still 4 times as strong as the best console on the market lol

This entire thread

Attached: CD47BC21-41FD-4D45-ADD0-DC5FF6DDCFB8.gif (480x480, 1.37M)

Carmack is now in on it with the elite. The moment you start developing rocket science you're basically a slave to the ayys.

Just play your PC on the television you dingus.

Ok nigger, here's how you optimize code for specific hardware.
If you know the L1, L2, and L3 cache sizes, you can set up your data structures to only have cache hits with N+1 prefetching. This greatly reduces latency of memory reads from ~100 cycles to ~4 cycles.
You can also set up memory accesses in such a way that every subsequent access hits the same row in a DRAM bank or a different bank altogether. The former saves the ram from doing a precharge and column select and saves you roughly 30ns on a memory read. The latter effectively parallelize the read with others, giving it an effective 0 delay in overall execution.
If you know the length of microcode paths, you can use for example float multiplications instead of integer division because the former takes less cycles
Assuming a PRAM model for the GPU, when you know the core count, you can use work inefficient shading algorithms that are fast instead of work efficient algorithms that are asymptotically slower.
This is just the hardware specific optimizations that I can list off of the top of my head.

based user

uhm in english please ?

it means I get paid hundreds of thousands of dollars a year to do shit you'll never understand in your life

so why are you shitposting on a console war thread huh ? I didn't even believe your first post anyway, nice try

maybe several decades ago

Who will I believe? A literal genius with computers or some user on an anime website?

Now imagine if most devs actually did any of this to begin with instead of just booting into unreal and working exclusively in C++

I like the Indiana Jones books

Still shit for emulation though

Attached: 1565232636538.jpg (1138x1558, 213K)

Attached: 1554097773494.gif (144x182, 98K)

in english, doc. Explain it to me using a food analogy

You can do all that shit in C++. C++ gives you plenty of control over memory layout, it's codegen which is abstracted away from you.

he was right, but then things like vulkan happened and the gap isn't nearly as wide as before

Where's the MGSV pointy hat collage?

maybe in like 2006 lol.

On console GPUs he's absolutely correct, first off because developers can go as low level as to write actual microcode where DX11 and OpenGL have a lot of driver and CPU synchronization overhead, second because it's a fixed spec developers will get better at targeting over time. Things have really changed on the API side though since 2014, with DX12 and Vulkan delivering far more efficient and explicit programming of the GPU.

On mobile GPUs he's correct simply because of thermal throttling and battery life concerns. The mobile SoC you're targeting as a dev is always underclocked to some extent.

he still works at oculus on the software side though. he's still keeping up with everything

Cringe

I need to have a PC regardless, and it needs to be good enough for comfortable use. Why buy a 500€ PC and a 300€ console (+games) when I can just buy a 800€ PC? The library will be bigger and cheaper and the games will run better than on console

is this because consoles will run everything at upscaled 1080p and 30fps and pcs will run 4k60fps?

At the time somewhat yes, but his numbers were off IMO, it was more like 50% slower on average. This was before Vulkan/DX12.

I would say it's even now that Vulkan can be done on PC, consoles and mobile.

probably
os overhead and power source

the fuck does vulkan has to do with anything. if every platform using it then what changes here? what makes his statement false?