Will we see a noticable jump in graphics next gen?
Will we see a noticable jump in graphics next gen?
Other urls found in this thread:
artstation.com
youtu.be
twitter.com
No.
Maybe better framerates on Consoles.
No. It's purely optimization from here on out.
No, probably never again, Last gen was the Last to offer big graphics jumps
Did we see one from the end of last gen? It's all just been resolution and processing power used for more filters and effects
I meant the Ps3-360 gen btw especially pc
This generation already offers stunning native 4K graphics, all we need is better framerates desu
>mfw the jump from PS1 graphics to PS2 graphics
>the hud actually curves like you're wearing a helmet
Now that's cool.
No the selling point will be resolution now since retards love to wank over arbitrary meme numbers
native 4k this coming gen
4k60 next gen (maybe)
8k the one after
Meanwhile 10 years from now games will look about the same as RDR2
lighting is gonna be the big thing this gen,
>SNES to N64
This. Drooling console retards only want bigger resolutions. At least PC gaming will be pushing the forefront of high-hz gaming. Can't wait for 120hz to be the standard on PC.
When the cost of living goes down and education isn’t fucking half of our income being used to pay it off
Even if there is, what's the point? Developers will have to hire more people to spend more hours to add a million polygons to the models which will end up taking away from budget to spend on other areas as well as resulting in less content since more time spent working on each individual model would mean less in total, so you'd be paying more for shorter games. All of this for what? So that I can see Ultra HD wrinkles on a guy's forehead if I zoom the camera in on it? Focusing so much on graphics is a waste of developmental resources not to mention the unnecessarily high system requirements in order to run these (which they'll probably try to curb by reducing framerate to nauseating levels as usual) and the ridiculously bloated file sizes.
no only optimization. hopefully this means devs will spend more money on mechanics, art/style, and gameplay
We could have photorealism but what you stupid motherfuckers never seem to understand is the IMMENSE amount of work AND time it would take to do that, and now with things like RTX the probability of that happening has gone down even more because it takes away part of the graphical designers job and let's them be lazy.
from here we are going to see more and more of the RTX shit and less of the polygon shit
consoles are getting it as well so it is official now, it is starting with the next gen
If you want games to take 7-8 years to be made then sure
>Sega Megadrive to Sega Dreamcast
>playing Sonic Adventure 2 as a kid after playing 2d sonic games for years
I hope this is the case, then again we're talking about consolefaggotry here so I'm sure some other graphical gimmick will crop up than will make performance take a hit.
Who the fuck is asking for games in 4K, I really want to know
This. Increasing graphics to stupid wank levels is largely responsible for the by the numbers AAA trash we get these days. Half the budget goes into getting 4k photorealism at 12fps on console, and the other half goes into marketing.
Indies are truly the only hope for gaming desu.
Did you see one this generation? Imagine that but halved or even quartered.
no chance, Sony are going to keep pushing that ridiculous 4K HDR standard and smoothbrained consumers are going to eat it up
Tech companies. To justify selling 4K TVs.
Then in 5+ years they'll be selling affordable 8k TVs and games will have to follow suit.
It's all do tiring. I'd be perfectly fine with never going beyond 1080p. I'd rather have 144fps1080 than 60fps4k
look on and see the morons saying that careful retopology isn't necessary anymore. moore's law doesn't end because technology stops improving, it ends when the people using it become complacent
Literally nobody. It's a dumb gimmick only used so microsoft and sony can add another """"feature""" to their consoles that normies can gawk at. Disregarding the fact that it takes a monster PC to even get 60 fps at 4k. Also an excuse to sell stupid 4k televisions.
You're a funny guy. Trust me, one of these days the big AAA companies are going to look for some stupid graphical feature like ray tracing or rendering ass hairs or some stupid shit and we'll be getting 300 gb file sizes and keep the 12 fps console standard.
>family bought a 30'' Trinitron WEGA in like 2002
>always thought about upgrading but never did
>ended up just using this until now
I mainly play on PC and just use my consoles on the old CRT. Is it really worth an upgrade?
Side by side my monitors don't look that much better than the CRT at the distance I'm sitting (especially with my eyes).
Do you think I should upgrade? Would it significantly improve my experience to buy a big new TV (My vision is blurry more than like 6 feet away anyhow).
This.
Just think about the amount of talent and skill that went into Battlefield 5 and how much more advanced it is compared to Battlefield 3, however it looks pretty much the same when you're actually playing it. Now imagine how much money it costs and how much stress the teams are under.
It's just a marketing thing for normalfags that eat this shit up, they're the kind of people you hear IRL boasting about their new and pointless 4K TV purchases.
I just want to see a game thats just focused on cool ass physics particle simulations like they show at SIGGRAPH even if its at the cost of good graphics.
>rendering ass hairs or some stupid shit
we literally have games where each individual eyelash and bead hair is 3d modelled and placed individually
No, we don't
Have you ever seen another human being before?
yes, we do
there are people where that is their entire job
Not in "next gen" - The top PC games aren't looking much better than anything else since we don't have the processing power for a big leap any more - mainly because the GPU (and other) industries have stagnated because of Trump's trade war with China - meaning things cost more for manufacturers, and that cost is sent down to consumers, who don't really want to pay $1300 for a GPU
No we don't.
That's not an individual eyelash/hair
That would be intensive and dumb for little gain
Maybe a token few games will stand out looking next-gen, but otherwise potential optimization can definitely go down the drain further for no real discernible improvement in graphics quality. I wouldn't be surprised if the bulk of PS5 games or whatever essentially bank on the SSD's faster speeds or maybe some lavish effects kinda like PhysX used in more games.
>everyone hops on the open world train but well done LODs, view distance, and lots of complex, seamless interiors is still rare as shit for video games
As someone who plays everything at 4K Ultra on PC no graphics technology has stagnated because people care more about God Ray's and shit more than what really matters. Individually render blades of grass and footprints that are permanent and beaches with functioning waves. Add more to realism than anyone gives credit for. Games look good nowadays but I expect Raytracing to be legitimately underused on the consoles. The best choice would be to push 60 fps and 4K medium because that still looks bette than 1080p 16AA Ultra
yes we do
look at the eyebrows, each card is a hair
It is insanely intensive, that's why the reveal trailer had different hair to the final game because the overdraw was absurd
Yeah absolutely, anyone saying otherwise is delusional and i wish i could rub it in their faces in a few years.
PS5 is looking to be more powerful than the toasters that most of Yea Forums's master race uses, while the PS4 Pro was basically a shit tier potato.
Just look at how much visuals have improved just from launch of the PS4 to now, a few years into next-gen, games will blow your fucking mind, especially multiplats that make it to PC.
This is still the "jury rigged" style of hair that is done on models that are rendered in real time, in pre-rendered, the hair is rendered literally hair by hair, therefore it can look realistic with all the shit like translucency, reflections, thickness and so on, unlike the real-time rendered shit that still looks like some brush bristles caked in grease.
>push 60 fps and 4K
Stop this you stupid nigger. Consoles can’t even hit 60 fps at 1080p on a consistent basis, why would you push for something that is obviously unattainable for the ~$500 console market?
PS5 will use Zen 2 + Navi and sell at a loss, it'll easily be equivalent to an 800+ dollar system from right now.
4K 60 fps isn't hard to achieve at that point, especially with console optimization magic.
I do disagree with that poster though, i think they'll go for 30 fps instead so they can fit in more raytracing and other graphical bs, most people don't care about fps as much as they should.
To all the niggers saying graphics have stagnated:
there's yet to be a game entirely built on physically based rendering from the ground up
there's yet to be a game entirely built with accurate fluid caustics and physical objects in mind
there's yet to be a game entirely built without loading in any capacity
You sound like a damaged individual
Arkham Knight for me was a massive leap graphically for a open world game. Witcher & Dying Light came out around the same time but AK did textures/lighting far better in my opinion, but that's partly because all of it was set during nighttime.
Why's that?
It will not be an 800$ PC in a $500 box it will be An 8 core ryzen at 2.0 ghz and a gtx 1070 equivalent. They're gonna push that hardware to it's breaking point when some people would say the 1070 isn't a 4K card but I played alot of 4k on it so I don't ever know who to trust.
I'm noting saying consoles should play at 4K but they are gonna try, no matter what
I'm not saying it'll be equivalent to a 2020 $800 PC, but to an $800 PC from right now.
1070 Equivalent i can believe, maybe 1070ti-ish, but i think they'll use a much better processor than what you're describing.
The best Zen+ is better than that and it's confirmed that they will be using Zen 2, pair that with them most likely selling at a loss and having some special deals with AMD and you're looking at a possibly quite capable system imo.
>1080p in 2019
>640K of memory is all that anybody with a computer would ever need.
People that spout this shit have no sense of imagination. Why the hell would current tech be our peak? Fuck 1080p isn't even current tech it's ancient, shit is like 14 years old. It reminds me of when I thought my nice boobtube tv looked glorious & I would never need anything else, see how well that turned out even though the TV still looks great running Mario Sunshine on it.
One day I'll upgrade my 1440p 144hz gsync monitor to an 4k hdr 120hz+ one, no way this Dell is the best experience humanity will ever offer.
My b, i meant to say "the worst Zen+ is better than that"
hopefully never
>50 years from now
>"We present to you the PS11, capable of rendering at 16k the individual hair on the ass of this hobo in real time"
>"You can even see the lices in his hair, with each of their individual legs"
>"Still 25fps tho sorry"
Unironically not unrealistic, 60 fps simpy isn't a term that seems to appeal to general console audiences.
that is inevitable, hardcards are an approximation just like normalmaps, baked AO and cubemaps are. they're never going away, they're only going to get more and more complex and detailed.
hell you even sometimes see the old Shadow of the Colossus fur shell technique show up these days
>it's ancient, shit is like 14 years old
Just like your mom
Woah
Better to have 60 fps and actual AI. I swear that AI got worse this console gen.
That's exactly what's not going to happen as the constant chase for muh grafix pretty much always sets back the performance quality
Shut up You liberal piece of shit you are stupid
No. Practical photorealism was achieved many years ago. Because of the exponential growth in hardware capability, future generations will instead be closing in on simulated realism -- using physically-based rendering to refine and expedite what used to require mountains of careful detailing. Expect quicker development cycles, larger game sizes, and more experimentation with technology.
Naughty Dog's next IP
Hair doesn't work in squares
"each card is a hair" is just incorrect
Why do you keep going on as if you're pretending to be retarded?
>Hair doesn't work in squares
it does in video games
>"each card is a hair" is just incorrect
except it isn't. each card is textured as a single hair. they are individual hairs
I'm not on Yea Forums to argue semantics, feel free to send Danilo and email and ask him yourself
>it does in video games
great we also model heads as cubes it's so realistic 1 head = 1 cube
They haven't MODELLED one HAIR as one OBJECT
You can see this with your eyes
What's after 16k? 32k?
Lighting is key to realistic graphics, I agree
Reminder that Crysis 3 was able to run in a downgraded form on hardware from 2005.
I’d imagine that developers will start paying more attention to HDR implementation now that HDR displays are more common, I’m rather giddy about the possibilities. Of course, many new games support HDR already and some look glorious with it (Horizon: Zero Dawn, FF XV etc.), but there’s also quite a lot of games where the implementation is straight up lazy or bad. But now that it is becoming more common and common, it will become a higher priority and will end up giving a very nice boost to graphics quality without any serious performance hits.
At 15 fps
It will be noticeable, but only because games run so shit on consoles nowadays. Maybe consolelards will look at the same graphical quality they had last gen but at proper resolutions and 60FPS and demand this from now on. I guess that's good for the industry to some degree, but this should have been the norm since the PS3/360 days.
600k triangles when? What about 6 million or 6 billion? I want a redo of this image. It's outdated now.
except they have. there are hairs on this face that are single sub-objects of a few polygons that are textured as hairs.
one object, one hair.
what about this concept are you finding so challenging?
artstation.com
what you're arguing is like saying the face is not a face because it is made in the exact same way, a collection of vertices connected to form triangles that are textured in order to represent a face.
literally the only difference is that the hair uses a different texture and lighting model
Crysis 3 ran a fair bit better than Crysis 2 (and especially Crysis 1) on consoles. The PS3 version ran significantly worse, but the 360 version generally holds to the mid-upper 20s. It runs quite well via XBO BC, but the 32fps framerate cap causes some annoying judder.
DESU we’re kind of hitting the peak with resolution now, going above 4K doesn’t make much difference unless you have supermassive screens to go with it. And no matter how big screens tech companies make, the limiting factor will ultimately be the available space at people’s homes - I doubt that many people would find space for screens larger than 65 inches and something like 70+ inches starts getting ridicuously large. What good is 8k or 16k if your screen is so small in proportion to viewing distance that it looks the same as 4K does?
I have no doubt that the screen technology will evolve, probably in directions we can’t even imagine now, but resolution game really has hit the point of diminishing returns by now.
>this meme image again
I'm all for saying fuck graphics in favor of literally any other aspect of game development but this was made using the 6000 as a base as such the 60000 is not as detailed as what it would be if they didn't and is misleading. Someone post the image debunking this.
Real life graphics. Real question is, what's the next jump after that?
Graphics don' t do much if the game is shit.
>What good is 8k or 16k if your screen is so small in proportion to viewing distance that it looks the same as 4K does?
But how do we know this is true? Maybe the smallest differences is enough to break the illusion of reality.
fun games
Wake me up when photo-realistic rendering is a thing.
You say that as though 90% of AAA games aren't "really good" at worst. Tiny-dicked internet forum warriors like to pretend that games they aren't interested in are "terrible" but the truth is that when you get 60+ million dollars and 3+ years of hard work, the results rarely turn out bad.
>photo-realistic meme
>not just real world rendering
stop chasing graphix
It's the way to maximum immersion.
There was a look into one of the new 8K monitors and even looking at it through a compressed youtube it looked beautifully sharp.
I thought the same as you that phones having 720p screen was ridiculous for such a tiny screen, that there was nothing much to gain from a higher res and look at phones now. Anyway most of my viewing is close to a monitor so I think 4k will not be the peak for me, desktop space, DPI, sharpness etc. will all be sweeter with higher res monitors.
the fuck is Gaunter O'dim doing in sci-fi shooter?
Graphical fidelity goes hand in hand with emotional connection. Crytek were founded to create the "ultimate emotion". They chased extremely high graphical fidelity and extremely high end digital acting. They've also dabbled in VR. Their end-goal was always to create a game where you truly care about a digital human being and want to protect them. If they hadn't been fucked over by the GFC, Redemption might have been that game. It was TLOU years before TLOU, except the little girl could die and you would have to pay careful attention to her emotional state.
and as soon as they move the continued ridiculousness of games as a representation of reality is starkly shown. characters slide across the ground, glitch between unnatural movements and basically have no weight or presence. FUCKING FIX IT
>2019
still frame quality: photo
animation/motion quality: same as half life 1
Games are expensive because of marketing and poor management, not because of graphics. The Old Republic cost nearly as much to make as GTAV and half of GTAV's 260 million dollar budget was literally just marketing. Crysis 3 cost 66 million by comparison. Crysis 1 cost 22 million and there are plenty of games released within a decade of it that cost more and don't look as good.
Sounds real.
I think lightning and animations are going to be the most noticeable improvement in this next upcoming gen
Yeah, because they'll buy it for the movies. Sony is absolutely based.
NO
But that's because they don't exist in games, right? They still exist as simulations.
Wow EA must be desperate if they're shilling on Yea Forums. It's nit gonna work though their times up busy.
console plebs will finally be rid of loading screens
and better lighting is coming, nvidia got spooked and prematurely released their tech
Consoletards can't see above 25fps.
Not him but I'm pretty sure there are multiple hairs per triangle, otherwise his head would look pretty barren.
>missing the point this hard
To be honest, I'd be perfectly fine with graphics staying on PS2 level, in higher resolution and updated controls. It's getting ridiculous how they sacrifice performance so they can add another piece of hair on character's ass.
modern hair tech isn't based on polygons, it's compute shader
nvidia uses it's tessellation bullshit, but AMD tech won anyway because it runs on consoles well
Same, but they'll never stop chasing it. Gotta give those modelers and artists jobs after all.
>Maybe better framerates
hold my 24fps
besides framerates, no
unless memetracing becomes the next """big""" thing even though its just the game looking the exact same but at 10 fps and triple the size of the game
Not the polygons fault they aren't being used well.
you say this as if it's an accomplishment.
Seems like animation is the way to improve now.
4K even isn't standard yet
>triple the size of the game
What are you talking about?
>hell you even sometimes see the old Shadow of the Colossus fur shell technique show up these days
I think they used that for the grass in the latest Mario Tennis
Fucking hell, do people post that meme just to piss off people who actually know what they're talking about?
Also, videogames are not photo-real yet. They can get close, but they're not there yet.
>4K
>Zen 2 cpu
>Amd Vega gpu
>12GB Ram
I hope so.
Not him but the general idea of raytracing = quality is increasing the model quality significantly because performance doesn't suffer much from it. You can have massive higher scene complexity without performance loss with raytracing. This however means still a lot more data to store. Image real 3d scans of buildings used a source material. It's possible with this.
It's not missing the point at all.
The 60,000 mesh isn't a true 60,000 mesh. It's a 6000 mesh that's been sloppily uprezzed,
It's like taking a tiny picture and then stretching it to HD size, rather than using an image that was originally HD in the first place.
Also, the 60 and 600 versions are shit too because they're not constructed like how any 3D modeller would use the tris they had.
4K is kind of the standard if you're buying a new, good, TV.
Sort of like how 3D became the standard for a while ago.
Having said that, much like 3D, I'm not sure how many people are actually running it.
I have a 4K TV, but I still watch standard blu-rays and, while 4K gaming looks great, I prefer having decent framerates more.
You have no idea what raytracing is. Raytracing is a lighting/rendering technique. It literally tanks framerates, so saying it allows for higher complexity in other assets is literally backwards. You can drop raytracing in a scene and change nothing else but maybe some lighting actors/values, and leave everything else alone. Activate it and framerate takes a noticeable hit, and the overall game size only increases a few KB at most due to the added information for the lighting values. Nothing you said makes any sense.
>Activate it and framerate takes a noticeable hit, and the overall game size only increases a few KB at most due to the added information for the lighting values.
How does a few KB make it drop framerate? This doesn't make any sense.
> amd
> not nvidia and intel
poorboxstation
Are you trolling me?
I'd settle for current graphics staying the same if they brought up game quality. Better worlds better characters, more content, etc. I foresee tools that make the process of making 3d games easier. Enough of this 2d shit.
You don't understand where the cost comes from. Raytracing cost is the amount of rays you use which mostly depends on your desired screen resolution. It's vastly more cost intensive than regular rendering to raytrace a single 3d model but it costs almost nothing to change the 3d model to a fullblown city in a raytracer compared to making that change in a regular renderer.
>we live in an age where it's possible to use countless creative, stylish art styles due to hardware capability
>make dusty, boring, grey realistic shit with iffy gameplay
>You don't understand where the cost comes from.
Yes I do, and while your definition of the raytracing cost is right, I disagree with almost everything else. Making more complex models is likely cheaper than raytracing, but it really depends on the scene. That said, I understand your point now. Seems like a little bit of mental gymnastics, though, because whether or not raytracing catches on, it wouldn't be its fault that game sizes increase. They've been doing that forever without raytracing being a thing.
in a perfect world, but since parts get better every month they don't bother to optimize anything. websites included.