DUDE LET'S MAKE $1200 GPUS MELT FOR DYNAMIC LIGHTING THAT YOU'LL HARDLY NOTICE DURING THE HEAT OF THE GAME LMAO

youtu.be/blbu0g9DAGA
>DUDE LET'S MAKE $1200 GPUS MELT FOR DYNAMIC LIGHTING THAT YOU'LL HARDLY NOTICE DURING THE HEAT OF THE GAME LMAO
>DUDE IT DOESN'T MATTER THAT LIQUIDS AND WALLS DON'T REFLECT LIKE MIRRORS IT LOOKS SO COOL LMAO

How do we stop this detrimental meme called Raytracing foisted upon us by the scum at Nvidia? This garbage waste of tech that'll make devs focus more on shitty boring corridor shooters than making vibrant level design with proper art direction.

Attached: ray tracing.jpg (1310x688, 100K)

Other urls found in this thread:

newegg.com/powercolor-radeon-rx-5700-xt-axrx-5700xt-8gbd6-3dhe-oc/p/N82E16814131752
youtube.com/watch?v=5jD0mELZPD8
streamable.com/5uzrn
streamable.com/3jer1
streamable.com/fbsdd
youtube.com/watch?v=a2IF9ZPwgDM
dorian-iten.com/fresnel/
dsogaming.com/news/here-is-how-you-can-completely-fix-the-annoying-dx12-stutters-even-with-ray-tracing-in-control/
youtube.com/watch?v=blbu0g9DAGA
twitter.com/SFWRedditImages

It will get better when it passes the meme stage, like bloom did

>>DUDE IT DOESN'T MATTER THAT LIQUIDS AND WALLS DON'T REFLECT LIKE MIRRORS
What are you talking about?

I remember Battlefield showcase of RTX and they were boasting about building reflection in tram window and it wasn't even correct lmao. I can't be bothered to look it up, but if someone wants to pay close attention to said spot in the video

>DYNAMIC LIGHTING THAT YOU'LL HARDLY NOTICE DURING THE HEAT OF THE GAME
Oh, god, it's the console kiddy argument.

Even though ray tracing is the future Nvidia goofed hard by trying to push it before it was ready.

How are they pushing it before it's ready? Control works flawlessly.

>falling for the raytracing meme

It will be used more in the future, for sure, but it's emerging tech that isn't worth the price -- considering how few games even implement it.

Remember when bloom was the hot new graphical thing and devs added bloom to everything for a while before they realized how retarded it looked?

This upcoming gen will be looked back on as the time of fucktarded looking reflective concrete surfaces.

Attached: bloom.jpg (900x506, 127K)

>How do we stop this detrimental meme called Raytracing
By not buying nvidia

>fucktarded looking reflective concrete surfaces.
I'm sorry, but that's what polished fucking concrete looks like. I'm starting to think a lot of gamers are incels who have never seen real light interacting with the environment in their lives.

I still see that as a good thing. The more retracing is pushed by industry and the more hype it builds in consumers, the faster itll be implemented efficiently. Solar was dogshit and cost a shit ton per kWh until it didn't.
Then again I'm a pirate and a smelee so I'm really only behind progression for progression sake.

>How do we stop this detrimental meme called Raytracing foisted upon us by the scum at Nvidia?
actually competition on the high end market

Bloom isn't reflective of reality. The entire point of ray tracing is that it is an irrefutably accurate portrayal of how light works. You can't have an opinion on this. You're either correct or you're a fuckwit who can't do math.

Devs are going to turn down the roughness settings on surfaces to an unrealistic degree, just to show off the technology. They're already doing that.

They're not, though. All the surfaces in Control have appropriate materials for what they are.

Are you fucking retarded?

Nvidia unironically are the only company to actually push the industry forward even if they do tend to use that early advantage to price gouge. If nvidia didn't exist and we were left with AMD and intel we'd be stuck with intel HD graphics and last gen nvidia performance from AMD. The radeon 7 is/was the best and most powerful GPU AMD ever made and it barely matched nvidia 2016 performance with the 1080/1080 ti.

We won't realise it now but nvidia practically forcing devs to update their engines to support ray tracing and to utilise low level API like vulkan and DX12 to do it will benefit us all in the long run or else we'd be stagnating.

Blood dries very quickly. This looks fucking dumb.

Attached: blood ray tracing.jpg (1303x696, 94K)

Respectfully, games and film alike have been cheating with "pools of blood" for decades, so why stop now?

>literally a console build demo as if the controller wasnt fucking proof enough

Still optimistic, and Yea Forums's opinions suck
>in b4 Yea Forums isn't one person

AMD fans for you.

blood doesnt reflect like that in general it looks more like shining light on semi reflective plastic

What the fuck is this post even trying to say?

I think I'll have to switch over to AMD since Nvidia wants me to spend a small fortune for slightly different looking explosion effects.

Attached: WOW GAYTRACING.jpg (1320x727, 78K)

Digital Cuckry shills for the Njewia

Never mind this game looks like a PS3 game with its jank animation and mediocre graphics

Blood pretty much only reflects big light sources like floodlights, etc irl
>t. paramedic
>bonus story: got a call yesterday for a 60~ y/o male unresponsive.
>walk into his 1 bedroom apartment
>wine bottles everywhere>
entire floor is covered in a dark red liquid.
>step into room
>squelch
>that isnt wine
>this poor fucker got shitfaced and ended up having a AAA
>hematemesis everywhere

it looked like a horror movie desu. thats my story hope you enjoyed it.

Attached: E84C4EEC-D6CB-4D96-AE6C-B87859A742A6.gif (440x330, 1.24M)

>Never mind this game looks like a PS3 game with its jank animation and mediocre graphics
It's the best looking game you can buy at this very moment. You should probably get your eyes checked.

In real-life, not everything is polished like a fucking mirror you fucking fuck-tard. Brb with every floor being a fucking hall of mirrors.

this guy gets it tho desu famalam

Dude, fuck off. Look at those jank fucking animations and look at that character model. Resident Evil 2 Remake looks 5 console generations ahead of this trash and unlike this Quantum Break reskin, that runs on fairly mediocre hardware at 60fps with an excellent engine meant to scale, an engine that manages photorealism at 60fps.

I read this story and I enjoyed it

I love it. Gonna buy one of those crazy GPUs and not need to replace it for years upon years because everyone will be obsessed with the hit new gimmick effect and i'll have that shit turned off.

American government buildings are often highly polished. The architecture and materials you see in Control are exactly what you'd expect.

Attached: FOR YOU.jpg (1200x778, 152K)

>t. has never seen blood in larger quantities than a paper cut

It depends on the surface a bit, yes, but large amounts of blood do not dry as quick as you think.

>Resident Evil 2 Remake looks 5 console generations ahead of this trash
RE2 remake has severe issues with its scene reflections, a weirdly common problem with Japanese games, has basically no physics, and is completely missing most of the sophisticated lighting, shadowing, and shading effects found in Control. Even Quantum Break is more sophisticated than RE2 Remake by a significant margin.
>Look at those jank fucking animations and look at that character model.
Weeb detected. Also, a huge complaint about Alan Wake and Quantum Break was that they prioritized animation over responsiveness. Control is more like Max Payne 2, and that's a good thing.

I see you've only seen the literal first hour of the game.

Attached: Control_Screenshot_2019.08.29_-_20.57.36.63.png (1920x1080, 2.31M)

Ray tracing will be mandatory in AAA games within 3-4 years just as DX11 became mandatory around 2013.

at least you get something different with that extra money

Attached: rebrandeon.jpg (1340x1588, 529K)

RE2 Remake isn't photorealistic in the slightest.

Ray tracing is not something like DX11. Its not some framework tier thing.

Mad cause poor.

That's just not gonna happen. Not after the 290X burned, both literally and in sales.
That was AMD's last real attempt at a high-end GPU, they made a huge fucking slab of silicon that was faster than a Titan and sold it for half the price of one.
Naturally, the reference blower cooler was loud as fuck and didn't actually cool the card and even if some of the aftermarket models were pretty good the damage was already done.
You've got a similar situation with the 5700XT which chose to expose T-junction temps which are naturally going to be higher than package temps. But people saw 100 degrees Celsius and were immediately turned away. Doesn't help that again decent cooler designs weren't available until like a month and a half after launch.

Even if by some miracle AMD's next gen is actually good for high-end and sold at a lower price their image has been ruined so hard that it won't matter. They've failed too many times and their PR really isn't helping.

ray tracing won't be "mandatory" it's just the natural evolution of graphics and as hardware becomes better and more capable developers will use it to make better looking games which has been the case throughout the history of games development. AMD are going to be debuting their ray tracing cards next year if rumors are to be believed. the next gen consoles are also going to be supporting ray tracing hardware (confirmed in the case of the next xbox anyway).

It's a good analogy for how developers will quickly abandon any attempt to make their games work on non-compatible GPUs. In 2013, along came Crysis 3 and Call of Duty: Ghosts. Neither of those games worked on a GPU that didn't support DX11 features. The number of games rapidly increased. By 2024, you won't be able to play the latest Call of Duty game without a hardware ray tracing accelerator or some kind.

>ray tracing won't be "mandatory"
Yes it will because nobody in the AAA space is going to make games with traditional rasterized rendering within a few years.

Well first of all, no one really cares about CoD on PC anymore.
Secondly, DX11 has been around since 2009.
You'd have a point if we were rapidly seeing the number of games mandating DX12 increase, but we're not, and I doubt we ever will.
And we're almost 5 years in to DX12 being on the market, by late 2013/early 2014 DX11 was mandatory in the vast majority of non-indie games.

If you honestly believe games will be using fully ray traced rendering in 2-5 years you're retarded. The ray-traced reflections and GI will probably become standard but we're still a decade off fully raytraced rendering at acceptable framerates.
Especially as 4K and higher framerates become more and more common, 4K just fucking kills any chance at fully raytraced graphics any time soon.

do you understand what mandatory means? there won't be a compulsory law in games development saying all devs have to use ray tracing, which is what it would be if it was mandatory. if most devs choose it it's because they can and its of their own will. you'll still have games which don't use it because it is still a huge performance impact. competitive games are the obvious ones which won't use it or use an extremely lightweight solution for it.

If you've actually played the game on a good graphics card, then you'd know how bad the lighting set up is. It's not just the ridiculously bright and huge floor reflections either, the amount of bloom coming off certain surfaces like white concrete is fucking blinding. The art direction in the game is pretty bad to be honest, not sure why people keep sucking it off.

>do you understand what mandatory means?
Yes. If you want to play the latest AAA games, a GPU with hardware ray tracing will be mandatory within a few years. How is that difficult to understand?

>The art direction in the game is pretty bad to be honest, not sure why people keep sucking it off.
Someone isn't a fan of brutalism.
>Especially as 4K and higher framerates become more and more common, 4K just fucking kills any chance at fully raytraced graphics any time soon.
4K is a dead meme. A gimmick pushed by TV companies, and roundly rejected by graphics people who know what they're talking about. 1080p and 1080p with a shitload of reconstruction is going to be the norm going forward.

all ray tracing right now is hardware ray tracing even pascal GPUs which support RTX. what you're saying doesn't make sense. you'll have dedicated RT hardware for assisted RT but that doesn't mean people who don't have that dedicated hardware can't run it.

5700 xt aftermarket cards sold out senpai.

>all ray tracing right now is hardware ray tracing even pascal GPUs which support RTX.
It runs like dogshit without a significant chunk of the die being dedicated to ray tracing.
>you'll have dedicated RT hardware for assisted RT but that doesn't mean people who don't have that dedicated hardware can't run it.
The performance will be so cripplingly bad that developers will simply prevent the games running on older GPUs.

No. I seriously doubt ray tracing will go with the dedicated hardware accelerator route. That's just really fucking dumb. You'll likely be using a vendor-agnostic API like DXR if games go fully raytraced and that won't happen for quite a while.
For assisted ray-tracing only for reflections and global illumination it obviously won't be mandatory.

imagine buying a 2060, 2070, or a 2080 and not waiting for the 3060, 3070, or 3080ti.

Isn't there still only one with a non-blower cooler, the Sapphire Pulse? And that one's not even on the shelves in Europe yet. All the earlier "aftermarket" cards were still using the blower design.

fixed

Attached: cbt.jpg (1303x696, 285K)

The 285 and 380 is a more accurate comparison since they literally used the same chips there.

The chart doesn't mention that the 390 uses GCN 3 vs 2 in the 290, which is a common misconception. It's one of the main reasons why the 2xx series works well in Linux but kills itself when it encounters a 3xx series because the devs were lazy and wired everything up almost the same for 2xx and 3xx cards despite this difference and it didn't get better until very recently with Linux 5.1 or so.

>3900X
>wonders why his framerate is low

I'll say that this game looks great even with raytracing off. The screen space reflections are crazy good, not sure how they pulled it off. MY RX 480 runs it great.

they always test on AMD, it's just a superior cpu

>Crysis will never be topped, why don't they care about making true next gen graphics
>Ray tracing is a meme
Why do we need amd/nvidia consolewar faggotry

ty user stay safe out there

No, your information is outdated. There are over 5, example.

newegg.com/powercolor-radeon-rx-5700-xt-axrx-5700xt-8gbd6-3dhe-oc/p/N82E16814131752

And they are all sold out. I ended up getting a reference design because I don't give a shit about the noise, because every single aftermarket card is sold out, and the Asus one is backordered before even coming out.

9900K is better for gaming. Should have used it instead

This game is making my RX580 get to 80º for some reason, what the fuck?

not even looking at your image, but if you can damage your gpu by using it, then your cooling solution is not adequate and your product is defective

oh boy, same graphics card, and same problem.
>turn everything down to minimum
>turn on vsync
>still gets up to 58c. even with my crazy fan set up
the fuck is going on here?

>>still gets up to 58c
Are you retarded?
58c is insanely cool at max load.
Is every PCfag on Yea Forums this technologically illiterate?
Fucking blows my mind.

It's "better" if you're CPU limited because you're aiming for 240 FPS at 1080p. If you're aiming for 60 it really doesn't matter and more and more games are becoming parallelised to the point that AMD's core count can actually matter.

it's significant becuase it's 58c at 100% fan speed I remember hearin horror stories of solders softening and hardening repeatedly, leading to cracking at temps as low as 67c, let alone the70C that some people let their shit get to, so I set my fans to run at max at 55. is this wrong? wasI misinformed? or is chink solder as bad as I was led to believe?

Whoever told you that is either trolling or a fucking moron.
Unless you're mining and have the GPU maxed out LITERALLY 24/7, none of that will ever be a concern unless you get SUPER unlucky in the silicon lottery.
Even the fucking Fermi housefire GTX 480s were fine at upwards of 85c for the most part.

>80 is safe
well, guess that means I can turn my fans down a bit. thnx user.

I have this 6 year old Tri-X 290X that I never upgraded from because when I wanted to the cryptofags ruined the market and now I'm just waiting for 2070 or 5700XT prices to come down a bit. It's been running at honest to god 80C-90C on full load as I've had to push the voltage and clocks ever higher to keep up with modern titles, it's currently at +80mV and +200 core and +500 memory clocks. Fans set to ramp up to 100% at 65C, not that it helps with those voltages. They've been running so hard for so long that the bearings are worn to shit, 2 of them can only spin at ~40% of their max RPM when set to max.
The soldiering is still completely fine, I replaced the VRM heatsinks and the thermal paste once and it wasn't even that bad when I did.
The card was made to run at 80, and it can definitely do that for years.

this leads to another question though, how fucking hot were 360s/ps3s running that they died due to soldering issues?

no, 4k will be the standard going forward. They may use tricks such as checkerboarding, or intelligent upscaling, but 4k is going to be the new standard for consoles like 1080p was this gen.

AFAIK the PS3 idled around 70 on the CPU so... VERY hot. Another problem is that there's just not enough airflow so the inside of the case gets insanely hot and the PCB lights up as well.
In a gaming PC even if your core temps are high you'll usually have enough airflow to keep the case itself relatively cool.

Generally speaking, anything below 90c is fine for most modern GPUs
The issue there was the combination of inadequate cooling for the form factor consoles target and cheap chinkshit solutions.

>DUDE LET'S MAKE $1200 GPUS MELT FOR DYNAMIC LIGHTING THAT YOU'LL HARDLY NOTICE DURING THE HEAT OF THE GAME LMAO
>implying people who can afford $1200 gpus can't afford liquid cooling systems to adequately cool an overclocked video card, nonetheless a card running at stock speeds.

poor shits kill me with shit like this. just because you have garbage specs that'd fry when rendering a modern game doesn't mean everybody else does.

Attached: 32.jpg (390x310, 28K)

>implying that the majority of people with $1200 GPUs even know what the fuck they're doing
Let's be honest, a lot of people with 2080tis and ESPECIALLY the retards with TITANs have no fucking clue about building PCs and are daddy's boy rich fucks or e-celebs.
The retards who actually fried their 2080s and then just bought 2 more are good examples of this.

Right now it's just a tech showcase.
RT will become mainstream in less than 5 years.
Every AAA game will be designed around RT, not just adding it as an afterthought like they're doing now.

oh god I still remember that retard who argued for an entire thread that still bodies of water did not act like a mirror.

>meme
>garbage waste of tech
Degenerate retard.
youtube.com/watch?v=5jD0mELZPD8

No they aren't just look at blood doesn't reflect like that, IRL it only reflect big source of lights like more like a shiny plastic surface

I mean it's closer than shadow mapping but it's still not 100% accurate. It portrays light as only a solid particle and not as both a wave and a particle. And obviously it doesn't simulate any of the reality-breaking weird quantum shit.

I think it's really good tech but performance wise it should cost around 10 fps not fucking 40+

Attached: Control Screenshot 2019.08.29 - 18.49.05.46.jpg (2560x1440, 538K)

Attached: 1547924051620.jpg (1247x808, 603K)

Developers are inept, lazy retards however so this is nothing but a detriment and a drain on hardware resources

So I spend 1600 dollars to get 1080p and 60 fps?

>performance wise it should cost around 10 fps not fucking 40+
It's an absolute miracle of optimization that it runs faster than 1fps.

That's a lot of blood

It's a bit of a cheat because it's behaving like a liquid, just not like human blood, and that's an artistic dispute. If you spilled a red syrup on the floor that was already reflective, it would look like that.

Can't wait to play this game....in 2021.

>I mean it's closer than shadow mapping but it's still not 100% accurate. It portrays light as only a solid particle and not as both a wave and a particle. And obviously it doesn't simulate any of the reality-breaking weird quantum shit.
All fair points.

4K is a technological dead end on par with 3D TVs.

based gamer thread (CBT) poster

You say that but people are buying 4K tvs like nobody's business. Even though there's nothing to watch in 4K. It's stunning.

Because it's water tinted with blood.

Point is, being accurate to reality is a stupid goal to set. It doesn't really matter if your simulation is accurate, it only matters if your results are.
So if your tech completely hacks it but still ends up with accurate results 99% of the time, great. Fully raytraced rendering is definitely the future and it will happen some day, probably not in the next 5 years though.
But its main advantage is that it can get more accurate results, not that it's a more accurate simulation. For now a lot of the shit Nvidia is pushing with RTX is just really silly, compared to traditional rasterised rendering it's not enough of a difference in results for a huge difference in performance. And once fully raytraced rendering starts being popular it will almost definitely not be done by dedicated RT hardware. That would make no sense as essentially 25% of your GPU would be doing 90% of the rendering work.

That's still a lot of blood. The color is very intense.

Hardware accelerated real-time ray tracing has been part of the DX12 spec since early 2018, Nvidia hasn't pushed anything forward technologically they've just found another way to gouge early adopters with half baked silicone.

They're going to get BTFO extremely hard when a generalised implementation drops and $200 GPU's are outperforming the 2080 Super.

Your delusional. The 2080 Ti's performance is not that great, and cost 1200 dollars. Nobody would buy an AMD graphics card that expensive, but they could make one if they had a gun to their head.

>Nvidia
>Forcing devs to utilise low level API like vulkan and DX12
Except for that time when they were stopping devs from utiling async shaders because they ran better on AMD and even incentivised benchmark makers to change their benchmarks so their cards look better. There's a reason that AMD cards run proportionally much better in Vulkan and why the Fury was suddenly kicking ass in those DooM Vulkan benchmarks.
Nvidia only pushes the industry forward when it benefits them.

try using a real cpu next time instead of one that still can't run crysis at 60fps

Attached: 60fpsdenied.webm (854x480, 1.61M)

t. poor

How is this an argument? No, honestly, would you actually buy a $1100 GPU from AMD if it was similar in performance to the 2080ti but with standard AMD "quirks" of higher power draw and heat?
AMD's entire brand revolves around price-to-performance nowadays, nobody would buys their expensive GPUs.

The game runs great with both RTX on or off

Attached: Control 2019.08.30 - 17.07.25.06.webm (684x384, 2.93M)

>Pay more than a 1000 bucks for a gpu
>Can't even get 60fps on max settings

Attached: yNlQWRM.jpg (1813x2111, 912K)

Anything sub 144 isn't "great"

Ray Tracing is the future, people have to stop forcing the 4k meme already
it isn't worth it and the cost of multiple games forcing T-AA is awful
especially since it isn't the standard yet

Retard.
The telekinesis reminds of Saints row 4

i dont see that huge difference in everything exep reflections in glass.

how does that even possiburu with this ultra realtime ray tracing?

Attached: b2sc4zhkp3j31.jpg (616x479, 14K)

>another remedy tech demo that is unoptimized to shit and mediocre or outright bad

Attached: 1435745488225.png (423x454, 193K)

Pretty good story, but the illustration didnt really fit the mood of the tale.
Keep up the good work and you'll nail it in no time.

>Solar was dogshit
and it is now. retard

its cost inefficient and location dependent.
like windmill power plants.

fucking nature loving idiots holding good things to progress and be cheap + ubiquitous. Natere is doomed and you can not do anything about it. and why you should even care about this anomaly of a planet.

roughness is THE option to drop your PC perfomance to low ground. Thats why every thing is a polished turd. Its not about "show-off"

It might not be very realistic but that sure looks cool.

I'm excited for Ray tracing simply because it means we can have working mirrors again

The entire point of ray tracing is to no longer cheat with reflections.

no way, is crysis 1 still used as a benchmark?

Could've gotten max score on your story if it wasn't for the cheap whore. Pretty nice short read tho

No not really. They did it to test Ryzen single core performance, since Crysis still runs like shit on a lot of CPU's by being bound to a single core.

Attached: Control Screenshot 2019.08.31 - 00.53.47.25.png (1920x1080, 3.51M)

How is 4k a fucking meme. Im so tired of this lie. Hook up a 2080ti to a 4k tv, put it on 4k. Oh snap its sharp. Put it on 1080p. Oh snap its blurry. Add shit ton of AA. Oh shit the edges are gone but some faggot smeared vasaline on my tv.

I've been running gtx 270 or some shit like that for years without cleaning all day everyday, load temps were nearing 80s if not 90s, it only died after like 5-6 years of such abuse long after it became completely obsolete, you are being extremely paranoid about your hardware, its not nearly as fragile as you think and maxing fans out that low will only lead to more noise and fans dying sooner... Fans getting misaligned is a much more commong thing than you getting a card with bad silicon or soldering.

i have 1080ti for VR and will wait
for 4080ti.

fuck this 5-10% bullshit advantage.

Attached: 1562514580962.jpg (1224x896, 99K)

THANK YOU BASED NVIDIA!

Attached: 1023948710234.png (1920x1080, 2.31M)

please stop with this graphics madness

>why you should even care about this anomaly of a planet.
Because the next generation has to live on it.
Although if some people weren't so piss scared of nuclear we probably wouldn't need solar.

It's a meme because no hardware can adequately handle it, let alone in 144hz which is THE thing that changes how you see the game. 2k on a good monitor is more than enough for the forseeable future. If you want to hook your stuff to inferior tvs with only thing going for them being muh size then be my guest, you are still likely to enjoy it if you think that its AA that smears vaseline on your screen and not the tv itself. Tv is a fucking meme and you should treat them as one, its like expecting quality from gamer branding. Inferior colors, horrendous input delay, various internal hardcoded mechanisms of "enhancing the picture" that do nothing but smear it all up. The only reason to use tv is if you are a dirty little console peasant who wont ever see what good graphics look like anyway, but we are talking about 4k and no console does that yet or in the next gen or in the next two, not without ruining the framerate for sure.

It has value as a benchmark in some situations because it can't handle multiple cpu cores very well.

aesthetics is above realism
of course, things don't reflect like that in real life as if everything was shining clean, but it's prettier

RT makes AMDfags seethe

can't wait to see how they react once AMD comes out with their own RT capable graphics cards. they better continue with this narrative that RT is dead and pointless or else there is some serious double standards and hypocrisy here.

These. Anti-nuclear luddites should unironically be put against a wall and fucking shot.
>abloo abloo, a soviet power plant made out of pig iron and run by perpetually drunk and incompetent commies exploded because they did literally everything wrong, so no-one is allowed to use nuclear reactors ever
Daily reminder that nuclear energy is literally the cleanest and safest energy source known. Daily reminder that solar is a fucking meme, and wind-generation costs more energy to build than it will create during it's entire fucking lifecycle + it fucking ruins the landscape and kills thousands of birds.

>How is 4k a fucking meme.
Because you're pissing performance up the wall for no real benefit. 4K was pushed to sell TVs. Nothing more. Graphics engineers have always disliked it.

lol. retard.
there are no good options in 50+ tv with lower then 4k resolution. People buy because there is no choice (only choice is Not to buy)

thank god you did not saw blood in large. this is just water colored by blood.

not made by the blood undone by the blood

Attached: maxresdefault.jpg (1280x720, 80K)

>How do we stop this detrimental meme called Raytracing

can't stop the progress. it will take some time until devs start using it correctly and we should encourage richfags to buy more 2080 cards so nvidia can move on to better and cheaper RT cards faster.

Attached: 1552787003408.jpg (1280x720, 198K)

resolution is a meme if you're not taking into account screen size. playing on a 24" 1080p monitor has the same pixel density as a 32" 1440p monitor so naturally if you want to increase screensize and want to keep the same crisp visuals you would need to increase resolution. for big screen content like 65" tv's 4k is an absolute necessity. my dad has a 1080p 65" TV and you have to sit far away to not be able to see the pixels anymore.

>WOW single thread only engine
>WOW still less then 60 fps
>WOW how could it be vros?

but we dont have other options
amd is for niggers and poorfags and cant even run android emulators

Imagine being so fat, you're trying to make a point about hardware but you end up talking about food instead.

Is this normal in the US?

>dat NOISE
just fuck off

Attached: 1561694989590.jpg (1600x1200, 275K)

Precisely how could it be. Just because you have 6 cores or double the amount of threads there is no excuse for the first logical core being slow shit.

>make puddles reflect like mirrors

that'll be 599.99 plus tip

VRChat has mirrors just fine. Literally a non-issue.

I don't know how all modern games have been fucking it up so bad for so many years when a shoddy, glitchy VR social game has had working mirrors from the start.

Attached: Lame.png (440x440, 79K)

he is right though. but only if you are using m+kb

Attached: 1560345007792.jpg (1280x882, 229K)

fucking retard

Attached: Control 2019.08.30 - 16.54.29.03.webm (684x384, 3M)

alan wake was good

Because the modern games want to reach the biggest amount of users unlike VR shitapp devs that care fuck all about optimization or performance.

>4k meme

resolution will continue to increase forever

Is this the true power of RT cores?

That's mixed with a lot of water.

>puddles don't have reflections on real life
>you don't notice better graphics while you're playing the game
>I didn't buy a 1200 dollar gpu for better graphics
maximum cope

And for what reason would you need to increase screen size that much for pc gaming? Consoles wont handle it anyway and there is barely any films or shows to watch in 4k. There is literally no reason to buy one if you arent a pc gamer with thousands ready to go to build top of the line machine. Turning your 5 grand pc into a console-like thing and couchsurfing with it seems to be the only option but at that point you'd be much better off with some of bigger 144hz monitors that are set up appropriately. You can even still have your ass in a couch. And you will have much better image quality, for what little you'd lose in sharpness you'd gain much more in fluidity and responsiveness.

Neither of those looks like glass. Glass is transparent.

cat is fine too

Attached: 1560372696692.jpg (500x456, 36K)

>DUDE IT DOESN'T MATTER THAT LIQUIDS AND WALLS DON'T REFLECT LIKE MIRRORS IT LOOKS SO COOL LMAO
Please leave your room from time to time.

Attached: 34531125214_d0146d150b_b.jpg (1024x838, 372K)

>Already used to 4k/60hz or 1080/144hz for PC gaming
>Buy this $1200 GPU and enable this new graphical feature
>Can't even run at stable 1080/60 anymore, but hey it's more realistic
Definition of a pointless gimmick being forced ahead of it's time. It's PhysX and Tessellation all over again.

Because current high-end graphics are too demanding for the brute force way of doing reflections of the current graphics model. In essence, you have to render everything again for every reflecting surface. This works fine if your polycount is low, your textures small, AA barely a thing and you don't have any particles. But at current graphics at the resolutions they are pushing for? Way too demanding.

I would rather raytrace with my RTX then play in 4k. I don't even own a 4k monitor. 1080p 144FPS Raytraced games are the ultimate experience at this time.
Nvidia are greedy bastards but they are not wrong

>NOOOO WINDOWS DONT MAKE A REFLECTION!!! NOOOOOOOOO

Attached: shop-window-reflection-cheapside-london-england-EN2115.jpg (1300x956, 243K)

chill
in a couple years it'll be more practical.
more cards will enter the market and drop the price.
more developers will learn how to run it more efficiently.
in a few years you'll be shit talking anything without ray tracing.

this game looks so kino
i cant wait for it to arrive

>NOOOOOOOOO RAYTRACING IS UNREALISTIC!!! WINDOWS DONJT REFLECT LIKE THAT HRUAAAAAAAAAAAAAAA

Attached: 9b50017ad6cd3a689a564fdd146bcb86.jpg (1508x1124, 886K)

>NOOOOOOOOOOOOOOOO BUT ITS A MEME!!! MEEEEMEEEE!!! MEMETRAYING!! THIS CANT BE TRUE!!

Attached: 29847864601_fe8fb862bf_b.jpg (1024x714, 229K)

>Tessellation
>pointless gimmick
neck yourself

>nvidia hairworks
>nvidia-ageia physics
>nvidia-ray tracing

Nvida is some kind of perverted King Midas
turning everything to shit and holding tech as a hostage for many years.

Attached: gold-shit.jpg (350x336, 31K)

aug 2020.

the game uses physx too

Attached: CNTRL53.jpg (1851x2160, 1.77M)

right. i forgot nvidia invented raytracing

Attached: 1566993001639.jpg (555x555, 53K)

>LOL
>Lets just subdivide all this water under the level that the player doesn't even see just so we can sell more expensive video cards.

Attached: IMG0032910.jpg (600x338, 136K)

nice avatar u have m8 )))

PhysX is mature now and arguably this is the time for it, not 2006 when they were pushing people to buy a second dedicated card just to run the PhysX simulations.

Attached: proxy.duckduckgo.com.jpg (1280x720, 81K)

PS5 is confirmed to have ray tracing support. Good luck playing those ports next year on your gtx card.

what's next after ray tracing?

>muh realism

guess how i know you're a low iq retard

poorfags like you are just as bad as consoleniggers

Attached: ghost recon15.jpg (3840x2160, 3.68M)

Damn, that looks good. Too bad CP2077 movement isn't as smooth as this.

So you don't actually have an argument. Sad.

So when they nail down graphics will we finally get onto things that matter like AI and physics?

This is for the retards who keep talking about how windows don't reflect like that or that puddles aren't mirrors.

Attached: 30789864703_6ca6a6586d_k.jpg (2048x1365, 360K)

lolno

>great physics
>great particle effects
>best implementation of ray tracing in a modern game ever

why did people ever expect to run this with raytracing without a top of the line card?

it runs decent on mid tier cards WITHOUT the raytracing, you know.

you have to have an argument first.
saying new tech is bad because you might have to upgrade your old gpu is not an argument.

Attached: avellone.gif (680x381, 2.75M)

>physics

what the fuck do you mean? physics is solved in gaming. so is, a.i. since that has mostly to do with how skilled the programmers are coding the game.

I did have an argument but that wasn't it, Strawman harder faggot.

Forget AI and physics. When are they going to finally focus on actual gameplay?

>b-buh if i don't get 9000fps on ultra it's poorly optimized!

Seething

>Jannie menstruated again...

Yeah your argument was
>waahhh tesselation under water!! muh gpu cant handle
And then I posted a good implementation of tesselation in Wildlands.
Keep seething, poorfag.

Yes i'm sure graphics programmers and GPU vendors will be instrumental in advancing gameplay.

that fucking grain everywhere...
in almost every rtx game especially with shadows u got this shitty grain look

Wow, it's like you didn't even fucking watch the video you posted.
Did you just read the title? Holy fuck, what a dummy.

Ray tracing ; but for real this time. As in, fully raytraced rendering and not this hybrid dedicated RT cores shit. Basically your entire GPU will be doing raytracing like it would in a CGI movie.

>Yeah your argument was
Wrong. Try again faggot. This time actually read.

I fucking love the really chunky mist the Hiss leaves behind

maybe he didnt turn off filmgrain

Attached: CNTRL65.jpg (3840x1677, 3.72M)

I did. Here's your "argument" One bad implementation in Crysis 2, which of course means the whole tech is pure garbage, a meme, and worthless!

2017:"consoles are holding graphics back"
2019:"HAHA PC GAMING IS HOLDING GRAPHICS BACK!"

wow how did we come to this

ray tracing doesn't belong to nvidia.
they just manufactured the first consumer grade chip that can kinda handle it.
Other manufacturers will follow suit.

?

Crysis is going to continue to run like shit until the processors get good enough that the game loop can fit entirely in cache.

my 2080ti plays this at mostly stable 60fps and the raytracing actually looks pretty good, as in i actually notice it when playing.

Attached: contrl.webm (1100x680, 2.83M)

Thank you for your civil discussion I enjoyed reading both of your posts

all that fucking grainy noise

We're getting closer.... now you just read the whole reply chain instead of cherry picking something out of context and we'll be there. They really should pay me for walking you room temperature IQ faggots through basic reasoning like this.

>literally removes some parts of the shadows
god even physx was a better memefeature
literally nothing new with this shit, it just removes reflections/shadows if you dont have a $1200 gpu

Attached: file.png (1208x677, 1004K)

better graphics sells
better AI doesn't sell

It runs fine on the 2060 which is the weakest RTX card

>but that's what polished fucking concrete looks like
And how often you find concrete polished to a mirror sheen

you guys do realize physx is not the basis for all video game physics now?

Yeah and if I do that we're back to you complaining about your garbage bin laptop not being able to run fucking tesselation, which barely affects performance anymore.

Attached: Control 2019.08.31 - 12.22.01.05.webm (684x384, 2.95M)

I get solid 60 fps with all raytracing on with my 2080ti at 1440p. Stop being poor.

blood is mostly water, dumbass

So now we're back to the strawman, we were so close too. I almost thought you were capable of human levels of discussion. I guess I was wrong.

No you don't. You drop into the 40's during heavy combat.

Prove it.
Record some footage of heavy action with RTSS overlay on to show GPU clocks, utilisation, and frametimes.

Cyberpunk 2077 Threads
>GRAPHICS BAD THEREFORE BAD GAME
Control, KCD and other high end PC game threads:
>GRAPHICS TOO GOOD THEREFORE BAD GAME.
What did Yea Forums mean by this?

Attached: 1564334689040.jpg (602x602, 125K)

Nah I really dont see your argument there.
Basically what you're saying is that you're used to 4k 60 (lol) and that with better visuals you cant get 4k 60 (lol) anymore.
Oh noooo!!!! Fuck progress and fuck better and new tech! Btw consoles are holding back graphics!!

It's almost like Yea Forums isn't one person, you dumb fucking memeposting redditor.

Control isn't bad. But it's mediocre not because it's graphics are good.

every locations look the same
played for 30 minutes and got bored

>Raytracing
>GPU clocks
You have no idea what you're talking about do you? The limiting factor in the fps with raytracing on is not your GPU clocks, it's the amount of raytracing cores that you have. Your clocks will be lower with RTX on because your GPU won't need a higher clockspeed.

yeah its mediocre because it does something similar to another game from 2004. fucking telekinesis in every game!

>every locations look the same
>played for 30 minutes

Attached: 1515907874233.jpg (660x574, 31K)

No. It's mediocre because the combat is fucking boring shit.

So now you've actually read the comment and you still present half of it as strawman. I just want to see if you can make an argument that isn't a strawman or otherwise in some other form of bad faith.

>tfw i turn off bloom if possible
fuck bloom and fuck depth of field and fuck every meme tech

Because that's not an argument, that's just what's happening. Better tech comes around so now you can't play 4k 60 unless you upgrade.
And raytracing also is not a pointless gimmick, I really hope you don't think Nvidia invented it.

in all the video I see its still the same locations
office building with sometimes red light

Actually wind energy is pretty efficient compared to every other form of renewable, the major issue is the storage of it during off peak times since batteries are dogshit

My only issue with nuclear is 3rd worlds getting it and 1st World countries shipping in 3rd workders to run it. You see how incompetent our energy structures and future planning is and you don't think it'll end up being a complete shit show down the line

>Better tech comes around
If it's not optimized to run well, then it isn't "better" tech. You can hand-carve potatoes into fries because you think that makes them taste better, which'll take you a while to do, or you can run them through a machine and be done in a minute or two. Is either technique inherently better than the other? No. They have the exact same result: making french fries. Once they figure out a way to make raytracing run well (4K 144Hz), then it might be considered a true successor to other lighting techniques.

You can't make raytraced reflections run as good as screen space reflections.

left one is better

rtx is a scam. That shit will never run on full at decent frame rate on anything other than shit from the early 90's. It's just another tool given to developers who are too lazy to do their own lighting and want to make shortcuts at the expense of performance.

you know next gen consoles will have it too
and amd are working on their own raytracing stuff?

Attached: 1544155878605.png (385x343, 53K)

>Sub 60 drops for 1080p footage assuming you have top of the line gpu
>Higher resolutions? Lower frametimes? Don't have top of the line gpu? Don't wanna see drops? Fuck you, buddy
>Flawlessly
Nigger fucking really. Don't even argue that you can turn down other settings, because it's the raytracing that fucks the framerate, you will have to turn down a lot to.have any significant improvement in framerate stability

>You can't make raytraced reflections run as good as screen space reflections.
So that means it isn't "better" tech. That means both of these methods can exist side by side, one appealing to better performance, the other appealing to cleaner looking graphics at an exceedingly high cost.

>Because that's not an argument
Explain how it's not an argument? Use examples. I made my case quite clearly and you disagreed with it. You couldn't argue in good faith so you resorted to strawmanning my position for several posts. Hell the later part of this post is even looking for another position to strawman because you still don't have anything substantive to say.

Devs still add bloom to everything, it's just more high quality bloom and its way less pronounced. And that's a good things it's a decent post processing effect when it's being properly tuned, devs couldn't properly do that before due to technological and engine constraints of the time

This doesn't mean it won't be a giant performance hit and eventually people will give up on this crap for better frame rate

4K is a gimmick though

No that isn't what it means.
It's better tech which is why you need to fucking upgrade your GPU aka buying better tech.

When next gen consoles come out and graphics as a whole are getting better again and your current GPU won't be able to max new games out anymore, will you say the same? Ohh, these new graphics, despite being much better, are actually shit and worse because your old GPU can't handle it?

Attached: 20180730152253_1.jpg (3840x2160, 1.82M)

Get better hardware kiddo ;^)

240 hz 4k or bust

>It's better tech
That's like saying 3D games are better than 2D games. They are not. They are two different genres, existing aside one another. Just because 3D games need better hardware to run, does not make them inherently superior.

they are only pushing RTX because they've hit a wall on performance and want to counter AMD who has consoles on lock. This is all about marketshare and continuing a near monopoly. That is all.

>most games still dont run at 4k 60
>expects games with raytracing to run at 4k 144

Attached: 1533677267145s.jpg (250x246, 7K)

At the time where only 2d games were possible, what made it possible to finally create 3d games? Oh yeah, tech getting better.

I guess Crysis also has garbage tech because PCs couldn't run it back then.

>this is all about money
Of course it is, doesn't mean that there's nothing in it for you.

I'm fine with raytracing at 60fps thank you.

Attached: Control 2019.08.31 - 12.14.19.02.webm (620x348, 2.97M)

>poorfags

Anyone who sees first hand the transparency ray tracing in control knows it's not a meme. That shit is the future.First game in the history of video games where glass is actual glass.

Correlation is not causation. Tech is the best it's ever been, and people are still making 2D games. One hundred years in the future, people will not use memetracing as a mainstay feature. Instead, there will be games with memetracing, and games without it, just like there are 2D games and 3D games. It is not "better tech", because it does not replace anything, it merely provides an alternative, one that will never be fully adopted because it is too computationally expensive.

>there will be games with memetracing, and games without it
I'm not sure about that, do you see many games with no SSAO or any form of AO?

>zoomers finally get to see the cutting edge visuals for the first time after years of complaining that consoles have been holding graphics back
>WAH, WHY DO I HAVE TO PAY A PERFORMANCE PENALTY

ban children from the internet

Performance impact of ambient occlusion: minimal.
Performance impact of memetracing: MAYBE IN LIKE SIX GENERATIONS WE'LL BE ABLE TO GET STABLE SIXTY FPS AT 720p

Attached: 1515426269501.gif (357x479, 252K)

New consoles will have raytracing and it'll be become a standard.

Well it's more like saying 3D games are better than 2D games during the PS1-era when there were plenty of 2D games which were nearly just as demanding as the fully 3D ones. I can't think of many modern 2D games which are anywhere near as demanding as a fully 3D game is graphically. I mean you could theoretically build one that takes it to the extremes and you have something like 10,000,000 sprites on screen at once. We're talking about graphical horsepower and effeciently spending that performance budget not really gameplay.

There is always a trade off between performance and visual fidelity. The current gold standards are 4k/60 and 1080/144/240hz those are what people target these day. It doesn't matter how good your GPU is you can't have RTX on, 4k and 60fps or RTX on, 1080p and 144fps in a modern game. So you have to pick 2 out of the three for either of those and it's usually the performance which suffers. The choices however equivalent from a hardware perspective and when you need to sacrifice the standards it tells you this feature is ahead of it's time.

I'll take better lighting over 4k.

>One hundred years in the future,
I'll stop you there. It will be replaced by tech that performs more efficiently and uses less resources. I would know I'm from just 20 years in the future.

>Performance impact of ambient occlusion: minimal.
Once upon a time AO absolutely murdered your performance. Guess why it doesn't anymore. Here's a hint, it has something to do with
>MAYBE IN LIKE SIX GENERATIONS WE'LL BE ABLE TO GET STABLE SIXTY FPS AT 720p

>Once upon a time AO absolutely murdered your performance
Until they found new ways to implement ambient occlusion, to the point where it no longer matters. Until they do the same for memetracing, it will remain a framekiller barely used for fringe cases.

>Nvidia only pushes the industry forward when it benefits them.
So does AMD. The only reason AMD even made mantle back in the day was because their drivers suck complete ass and still do. Their DX11 driver is extremely single threaded which is why a card like the RX580 which has a si liar amount of cores to a 1070 performs like a 1060. That same card utilising a low level API renderer gains like 10-15% performance over the 1060. AMD only gave mantle away for free to Khronos group because it benefitted them.

shame about the terrible game part

If you take the OP's video as taking Crysis as pioneering ambient occlusion, it still doesn't run better today. That was already a good SSAO implementation. Any new ways of doing ambient occlusion have been quality focused, and run worse.

>NO STOP MAKING NEW TECH AND SOFTWARE
>NO STOP HOLDING GAMES BACK

Which one is it you faggots?

Then you're taking better lighting not only over 4k, but also framerates higher than 60fps.

Even with a 2080ti, a $1200 card most RTX enabled games can't even maintain a stable1080p/60. RTX on means 1080p/60 with regular drops to the 40s. That's a technology ahead of it's time.

Attached: 2019-04-16-image.png (1440x1000, 59K)

>Look, guys! I took a 16k HD picture of poop and made a game around it! Look at the lifelike graphics! Isn't this great?
Nice "progress", user.

they want the magic videogame fairy to come and add OPTIMIZATION to a rendering problem a multi-billion dollar industry has been working on for a few decades now.

the only way through now is better hardware or distributed compute, but the kids here don't want to accept it.

The best part is that I got a 2080 Super mostly because I was due for an upgrade and I wasn't really expecting to make use of raytracing in games too much since it's been running poorly. But Control actually runs 60 fps maxed on 1440p with DLSS, which this time actually looks acceptable. And I'm talking 60 fps for like 98th percentile.

>This garbage waste of tech that'll make devs focus more on shitty boring corridor shooters than making vibrant level design with proper art direction.
raytracing just simulates how light works in real life, that didn't but lighting technicians out of a job in the film industry user.

Weird, because I was sure this thread was about an actual game that introduces new visual features and runs fine on new tech.

>Consoles forced to feature not only 4k but Ray Tracing as well
Jesus Christ the horror. It'll be Goldeneye all over again.

Attached: Framerates like this aren't normal, but on consoles they are.jpg (1594x893, 146K)

I still think consoles moving to 4K is such a retarded decision. Native 4K is almost literally four times the workload than 1080p, and most console games barely even ran 1080p at the time. I know it markets well to normies buying 4K displays but Jesus Christ. In that regard, raytracing is a way better image quality/performance tradeoff, but alright.

50FPS at 1440p with raytracing is fine.

Its weird that new effects hit performance hard. We should just stop any development on new software, physics, animations etc. It will just hit performance at some point and level so might as well stop it so everyone can play and gaming doesn't change.

yeah but you're just a retard

It is, it's literally a gimmick meant to sell the pro versions, they don't actually display 4K, and how would they when they can't even display 1080p at a stable framerate.
But with the majority of people today being tech illiterate, it works just fine.

>runs fine on new tech.
OH NO NO NO BUY THE DIP

Attached: Untitled.jpg (348x158, 19K)

>50FPS is fine.
I've got a 144hz display that I've been using for 4 years now and I can't even comfortably go back to 60fps because it feels so sluggish which is why I just can't into 4k either. For my money Performance >>>>> resolution > general visual fidelity.

If new consoles run at 4k graphics will be held back for ANOTHER decade.

You jest but you forget you are on a website were numerous people frequently express the genuine opinion that graphics reached "peak" in the PS2 era and there was no reason to go beyond that.

Sucks for you but thank god I didn't fall for the 144hz meme

It's just a stutter

DX12 version has frequent microstuttering.

Yeah this is an issue of the DX12 version of the game, even without RT. Weirdly enough I barely get it if I use DLSS but it's still something that should be fixed.

I've been using 120 to 144Hz monitors for most of the past decade as well but I know I can't expect every game to maintain a perfect 144fps so I'm fine as long as it's above 60.
60fps on the desktop however? Fuck that, office PCs feel so sluggish now.

thank god I'm not retarded like you

>didn't fall for the 144hz smooth crisp gameplay

Attached: 1418564756824.jpg (250x250, 10K)

I would agree that GRAFFIX is not the be all end all and I'd rather advancement in physics, AI, mechanics above pretty textures and reflections, but cool new tech is still cool new tech.

>he bought a 144hz monitor bcs Yea Forums told him to do so
>he cant play anything that isnt at least 120 bcs he gets motion sickness or some shit

>couldn't afford it because mom's basement

Attached: 1425277593632.jpg (7680x4320, 2.82M)

>totally fine with 60hz
>get 144hz monitor
>WTF I ONLY HAVE 115FPS THIS IS FUCKING UNPLAYABLE!!!

4k60 is a meme, ray tracing is the future.

I really hope you are joking. I am having one hell of a laugh m8

Attached: 1377187853890.gif (500x249, 485K)

seethe and cope

Attached: 20190808103531_1.jpg (3840x2160, 2.4M)

No hes not you are. Bloom is always fucking garbage. Dof is usually fucked and not realistic either, I'll just let my eyes do it for me

ok

Attached: comfy157.jpg (3840x2160, 2.62M)

1080/144hz > 4k/60

Enjoy your "screenshots" bro

Of course not, but the difference in hive mind and the fact that one hive mind does not the defend their property from the other hive mind makes me wonder what organised shill groups operate here.
I know, but it seems a lot of people in this thread are complaining because the graphics are too good. The game itself is quite mediocre from a gameplay and level design standpoint.

Attached: I guess I'll drink and go to bed.png (640x478, 392K)

>1080p
>not even 1440p
enjoy your blurry taa ridden visuals and jaggies

Attached: 4ol07ife2kw11.png (1007x720, 812K)

No it isn't considering it's below low quality "ray tracing"

I drop graphics settings until I get 144hz on pretty much everything. It's really not hard when you have a 2080 to get 1080/144hz even on the most demanding games.

The only problems arise if you want to go 4k or enabled RTX.

>reslets coping with 1440p instead of glorious 4k
How are those jaggies, faggots? Don't worry, im sure your post-processing AA will blur things enough to make it unnoticeable.

Downsamping to 1080p looks better than native 4k.

based

Attached: CNTRL54.jpg (3840x1676, 1.7M)

How's the 60hz, no gsync/freesync screen tearing user?

Which should be in indicator that it's probably not worth the performance cost. This reminds me a lot of when Ambient Occlusion first started taking off and enabling it meant a drop from 60fps to 34fps, most of the time it just wasn't worth the performance hit even if it did make games look better.

What 4k monitors don't have freesync?

No one cares about your blur filter.

Damn... that's a low blow and totally uncalled for.

>i need 144 fps because im a wannabe pro gamer
yikes

Reminder that Unreal already had reflective surfaces in 1998 that even let you see your own character model.

Attached: 1551612034579.png (446x435, 74K)

>144fps

he didn't even imply that. 144hz monitor on the other hand makes everything between ~45 fps and 144 fps smooth. you don't need to be at 144 fps.

Framelets BTFO
How will they EVER recover?

>I have money for a better experience therefore I must be a pro gamer wannabe
This level of projection is yikes

>blurry
Not in Control it's not.

Attached: unknown.png (2560x1440, 3.41M)

>he needs 144hz in a singleplayer slow third person shooter

Attached: i-am-fine-wojak-feels-guy-know-54350991.png (500x458, 78K)

Wasn't raytracing supposed to be a "it just werks" thing that had no performance impact?

No, implementing it just works.

Bruteforce reflections don't work anymore. "lol just render everything twice" is not feasible with how detailed graphics are nowadays.

literally no one ever said or implied this. ray tracing is extremely demanding on hardware.

it has a huge performance impact

Imagine having enough money for things to look nice user.

Keep imagining because that is all you are ever going to be able to do living in your mom's basement.

Attached: 1430125632429.jpg (329x281, 23K)

>1080p
>having enough money for things to look nice

Attached: loll.jpg (258x245, 12K)

>implying 1080

Attached: 1375437000122.jpg (390x406, 19K)

That soind hilarious considering this game is a blurfest by default thanks to TAA, and with your additional blurfilter it's even worse.

It runs concurrently so it depends on how the devs target the framerate. Typically it's 60 fps. In some cases it can run better around that framerate as it frees up for example shadows that are no longer done by the shader units but by the RT units. I forgot which game had this but yeah. When people say "If you turn on raytracing it drops 30 fps" or something and it went from 90 to 60, it most likely just means the RT cores can't finish whatever workload the devs figured out faster than the shader part is done, so it's essentially an RT bottleneck.

>rtx shit flinging thread turns into resolution shit flinging thread
I really appreciate just how far some people are willing to go when it comes to shitposting instead of just enjoying the gimmicks they have and leaving other people and their gimmicks be

Attached: 1566285552535.gif (500x375, 425K)

>that had no performance impact
I'm pretty sure nobody who knows what they are talking about ever said anything even close to this.

if only it were twice, you're rendering that part of the scene again for each ray that collides with it and if you have multiple light sources and multiple reflective surfaces that can be dozens of times your effectively rending the large portions of the screen again from a different angle.

Using DLSS turns off temporal AA, genius.

We've had raytracing in 3D softwares for decades, why do you think we only have it in games now?
Fuck, I now remember leaving 4 PCs on the whole night to render some shitty 3D animation in time for a school project.
The RTX is also a very basic implementation of raytracing compared to what you can do for movies that takes literal days to render a couple frames.

>4k
>rtx
imagine the slideshow...

You wont even get CLOSE to 144fps at 1440p even with a 2080 Ti, user.

Attached: 6f24f5cba049bdc534e467729514a12f.png (1152x769, 46K)

Well now you made it look even worse since i gave you the benefit of the doubt and assumed TAA was on since the image was so blurry.

You could also turn on nvidias sharpening filter you know

Bloom is good for simulating glow on appropriate surfaces, retard.

>this amount of cope
The only game you're not gonna call blurry at this point is a 2D indie game.

cope

>it doesn't work unless it's right on 144hz
>forgets about g-sync
Retarded

>mess around with some settings by accident
>suddenly my fps shoots up to like 120 fps
>think wtf and check my settings
>all the same as i had it before which was medium settings
>thinking something in my system unfucked itself
>proceed to enjoy this silky smooth motion and combat
>randomly check through settings again for another thing
>see "render resolution"
>tfw my game was internally rendering at 720p the whole time

this game is so blurry i literally couldn't tell the difference between the two. dunno if that's a good thing or bad thing desu.

Well the 4k/60people vs 1440/144hz 1080/240hz people war was already a heated with neither side able to agree on anything.

Now we have the 1080p/~60/RTX on, people as well who are like "FUCK 4k, FUCK FRAMERATES, RTX IS GOD ALL HAIL NVIDIA"
and they want in on the shitflinging too.

depends on the game you fucking retard

>buying a 144hz monitor to play at 80fps
>Not getting a 60hz monitor and upgrading it to 75hz
yikes

I had the same experience except with 1440p and rendering at 1080 lmao

you only play games from 2010?

>I bought a $1200 GPU to make Minecraft and Quake II look better

Attached: 1545279744187.gif (448x708, 135K)

>own graph shows 128
>must be 80fps
yikes

streamable.com/5uzrn
streamable.com/3jer1
streamable.com/fbsdd

control's protagonist is cute. CUTE!

Attached: Control 2019.08.29 - 03.36.09.21.DVR.webm (832x1026, 2.93M)

It uses accumulative temporal reconstruction to get back to that, so if you don't move your camera much it should look vaguely native. The part where it's shit is when you move at all. Still better than most other games though. Division 2 had it as well.

At 1080p which he apparently doesn't have

Which game?

nah fuck of raytracing is the future

Mr dev can;t you go and spend an hour fixing her face instead of shitposting here every day

the one this thread is about

Raytracing will still result in something 99% accurate to how light works. I doubt quantum shit is much of a factor to how day-to-day lighting works.

>75hz when 85
yikes

You're getting schizophrenic user.

85 is closer to 75 than 144.
sorry, you got meme'd.

>The part where it's shit is when you move at all
looks fine because the PC version has subtle motion blur you can't even turn off. normally i hate motion blur but this one covers up the artefacts in motion and it's not headache inducing.

Yeah yeah but really can't you do that, the game flopped because of her face.
Had you made her pretty and made the astral dive suit not fridge tier the game would have sold 3 times as much

>144 > 85; 85>75
>losing 10; claiming somehow 85 can't been seen on 144
damn u dumb

So when is the new driver that fixes performance coming out?

>spends $200 more for 10 frames

Attached: zZ86SqQ.jpg (891x717, 77K)

That with the Sharpen Freestyle filter and it looks as good as native really.

Yeah it looks fine. I personally still opted for DLSS because it looks a tiny bit less disorientating in motion and is a bit sharper albeit a bit jaggier.

>implying I play only one game
>implying 10 fps isn't measured in seconds.
Sit for an hour and cry about all your missed frames idiot

Attached: whatatard.jpg (768x926, 114K)

Do you even have a 2080 ti user?

Attached: 48ce219fd5adeff6e458a6a20840d508.jpg (1925x1163, 1.27M)

No but he pretends he does online for cred.

>I just bought a RTX 2060 Super

did I fuck up?

Attached: 1465699911736.jpg (365x355, 17K)

>low point; avg looks around 83
The meaning of avg is hard for you isn't it?

Honestly I think max settings 720p is good enough in most cases and 1080p/1440p/2160p are severely overrated. You can also get crazy high FPS at 720 that just aren't possible with those higher resolutions.

It made me want to experiment with a modern game outputting @ 480i to a CRT. Maybe the HD revolution was a mistake and games would look much better if we just went back to SD and focused on high fidelity visuals.

Yeah, however, 50 seconds later..

Attached: 76ccd5c97441d7b4995515ac23e218e4.jpg (1947x1161, 1.12M)

Turing is just early access for RT, you'd be a fool not to wait for Ampere next year

And again this is with a 2080 Ti, and if you don't have, you're probably not even getting stable 60FPS at 1440p.

>assassin's creed
>not control
Rtx on metro did better user. Petty af trying to single out games though 0/10 try

>>did I fuck up?
Only if you were dead set on 1080/60/RTX on

Change any of the variables and it should be fine. 720/60/RTX on is easy with a 2060 Super, as is 1440/60/RTX off and 1080/144/RTX off

Yes rtx runs like trash on 2060

>blo0m
I always turn off bloom. ALWAYS!

If the goal is the spend as much performance on things will noticeably make the game look better and spend as little performance on things that will mostly go unnoticed if you're not looking for them suddenly 720p makes a lot more sense, because the impact resolution has on performance is exponential. 1080p is 4x as hard as 720p, and 2160p is 4x as hard as 1080p.

Attached: re2_2019_01_30_13_26_47_305.png (1280x720, 850K)

You are not Donald Trump.

>inb4 muh futures games
calling it now. user is Butthurt and won't let go

We already talked about Control.
And Metro, you say? Yeah that seems WAY better.
>Petty af trying to single out games
Keep in mind those are all with a 2080 Ti, user, a card you don't own.

Attached: 527dd9549e9cd8085a9bde6f0b1efeab.jpg (1964x1169, 1.32M)

Who?

Why exactly would you do that in modern games? You always turn off blur because it's a stupid effect only useful to mask low framerates.

>game drop graphics and shaders for performance
>Yea Forums livid
>game keeps graphics and shaders while performance suffers
>Yea Forums absolutely quaking with rage
What did user mean by this?

>2080 Ti, user, a card you don't own
Projection all day doesn't make it fact

Nah, my base 2060 maintains 60fps and above with full RTX 1080p with DLSS
see

>DLSS
Have fun with your shitty blur filter

I own one, but you don't apparently since you repeatedly ignore me asking if you even have one.

So let's see, you apparently spent $1300 on a GPU to get 144FPS on your 144hz monitor but instead you get an average of 70-85 with drops to 50FPS in pretty much every new AAA game.

You did great user.

>ray tracing can be set to off, high, or ultra in metro
Guess you don't have a 2080 ti either

Attached: 1566635181264.png (604x584, 149K)

Yeah I'm sure youre getting 144FPS with RTX on low when RTX on off drops to 50.

I agree, DLSS sucks and I think downsampling is the ultimate form of AA, I use DSR on everything and I think control at 720p with 2x DSR (downsampling from 1080p) looks better than native 1080p with DLSS.

Downsampling eliminates jaggies and shimmering in a way no other form of AA does and it does it without making everyhing look like it's covered with Vaseline.

>back to 144 again instead of 85+
Circle jerking an argument to get you way is yikes

>pc
>200fps

>console
>15fps

what went wrong?

>extreme
You know you don't have to set it to extreme user

Why does the woman look like a tranny?

how does this game perform on PS4 Pro?
I don't want to buy it on epic chink store

>he thinks he gets stable +85fps at 1440p in modern games
elmao

Attached: cb13ba71bf6698573b58e18c9f559de6.jpg (1913x1016, 889K)

Depends on the amount. This looks like a puddle.

>bang4buck
The credibility of this source is lacking

>playing on low

>I'll just let my eyes do it for me
wow thats just about the dumbest thing you could've said

>has to be on extreme
bang4buck is crap on the credibility end though
You wouldn't know because you don't own the card

Your pic destroyed any interest in your story we could have.

>playing at less than 60
Why? When simply turning some stuff down makes the whole game smoother?

I'm more amused by them advertising dynamic fucking ropes and DOF. This is 2019 people, come the fuck on.

Attached: aux_comparison_games.gif (796x270, 2.76M)

I think it's you who doesn't own. Enjoy your 70 average FPS on your 144hz meme monitor

Attached: 7f13f4f6c32cb9cecb1b4e0792788f01.png (1203x789, 94K)

Prepare yourself for a generation of everything being shiny and mirrors placed everywhere. When theres a new tech gimmick devs use it everywhere

Whoops wait I meant 57 hahaah, if you even have a standard 2080 that is.

imagine they tried to make a good game instead of piling on gimmicks and uncanny valley tier graphics

You're only seeking to discredit the source because you don't like the facts it presents.

I can always bump the res to 4K if I want to take screenshots.
It's more important for me that it looks good and fluid in motion.

>720p in 2019
Actually kill yourself

WHY IS THERE NO GIANTESS GAMES FUCK

I bet you think 1080p with TAA looks good.

>I play everything on max sub 60 fps because I am cool
You don't own the card and playing less than 60 fps with that hardware is retarded and thinking 144 doesn't look different is literally retarded.

You can try to spin this to retarded all the rest of your life but until you see with your eyes the 144hz difference you will never know. Linus didn't a video on this, 144hz wins. Time to upgrade poor fag

>30k subs, once runs
probably your channel isn't it?

youtube.com/watch?v=a2IF9ZPwgDM

Eat shit kid

Yeah facts aren't going to matter to this guy, he'd rather play at max settings sub 60 anyways.

>DYNAMIC LIGHTING
Physically accurate lighting is not simply dynamic you ignorant buzzword slinging fucking faggot. You have no fucking idea how ugly games often look becauze of all the workarounds devs have to do for lighting. The reflections are the least interesting part of ray tracing. Not having massive spot lights shining through objects and walls and having real cast shadows is the big one.

>he plays in 720p
Nothing else needs to be said 1440p is the minimum these days.

>buying 144hz monitor for 70fps with drops to 40fps
Retard.

Not a single new AAA game runs even close to 144FPS at 1440p and most of them even drop below 60FPS in busy areas.
>actually playing on low settings to have worse visuals than consolefags just bcs he fell for the 144hz meme

Attached: 1525699932146.jpg (359x364, 23K)

Still not worth the price tag

That looks like water mixed with blood

>1440p is the minimum these days.
Thanks for the input BenQ, now politely fuck off if you don't have anything to add.

if framerate matters so much you would have never bought a 2k monitor in the first place, just saying

>3019
>some people are still on 60 hz
imagine

> huuuduuruuuu there's water in blooood therefore adding more water won't make a difference to its appearance huuuuurrr badruuuuuruurr

don't know how anyone can argue that 144hz isn't superior. even browsing the web is so much nicer because of the smoothness. people need to remember 144hz doesn't mean you need 144 fps. anything UPTO 144 fps will be smooth as fuck with no tearing and low input lag because you don't need vsync.

>playing on max arguing 70fps to 40fps instead on turning it down and playing at 144
Only you can be this retarded

>Not a single new AAA game runs even close to 144FPS at 1440p
Citation or didn't happen. inb4 more max option bullshit. user consoles don't even run medium on new AAA games

Superior to what? 60hz? Yes it is.

>he says as he plays on a resolution that even consoles stopped using in 2013

the sweet spot is 90+ there are tons of games that do this

Because you have to lower your settings to even get stable 60FPS at 1440p. With a 2080 Ti.

No that’s not what it meant, it works as in there’s less effort needed with developing lighting

BETAMAX WILL WIN! It'S CLEARLY SUPERIOR TO VHS

i literally sit with 3 screens. 144hz in the middle, other are 60hz i guess? the difference is NIGHT and DAY when it comes to smoothness of movement. that guy is a fucking clown

or you could get a 1080p monitor, max everything out and actually get +120fps.
1440p isnt a big upgrade over 1080p.

You know you can just turn RTX features instead of setting everything else to low just so you can have memetracing on with your shitty TN panel.

>actually playing on low settings to have worse visuals than consolefags just bcs he fell for the 144hz meme
consoles can't run on max user so this is stupid as fuck. Changing goal posts at will between fps and max settings is something a nigger would do.

>playing on console pleb tier graphics

>double the pixels isn't a big upgrade
Yeah maybe if you're blind.

Agreed. Imagine being that dumb

>you can turn graphics down to get 1440FPS so you dont have to play at 70-40fps!!

>consoles cant do max unlike pc!!!

which is it?

if my 1070 can hit 144 fps at medium/high in metro at 1080p then a 2080 ti can easily match that at 1440p because there is only a 56% resolution difference between 1080p and 1440p but the gtx 2080 ti is 67% more powerful than the gtx 1070 according to techpowerup.

I like the blur effect, it reminds me of them old movies when I was kid.

Attached: Robin-Williams-as-Peter-Pan-robin-williams-37438479-1000-670.jpg (1000x670, 213K)

>playing on low settings claiming that's the only way to 144hz

Attached: 1418769488919.jpg (234x216, 19K)

it really fucking isnt. if you go for resolution, go for 4k or dont do it at all, youd still have the option to downsample

>WAAAAH WHY DON'T GAMES HAVE WORKING MIRRORS ANYMORE????
>WAAAH WHY ARE THERE MIRRORS IN MY GAMES AGAIN WAAAH
Christ you people are impossible to fucking please

you can't max everything rtx on and get 120fps nice try though

*bloom

>You know you can just turn RTX features
true

>tn panel
IPS stop projecting

that's what is saying you absolute retard
if you dont turn them down it looks like this enjoy your sub 60fps at least it has 144hz written on the box

Attached: Rg_laugh__large.jpg (590x590, 76K)

And AO is pretty much ubiquitous and required these days unless you want your game looking like it came straight from 2000. The same will happen with ray tracing in time

YOU JUST DONT GET IT!!! RAYTRACING IS A MEME!!

Attached: images (4).png (225x225, 17K)

> Polished concrete

both are playable depends on what frame rate you are going for. Max on consoles isn't an option in most cases. The argument that turning down pc graphics immediately makes it console graphics is stupid as fuck though.

Regardless of how the consumer feels about raytracing, there are real economic benefits to development workflow by not having to bake fucking everything like they do in modern games where dynamic direct light to indirect light transitions happen all the time. The edge cases become fucking nightmares.

There are no mirrors because devs are lazy and have been pandering to consoles that can't handle extra drawcalls.

If your goal is to literally get +70FPS it's not that stupid.

You seriously don’t know what polished concrete is?

>stopped using in 2013
Yet consoles still can't maintain a stable 30 fps, it's almost like that decision was driven by TV companies rather than what looks best for the performance, just like they're doing with 4k right now. If I had the option to switch my PS4 into 720 mode and suddenly get 60FPS in every game I'd do it.

As soon as you enable Raytacing and downsample from 1080 and watch all the jaggies melt without blur filters and still get a stable 60fps and you'll thank me for enlightening you. Native resolution is a meme and you fell for it. Always downsample from whatever res you're targeting to half that res. It always looks perfect. 2x DSR is the best form of AA bar none.

Shooters push the medium forward in terms of graphics. It takes a lot of effort now but it won't be long until we have easy apply settings that create photorealistic lighting on what will be low level hardware.

>every single chart: ultra, max, max ,max
>Not a single medium/high settings
>enjoy your sub 60fps at (implied max, ultra) least it has 144hz written on the box

Attached: 1375436292586.jpg (550x500, 127K)

cope harder ayyymd poorfag
It looks fucking amazing. I will get 3080ti.

It is a real PC vs console.

Attached: PC vs consoles in 1997.png (913x479, 412K)

RTX is shit now but it's good for the industry overall.

All the retards ITT who don't understand how surfaces work and have never left their fucking room need to read this
dorian-iten.com/fresnel/

> soul vs soulless

cope hard, console peasant

Attached: 1386069672579.jpg (375x508, 50K)

ok, but first you HAVE to dilate dude

Playing at 720p is your blur filter you retard.
>pushed by TV companies
I had a 1600x1200 4:3 monitor back in 2005. And you're here using a significantly lower resolution in 14 years later.

>it's not that stupid
Reading charts for ultra and max thinking that you are not going to be able to get 70+fps is though

Yet my games both look and run better than yours. It's almost like you've been duped into believing something that's not true or something because it made TV companies more money. But that can't be true, it'd make you a total sucker. You're not a sucker are you?

>I posted a good implementation of tesselation in Wildlands.
Not that guy but Wildlands has a trash implementation. Rocks and dirt looks like marble nuggets. Doesn't look right at all.

The only way you're just going to casually double your framerate with only a few settings is to kill RTX features. Otherwise have fun setting everythig below console level just so you can have slightly nicer reflections.

>You can disable it
>You're fucking blind, it's the biggest leap forward since ambient occlusion, retard

>people with no technical knowledge are also /pol/esmokers pushing the tired "dilate" buzzword
Colour me shocked

oh fug, rejoice my non-poorfag friends
dsogaming.com/news/here-is-how-you-can-completely-fix-the-annoying-dx12-stutters-even-with-ray-tracing-in-control/

>look better
720p looks absolutely horrible, no thanks. Even trying to watch 720p video with heavy upscaling looks nasty compared to native.

Imagine being such a retard that you fell for Nvidia's meme rays so hard that you're willing to play at 720p.

I just read the thread. This is a spiderman thread now

Attached: 1356877644963.jpg (484x357, 41K)

>leap forward
>ambient occlusion,
What? Ambient occlusion was a heavy and gimmicky solution to adding shadows that often left everything looking distinctly pillow shaded. It wasn't until years later when HBAO+ came around and actually made it look palatable and GPUs were beefy enough so that the performance hit for turning it was worth the -now- minor performance hit.

The biggest leaps were things like the adoption of deferred rendering.

same here.
My 980ti shows its age slowly.

The TV companies literally control your mind. It's not too late to break your conditioning and see the light. I shall pray for your soul.

Says the Nvidia shill

>How do we stop technology getting better
Are Yea Forumsirgins legitimately the lowest IQ people in existence?

>PS5 is confirmed to have ray tracing support.
Only one path meanwhile RTX 2060 can do 10 paths while 2080 ti does 50 paths.

I got a 1080ti around when the cryptomeme died for around $200 and I don't think I'll be needing to upgrade anytime in the next 5+ years the way things are going. If the future is pushing the RTXmeme and everything else is going to be incremental performance improvements I think I'll be more than happy without some reflections if it saves me dropping 40+fps and $1000 every 2 years. Actually imagine being on of those people who fell for the 2080TI meme and have to settle for 1080/60 in 2019.

>He bought a miner's 1080TI

Attached: come on now.jpg (211x239, 17K)

The amazing graphic on PC makes console peasants and poorfag SEETHE

Resident Evil 2 Remake is the only good use of bloom I can think of. Fable on Xbox is the worst

Console is literally medium or low settings across the board for xbox. There is some room there. Source: youtube.com/watch?v=blbu0g9DAGA about 23 minute mark or so

Do you think Nvidia invented Ray Tracing? Shit has always been the future and the future is now, you're just too hung up on your TV company propaganda to experience it. imagine being so cucked by a corporation you refuse yourself better looking games because of some arbitrary numbers of pixels.

Nah, you notice it a whole lot actually.

He does think Nvidia invented Ray Tracing.

TV companies didn't invent higher resolutions either moron. Nvidia is the one pushing realtime raytracing years before it's ready.
Imagine being so cucked by a corporation you're willing to play at 720p for some reflections.

>Yea Forums is one person

People do green text me but, I got it for less than half the retail price and it's still working fine. Every stress test I've put it through has the same exact performance and thermals as a new stock one. People are giving up $200+ because they're afraid of some unsubstantiated FUD.

>Urge to install Epic Game Store intensifies

I need to hold out and play a Steam Exclusive weeb shit to distract me. I'm going Gog Galaxy 2.0 makes me stop caring about how many "stores" are installed and running

Miner cards that ran undervolted and at constant load and temps are probably in a better condition than gamer cards that ran heavily overclocked and overvolted at constantly varying loads.

Just pirate it like everyone else, buy it later when it's cheaper or on some other platform.

>TV companies didn't invent higher resolutions either moron.
No they just pushed impressionable retards like you into buying them because "bigger numbers = more betterer so you better by the 4k even though you can hardly tell a difference from 2k and it's 4x as hard to run games lol."

>FUD
>Buying a product from someone that had it under high, unnatural stress for extended amounts of time in an environment that was unnaturally hot for the card and significantly killed the lifespan
The point being, you bought a card with a significantly shorter life span. My company is currently throwing out a shit load of server class items that they're letting me have and I'm fully aware that the high heat and constant use has significantly killed the life span of them. I see that the /biz/ miners are still trying to push their shit lmfao

because console is console

Attached: PC vs consoles.png (2318x4199, 2.77M)

If you run 4K with ray tracing on, you are fucked up.

Just stay with 1080p with ray tracing on.

>even though you can hardly tell a difference from 2k
This is literally what consoletards were saying in 2013 and everyone was fucking laughing at them. It wasn't true then and it's not true now unless you're fucking blind.

I dunno bro, you seem awfully short on actual facts and full of FUD. You scream "It shortened the lifespan" but you can't prove it, meanwhile mine works fine, all diagnostics says it's perfect, it matches stock performance and thermals and I paid half as much for it. Seems the person with an agenda is you.

You put 2k and 4k next to eachother under normal play conditions and the only thing you'll immediately notice is one is running significantly better than the other. Guess which one that will be I wonder. Anyone with a 4k is the absolute king of falling for memes.

>but you can't prove it
Nigger, the laws of thermodynamics prove it. Hell, we can test this on you if you want. Go jump into a closet with 50 other things generating 100 + C temperatures and see how you feel after wards.
>mine works fine
I'm sure it does work fine now. A GPU doesn't just stop working over time. Enjoy for now :)

Fucked that up, meant to say
>A GPU doesn't just stop working immediately
A server HDD that has had a million write cycles still has a million write cycles worth of damage done to it. Fucking brainlet hahahahahahahahahahahhahahah

So just more FUD without any actual sources. Not all miners were run to the point of overheating and destroying the hardware, that's just FUD.

Don't move the goalposts now, you're the one trying to say 720p is acceptable nowadays. 4K is a long way away from that.

How is that moving the goalposts exactly? I don't think you even know what that means because it certainly doesn't fit this context.