>he doesn't own an RTX
>he's going to play Cyberpunk without raytraced reflections
yikes
He doesn't own an RTX
Other urls found in this thread:
youtube.com
valleyadvocate.com
youtube.com
youtu.be
youtu.be
youtu.be
youtu.be
pcpartpicker.com
flickr.com
twitter.com
got 2080 but probably going to play without the raytracing because fuck losing 1/3 of your frames
>going to play Cyberpunk
nope lol
turn on DLSS
desu it worked pretty flawlessly in Metro and didn't tax my 2070 at max settings that bad, still silky smooth 60-70+ fps
>MUH REFLEXIONS
You'll have payed shitloads extra for barely anything
I wont even pay for the game lmao
>all that bloom and lens flare
It's like having the sun in your eyes. There's no escaping the sun in this game, is there?
>Silky smooth 60fps
Lmao over 100fps is butter in a shooter, 60fps is borderline peasent for PC gaming
thats why 120hz is a meme
everything under it becomes unplayable
you can turn off lens flare and bloom in EVERY game
and there arent even any lens flares
60fps is a meme at this point, i can't play anything under 80-90fps anymore, it's all so blatently stuttery now with anything under.
Games like DMC 5 and Sekiro are fine at 60, but once you go 120hz beyond it's too good to go back
Ray tracing is dope but I can wait until it's done without RTX cards Eg: Cryengine
Raytracing is overrated and the weakest gimmick Nvidia has ever come up with.
DLSS mostly compensates for the lost frames so that's a thing. Or you could go w/out ray tracing and turn on DLSS for a huge jump in base frames. Might be what I do, ray tracing hasn't been something super cool to me personally.
This has been debunked and disproven. Besides this is ray tracing which involves shaders and lighting not polygons you nigger
This is how pathetic white "peoples" genetics are, literally get headaches and dizzy from seeing 60fps now
The same weak genes that get sick from peanut butter lol
This means that we've pretty much peaked when it comes to graphics, so where will innovation come from, AI ?
bait pic. The bust model isn't complex or detailed enough for you see a profound difference.
nigga you got conquered by those genetics
I'll buy a 2070 soon but fuck raytracing. I prefer high framerates
>polygons are all there is to grafix
lighting is the most important part
play Quake RTX and you wont repeat these words
how about good gameplay instead?
>Nigger who plays sub-30fps games on his Kangzstation 4
RTX feels like Physx all over again, so far only a handful of games support it, and from what we've seen it only adds a bit more realism and "shine" for the cost of performance, just like physx.
>caring about graphics when we're already at uncanny valley levels
I haven't had a job in over a year so I can't upgrade my PC
/Thread
Challenge accepted.
why bother when faking reflections looks good enough with 10 times the performance?
RTX is hardware, you do know that right?
AW MY HEAD, MARKETER TOLD BE ANYTHING UNDER 120FPS WILL KILL ME!
based
So was Physx.
we arent
>He’s going to play Cyberpunk
I don’t play games made by liars
>Announced as an RPG
>Quietly changed to “story driven open world adventure”
>headaches and dizzy from seeing 60fps
Dumb nigger, it's about how SHIT it looks after you've seen how good it can be.
>Oy vey just run your game at 720p and we’ll fill in the gaps with fuzzy generated shit
Fuck you fag
Haha he doesn't use his frontal-lobe for cognitive thinking haha
This is fucking hillarious.
You know that the gaming industries and publishers in particularly are the ones who were and still are trying to fight AGAINST high FPS expectations, right?
Jesus, how fucking brain-dead are you?
Should I wage slave a bit to afford a RTX bros? I'm a neet with a lot of free time
I’m black tho
>Buying a card for one game
Sorry, but all of the best looking games last gen were Sony exclusives, if you pay full price for an Rtx you're a sucker.
You are incredibly dumb, did you know that? Nvidia RTX is a graphics rendering development platform created by Nvidia, primarily aimed at enabling real time ray tracing.
PhysX can be accelerated by either a PhysX PPU (expansion card designed by Ageia) or a CUDA-enabled GeForce GPU thus offloading physics calculations from the CPU, allowing it to perform other tasks instead.
Those reflections aren't even fucking correct, they are missing the actual ads on them
bet you cant wait to save $40 in one whole year huh
I'm not going to be playing that shit.
>calls others braindead
>thinks there is some conspiracy of industry wide attack on high FPS
ok retard
Wait for newer cards dropping soon-ish gonna be a price drop on Rtx (rumored)
Yeah bro can't play anything unless its 1000 FPS, like bro I die if its less!
Nice direct copy paste from wikipedia my friend.
There actually has been one. It's not a conspiracy though, it has been done completely openly. How old are you?
I have a 2080 Ti but I doubt I'll use RTX, the game won't run well enough to maintain 60FPS with that enabled.
You jest, but it's true that going from 60 fps to 30 fps is like going from 1000 fps to 60 fps. It's incredibly jarring and honestly makes me feel ill.
It's not a conspiracy if it's public knowledge you actual fucking clown.
>one game
Even single videogames are going to feature raytracing due to nextgen consoles adopting the tech. You'll have to upgrade for those effects no matter what.
>Sony exclusives
Are shit.
>Imagine not having some shekels for a 144hz monitor
Lmaoing at the peasentry in this thread
>new game comes out
>fuck the gameplay and game
>rage on internet about some rays
???
will an rtx 2070 be able to do ray tracing at 60fps/1080p?
Just when I thought this place couldn't get any stupider.....
>doesn't have 165hz monitor
>calls other peasants
kys poorfag
>Half the games appeal is the visuals and graphic-fidelity
Hah yeah just givem gameplay lads, heck don't even call it cyberpunk just call it rpg 2077 fuck art directions and visuals lmao
>fat obese mutt doesnt understand raytracing
Don't talk shit if you're shelling out for a half-assed 240hz, retard.
I agree with you.
Is nvidia cheating again?
>fat
>gaming chair
>retarded
Sounds about right
because itt doesnt have reflection raytracing but some weird illumination shit
I'm not going to use ray tracing, and I'll probably grab a 2080 next year unless they announce something before then.
its a single player game losing frames won't mean much
I'd rather play on a stable fps thanks
>he fell for the 20XX meme
>he isn't using a 1660 TI
>Tfw has165hz monitor
Nice try nigger
Actually asking, has there been reflections in surfaces like that (in games) that actually shows a proper reflection?
Dude, again:
How. Fucking. Old. Are you.
This is something anyone who has followed the industry in the past ten years would fucking know. Your amazing ignorance of the history of the market is not our fault. Major publishers as well as console manufacturers have been fighting agaist 60 FPS as an industry standard for ages, ESPECIALLY during the last gen. There were even these fucking hillarious claims about how 30 FPS is actually better because it feels "more cinematic".
Did you really not know about the fucking "silky smooth cinematic 30 FPS" jokes flying around here even just a couple of years ago?
The fuck is wrong with you?
slow third person shooters like control can be played with 30fps no problem
*Can be. It's neither enjoyable or optimal. Also this is a first person shooter.
:)
This illustrates the general IQ among nvidia clients.
??
180 million people play at 30fps
stop crying
You might find this shocking but it's not a conspiracy, people are ok with 30fps crazy right!
Ah there we go, there's the no argument post.
>doesnt take the time to put gay shit in his tab-
>Epic Launcher
Nevermind carry on
But OP it's not out for nearly a year (even their slated release is April 2020, and that's assuming no slips). It probably won't really start to get POLISHED until another 3-6 months after that when it has a good initial set of patches and mods under its belt.
So unironically you're a retard if you own an RTX for that or any other game coming out down the road, because even taking for granted that ray tracing really is valuable RTX is still Nvidia's 1.0 version of it. Whatever they do next time around will be a lot better and more refined. And with AMD actually starting to rev up at long last, it might even be cheaper too, Intel is already being forced to start getting more competitive again.
or tl;dr: THANKS FOR BETA TESTING RAY TRACING FOR US FAGGOT
Of course they're okay with it when they have absolutely no choice in the matter
There goes the word again, you fucking retard. It's not a conspiracy, it was a completely open, 100% logical and explicit attitude of those companies. Push for 60 FPS standard has been 100% customer based. And YOU are the fucking spastic mongoloid that claimed that people only want decent FPS because they are "victims of marketers". You are the obsessive, iditic paranoid cunt.
PC gaming is a huge fucking meme. I've honestly been thinking about just going back to consoles
you can play games at 30fps no problem if the framepacing is stable
you dont need high fps for a game where you hide behind cover most of the time
isn't DLSS game specific?
>Need
All games should be 60 FPS. There is no argument against this that is valid.
>fuck losing 1/3 of your frames
>1/3
It loses more than half from doing basically nothing in various Battlefield V levels, so more like 2/3 in a game as lighting-source heavy as Cyberpunk 2077.
>Buying RTX 2070 but won't use memetracing
>Not waiting one week for reviews of AMDs 5700XT, to at least make an informed choice
The absolute state of noVidyaniggers.
RTX is basically AMD not offering any competition, so instead of increasing performance in any notable way (2080ti excluded, but what do you expect from a $300 more expensive replacement), nVidia decided to further milk the idiots that buy literally anything they put out.
I'll be accused of being an AMD shill for sure, and I freely admit that I will never buy nVidia. I refuse to support a company with such business practices. But even if I wasn't, I'd say wait a week and see how the new AMD cards perform. They replace (up to) 2070. nVidia are also pushing out new, even more expensive cards, but I'm working under the assumption that you aren't actually retarded.
The reason AMD didn't include a DXR-ASIC on their cards, was because they knew that at the price points and current hardware, it would be pointless. Check any review, memetracing with anything other than a 2080ti or a titan is basically pointless. Even then, you're looking at 35-50 FPS with a $1000+ card.
>muh one-sample puddle tracing
New consoles are on AMD hardware. Draw your own conclusions.
It's a single sample, and if you see it before de-noising (look it up), it's pretty cool that it can even do that quickly enough. Still, the result is only generally simple reflections and generalisations over larger areas. It can't do every pixel in the ad, because it's not truly reflecting everything, it's only doing it in a very general sense.
>RTX
There is a new gimmick every year, good lord.
How long have we been stuck in this plateau?
Fuck consoles too.
>You'll have to upgrade for those effects no matter what.
But upgrading NOW for games coming out in 2020 or 2021 is fucking dumb. Ray tracing hardware will be a lot better by then, and will no longer be Nvidia exclusive either if what you're saying is correct. Hell by your argument it's extra retarded. Next gen consoles are AMD APUs again, so all games built with PS5/XTwo platforms as one of the launch targets will also optimize with AMD GPUs in mind. Even more reason to wait until next summer or summer 2021 and see what both Nvidia and AMD are doing at that point.
Funny because all I hear is console gamers dying to get into PC gaming
they should but its not necessary
Should = necessary.
raytracing will stay though just like ao
Only autists care about frames. Maybe you should be in the SGDQ thread.
It's either an issue with RTX or with the game.
But you are correct, the reflection isn't realistic.
RTX is failing completely in this example.
Everyone cares about frames. They just don't know it until they're puking on the carpet.
But there is retard, if 30fps was so bad as you claim then they would either not play that 30fps game, play a 60fos game like CoD or switch to PC and play at 60fps yet a lot of people still enjoy games at 30fps
>Push for 60 FPS standard has been 100% customer based.
Weird I see none of my gamer friends crying about 30fps or pushing for 60fps, if 30fps didn't sell millions then it won't exist anymore. That fucking simple retard
need = necessary
should = could be but i dont need it
Overall fine user, except
>and I freely admit that I will never buy nVidia
This is dumb. NONE of these companies are our friends. I would 100% absolutely buy Nvidia again if the market became more even and competitive and Nvidia chose to become value competitive once again. If they actually brought down prices and didn't hold back on performance. That's the whole point of wanting competition in the first place after all right? If competition produces exactly the result desired why not just buy the best thing available?
see
Only a low test beta faggot gets sick while playing video games
Assuming it gets delayed and doesn't release in April. Nvidia's next gen cards might release before Cyberpunk and I'll have a new build from the ground up by then.
Another great render illustrating the problem, though I assume this one was rendered offline with tesla or something? But nvidia clearly cheating and CDPR is incompetent like always.
i kno rite just terrible
>Render
>spending 1000$ on a videocard
I'll wait for the minimum system requirements before upgrading my PC.
>Weird I see none of my gamer friends crying about 30fps or pushing for 60fps
I don't give a flying fuck that you are an idiot, and so are your friends. You have the audacity to accuse others of being manipulated while you literally zealously advocate what is simply low fucking standard that allows companies to get away with more easily marketable, but worse games. Nothing else. You are the fucking marketing-brain-washed suck-up here.
If you don't mind your games having poor responsivity, fine. But at least have the fucking dignity of not insulting people who have higher expectations than you do. Now fuck off.
UNIRONICALLY ACTUALLY GO OUTDOORS AT NIGHT ONCE EVERY FEW YEARS AT LEAST HOLY SHIT
Should denotes a minimum standard. Meaning if it doesn't meet that standard it's below minimum.
Your body don't give any shits how much testosterone you have flowing in you, you omega.
I've played metro 2033 in its entirety at 10 fps once, getting sick at some point becomes expected and then unavoidable.
No current RTX card will be able to handle it on max. I doubt even the RTX2080TI will achieve 60 fps at max setting's in that game. You'll have to upgrade. Trust me
seethe
Only if you're a genetic failure like this tranny
>says something retarded
>gets pointed out its retarded
>s-seethe y-you m-mad :(
Whew.
You can stop projecting your removal of your penis now.
That's from a much larger distance and not a good comparison idiot.
Nvidia RTX is a complete failure in Cyberpunk.
This is how neonlight reflects in reality: valleyadvocate.com
DILATE
but you're wrong so keep seething
new gpus will come out and the early adopters will be fucked over as always
Visually it's not even that great
Bar tops are coated with protective sealant to keep the wood from rotting. Of course it's going to be hyper reflective you dumb fucking mongoloid
>haha guys concrete is a mirror
The complete and absolute state of nu/v/.
I won't buy Intel, I won't buy nVidia. Intel for being repeatedly sued over being a shitty company (backdoored made in Israel security holes aside), and nVidia for similar reasons.
>If competition produces exactly the result desired why not just buy the best thing available?
nVidia has a long way to go before they aren't a piece of shit company. AMD may only be nice because they can't afford to be anything but (historically, that hasn't actually been the case, but as a disclaimer), however for me that's a good enough reason. There is no other competition, so I choose the less shitty company. I'd rather skip generations of cards than buy from someone I don't agree with. I do have an R9 280, so that'll need to be replaced. Either Vega 56 for $300, or wait until 2020 for a price drop in AMD's 5700.
If this guy is right, mind you, it's beyond "wild speculations", then there might be a reason to buy nVidia in the far future: youtube.com
What would be cool, would be if you could offload memetracing (DXR, not nVidias specifically) to CPU cores.
>polished sealed wood bar tops are the same as concrete or constructed stone flooring
ikr? its pretty bad
I'm going to play with an RX 580, on a 10 y/o 1080p monitor.
You can't stop me.
>at night
I drive home at night daily.
>1660ti
>Better FPS/Performance compared to even a gimped 2060
>Hurr durr price > performance
>2060 Barely 100 dollars more
Gtfo nigger.
>nVidia has a long way to go before they aren't a piece of shit company
Well yeah. I'm talking years down the line though user, I only object to "will never buy" because that's just another "loyalty". I have no illusions that AMD is somehow immune for example to standard profit influences. I would be fucking DELIGHTED if it all goes well for them and they get up to flat out 50% marketshare, however unlikely that is. Full, major competition. And since silicon design lead time is like 3-5 years, there is plenty in the pipeline based on their current competitive culture.
But if they ever somehow got to 90% or something and took Intel's or Nvidia's place, it'd only be a matter of time before they started to do the same shit. That's why monopolies are troublesome no matter who it is. A good leader can help put things off for a while, but that's even harder in a public corporation because shareholders can just push for milking and get rid of management that opposes it too hard.
>goes from conspiracy rambling to now crying WAHHH muh low standards
Grow up kid, you're in that age group where you think the world is wrong and you are right
Genetics don't contribute to that, they contribute to baselines, even the most abject genetic failure of a person can get better in some way so long as they live and want better, low framerates causing motion sickness or whatever the fuck it is isn't limited to just the people that "lost" the 1st lottery, beyond a certain threshold its becomes everyone feeling it rather than those most sensitive to it because the "fault" exists in everyone to some degree.
And does the asphalt you drive upon reflect light like a bathroom mirror? Or does it reflect "unrealistically" like OP image?
>What would be cool, would be if you could offload memetracing (DXR, not nVidias specifically) to CPU cores.
What good would that do? CPU cores can't handle that kind of heavily parallized workload.
Wood is actually less reflective than wet and smooth tiles.
It's pathetic that you don't know this.
For a densely populated open world game it's more demanding than any other game out there right now. An open world game of that caliber with global ray tracing will be very demanding. Are there any other examples on the market right now? Non that I know of. What if you want to play at 4k? The highest end cards currently do open world games at 4k at around 50-80 fps. If you include rtx you can imagine what the framerases will be .
You can turn on DLSS.
>playing Normiepunk 2077
That's where you wrong kiddo
>Grow up kid, you're in that age group where you think the world is wrong and you are right
You do realize that is YOU who talks about conspiracies: you who both tried to argue that higher standards are product of corporate brainwashing, and YOU who constantly tries to fucking push some incoherent ramblings about conspiracies that nobody else ever fucking mentioned. And screeching about popular vote is definitely not fucking helping you either.
Again.
FUCK. OFF. Eat shit and die. You are literally all that is fucking wrong with this place, and this industry. Just keep your fucking mouth shut and let the adults do the talking and thinking, for christ sake.
>RTX
>Still play on PC AD 2020
I'm going to play it on console.like it was intended
It's not the wood that's causing the reflection you retard. It's the PROTECTIVE COATING on it. Holy fuck you are dumb
Pretty much like this.
>NO UUUUUU
holy shit are you 10?
>AD
>implying god existed in the first place
This isn't a realistic reflection on smooth tiles.
Even a 5 year old can understand this.
>Fat retard thinks he can play raytracing whilst also playing in 4k
Are you that retarded?
Says the person who literally does not remember the massive push for 30 FPS?
Are you kidding me? Do you even know what any of the words you are using mean?
And yes. YOU. I told you four fucking times, as well as other people, that nobody said anything about any conspiracies. That is 100% your fucking bullshit. And YOU threw the fucking accusations around. How are you even alive being this fucking dumb?
>He expects the game to look as good as advertised.
Looks better desu
>He expects the game to look as good as advertised.
what weve seen from the game looks unfinished as fuck and outdated though?
Your mom looks unfinished as fuck and outdated. Doesn't stop me from busting my nut in her though
Calm down, I have literally a year to worry about it.
This guy gets it!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Exactly. Witcher 3 looked like shit at E3 and then it came out even worse. I wouldn't waste a grand to see this shit with RTX at 30fps.
>spending $1000 to play console port with reflections
mustards have reached a new low
That's because with modern games are already demanding on their own)can have all the features that Raytracing can offer at once: Metro Exodus (GI only), BF5 (reflections only) and Tomb Raider (shadows only). Also don't confuse RTX (Nvidia's branding) and actual Raytracing (which is something that CGI movies have been using for decades.
Just look at Minecraft with all the Raytracing features (GI, reflections and shadows) all at once, and doesn't even need the newest RTX cards to run:
youtu.be
youtu.be
>>Buying RTX 2070 but won't use memetracing
>>Not waiting one week for reviews of AMDs 5700XT, to at least make an informed choice
AMD is shit, user.
Dude you are trying too hard. inb4 a shoddily edited screenshot to somehow prove he's not being a faggot.
>What would be cool, would be if you could offload memetracing (DXR, not nVidias specifically) to CPU cores.
You can do that already, but it's too slow for realtime use. In fact the whole reason RTX exists is to avoid doing that.
He's being intentionally stupid to appease his desire for (you)'s. He's probably some Mexishit, though, so don't feel too bad. This is all he has.
Give it up user, there's obviously no reasoning with the guy, let him believe what he wants and enjoy his "totally by choice" 30fps on games that should be was over 60fps as a standard by now
This guy isn't even getting 60 fps in shadow of the tomg raider with dlss and rtx on. I imagine cyberpunk will be more demanding.
You never seen wood with protective coating? Where in the world do you buy your furniture?
Meh, I'm still waiting for raytracing to offer me a noticeably better experience over what I'm getting now without murdering performance.
Which won't happen for at least another 5 years.
He also isn't even getting 60 fps with both of them off. What the fuck kind of shitbox is he using?
His system specs are in the description. He's using gtx2080ti oc.
>not 60 fps
how
The last two tomb raider games are very demanding especially in 4k. I have a 1080ti and I get sub 60 fps in certain parts of the game at 1440p nevermind 4k. Anybody that bought an rtx card now is in for a rude suprise if they wants to play cyberpunk on max setting with Ray tracing. Theyre going to have to upgrade.
I have a 1070, I was thinking about upgrading. Thanks for the tip, think I'll hold off for the next generation.
>AMD Ryzen 2700X 8x3.70 GHz
Oh, there's his problem.
AMD is fucking shit if it's not running vulkan-based games
OwO
nigger go play smash and make some loud obnoxious noises whenever something happens.
you're better at that, fuck out of this thread.
There's no Cpu bottleneck. At 4k the workload goes to the gpu.
Pro tip: The best time to upgrade your PC is when the new PC hardware launches in the same year as a new console launch. Because of the development cycle of consoles, the PC hardware that launches that same year will outperform the consoles. Then, because so much shit is multiplat these days and the consoles are frequently the lead SKUs for game development, your hardware will be guaranteed to last the entire duration of the console generation unless you play some particularly intensive PC exclusives or something.
>samefagging this much
Again no argument and pathetic
That would be the smart move. In a year from now when the game comes out ipretty much guarantee their will be new generation of gpus rolling out. Nvidia is currently going to roll out what they call a "super rtx" card which is just a stop gap to compete with AMDs Navi.
PS5 is coming out soon, that means I'll be upgrading sooner than I think.
Ok I'll go play a good game, you enjoy your time in this thread :)
>when the new PC hardware launches in the same years as a new console launch
Consoles are generally behind even older PC hardware. I don't think the new consoles coming out will match even a 2080 let alone a 1080.
I'm personally planning a full upgrade in 2020 or 2021, depending on when the next round of GPUs comes out.
That's why fully upgrading your PC with the latest hardware at the launch of the new consoles puts you far enough ahead that you can safely go without upgrading until the next consoles come out. I've been doing this a long time and it's always worked really well and has basically eliminated my need for incremental upgrades.
Apparently the next Gen consoles are promising 4k and 8k at 60 fps. They'll be pretty powerful if that's true.
Unless Sony and Microsoft have developed large hadron colliders capable of pulling some quantum shit, I highly doubt they'll be 4K at 60 FPS, let alone 8K.
upscaled from 1440p and 1080p respectively, maybe. I'm sure at 8k the only games getting 60 fps are basic 2D games. You should already know console claims are utter horseshit.
>includes RTX to hide the other downgrades they made because of consoles
Same shit with the Witcher 3 and Shitworks.
No, they are saying they're "capable of OUTPUTTING UP TO 8K at 120fps." That's cheeky marketing speak for "Our console supports the latest HDMI revision" but what idiots HEAR is "Our console will RENDER 8K at 120fps" but they absolutely will NOT.
Show me on the doll where CDPR has touched you.
imagine still being mad about Bloodbourne missing out of GOTY.
I don't doubt that it will probably be some upscaled fuckery especially for 8k. But a year from now when they launch I think it's entirely possible they cold offer hardware that can produce native 4k at 60fps.
>promising
imagine actually falling for that
native 4k 30fps and consider yourself lucky
>I think it's entirely possible
Well then you're deluding yourself.
Yes, I enjoy playing my games at a high frame per second.
Ah yes, just like how everyone was envious of nvidia hairworks
>I do
>probably still not going to use RTX because even with a ti the framerates will still tank
I have a 2 year old pc that can pull off 4k at 60 fps on most games. What makes you think they can't manufacture a console that wont be able to do the same in a year to two years from now. I'm a pc gamer through and through and you sound like you're the one that's in denial.
>I think it's entirely possible they cold offer hardware that can produce native 4k at 60fps.
Only if they want to price their console at like $1200.
Because a 2070 cannot do 4K at 60 FPS in anything but Risk of Rain.
XBX can already do 4k with some games, new hardware will make it easier
That's today's prices. You do realize technology advances fairly quickly right? I'm sure the cost will be down by the the they launch. If youve been paying attention to what AMD has been doing recently I won't discount the possibility that it will be possible.
No, it won't, because as usual devs will use the extra power to add more graphixx at the expense of resolution and framerate. Much in the same way how PS4 has spent the last gen doing 900p30 instead of 1080p60.
My gtx1080 can do around 60 fps at 4k on mostly high or max settings on almost every game I own. With the exception of a couple open world games.
Have you ever considered the possibility that you are, in fact, absolutely fucking retarded, and it not just some crazy conspiracy that is after you?
Arguments were provided. You are just retarded.
This is me, and I'm buying Cyberpunk
I know for a fact that it cannot because people with 2080s bitch about how difficult it is to drive 4K at 60 FPS.
raytracing is more like AO you dumbass its not at all like hairworks
>I'm sure the cost will be down by the the they launch.
You really think that shit's going to get down to the $500 range by Holiday 2020? They're going to have to start manufacturing these things pretty soon, user.
I have a bridge to sell you
If you have been paying attention to what AMD has been doing and have any idea of how console design and manufacturing works, you should know next gen consoles won't be doing 2160p60.
>8k
>Under $5,000
I have an Intel i3 at 1.8GHz and a Intel HD 4000.
I live a life not being a graphicsfag and playing games I like that aren't taxing on my hardware.
Most ps4 games are 1080p and there are a bunch of games that are 1080p60fps
PS5 will be able to do 4k60fps and there were a bunch of games that will be 4k60fps but yeah there will be developers also pushing graphics
>Any new first person game
>Numbers fly out of enemies when hit
This is a universal red flag.
>You can turn it off
It's not the numbers themselves that are the problem but the unresponsive AI that mostly soaks damage and have predictable damage thresholds for stagger and other scripted animations. It's literally "we want the mmo/borderlands crowd".
Yeah, aren't these consoles now using integrated CPU/GPU combos?
>Arguments were provided.
GOOD FUCKING JOKE, your "argument" is conspiracies and name calling
>PS5 will be able to do 4k60fps
>Unresponsive AI
But we literally saw the NPCs reacting to being shot in the 45 minute demo. Were you not paying attention or were you one of those NPCs?
>NPCs reacting to being shot
Yeah by flipping the fuck out like a bethesda game
With my gtx1080 I can play bf1,bfv, project cars 2, forza horizon 4, gtav, sunset overdrive, watchdogs 2 among other games at 4k 60fps with high or max settings.
Off th top of my head Ive had trouble with assins creed origins and odyssey and some other that I couldn't play at 4k with high settings.
Of course at 4k I generally turn off AA because it's not really necessary.
A PS3 could do 4k60fps if all it has to do is play fucking pong. Of course the benchmark here are games that are actually graphically intensive, not 2D indie platformers and last gen ports (aka 95% of 1080p60 games on PS4).
With my gtx1080 I can play bf1,bfv, project cars 2, forza horizon 4, gtav, sunset overdrive, watchdogs 2, street fighter 5, among other games at 4k 60fps with high or max settings.
Off the top of my head Ive had trouble with assassins creed origins and odyssey, shadow of the tomb raider and some others that I couldn't play at 4k with high settings.
Of course at 4k I generally turn off AA because it's not really necessary.
>because it's not really necessary
Okay this is where your post fell apart and I'm calling out your bullshit. AA has been absolutely necessary on all 4K games I have seen. Jaggies did not go away with the higher resolutions. Did it help a bit? Sure, but it did not go away.
There are more than one reason that could be the case in the demo. Demo environment might not represent release or the majority of the release due to scaling, difficulties or just to give a better impression. I know Dead Island sure as fuck didn't show high level zombies just standing around taking it 500 times in the head.
>like a Bethesda game
>implying that it's going to be in the final version
They'll remove it and ban you on forum if you menition it.
Did you not see the bad guys practically teleport once they died?
We get it, you like seeing Geralt in the bath.
post the video of the sailing comparison to the trailer
Wow what a argument, im utterly convinced!
So who defines "graphically intensive", BF4/BF1/BF5 are graphically good looking and run at 1080p60fps
>he pays thousands of dollars to have slightly better graphics that don't even matter when you play it
I saw the guy get his head blown off by the shotgun and fly backwards.
Depends on the settings you're aiming for.
I can nearly pull that off (90-95% resolution scaling at 4k, 60 fps on anywhere from medium to ultra settings) in games like DMC 5, REmake 2, RE7, BF1, GTA V, Nier Automata, Shadow Warrior 2, etc.
Hell, I can even pull off a native 4k60 in games like KF2, Rising Storm 2, Titanfall 2, MGSV, and less demanding games.
That said, I usually stick with 1800p60 if the game's particularly demanding, like pic related.
Both are shit pc is better but still shit, you will get a console then want a pc you will go back to pc and realise even with high fps and res gaming is still garbage stick with your pc user atleast you can mod and have access to way more types of games.
Yeah when baddies do their death animation and ragdoll they flip out
>BF4/BF1/BF5 are graphically good looking and run at 1080p60fps
BF4 runs at 900p60 with drops on PS4.
BF1/BF5 run at dynamic resolutions that drop as low as 1000p and struggles to hit 60, let alone maintain it, at least in Conquest and Operations and similar modes.
I'd like to see you get blasted by some future shotgun and not do a flip.
youtu.be
>"I've got news as big as my balls!"
The fuck is this writing
Oh, and this is all on an R9 Fury.
>Expecting druggies that literally cave their own faces in to speak the King's English
>your "argument" is conspiracies and name calling
This just got really boring. I'll give you the one (You) out of pitty, but unless you can improve your game, I'm giving up on you.
That line's up there with "maybe because I shot him in the face"
I guess technically, but aren't all RTX features game specific? Like some ray tracing games (metro) work with only lighting, not reflections and shit like that. Pretty sure DLSS can make a 25-30 FPS difference in some games, and as the tech improves, especially with a release as big as Cyberpunk, we might be able to see similar frame rate jumps.
Funny you didn't refute it, remember saying HURR U GOT LOW STANDARDS is not at argument, 30fps is fine and millions are enjoy many games at that framerate
Ok look at CoD WWII, again what is your definition of graphically intensive
I love how scripted this shit is
>all these people bitching that 60 fps is too choppy
This is why I NEVER play games above 60. It still feels perfect to me
That is SLIGHTLY better. At least these are idiotic arguments, rather than you just doing the thing I told you that you are doing and it is stupid. "Millions of people" are thinking Transformers were great and bought every fucking god-awful Halo and CoD game, so that is one fucking laughable argument out of the window.
The other, again, is actually already answered. If you are fine with 30 FPS, you have low standards. That is my point. It does not require any further support.
>30fps is fine and millions are enjoy many games at that framerate
That's like saying a lot of people live for years with tuberculosis before it kills them
>comparing 30fps to TB
And you faggots wonder why no one takes you serious
It doesn't really matter if you take us seriously because you're the one paying for motion blur and 20 fps, not us.
That also uses dynamic resolution on the horizontal axis.
>implying I don't pirate everything on PC
Stay mad, retard
>nvidia meme effects
>buying overpriced hardware just to have simple reflections
>RTX
Nice gameplay user
Someone post the pic that debunks and destroys this before some poor kiddo falls for it
I don't pledge myself to AMD forever and ever, or anything dumb like that. But out of the current field of competitors, I much prefer them. That video I linked might be interesting though. For example if AMD keeps making pretty decent mid/upper mid-range GPUs, but simply making them smaller and smaller, eventually you can fit it all in one die with a CPU, and you can sell the whole thing (built in HBM for RAM/DRAM), instead of buying a cpu, RAM and a GPU, you now buy just an "8K gaming PC" APU. In that case, they might get a lot of market share. But nVidia will still be around to make the highest end GPUs, I'd think.
GPUs aren't great at ray tracing either. Ray tracing ASICs are. That's why nVidias one generation (non RTX) GPUs are dog shit at it, too. We're talking about one sample as well, it looks like dots with several pixels inbetween, and then the algorithm somehow connects the dots (literally in a way, I guess) and blurs it out to make a shape.
>Whaaaaa--
Just wait a week, nigger-san. Why would you say no to higher FPS at the same price, and one non-meme feature which is the low latency mode? If it's bad, reviews in ONE WEEK will tell you anyway.
>nVidia can't win in Vulcan
>nVidia can't win in DirectX 12
>AMD can't win in DirectX 11
Oh no..
>4K 60fps console
I'll believe it when I see it. But you have to remember, a console is not equivalent to a normal desktop computer. A console game is custom-made to run on that specific hardware. It's as if someone made an entire build of a game, with the only purpose of running perfectly on your exact hardware (and no bullshit background processes either). I haven't even played much consoles, but the PS4 can actually put out some fairly impressive graphics for its hardware, so I think it's entirely possible at least with later games.
It's a webm, and it's replaying at live speeds.
>Did it help a bit?
Thanks for bolstering my argument. I just don't use smaa in 4k with my gtx1080. In some games I can get away with smaa 2x but I usually use fxaa or no aa and it looks perfectly fine to me. I have no reason to lie numb nuts. My pc can do 4k at 60fps just fine in most games.
>>he's going to play Cyberpunk
that's where you made your first mistake
AMD can't win in OpenGL either
>I'll believe it when I see it. But you have to remember, a console is not equivalent to a normal desktop computer. A console game is custom-made to run on that specific hardware. It's as if someone made an entire build of a game, with the only purpose of running perfectly on your exact hardware (and no bullshit background processes either). I haven't even played much consoles, but the PS4 can actually put out some fairly impressive graphics for its hardware, so I think it's entirely possible at least with later games.
The issue is that all that extra optimization work always goes into improving graphics further. No dev gives a shit about framerates, and if anything they tend to get worse as the gen goes on (in third party games especially).
Yeah, probably true. Well, we may be looking at one of the last generations of consoles anyway. Streaming (like Google Stadia) will probably become the norm within a few more generations, max. It's more profitable for them, easier to handle for those who play on consoles and don't notice the enormous latency anyway, and it means that they can avoid the whole manufacturing and servicing of machines thing.
>A story mission is scripted
HOLY SHIT NIGGER
>Thanks for bolstering my argument
Yes, I supported your argument by calling out you saying AA isn't necessary at 4K when it literally is necessary. 4K doesn't remove jaggies. They're still very present, thus I have come to the conclusion that you've never played in 4K.
It's being released for current-gen consoles; RTX will be an unoptimized piece of shit gimmick included in Nvidia's black box just like Hairworks was in Witcher 3
You said yourself it helps. Me personally in most games at 4k aa off does not bother me. I have a 4k tv a 1440p minor and a second 1080p monitor. I always change the resolution to 4k when im on my tv. I just usually have to tweak the setting to low AA or off. To get a steady 60fps. Doesn't bother me in the slightest.
My pc is
7700k
Asus maximus ix formula
32 gb trident z ram at 3200
Gtx1080
M.2 and ssds
It's a very capable machine.
>raytracing
>simple
enjoy your ssr lol
fuck off to /g/ AMDrone, you wont sell your shit here
>You said yourself it helps
HELPS != REMOVES you utter fucking philistine.
That's just the dilemma, designing good non-cinematic gameplay becomes exponentially more difficult the better the graphics. Indie games exist because the people behind hit obviously lack funds for a hyperrealisitic production. And that's good; their passion doesn't suffocate under the tedium that is programming a realistic looking game. Instead it's full steam ahead on the initial vision, with enough room to sprinkle in soul in the form of little details. Just compare Hollowknight to RDR2. Both insanely detailed, both made with a passion but it's clear which one has been bottlenecked by misplaced hyperrealistic asipirations.
>"8K gaming PC" APU
There are like 5 things wrong with this.
Retard.
>using raytracing in a game with neon light fixtures that have STATIC placement.
Already wasted computations that don't use raytracing to its advantage. Why would you care for raytracing from a static lightsource to moving objects or vice versa. The only way to use raytracing to its advantage over prebaked shadows or other simple lighting systems is to use complex and moving light sources against moving objects.
>prebaked garbage
The whole point of raytracing is dynamic environment which shitty console movies will never have.
Is this the PC thread?
pcpartpicker.com
Is this good for 1440p gaming with 80+fps? Though I am having second thoughts on 1440p? Is there much of a difference between 1080 and 1440p? Higher fps or higher resolution?
because screen space reflections suck dick
>30fps is fine
Imagine being that retarded.
My point still stands. My card can play most games at 4k in the 60 fps range. Stay salty you incel. I've concluded youre either a butthurt poorfag that can't test it himself or you're just a retard that bought an rtx card too early and are realising you've been duped.
Wait 8 more days and get a Ryzen 3600X
>ryzen meme
Yeah, good luck with 99% of other games that still work on a single thread.
>tranny thinks processors pull from a magically homogenous pool of resources to perform computations
enjoy your stutter
>yfw both Hairworks and RTX on
he put a ryzen 2600x in his cart.
if he wants a ryzen he should at least wait a few more days.
There's something really comfy about this picture.
I don't think I have the budget for it
I only put Ryzen because I was recommended it over intel
>uses the word "philistine" to signal he thinks he's smart
No surer sign of a genetic disaster.
>pic very related
How expensive is a new full rig with an RTX?
I'm saving currently for april 2020.
it's just $70 more than your 2600x
you could also get the normal 3600, its better than the 2600x too and is just $30 more
Do I need a CPU as strong as that since I think my gpu might bottleneck it. But I guess itll last longer
All these cyber punk threads are made by cd project red employees
AAAAAH GOLEM GET YE GONE
I think for 1k$ you can get one with an r5 2600, rtx 2600 and 16gb 3000mhz ram
I did a 9900k, 2080ti build for under 3k. Can't remember exact total but it was well under it.
>I'll continue eating shit, because I like the taste of it
That's your choice man. For the record, ni/g//g/ers are even worse than you guys in terms of raw autism. I went there for phone advice, and all I got was "buy chinkshit and put a pajeet ROM on it".
Name them, nigger. A future APU that contains the CPU, GPU and RAM/DRAM. You need the motherboard, PSU and storage.
>Comparing film to games
That aside, if you imagine that webm in "3D", that's what my first "3D" movie looked like. Only years and years later did I finally see one that didn't look like garbage (and only because someone got the tickets for me and forced me to come along).
The Ryzen 3000-series have had clock and IPC improvements good enough to make up for that, if you are to believe official numbers. That's why if you wait one week, you can have a proper debate on all of this, without default-recommending the big security vulnerability that is an Intel CPU. Ironically, the more expensive the chip, the better the binned chiplets are, so the 12 and 16-core actually clock even higher.
Wait one week, see reviews for CPUs and GPUs before deciding. Anything else is just dumb, in my opinion. The GPUs are probably going to be "eh"/good enough, but the CPUs are no doubt straight up better in every single application. No wonder Intel strained hard and pushed out a half-wet turd in the form of 9900KS, a CPU clocked to do what every 9900K can already do. The 12-core Ryzen already does better than the 9900K, and the 16-core straight up destroys it.
Wait for the 3000-series. The IPC and single thread performance improvements at the same price point are definitely worth it. If nothing else, they have learned their lesson and VRMs across the board are looking better. So you could even just buy an older CPU for cheap (if you can find), but get the newer motherboard if you wanted to upgrade the CPU later.
All the reflections in this screenshot can be done with screen space reflections today.
Damn boys, I want a Cyberpunk game set in Beijing or something
This is why I'm never even going to bother with 120hz. Becase on I spend the extra for a 120hz monitor and then have to get all new hardware because what cuts it for 60hz won't cut it anymore, and then once you experience what it is to go even further beyond, you'll never want to go back. Next thing you know you're constantly buying 700 dollar graphics cards every 2 years to keep up with the pace.
Nah, 60fps is fine for me. I'm content with that, and I don't know what I'm missing, and I don't WANT to know what I'm missing.
CDPRs blatant fucking Nvidia bias is really pissing me off.
Do you exclusively play indeshit or something?
If it's a good game then it won't rely on cheap gimmicks that a corporation used to whore for attention.
Still looks like SSR or parallax cubemaps, nothing in that image evokes raytracing.
> People want to play at 30 fps to experience raytracing woah effects that have been doable with screen space reflections or a second camera for probably a decade now
Does that look better than Bioshock remastered?
spbp
>>he's going to play Cyberpunk
>implying
What I'm actually going to do is replay Quake while listening to Razor because modern shooters are too fucking anemic for human processing.
>screen space reflections
>good
lmao is that pic of you? You realize avatarposting is a bannable offense, sweaty ;)
Subtle trannyposting is subtle.
Nobody will give a shit, oh I need reflections behind the wall of a shitty tranny ad oooh
Just use bigger triangles with better textures
SSR already gives you 80% of what you need. You will have all the rain puddles in the world.
>posts weebshit
>sub-80 IQ
Like clockwork
Wonder how it will look after the downgrade. Already doesn't look that good.
As long as I can get 60fps at 1080p I am fine. After I pirate it of course.
fuck me daddy uwu
t. posts literal not raytracing
Why am I alive in a world where I can't own this as property Anons?
>RTX 2060
>using RTX features
pick one. to make existing RTX implementations playable you need to wait for the next generation of RTX cards. I haven't looked up if they added more tensor cores. the RTX 2000 cards are all underpowered for RTX features, future generations need to quadruple the tensor cores to make shit worth activating. also the "super" refresh cards only move up one tier each, so the new 2060 will be like a 2070 - not good enough for RTX games.
give it time
shut the fuck up toady