Who else is waiting for RTX 30XX?

Who else is waiting for RTX 30XX?

Attached: NVIDIA-Ampere-Feature.jpg (1920x1080, 230K)

Other urls found in this thread:

wccftech.com/intels-core-i9-10900x-10-core-20-thread-comet-lake-cpu-benchmarks-leaked/
youtube.com/watch?v=mFSLro_OXLQ
youtube.com/watch?v=IicEyJkgnvw
youtu.be/S1o24hvcNNs?t=3207
twitter.com/NSFWRedditGif

I'm waiting for PC game developers to have the balls to push graphics far enough to actually warrant buying a new card again.

Cyberpunk will

I'm waiting for this and Incel's new CPUs before I decide what my new build will be

>600
>700
>800
>900
>1000
>2000
>3000
Why

*inhales*

when will the new cpu come out

>refresh HEDT
>refresh desktop
from intel? never

Unless they lower their prices, I'm not really waiting for them.

im waiting for good games first.
not console ports.

is the PC ever going to get PC only games that isnt an mmo or indie game?

but it wont

tfw poor
tfw the 10xx series will be cheaper
tfw i be finally be able to afford it

Attached: 1549824333074.png (514x710, 577K)

Benchmarks for Intel's 10th-gen X-series CPUs just got leaked, and they lag behind the Ryzen 3900X in single core performance.
wccftech.com/intels-core-i9-10900x-10-core-20-thread-comet-lake-cpu-benchmarks-leaked/

I was gonna get both a ryzen 3900x and a rtx 2080 ti, I'm coming from a r7 1700 and 2 gtx 1080ti, and I dont see a reason to. minecraft rtx isn't coming out until next year and amd will come out with their version of raytracing, I believe that ampere will launch next year strictly because amd may have something comparable and raytracing enabled, all in all next year should be exciting

Attached: maxresdefault.jpg (1280x720, 158K)

Same here, I'm waiting for comet lake at least. Not sure if I'll wait for Ampere or just pull the trigger on a Turing card though.

>pc gaming the worst its been in years
>"heck yeah bruh who's ready for the next extremely minimal upgrade that no devs will utilize that will cost an arm and a leg"
>"you're not poor are you? Le jej"
What a sad state of affairs pc gaming is in. Should be ashamed.

>releasing on current gen consoles
expect another witcher 3 downgrade

this shit needs to run on base xbox one

cyberpunk comes out next year. Thats something to be excited about

Well it's lagging behind because intel is just glueing cores to an existing architecture and cranking up power draw and a billion security flaws they patch at the cost of performance, it's only going to get worse next year.

Attached: 1568747484858.png (720x719, 675K)

>200
>300
>400
>500
>56
>5700
It's not as bad as amd.

It's just dumb consumer whores. I'm still sitting on my 380x and play all games I want no issue.

Its actually
>200
>400
>500
>600
>700
>900
>1000
>2000
>1600

I wish Nvidia would stop price-gouging. :(

this right here. Game devs need to OPTIMIZE their shit and stop relying on GFX cards to brute force everything.
For reference, Crysis was made for a video card that is about 50%-60% less powerful than modern ones

that's what i mean by nothing, is just skylake x again and 10th gen desktop is skylake ++++ and apparently it needs a new socket again but for now is just rumors

Probably, still rocking a 980ti, I probably need something new once the next-gen consoles come out

Attached: tumblr_ptyzrkNKuI1rbmgyro5_500.gif (500x210, 2.89M)

Amd has to gain some kind of foothold, if it isn't hardware it's software, their chance is raytracing. If amd can come up with a raytracing solution that outclasses nvidia whilst performing better it will sink nvidias marketing campaign they have been running for what will be two years by the time rdna 2 launches, amd has already shown they are crafty, their Cas is superior to dlss, they need to do that with raytracing.

Attached: 1563313888761.png (602x564, 584K)

I'm waiting for AMD's Big Navi chip (never ever).
I'm never giving money to Njewdia ever again.

they are already shitting out 3k series?
2k meme tracing failed huh?

Yeah, I think this is the part that people are really missing out on when they think Cyberpunk will be some groundbreaking new evolution in game graphics/ai.

It has to run on budget pc hardware from 2013 to be playable on the standard xbox one and ps4. It's the same issue that hamstrung GTA5. Sure, they might release an HD version at some point, but the core engine is handicapped by the limitations.

The only fucked parts of the PC market right now are Intel and the GPU sector. AMD has brought actual competition back to the CPU market, while cheap SATA devices and entries like the Intel 660p have made all-SSD builds possible without having to spend an exorbitant amount of cash. AMD just needs to get their shit together with their GPUs and start competing with NVidia at the high-end (x80-tier cards) so NV can fuck off with their price gouging.

Attached: consider the following.gif (500x374, 500K)

Doubt it, since CAS isn't tied to game implementation, it's able to compete, but something like raytracing is, and devs will push nvidia's solution, because THE WAY ITS MEANT TO BE PLAYED (aka money). Even many console ports, which are all amd hardware based, end up on PC with said program and in turn favor nvidia.

Honestly, do you even need a new high-end graphics card for anything other than AAA releases?

Not even AAA games need that. Only a few pieces of shit actually require high end pcs.

From what I understand, the implementation of raytracing isn't tied to nvidia, it runs off Microsoft dxr, but amd doesn't have one that can run as effective as nvidia, but the minute amd has a solution, they will be able to play current rtx games with no lockouts or poor performance like physx

Attached: 1563668770595.jpg (600x596, 20K)

New high end gpus are pretty much just for the 4k meme. If you're happy with 1080p or even 1440p you can maintain 60fps and even 120fps with mid-tier or older gen gpus just fine.

since this is a GPU thread here's a PSA:

DON'T FUCKING BUY AN RTX2060
SPECIALLY NOT TO TRY OUT RAYTRACING

i fell for the RT meme and any time i want to try RT with this card it's either
RT off and all max = 60fps+
or
RT on + turn off something else + 1080p + dynamic resolution = 50-60fps
shit is not worth it

Attached: 1565026581082.jpg (554x554, 47K)

You'll need one next year when the next gen starts.

Attached: halo-infinite-3840x2160-e3-2018-4k-14375.jpg (3840x2160, 2.98M)

VR needs it, but that's kindof niche still

Stop kvetching

Crysis ran like shit on even the best pcs on 2007. Seriously, just google benchmarks.

>>"heck yeah bruh who's ready for the next extremely minimal upgrade that no devs will utilize that will cost an arm and a leg"
Not really true. The goal post has moved a lot with pc gaming. Sure crysis was a beast of a game at the time, but back then 720p 60fps was the target. Now with xx80 xx80ti cards the target is dubbel the resolution and 3 times the fps. Games didnt evolve that much, but the how we play them sure did.

Now post actual in game screenshot.

Attached: 1565987535984.jpg (462x500, 46K)

>dubbel
nice. I like it.

Attached: 1543829735029.gif (332x263, 2M)

>he didn't wait and buy a RTX 2060 SUPER instead

Attached: 1375580057444.jpg (139x140, 5K)

not really, if new consoles start pushing the 4k/30 fps meme then devs will have to optimize more to get that target and you should get a pretty good experience if you stay at 1080p or 1440p even with lower tier hardware, if they add raytracing then if you are on a budget it's probably another option that you can turn off on pc to get more frames

>VR needs it
Not even. A GTX 970 or 1060 can do VR at 90Hz without any issues, and a 1070 can do it at 144Hz.

shit got announced like 2 months after i got mine
still got no faith on something named SUPER

>2020
>1080p
yikes

Attached: c8569243230cdfb8c75ba37072c47462.jpg (1919x821, 878K)

I welcome raytracing because it will allow a plethora of effects and combine existing ones as opposed to how splintered it currently is
>screen space reflections, ambient occlusion etc, etc
And will allow them to focus on just making the games and stop fucking around with effects.

Attached: 1568396314808m.jpg (1024x576, 98K)

Doubtful, it'll have crossgen titles mostly maybe the year after that.

Game already looks like shit

Attached: 1567191233997.png (1376x1070, 1.3M)

Here Here!
spbp

Attached: Cyberpunk2077 can run 1080p60 at Ultra setting on my $500 PC Gaming.jpg (1397x881, 283K)

Is raytracing really just fancy lighting for an unnecessary amount of resources? Would you really sacrifice fps or resolution for something that isn't going to affect or improve gameplay in any significant way? I don't see the appeal. It seems like something corporations are shilling simply to con people into buying overpriced shit day one.

Sounds good to me. I prefer Nvidia cuz it just werks. With AMD, you just get mad.

Kind of, raytracing is the step towards path tracing, ray tracing when applied correctly will eliminate the need for a billion different effects and put them under one umbrella and add new ones, it was needed, a step toward it is better than no step.

Attached: AU_Billboard01.jpg (800x400, 99K)

Will the new consoles even be powerful enough to utilize it well? This just seems like something that might be worth it in 5 years or more when devs start actually using it efficiently. I'd rather consoles and games in general right now not use it and maintain 60fps+ at high resolutions.

>Is raytracing really just fancy lighting for an unnecessary amount of resources?
Raytracing is essentially the computer trying to simulate realistic lighting using rays of light and real reflections rather than tricks and special effects. It's much more computationally intensive than current GPU rendering, but it's the end goal for realistic graphics.

I'm not a fan of nvidia, I bought into the 4k meme and I needed gtx 1080 ti for that, but no fucking way was I paying 1500 canadian leaf bucks for 15 to 20 frames the 2080 ti offered, especially since before the 1080 ti I had and r9 290 and that card was a beast for 2 years. Amd will strike back next year with r9 290 electric boogaloo, when the r9 290 launched it went head to head with the first nvidia titan while costing 1/3rd the price, rdna will have a card that is a 1080 ti challenger next year, which isn't that bad because the 2080 ti is barely that, but it will do raytracing better than the 2080 ti and 3080 ti or at least comparable whilst costing way less, and nvidia will drop the price, mark my words that is what will happen yall.

Attached: 058.jpg (1200x675, 82K)

>Will the new consoles even be powerful enough to utilize it well?
We are told that the next gen console is supposed to come out next year. They will not have enough power to do RT. or just only one path. Semi-realistic, but you can tell it is fake. Not enough realistic as PC ones.

Not only hardware is fucked. Software is bad too.
Many games work like shit on win 7 and win10 is buggy piece of shit that has worse drivers support than A FUCKING LINUX. Also you can't turn off desktop compositor in 10 so you have forced input lag.
And linux is not advanced enough to migrate to.

yup, that;s why I skipped 2080 ti because of its ridiculous price and immature RT. I can't wait for 3080 ti.

What will be the name after they run out of nubmers before 10000. Will they just go on with something like 10800 TI or go back to reusing "xxxx gtx" like they did before gts/x 100/200 series?

Truthfully yes, it will probably be the metro exodus low raytracing setting, instead of two light bounces it will do 1.5 or 0.7 bounces per pixel, maybe they will go two bounces in the amd version because they can optimize the shit out of it on console because it's a fixed platform, microsoft especially isn't going to settle on less than two bounces because why else would they come up with the dxr standard unless they had something in the pipeline ready for it?

Attached: 1506925782437.gif (480x480, 3.44M)

I would like to imagine they'll keep the 2XXX brand for RTX products while keeping 1XXX for GTX. 1660 is fucked though.

I'm waiting to see how Navi 20/ RDNA 2 turns out.

The only way that price is coming down is if amd comes up with something, like I said hereAmd has to trounce nvidia in raytracing as opposed to pure brute force gpu throughput, nvidia will release ampere next year to anticipate this, but amd will still be competitive in this arena, a card that is as strong as a 1080 ti but does ray tracing better or similar 2080 ti performance at nearly half the cost, it will fuck nvidia up. And I don't think I'm telling any tales out of school either, amd is pretty close already, the radeon vii and 5700xt didn't hit 1080 ti, but next years cards will and will have raytracing, next year will be fun.

Attached: 1500611261559.png (1536x2048, 3M)

DId they announce something?

If the price is not as ridiculous as the existing ones, yes. Otherwise I'll stick with what I have.

>4000
>5000
>6000
>7000
>8000
>9000
>introducing Nvidia GeForce XXX 100 series

Attached: 1568144553947.jpg (1920x1080, 300K)

>posting a 1919x821 image while making fun of 1080p

Halo 2 looking good

>plays console shit at 1080p

I'm gonna jump from classic 1080@60 to 1440p ultrawide 144hz as soon as I find the right monitor for this, so yeah it better hit soon and be good.

Attached: 1547044273137.png (534x494, 259K)

Not really.

It's actually

>200
>4000
>5000
>6000
>7000
>8000
>9000
>200
>400
>500
>600
>700
>900
>1000
>2000
>1600

We will see... I always get Nvidia no matter what because of its super driver that always run the old and new games better. More stable. Better experience. I don't want to deal with troubleshooting the bullshits on AMD.

Highly doubt that. The "next gen" is a console concept and consoles are consoles. Whatever the "next gen" is for them is just a setting on PC, one they've been able to operate on for years already at this point.

Attached: 1568178535769.png (638x359, 314K)

I plan on going to Zen3 12 core and 3080ti next year from 6600k and 1070.

Attached: 1532268026204.jpg (357x384, 30K)

It's actually

>1
>2
>3
>4
>5
>Everybody in the car, so come on, let's ride

Imagine having a $1200 rig, a beautiful screen and top notch optimized game graphics just to be crippled by the DRM.

I don't see any reason to upgrade my 1080ti. I don't mind playing games at lower settings, because I was poor for like 20 years and just had to make do. Even with my 1080ti I still play at 1080p/60fps just because it keeps the components of the PC cooler and is less prone to stuttering/freezing than higher resolutions. I don't even notice. I only use 144hz on my desktop for that buttery smooth mouse pointer action. I will probably get the best 30xx card right before the 40xx comes out and they discontinue it, like they did the 1080ti.

Actually 4690k, HDD, 8gb DDR3 and 970

I wait for Zen 3, RTX 3000 and DDR5.

Planet Zoo beta will be dropping in 2 days. PC only. Planet Coaster was also PC only.

Attached: 1555887705922.png (992x558, 307K)

Don't have to imagine it. That's reality for some games these days.

Good goy, consume product and get hyped for the next product.

whats a good amd cpu for high refresh rate gaming? i got a 180 hz monitor recently and a lot of my games are cpu bottlenecked now that i try to run them at higher frames than 60.
>50-70% gpu usage
>framerate varies wildly from 100 to 200 fps
very cool!

Zen 3 is going to be more of a Zen 2+ built on TSMC's 7nm+ process while still using AM4 and DDR4. The real next jump will come with Zen 4, which is currently in R&D and planned to be built on a 5nm process.

>fell for the 3700x meme

Attached: 1566651632869.webm (540x960, 1.79M)

yeah just like last gen when people said the 770 will do just fine

Attached: 1568485457579.png (382x349, 138K)

video game

Attached: 1512660596707.png (672x794, 422K)

>that awkward feel when you have to upgrade this year +/-1 but still want to wait longer because RTXs are so fucking bad
I've been buying PS4 versions instead because my PC can't run this shit anymore. Might do 2070Super if they're on sale for a bit off, or whatever AMD equiv there is.

City builders, sim games, and RTS are among the handful of PC exclusive genres.

I know the webm is old but idgaf rn.

Attached: PC Games.webm (600x338, 665K)

>remember when the 970 was the most popular card and how everyone thought it was the best
>then 3.5GB happened and it became obsolete within 3 years

Attached: fuck.png (273x437, 74K)

That's why I held off on upgrading my pc, I already have an r7 1700, the 3800 is only 15-30 fps faster, and can only get 4.4ghz across all cores, the 7nm+ won't be too bad honestly, they will be on a more mature process and will likely hit 4.7ghz across all cores while running at the same tdp, not great over clockers yes, but they will be good, so I will probably upgrade next year, the 3000 series was kind of a let down on pc, but for consoles it will change the game for most.

Attached: 1510030797152.gif (194x199, 69K)

>People unironically upgraded their GPU before the next console cycle
Bruh. You should know better. Buy a GPU after the consoles release so you're on top for the next several years. Also big ouchers on the people who bought first gen RTX. You should REALLY know better.

>Nvidia

Attached: 8GB vs 6GB.webm (1280x720, 2.92M)

user, that's clearly BF2142.

AW3418HW works for me

Fucking kikes and nvidia gimping their hardware, why because if they made the 2060 8gb, it would have great legs and be useful for years, bastards

Attached: 1511189186968.gif (380x214, 1.22M)

>1080p tier card playing at 4K
what for?

>Fucking kikes and nvidia gimping their hardware, why because if they made the 2060 8gb, it would have great legs and be useful for years, bastards

Attached: 1566610374964.png (1029x1040, 73K)

>it would have great legs and be useful for years
It would still be gimped anyways. NVidia always gimps their drivers for previous-gen cards when they launch a new lineup.
youtube.com/watch?v=mFSLro_OXLQ
youtube.com/watch?v=IicEyJkgnvw

that represents you retard

>using intel cpu with the nvidia gpu instead of doing a mirrored comparison only changing gpus

They should have done that, but when the 2060 is so clearly being bottlenecked by vram usage it probably wouldn't make that much of a difference.

LITERALLY already been down graded
personally, art style is king
realism is the death of art

>that represents you retard

Attached: 1569163575474.jpg (300x300, 47K)

I believe the point was that both systems cost about the same.

I'm just waiting for an amd card on the level of a 2080 ti

>1080p tier card
playing a fairly recent AAA game in 4k at max settings

>Buying nvidia when next gen games will be optimised for amd
yeah no

You are a slobbery mouth corporate cock sucker, your father is a faggot, you retro actively made him a homo
and your mom used fucks niggers, now lap up some more corporate cum and post something else retarded, you fucking idiot

Attached: 1239199_1372438846169_full.jpg (355x403, 36K)

This is the dumbest fucking meme. Consoles have already had AMD GPUs and CPUs this entire generation.

Thats not true across all software right now, but in some cases yes. And once true gen 2 headsets start popping up with higher res and features we are gonna need new cards more than ever.

I'm worried about it. I was so hyped for Evolution but it fell apart (But anything based off Jurassic World is doomed to fail)

I hope zoo is really really in depth and not shallow

Current gen consoles have a weak ass CPU though and many games nowadays are much better on AMD gpus than the nvidia equivalent (rx580 vs gtx1060) except for obvious nvidia devs shills like Ubisoft.

Is it thooooooouuuuuugh???

youtu.be/S1o24hvcNNs?t=3207

Attached: RedEngine presentation Cyberpunk.png (1680x1050, 1.31M)

Never gonna happen with consoles being the main market for games

Cost isn't really a good factor of comparability.

Exactly that, cpus have held this generation back, graphics are awesome but the games are anemic in terms of interaction, gta v had to degrade its vehicle deformation on ps4 and xb1 because both were weak sauce compared to what 360/ps3 launched with and it hasnt been reinstated since, I can't find the source, but ubisoft demonstrated that ps3/360 cpus were stronger than their successors

Attached: 1511238324079.gif (636x386, 3.01M)

But Zen 4 come to 2021 or something?