Oh yeah, just got myself a brand new GTX 1650...

>Oh yeah, just got myself a brand new GTX 1650, can't wait to join the master race and play those games at a silky smooth 60 frames per sec-

Attached: gtx.png (1920x1080, 324K)

Other urls found in this thread:

logicalincrements.com/
youtube.com/watch?v=GBkGu4Wd7vg
youtube.com/watch?v=A8VrFUi79yo
twitter.com/NSFWRedditVideo

Damn, if i knew you could make a computer with just a graphics card i wouldn't have wasted so much money on other components

>Vega 56 120 FPS average at medium settings, beating the GTX 1080 and only a bit below the 2060
still silky smooth 100+ at ultra, based AMD

Give it a few days for new drives, AMD's just dropped today, that's why Vega's frames are up there.

lol pcfags
How're the crashes?

This isn't a shit hardware issue it's a "id Software can't fucking optimize for shit" issue

Attached: 1539995930077.jpg (848x900, 36K)

No, it's because the game uses Vulkan.
Pretty much every Vulkan-based game runs much better on AMD than Nvidia with a few exceptions.
No, this is a "the GTX 1650 is fucking trash" issue.
Much like the GTX 1050.

Does Rage 2 feature Megatextures™?

>Rage 2 utilizes Avalanche Studios' Apex game engine instead of id Tech.

Rage 2 is not iD Tech 6(Doom 2016)/7(Doom Eternal) though, that's why it's shit

>1650 is pretty much identical to 970
looks like i have an excuse to not upgrade

>1920x1080 at medium
Literally who is this benchmark for?

Attached: 1555204251252.webm (230x276, 152K)

Your average consumer

>RTX 2080 Ti, Radeon VII, RTX 2080, RTX 2070 etc...
>"""""Your average consumer"""""""

Even the most brain dead go for low or high, medium is the redheaded step child of graphical settings, no one likes them

>RX 570 now has the same performance as a GTX 1060 6G
Is there a more based GPU?

>High adds a lot of subtle eye candy at a significant cost
>Low completely compromises the visual integrity of the game
Medium is where it's at.

>Turn down a bunch of meme settings which don't really add much to the graphics
>Game suddenly runs well
Rage 2 doesn't even run on idtech

Attached: 1541197900030.gif (474x272, 2.86M)

>on the fence condensed into a setting
>being indecisive
Low IQ

Nvidia is targeting PSUlets. And PSUlets deserve that treatment.

>Naïvely going for one extreme and not calculatingly finding the sweet spot for the best performance/visuals ratio
Two-digit IQ.

-ond

It says 63.5 average right there you fucking dunce.

>2070 over a 1080Ti

Just buy a used 1080Ti they said. Don't fall for the RTX meme, it's just as good as a 2080 they said.

>Average frame rates are meaningful
>Ignoring the much more crucial minimum

Attached: 1554905689914.png (478x523, 168K)

>3 fps difference
wow it's nothing

You've caught me user, I am, in fact, low IQ. I am sorry I was lashing out at you feverishly. I am but a simple man, and my two digit IQ knows no bounds. Medium settings are the pinnacle and I was wrong . Enjoy your day user

I just am so used to being able to run everything at highest and I am mad I have to use the medium settings on newer things

>going for extremes only even with fucking graphics settings
Holy shit you are a literal caveman.

If you're getting Nvidia, you should always get the top end of that generation. 1080ti, 2080ti, etc. It will last twice as long and work twice as hard as any of the budget models, effectively paying for itself. If you bought a 1080ti last year, you could realistically expect it to last until 2021 at bare minimum. Even in 2022 its perfermonce will still be considered mid-range. Or you could buy 2 or 3 shit video cards that are obsolete after a year.

No gonna lie the current RTX gen is fucking garbage, even the high-end 2080-ti struggles to get to 144 fps at 1440p and 60fps @4K. What a wasted generation.

1650 is probably 2x the performance of a base console. That's what I see here, not a problem.

Try the AMD 570, absolute king of the cardlets.

Did you have a stroke user?

>2080-ti struggles to get to 144 fps at 1440p and 60fps @4K.
only in shit games, or in games which are CPU bottlenecked.

Dorks just go for Custom. There are things that gobble up performance like reflections, volumetric lighting, and shadows that default game presets are too retarded to turn off. You can turn the important stuff up to High, like textures and anti-aliasing, then just turn off the unnecessary stuff like what's listed above, and your performance will be at "Medium" range but look a lot better. I dunno why these people can't make decent presets.

based and redpilled as fuck

First, the GTX 1650 is a $150 card. Second, there are a lot of settings that ruin performance while adding no benefit that these tests always have on/turned up. Third, averaging 63 FPS is actually really good if you compare it to consoles.

Ok, retard.

Attached: jpg2.jpg (690x421, 64K)

>Witcher
>Forza
>Warhammer
CPU limited
>The rest
shit, and also mostly CPU limited

try again

OH also you posted the 2080, not even the TI. We're both retarded,

Guys, when should I upgrade? These are my specs
>Windows 8.1 64bit
>I7-6700k
>16gigs of ddr4ram
>nvidia gtx 980ti
>Msi z170A m7 mobo
Or should I just max out my current motherboard?

>at 1440p
>no where near 144hz
I'm showing examples of my point.
I mean the ti isn't that better; and honestly if you got a 1080 or a 1080 ti a few years ago you could at least play 60FPS in all games. On 1080p or 1440p.

Attached: 1440.jpg (690x421, 59K)

Why would you buy a 1650? The 570 is better, cheaper and has better drivers.

60FPS @ 1080p is more than adequate to enjoy your games. You really don't need more than 60FPS @ 1440p. Anything higher than that is a luxury. It is a very nice luxury, of course. But you just don't need that much heat to enjoy your games.

I have a 1080ti/i9 setup and I still play at 60FPS/1440p just because it's less taxing on my computer (less heat) so the components will last a little longer. idgaf about 4k or 144hz except on my desktop - for that buttery smooth mouse movement. It's addicting.

Not for ages. I have the same build pretty much as a 980ti is on par with a GTX 1070 and I still play everything at 1440p maxed out minus Ubi wank. I'm hoping RTX gets more support and cheaper before I even consider going anywhere in the next 2-3+ years.

Nigger, no games are CPU limited at 4k, the fuck are you talking about?
If they were CPU limited then the 2080 would be performing on par with the Radeon VII.

That looks fine. Why would you want to upgrade?

Dont know, had this build since early 2016 since I got my job. My first build by the way, not sure when is a good time to upgrade.

Attached: 3f2.jpg (341x354, 161K)

>Windows 8

You need to upgrade right now; you shouldn't have anything other than Windows 7.

>not sure when is a good time to upgrade.
When you can't do anymore the stuff you want to do.

8.1 runs fine for me, meme 7 all you want. At least it isnt 10 rip windows 9, whatever happened to ya?

10 is fine; least ye forgot Steam straight up stopped running XP a few months back. Its going to happen. Just run the enterprise version and install classic shell or someshit; you wouldn't even notice the difference.

about the same as console, these days.

>cpu limited at 4k

on what fucking planet

Proud 1080ti investor Right here, saw the Redflags before it was too late

Windows 9 couldn't exist because of all the legacy software that just checked "if (version number contains "9")". This is the official explanation from Microsoft.

Someone at Microsoft is just making fun of people. The internal version numbers have little in common with their retail boxes.

>Windows Nine

>Vulkan
How much does it stutter?

Yes and that's why nobody checks the kernel version, they just go to the registry and check the ProductName key.

this
who the fuck sticks to the games definition of a setting?

People who use GeForce experience and allow it to optimise games. I legit know friends who came from console and were to lazy to understand settings so left it on. One of them even complained about framerate in PUBG because it was optimising around his 4k TV so naturally chased native 4k on a 1070 which led to barely 30fps.

It has been really strange; I've noticed that "high" settings aren't even the fucking highest allowed. So when changing it to the highest setting it goes to 'custom' I miss when games would just optimize automatically and enable all the good shit.

Attached: letthemdie.gif (474x250, 637K)

Or games where the setting scales horrifically and without consistency.
>Ultra View Distance 25000
>High 8000
>Medium 2000
>Low 1000
Can't trust these hoes.

>horrifically and without consistency.
Rust is like this hands fucking down, have 8 different presets. And the highest won't doesn't even max out draw distance. I blame toaster uses who bitch about performance issues in steam reviews.

Thank god someone finally fucking said it.
So sick of hearing all this terrible advice on this board when it comes to pc's

Any AA option other than TAA is placebo.

You also have to fuck around with your graphics card control panel thanks to Nvidia's fuckery. On NVidia cards they preset you to enable all kinds of extra bullshit, and on AMD cards you have to force disable tesselation because Gameworks games will crash your frame rate with no survivors if you don't. Oh and Vega has to draw something like 130% of its rated power to even hit phase 7 clocks.

>12 Tflop ps5 in less than a year

Shits going to blow most card in mid range and lower spectrum for the price of a 6TF rtx 2060

>bought a AMD Vega6 4plus DMCV, RE2R, and Division2 for $280
Good ol' AMD

My man

Attached: 1401428542083.jpg (245x237, 25K)

No trolling, blame poor 3rd worlders or NEETS with no money who shill AMD or low end cards. Or overall retards who don't know how computers actually work so they just go with what is most marketed or say shit like "diminishing returns"

Yea, 12 TFLOPs on an architecture that apparently can't even hit Vega clocks.
I can't wait for Navi to finally come out and disappoint so you faggots will shut up.

>the frames dropped slightly in this one spot, it's all over, better drop my resolution and settings!
Ironic image

>tfw 980
What GPU can I use to give me a rough baseline for my shitty cards performance? These benchmarks never include it for some reason

980 is either equivalent or slightly faster than a gtx 1060.

you legit don't need ultra on 4k even medium looks good becuase the resolution is so high

It's crazy how good AMD cards are with low level APIs. Goes to show that their hardware is damn great but their software team is composed of monkeys.

>this

I always watched those Potato Masher Pro vs Xbone X videos and I'm surprised by how good medium and a mix of high looks at 4k. If it wasn't for having to settle with 30fps I'd be tempted.

PS5's GPU is coming out Q3 this year, it's called Navi 10 & it will be $249

i see nvidias starting there performance downgrade drivers for 10 series already with 2070 somehow beating a 1080ti

with a 2080 ti you get 100fps on med/high BFV and like 70+ in AC: odyssey

>this just in, vidya developers can't optimize their games for shit and run like potato unless you spend several thousand on the most cutting edge gpu's
>tune in for our later story where we investigate if water is wet.

Attached: 30e0acf90b3ced9c7f697ed11a97dbf2cbabe9e7_hq.jpg (680x383, 26K)

>he fell for the 1660 ti meme

enjoy your cut down 1070 pcb

And all that power will be wasted chasing 4k. If a 6tf X can't hit native 4k in a tonne of games and they say you need 8tf for everything to be native, that leaves you with a 4tf overhead. Not exactly much.

this is extreme hpyerbole, 1650 is a 150$ gpu and the 279$ GTX 1660ti smokes it and is good enough for modern settings

>Buy console
>Game runs at 24 fps on what would be medium settings on PC
>No customization

>tfw 1366x768 pleb monitor but paired with a 1060 6GB and i can play anything i want at MAX settings and 60fps

Attached: dsg.jpg (835x872, 92K)

>medium

LOL
Also LOL at anybody who bought the GTX 1660, less vram and less fps than a GTX 1070 for the same exact price, wtf Nvidia is Jewvidya.

I'm on a 1070 so 4k means 30fps unless I lower everything down a tonne and that's best case scenario sadly.

>buys cut down, reject bin shit
>wow wtf this thing sucks!

Attached: 1551674833141.png (300x323, 94K)

This. People are in a positive spin cycle on what is realistically achievable with Navi. I see 8 tflops being leagues more realistic than 12 tflops if it's basically an RX 580 replacement.

>970 is still more or less suited for 1080P60FPS all those years later

Attached: 1553727284231.jpg (727x727, 39K)

I bought the 290x over it when the whole 3.5gb meme wank happened and it's funny how that turned out to be a non issue as far as I'm aware.

consoles have been holding gaming back because they're the lowest common denominator. if devs have to make their games work on shitty tablet cpu and underpowered 2011 gpu then any products from that time or after it will serve just fine. the consoles being outdated tech on release didn't allow them to age as well as 7th gen for example where the 360 for example aged really well because it launched with the best gpu of its time and over the years devs managed to learn new techniques to extract more and more power from it.

So for someone looking to build a new PC from scratch with a budget range of $1000-$1500, what should I go for?

A Switch

logicalincrements.com/
It's not perfect, but its a start.

get an 8700k or 9700k and an RTX 2080, you'll be smooth sailing til then
>inb4 Zen 2
Who cares it's gonna come slighty under Coffee Lake IPC and come a little under the price

Why do so many of those builds still recommend 8GBs of DDR4? The amount of stuttering I alleviated by going to 16GBs is absurd. Forza Horizon 4 went from being frustrating to perfect.

>8gb of ram still
Ooh boy.

>Used

Attached: drake_still.jpg (1440x1080, 87K)

TAA looks awful though. At 1080p it just blurs the hell out of the image. Watch Dogs had some AA setting that didnt blur everything, besides that there's been no good solution.

is an RTX 2060 a good card for a first time PC builder?

>2080

coward, do you buy used cars?

TAA done right won't blur the image.
SMAA looks just like FXAA with a sharpening filter which ends up making everything look jaggy anyway. TAA is the way to go aside from obvious MSAA but that's taxing as hell.

Should probably add that the most intensive shit that I play is limited to total war games and revisiting SupCom.

pretty much, RTX ain't shit on it tho

Why the fuck is a 2070 outperforming a 1080ti
Fucking Nvidia.

RTX ain't shit period.

>97th percentile
>One spot
It is gentle enough to show 99 rather than 99.9

lower level api's work better on turing, brainlet

Different GPU architecture I think is what its called. Its Pascal vs Turing, newer would be better in this case and probably age better

I'm not particularly mad since I got a new 1080ti for £250 less than a new 2070 would have been, but that's still fucking retarded.

Yes? That one spot is where the most shit is going on, therefore it's the most important spot to have a high frame rate.

Yeah, the average consumer definitely needs to know how well a 2080ti can run a game.

The medium settings at 1080P is the average consumer part. The high end cards are there for reference.

Yes and I don't throw it away when it dies on me

>1060 3gb still good enough
Feels good

New hardware is faster than old
Hmmm

Except that the 1080ti is about 60% faster than the 2070, you utter cretin

Attached: smugpost.png (858x1200, 795K)

Or that one unoptimized spot with useless particle effects that really has nothing going on.

Nvidia has been hampering performance on older gpus before you were even born faggot

>he fell for the /g/ troll
youtube.com/watch?v=GBkGu4Wd7vg
absolute lmao

You are a bad consumer and need to be purged, only the latest end game garbage that will be outdated next year should exist

Who in the actual fuck uses the in game presets? Just buy a console at that point.

>1650

literally a retard trap

I bought a 1060 and it was also a retard trap, fuck you

>Used vs new
Yeah it is something.
Don't tell me you're actually stupid enough to buy the most used bitcoin mining card after the bubble bursted

>Implying performance actually goes down on a piece of hardware designed to be turned on all day
God damn why is people here so fucking STUPID

Attached: dissaproval.gif (204x195, 1.03M)

I don't think that's what he implied at all, but I'm not one of those stupid people you mentioned

>shit optimization
>probably just as shit as a game
Yet another AAA game I can ignore exists

>not sure when is a good time to upgrade.
three months ago when the bitcoin meme finally popped

So is it finally over?

Considering a new 1080ti is running you twice the price of a 2070 unless you wait for Nvidea to restock 3 months from now, retard, you're looking at a used one that obviously got fucking pounded and risking expedient hardware failure.
Dumb faggot

But that's not true you retard. lol

Just change the fucking fans if you are so afraid of them dying
The components are designed to work all day and night, imbecile

Why do all that work for the clearly inferior card?

1080ti was worse than the 1070 for bitcoin mining actually. Did you miss the entire of 2017 when 1070s were costing the same price as 1080s because of it?

>1080ti shitting on the 64

loooooooooooooooooooooooooooooool

I guess you have hundreds of dollars to waste then.

You do know that the Vega 64 was designed to compete with the 1080, right?

Playing catch up means you'll always be left behind. AMD needs to actually make a great card, and not just a x card alternative.

>4096 cores vs 3584

in low level API the cores are utilized more so cards with more cores will do better than cards with less cores, not taking clockspeed into account and assuming bandwidth is identical.

k
Link me some shady ass used 1080 ti for $300 so you don't look retarded

Attached: Poortards lmao.png (911x992, 295K)

So, if I wanted to build a high end PC, the 1080ti is the one I should go for, right? Because the price for them are WAY higher than what I expected

Is Ultra settings a meme relative to the performance hit?

>64 beats 1080ti
>64 is a 1080ti competitor
>1080ti beats 64
>UHHH 64 IS NOT A 1080ti COMPETITOR

beats 1080ti
is a 1080ti competitor
Said no one ever.
If it beats a 1080 Ti that's even more impressive since it's a 1080 competitor.

what the fuck is a 1650

Fug. Thought I was the only one

Attached: 1517264557234.jpg (1262x709, 110K)

Yes.
youtube.com/watch?v=A8VrFUi79yo

I got a 1080 EVGA SC 2 + 9400f and it doing me fine.

Preset graphics options are the tard wranglers of the gaming world. You should be setting graphics options yourself to cut out things you don't like or don't care about to get both a better performance and experience.

>lowering settings
literally a console gamer

Nvidia's replacement for the GTX 1050 ti. It's a fine card but has a shit price so no but people who bought into the Optiplex meme should buy it.

Oh are you not going to reply now?