Why haven't you bought a new NVIDIA GPU?

Why haven't you bought a new NVIDIA GPU?

It's time to upgrade.

Attached: Clipboard01.jpg (1677x942, 945K)

I already have the Geforce GTX 1080Ti though.

>bullshit image taken out of context
DLSS doesn't look amazing, but still looks and runs better than running at a lower res.
Source: my ass running BF5 at 4K vs 1440p without DLSS.

Does this count?

Attached: DC9EF97A-8014-48BA-A470-2CD0AF6C4BBB.jpg (3264x2448, 1.33M)

for a 3.5 meme card my 970 is still holding up pretty well at 1080p

Why does it look like shit? In any case, a crop doesn't mean a whole lot here. We'd need to see it in a full frame, with comparison to other frames, and see full motion video of the effect.

i have a 1070 and it kicks ass at 1080p 144hz. my processor is the only thing holding me back but i'm going to upgrade that soon hopefully.

They fixed the bluriness like 4 months ago

Attached: F-1.jpg (3840x2160, 873K)

Because I don't have any money. I'm broke.

Attached: 1499097240975.jpg (540x472, 47K)

There are no games worth upgrading for.

>graphics
>fps
The forever ultimate pleb filter.

cause I'm not a fag

Attached: 20190603_161412482.jpg (621x796, 219K)

With just a couple of blowjobs you could have the money user.

I have money. I'm just waiting.

Isn't DLSS effectively just a meme TAA?

it's like console checkerboarding but yeah also like TAA.

Waiting to see what Navi offers, and if Nvidia cuts prices

Fellow 970 bro. 3.5 can suck my dick, this thing still chews through anything I throw at it. I imagine I might retire it when next gen hardware starts rolling out since I imagine the load on the GPU requirements will start to scale as vidya takes advantage of more modern hardware.

Attached: yeahboi.png (256x256, 168K)

They're not similar at all. One uses temporal sampling to merge pixel data from previous frames with motion vectors, the other uses neural network hardware to estimate what a pixel would be when upscaling.

Still sitting comfortably on my 980ti.
Skipping this meme early adopter generation and waiting for the next cards to deliver properly.

new nvidia cards leaked to be shown at e3. faster than 2060 and 2070 and all the cards will be price cut. honestly amd is dead and buried at this point because even the best navi card releasing this year utilizing the cutting edge of manufacturing technologies can barely beat out a 2070 which is using an old manufacturing process (essentially 16nm) and has half of its die space taken by meme cores. once nvidia releases their own 7nm cards they it'll completely blow the fuck out of amd and it won't even be funny. it'll be sad.

Good thing Intel wants to compete and actually has the funds for the poo in loo

>they fixed the blurriness
>still looks like a photoshop oil paint filter was applied to it

The original image isn't max quality at all

>1080Ti
Like OP said, it's time to upgrade.

Attached: d880359008c3eeddeacc9aa8a87be3e9.jpg (846x1200, 169K)

Unironically this, fuck an upgrade. I can run any game I want. my next gpu will be in 7 years.

Attached: 1552664233409.jpg (511x671, 33K)

There is literally no point in buying an Nvidia GPU beyond Pascal.
RTX 20 series is a meme scam. Turing GTX is overpriced for its performance. The new cards are also going to be inflated in price.
If you absolutely must, just buy a used 1080ti or 1080/1070ti.

All of those have shit framerate

what is the point of a gpu if a cpu is enough?

My MSI Vega 56 at 250$ with three free games (RE2, the division 2 and DMC 5) still satisfy me.
Why would I upgrade?

how is she able to afford to buy high end computer hardware like that? also aren't PC parts in japan more expensive?

>Spending $1000 on a GPU so you can have blurry upscaling

Attached: DzYIrtMXQAA-F-N.jpg (2048x1152, 217K)

Holy shit, right looks so much more real. I guess DLSS is good.

>30fps vs 60fps at 4K

I "acquired" a free 1660ti, I can't justify an upgrade.

Graphics don't really improve much anymore.

Hundreds of more dollars for 5% increase in fidelity.

Attached: cat.png (1150x1548, 2.36M)

So what do you guys think the "Super" is going to be at E3?

They're higher binned (better silicon chip yields through production improvements) 2060/2070/2080 cards with higher core and memory clocks, slight performance gains (like 10% at most) at the same or cheaper cost of current equivalent cards.
Nvidia's answer to AMD's upcoming Navi graphics cards.
A Super RTX 2070 or Super RTX 2080 will be pretty good. 2080 Ti won't get a Super version, will remain their flagship.

>30 FPS at 4k vs 60 FPS at 1440p
FTFY
Running a game at a lower resolution and upscaling to 4k is not the same as running the game at 4k.

My 1070 plays everything I want so I'm fine. Maybe the next RTX cards won't be overpriced wank and that'll entice me. Or maybe AMD will make a decent GPU range. LMAO.

no way fag. they're fag enablers

Temporal resampling (not the same as TAA) isn't 4K either, but people still call it that. DLSS works, and looks much better than native 1440p. Have you ever tried upscaling a video with NNEDI13 instead of a typical bilinear or bicubic upscale? Shit looks a lot better.

meanwhile i have to pay 800 for it lol because im a third worlder and good things arent allowed here.

>Temporal resampling (not the same as TAA) isn't 4K either, but people still call it that.
Only retards call it that.
If it's not native, it's not 4k.
Period.
Running at a resolution that's sub-4k and then upscaling or reconstructing to 4k isn't 4k.

the character models are hitting a point where more detail doesn't do much, however we still have a far distance to go regarding scale of environments, density of objects, and lighting (real time ray tracing is just now being touched on at the consumer level).

Graphics can go much much farther.

I'm not disagreeing with you, it's not 4K yeah. It runs a lot better than 4K, and looks a lot better than 1440p, that's what I'd call it.

Fuck.
I have a Nigerian online friend who needed to save for 5 month just to get a shit CPU but still a huge upgrade to him.
You would think that prices would be better in the third world for non needed but it's worst

checkerboarding actually resolves a native 4k image as in 1:1 pixel mapping

If nothing moves for over a second, sure.

That's not how it works. It uses previosuly rendered buffers, usually two. Most CR is 1920x2160 native, then previosuly rendered pixels saved in a ID buffer to fill the rest in. You always get 8.3 million pixels every frame, half native, half sub native. Hence why it looks so much better than TI or standard strech scaling. Ever played Half Life 2? No matter what resolution you use the geometry render is CR yet I bet you never noticed.

Nah even a slow 30fps game with bad checkboard still produces a near antive frame every other 33ms. So technically you'd have to stand still for 66ms if you wanted true 4k, 33ms if you wanted the mix of old framedata and new.

>ID buffer
this is exclusive sony tech. xbox and other platforms don't have this but they still resolve a 4k image just like, if not better than the pro (e.g. xbox one x).

The only technical paper I've read on this was Frostbite Engine's implementation from their presentation slides, and I think they use up to 4 or 5 frames of accumulated pixel data in conjunction with TAA for resampling.
Sampling just from one single previous frame creates too much jitter.

>Graphics can go much much farther

Since we've already hit the diminishing return wall years ago with graphics, we need better CPU utilization. Current consoles have terrible CPU's, with a proper next gen CPU being the baseline we could have games with proper open worlds with thousands of unique enemies and NPC's all with good AI.

Sony tech that anyone can utilise. Technically nothing stops Nvidia's next GPUs having a dedicated processor for it. ID buffers real strength is it's capacity for saving an entire framebuffer. There's way's around it though engine side, Frostbite and Source support it natively.
I know very little about CR on Frostbite as I mainly play PC. I heard the ME Andromeda use though is ridiculously good.

>we've already hit the diminishing return wall years ago with graphics
niqqa you ain't even seen the potential of photorealism and ray tracing in computer graphics. we've barely scratched the surface.

They're too expensive.
I think I might go AMD this time around.

Call me in 5 years when RTX or another RT tech is actually optimized and feasible and usable by all GPUs in that time period. Ray Tracing right now is a meme. BFV only traces reflections, Metro uses global illu-meme-ination tracing. Nothing is actually using real ray tracing.

ok. this is based.

that's the point. we've barely scratched the surface of this tech. once next gen consoles drop in a years time more and more games will support it and the tech will get even better. there's so much more we can do in computer graphics.

The last Nvidia GPU I owned was sold away to one of my friends because his computer blew up and he needed it more than I did. I bought up from a GTX 770 to an RX 560. He ended up passing it along to one of his friends for the same reason, and buying up to an R7 3** something or other. The first game that made me consider buying a new GPU was the DLC to A Hat in Time, other than that TF2 and Quake and Oldschool Runescape run fine on it. I wouldn't pay for the privilege of having to deal with Nvidia shit ever again, and if you paid me to take one, it'd still be a hard sell.

Attached: 1558817386551.jpg (1080x1062, 359K)

The next gen consoles won't have it though. I refuse to believe in a years time even a 2060 will be able to fit and be affordable in a console. And that's just a 2060, next gen absolutely will be chasing 4k and need that power without RTX on top.

Cause I dont need to play latest fotm showelware with shitty optimisation at meme fps. My rx480 does just fine for everything else.

Gains in the actual price versus performance of GPUs is slowly grinding to a halt. The market isn't exactly bearing the decisions for shit like RTD 2080 at the moment. Maybe AMD's (and intel, I guess) push back will help things, but the overall performance gains have been middling at best. It's way more likely that eventual progress is made through smart workarounds for some math involved in some of the more demanding features.

don't know about sony because their engineers are retarded and incompetent as fuck but for the next gen xbox, seeing as MS is literally the one who wrote the RT api (DXR), i'd bet they've been working with AMD on what kind of custom RT tech they'd need for an acceptable level of RT in their upcoming console.
also the upcoming AMD GPU, specifically the higher end one is literally half the die size of the 2070 for the same performance or there about so it's not hard to believe consoles can potentially have ~2060 performance + RT specific hardware all for a smaller die size than a 2060.

It's just the cost that gets me. RT sure. 4k sure. Both? Seems a lot of money to spend, even if you trade off 4k RT for 1080p RT. I guess it really does depedn on AMD though since we know fuck all about Navi. I just feel like as of right now consoles are in a weird place, they've pushed for 4k for 4 years by the time next gen starts, and I can't help but feel like they can't really back down and suddenly go after raytracing.

Get some glasses and everything won't be so blurry anymore.

And 1080p also resolves with 1:1 pixel mapping.
Is this a joke?

>And 1080p also resolves with 1:1 pixel mapping.

I'm not sure what you're implying but if you're saying that 1080p looks fine on a 4K display you're wrong. No TV or monitor out there (excluding CRT) uses integer scaling, so 1080p on a 4K display will look worse than on a native 1080p display.

I’m not a graphicsfag. I can still run new games on my 2017 Nvidia after updating the drivers

What is it SUPPOSED to be doing? Because to me it just looks like a gaussian blur filter. Why would I want that exactly?

Attached: 1556387986586.png (610x495, 415K)

no it doesn't. it's 4:1. every pixel in a 1080p output uses 4 pixels on a 4k display.

whats wrong with the image? genuinely curious. im hoping the right image isnt actually what DLSS looks like, because that looks terrible

im holding on to my 1080ti for at least 5 years

no, did you even read OP? its time to upgrade

but i did

Attached: [email protected] (1282x754, 98K)

There are no games worth upgrading for. Cyberpunk 2077 has no release date and will be out 2020 at the earliest. RDR 2 on PC is not confirmed and is still a year away probably if it does in fact come to PC. Bethesda has proven to be a garbage dev and even if you are still hyped for their games TES6 is still years away. 2020 will be a much better year for upgrades, I;d suggest anyone who doesn't have at least a 1070 should definitely upgrade to the 20 series.

>whats wrong with the image?
It's a bit softer, but far from that extent, and performance is better than an 80% scale.

>bought a 1070 at launch
>already starting to feel the age

Attached: 1514433861277.jpg (464x686, 33K)

DLSS lowers the quality. Fuck Nvidia

I got a 1660 not too long ago. did i do okay? i was told its the best cheap card

is it expected of me to not be able to see any aliasing on the left nor any improvement on the right since im on 1440p?

are you talking about what is clearly an effect applied because the player is wearing a facemask?

and to either of you, can you point me out the areas of the image that best encompass either improvement or shit of the DLSS?

I have an i5 and 1070 and I only get the urge to upgrade when I play a game online and that's playing @ 1440p. The single player games I've played so far this year have ran fine at 1440p.

Looks like blurry shit. Might as well play with FXAA on. Why is everything about the 20XX cards so shitty and gimmicky?

I have a gtx 1080 and I still have no interest in newer games at all.

fuck yeah 970 bros, i have not encountered a game yet that was throttled by my gpu instead of my cpu and i'm running an i7 8770. probably got another year of life at least, best card i ever bought.

Holy fucking artifacts, batman!

you unironically have to be borderline blind to not run games at native resolution

It just looks like complete blurry crap no matter what, the entire concept was designed for consoles where they sit 20 feet from their TV and cant see detail anyway

A more proper DLSS comparison. Metro Exodus had some issues at launch but the quality of it got patched.

It's a decent card for the price, but I don't believe it's an RTX card, so it doesn't support any of these features.

Attached: ZX44Y45.jpg (1878x901, 224K)

I'm reminded of this.

Attached: yoshitoadbomberman.jpg (1280x1920, 272K)

literally what is the difference between the two images here? one is just darker, i don't get it.

>consolelard coping with his cellphone-level hardware

Attached: 1556741263886.jpg (523x512, 61K)

The jump will come this up-coming gen

The blur helps, the left is too gamey.

i tried to read about how to use this faggot shit and they just fucking have entire articles and webpages, even NVIDIAS own site, dancing around the fucking topic and not saying how to actually enable it

yes i have played games that support it and looked in the options, yes i have a recent nvidia driver, yes i have a card that supports it

>they all look the same
pc gaming is a meme

Don't call that 4K. That's not being rendered at 4K. I don't give a fuck if you spent your life savings on your meme card and you're in a buyer's remorse stupor, you ugly cunt.

But left side looks better

The difference is like 50% better performance on the DLSS

Attached: 1407039024494.jpg (600x800, 265K)

Attached: Untitled.png (1075x801, 867K)

>Wow it's almost like the one that takes more resources and runs at a lower framerate looks better
Holy shit every single fucking one of you are fucking retarded. No wonder these threads are always full of autistic shitposting. None of you faggots have ANY CLUE what you are talking aboit

i really wonder the kind of leap consoles will make next gen. i cant imagine it will be that big. but i hope it will be, because even though they wont run PC ultra settings, it most likely has a pretty big impact on the image quality theyre willing to implement, which most likely affects PC ultra settings

It's very similar to
If you look really close. It's essentially the same type of technology on a fundamental basis.

is it not just
>open nvidia control panel(should be in your system tray at all times)
>manage 3d settings
>DLSS y/n
?

here you go

Attached: temp.jpg (2200x942, 1.05M)

barely discernible from each other in a still image, in game literally not noticeable.

so the idea is for it to be used for games that you can't run at max settings already? is this more of a technological foundation for games to start taking advantage of in the future?

nope

i think its supposed to show up in the options of games but ive never seen it in any of them

Depends on the GPU which there's no specs but even if there were we would not not as it's not even out yet. Ryzen 3000 and RDNA or whatever it was called is bleeding edge not even in the market yet. Hopefully PS5 have RX580 tier power

come to think of it, i think i remember seeing that in far cry 5 settings, though it was greyed out for me. i guess the devs have to implement it, or work with nvidia to have them implement it