THANK YOU BASED NVIDIA

nvidia.com/en-us/geforce/news/gamescom-2019-game-ready-driver/

THANK YOU BASED NVIDIA

Attached: gamescom-2019-geforce-game-ready-driver-faster-performance.png (3244x1703, 134K)

Other urls found in this thread:

us.download.nvidia.com/Windows/436.02/436.02-win10-win8-win7-release-notes.pdf
google.com/search?tbm=isch&q=puddle
store.steampowered.com/app/993090/Lossless_Scaling/
software.intel.com/en-us/articles/integer-scaling-support-on-intel-graphics
youtu.be/Vd-8fV_AWGM
guru3d.com/news-story/nvidia-preps-geforce-436-02-gamescom-driver-exclusive-geforce-experience-download-only.html
techpowerup.com/257773/amd-radeon-19-7-3-drivers-increase-rx-5700-series-idle-fan-speeds-by-over-50
twitter.com/NSFWRedditGif

INTEGER SCALING

HORY SHIT

Attached: gamescom-2019-geforce-game-ready-driver-integer-scaling-ftl.png (1801x1347, 1.35M)

>big performance improvements in AMD favoring games
>fucking INTEGER scaling
>low latency mode
>customizable sharpening filter
>RTX is getting wide adoption
What does Navi offer? 100$ off for welfare recipients?

>20xx
Kys nvidia

Fuck is that real?

>UP TO 23% FASTER
>*in this one specific game nobody plays
>*with this one specific gpu nobody bought
>forza was struggling to run on nvidia so this is more of a fix than anything
>the rest is just what you'd expect from new drivers anyway
lol
epic

>RTX is getting wide adoption
of mirror perfect puddles.
actually funny they implemented all the features AMD already got this quickly. still getting navi because fuck paying 14k of locals tugriks more than 5700xt

>buying a shit monitor with an ass ppi in the first place

hey it's great that these retarded people can at least enjoy retro style games now

>nvidia
>RTX cards in Australia are still easily $800+
>fucking 1660Ti is still $450 at the cheapest
I’ll pass

I'm using nvidia for 2 years and reading this article I already feel nausea just thinking about clusterfuck of a control panel. since they are copying AMD features can they also make a good control panel?

Why does the 2080 super do worse in some games than the 2070 and 2060? What the fuck? Oh
>FPS Improvement
What a strange chart

This is a feature? Couldn't you already do this?

Actually nobody has this feature except for Intel

Attached: 1564983438583.gif (336x238, 3.89M)

why is nvidia intentionally gimping its hardware and trickle down unlocking its full performance to their consumers

not to mention hilarious "freestyle filter", did fuckers just use open source of AMD fidelity FX?
which is part of the driver soft btw, no need for geforce bloatxperience
RIS does same thing integer scaling does too
but nvidia will be praised for it while AMD will continue "lol sharpening filter" in the mass mind

Doesn't AMD have that feature that lets you upscale to 4k without losing much quality at all? That looks pretty good. Quasi-4k/60+fps on a relatively cheap card vs 1200 dollars for the RTX 2080 TI

You sound mad. AMD doesn't have integer scaling, freestyle filter is a thing since 2015 and wtf is fidelity FX? Even on the reshade forums they say the radeon image sharpening creates way too many artifacts to be playable compared to the normal reshade sharpening.

literally why should I care
my card already runs everything at max settings in 100+ fps, I give zero fucks about new hardware or new drivers until they make games which push the graphics more (never going to happen since consoles exist)

us.download.nvidia.com/Windows/436.02/436.02-win10-win8-win7-release-notes.pdf

Release notes already available

>driver update still not available

>all those shit games
>all those ebin gaymin gpu's
wake me up when they do something meaningful

Because people eat that shit up. Better than releasing broken hardware and trying to fix it all with patches like AMD does, though.

The levels of cope in this thread are off the charts.

Intel doesn't have this feature because Ice Lake isn't available yet, only Gen 11 GPU has integer scaling

Meanwhile Nvidia has it first now, today on all Turing GPUs

Mvidia Chads always win
It will take some time for AMD to copy those features

>Even on the reshade forums they say the radeon image sharpening creates way too many artifacts to be playable compared to the normal reshade sharpening.
>they say soviet union is peaceful and utopia state, according to soviet newspapers

nvidia copied amd features, retard
except integer scaling

It says 6 am dumbass

>sharpening is an AMD feature
I'll give you thr low latency mode though.

What time zone nig

kek name one feature amd faggot
Nvidia brings new features to gaming and all AMD does is copy them

What do you mean?

google.com/search?tbm=isch&q=puddle

Wew finally regretting that RTX purchase a little but less now.

when the sun rises

It's simple.
When AMD gets performance improvements from driver updates it's Fine Wine.
When Nvidia gets performance improvements from driver updates it's gimping.

Imagine being an AMD poorfag in the current year holy shit AMD is fucking finished.

What's to regret? They're objectively the best graphics cards ever made.

but how much performance do you lose if you use a 10 series

>WAAAAAH ITS NOT TRUE

Attached: 1f5b0da35639282e68a015883bf19c143657d74678709855836fbd862ef30c02.png (1279x723, 1.29M)

>nvidia copied amd features
Why are you lying, faggot?
Literally every single feature in the AMD graphics drivers was stolen from nVidia.

It wasn't "stolen".
More of a knock off version year or two late, that's how it usually goes.

1-2% same as every driver update.
Time to upgrade be a good goy.

>First introduced in our Studio Driver in July, our new Game Ready Driver adds support for 30-bit color across all product lines, including GeForce and TITAN, allowing for seamless color transitions without banding.
so are 30 bit monitors not a meme now? does that even matter when the game most likely doesn't support that shit anyways?

>driver updates are now good goy tier
Getting desperate aren't we.

HDR is probably the impetus.

Obviously meant upgrade GPU from 10 series.
You want all those new features don't you?

can't find the drivers on the website

not mirror perfect in your examples, they need to fix materials for it, but knowing gameworks history they won't do shit

you dont seem to understand what this does
store.steampowered.com/app/993090/Lossless_Scaling/
look at the screenshots

up in 3 hours or so

it's still a little funny Nvidia started using AMD marketing tactics that lack of RTX sales getting to them, q3 revenue report won't be perfecti'd short their stock now while it's up, q4 might be fine though after all minecraft kids buy 2060S

Imagine owning an AMD card.
Imagine having to wait a year minimum to get rough performance parity at a measly 100$ discount.
Imagine having to be wait another year minimum for actual high end cards and feature parity, or rather parity with features the competition had last year.
Imagine having to come up with excuses why all the tech you're missing is shit anyway.
Why do people do this to themselves? It's not even 10$ a month.

The only reason the reflections in some of those images are even slightly soft is because the camera is focused on the ground instead of the content of the reflection. Unperturbed water is perfectly smooth and has perfectly sharp reflections.

>implement basic bitch nearest neighbour upscaling
>call it "hardware-accelerated programmable scaling filter available in Turing"
>lock it to your newest cards

Attached: imminent laughing.jpg (960x720, 51K)

it's not a damn mirror like in RTX demos where it always looks like you are looking in it
it's materials problem. we are at the same stage as morrowind shader water with this.

So what, should I enable low latency globally on my 2080 Ti?

Hey faggot, Intel only enables integer scaling on Gen11 GPUs

software.intel.com/en-us/articles/integer-scaling-support-on-intel-graphics

I tend to set everything on a per game basis, except AF 16x.

It will probably reduce performance and make no difference with vsync so I would only enable it in games where you want low latency and already have a high frame rate.

sure why not? it's like free FPS essentially

>Muh gimpworks
When will this meme stop? Hardware Unboxed BTFO that claim, meanwhile AMD does it for real and nobody says anything.

Attached: Screenshot_20190713-172101_Chrome.jpg (1343x1320, 518K)

3 drivers happened since then, also stop posting pcjewer, they take money from everyone on daily basis

>stop using this site it doesn't show my favorite company in the best light possible 24/7!

youtu.be/Vd-8fV_AWGM
it just works

>muh shill
>muh jews
Every fucking time. They still didn't fix the performance loss. Meanwhile Polaris can't even run Wolfenstein the New Order.

Attached: Screenshot_20190801-132932_Chrome.jpg (1928x794, 426K)

>AMDepiphany

there are actual respectable sites out there, pcgamer is only good for news feed

Intel being fags does not excuse Nvidia being fags.

Nvidiots are desperately coping after AMD btfo them with based Radeon Image Sharpening, Anti Lag, Freesync, etc.
Enjoy your overpriced Turing house fires, suckers.

>Meanwhile Polaris can't even run Wolfenstein the New Order.
what? the thing runs on 1050ti at 60

>Freesync is supported on Nvidia
>sharpening filter is somewhow a wow feature, also on Nvidia
>Anti Lag also on Nvidia
So AMD btfo'd them with nothing, got it.

here's your gameready driver goy

Attached: 1534118790-aezal.jpg (1919x2160, 2.01M)

guru3d.com/news-story/nvidia-preps-geforce-436-02-gamescom-driver-exclusive-geforce-experience-download-only.html

is this for real? are they really going to geforce experience lock it?

"AMD's" sharpen filter was already ported to Reshade & the "Anti-Lag" shit was just copying pre rendered frames 1 from Nvidia, which has been a thing for over 10 years

Nvidia always copying superior AMD technologies and trying to pass them off as revolutionary. Already seeing all the drones say how epic these features are when last month they were just "useless gimmicks" according to the same nvidiots.

>>Freesync is supported on Nvidia
5 years later when everyone and their grandma knew they could do it day 1
>sharpening filter is somewhow a wow feature, also on Nvidia
DLSS is worse than RIS, period. also yes, fidelityfx sharpening is a different feature to your normal cheap sharpening
>Anti Lag also on Nvidia
after AMD explicitly explained how it works

See the cope.
If it's just prerendered frames 1 then what does the new Nvidia anti lag do? Secret magic sauce?

>sharpening filter is somewhow a wow feature, also on Nvidia
The difference being Nvidia needs "super advanced AI deep learning" to do it and call it with superfluous name, while AMD is doing the same with a few lines of codes. Better yet, it's also open sourced and someone already ported it to Reshade so everyone can use them.

>the "Anti-Lag" shit was just copying pre rendered frames 1 from Nvidia, which has been a thing for over 10 years
that's not how it works, look it up.
>"AMD's" sharpen filter was already ported to Reshade
because it's open source. yaaay NVIDIA!

According to sub 80IQ nvidiots that is a loss for AMD.

Anti-Lag is just AYYMD lame copy of Nvidia features from 10 years ago

It's garbage and fake news

Here comes the nvidiot schizo.

we just never stop winning, can we nvidia bros.

okay, so nvidia reinvented it in this new revolutionary driver?

Here comes the buttmad AYYMDPOORFAGS with no drivers and baited into buying Poovi garbage

techpowerup.com/257773/amd-radeon-19-7-3-drivers-increase-rx-5700-series-idle-fan-speeds-by-over-50

No.

>"Anti-Lag" shit was just copying pre rendered frames 1 from Nvidia
except it isn't you dumb fuck

Attached: Capture.png (1169x825, 149K)

That article is false, the news article on the nVidia website has a link directly to the driver download page.
It's not out for another two and a half hours.

Seething nvidiot.

LOL

Attached: n.png (938x422, 73K)

No. It only works if the game is GPU bound. In CPU bound games it reduces fps with no benefit.

current driver version is a little higher though, or do you want me to post how cursor got corrupted last month? or how drivers black screened some monitors? or how idle 144hz clocks were like full load clocks last year?
oh and let's not forget how nvidia murdered several laptop models with driver update

Look, all the SEETHING AYYMDPOORFAGS coming to post in this thread because they realize their AYYMD HOUSEFIRES garbage is shit with no drivers :^)

will this benefit my 2060 too or just the super

COPE

okay, let em explain - the technique syncs CPU and GPU
GPU can't make more frames than CPU allows it in any condition, so if it's synced things happen faster for driver overhead and monitor starts processing signal several ms faster
it's a win-win. basically ~10ms this technique gives is free 100 fps on responsiveness and controls feedback

Well, I guess the games aren't completely GPU bound at 1440p...

>23% faster performance
yeah for the newer cards and all the other cards get fucking gimped

Depends on the game.

it's 23% in AMD favored games, so for example 5700xt runs BF5 at 146 fps faster than damn 2080ti
now 2080ti is a little faster as it should be

And only in a few specific games that ran like shit on Nvidia before.

So instead of 15 FPS we now get 30? Yaaay...

Did you just copy&paste that from the shill guide without understanding the tech first?

This technique simply doesn't work if the game is CPU bound because in that case the GPU has to wait for the CPU anyway. There is no way the driver can improve latency in that scenario.

You are not in the AMD thread, sorry!

name a single CPU bound game from last decade.

Far Cry 5

Anno 1800
Rising Storm Vietnam

Battlefield V. Runs like shit even on 9900k.

funny.
you two know what bound by one of components means? it means GPUs won't matter for the title. it means all GPUs would perform the same on testing.
the benefit of anti lag features would be lost only in that condition. are you one of the retards that think smaller resolution the more work CPU does?

Attached: untitled-5.png (712x1034, 58K)

is bf5 cpu bound in this case?

okay, and see anti lag works anyway.

This is actually an advertised exclusive feature of their flagship hardware?

Attached: 1464019061116.gif (346x342, 3.35M)

I picked up a 2070 rtx on ozbargin for $600 tho?
Why are you dumb

Likely depends on the map and player count.
PC magazines tend to test in single player mode where CPU is less stressed.

Stellaris

You posted so much dumb and wrong shit.
Stop.

5700 XT STRIX but no STRIX 2080 Ti
Haha.

>Yea Forums talking about tech again
If there is one thing you never do it's listen to Yea Forums when they talk about anything tech related.

> nvidia
> driver

I'll let some other sucker test drive the drivers and tell us if it's broken. Chances are, it'll nerf older card performance in attempt to get suckers to buy their newer overpriced cards.

>If there is one thing you never do it's listen to Yea Forums when they talk about anything
FTFY.

>Added Beta support for Ultra-Low Latency ModeNew control in the NVIDIA Control Panel->3D->Manage 3D Settings page. Offers improved latency for DirectX games. Currently not supported under SLI mode, DirectX 12, or on Microsoft Hybrid notebooks.

>[HDR][Integer Scaling]: Blue-screen crash or system hang may occur when changing the resolution while using integer scaling with HDR. [200542578/200542870]

Not sure if i want to try those beta features to be honest.
One doesn't work under DX12 and the other can simply BSOD, what is this shit.

>2080ti STRIX costs $1300, 5700xt strix costs $450
haha.

It can not work with DX12 because DX12 works differently than DX11 and DX9.
"Ultra Low Latency Mode" is a carbon copy of AMD "Anti-Lag". That also is a DX9/DX11-only feature.

Disappointing.
Coyping AMD features is almost always ill advised since they are pretty much universally shit.

NO FUCK YOU NIGGER go look at their forums if you want an idea of how they'll "revamp" the control panel.

FUCK NVIDIA,FUCK YOU
FUCKIING SHIT MONOPOLY

seething lmao

Anti Lag is great. Makes games feel a lot more responsive.

Where are my rtx chads at?

2080Ti reporting in, can't wait to play the Metro DLC.

2060 owner here
Shoot me.

You're all retarded, this wasnt a move against AMD this was a move to their actual competition which is their 9xx and 10xx series. locking this shit down to their newest cards is just another move to try and get you to upgrade since the RTX sales were and are dissapointing.

Attached: jewish.gif (320x240, 2.03M)

Will probably use my dividend next year to upgrade.

Attached: 1552339307659.gif (600x645, 2.5M)

You won't get dividends next year. Earliest will be January of the year after.

This. Tell me how Pascal can't do it. A 1080ti is still more powerful/on par with the RTX range which is hilarious.

Good program. Use it to play Open RSC and Super Lemmini

Attached: logo (1).png (325x325, 58K)