Are pc gamers ready?

Are pc gamers ready?

Attached: Screenshot_20220421-200820_Chrome.jpg (839x1144, 424.13K)

>we run less efficiently now.
SUCH INNOVATION!!!

>2000W power supplies will not be a meme in your lifetime

>he doesn't power his PC with 2500 hamster bros ngmi

dude you need 1kw to power you intel cpu to get that 5 percent increase over amd.To match your meme ai driven gpu that is barely noticably better, that ai is working hard!

does "power excursions" mean built in crypto miners?

GPU owners are a menace to the environment.

Can a typical house's wiring handle this being drawn from a single outlet?

3D graphics and their consequences have been a disaster for the human race

>can play games at 4k 144fps and cook you a steak to perfection
sounds based to me bro

>you will NOT get a pay raise
>you WILL pay more for power
>you WILL pay more for computer parts
>your computer parts WILL draw more power
>your dollar WILL continue to devalue
>YOU WILL BE HAPPY.

I for one welcome the return of the Fermi maymays.

lol this

just turn off your central heating :^), the costs will balance out

HIGHER WATTS BETTER BRO
just like video games with realism. Fuck this shit. Optimization is fucking dead

yeah, it will just cut power to the rest of the block you're on

>tfw live in a cyberpunk dystopia without ANY of the cool robotics and cybersex shit

Attached: 1637290523089.png (550x700, 159.79K)

we have cool robotics, they're just mostly in the """defense""" sector.
it's about as pure cyberpunk dystopia as you can get :^)

Not long ago I hear that 1200w is overkill but my PC is always hungry; ovc cpu, ovc gpu, wifi/bt dongle, usb x2 for corsair keyboard, mouse with leds, 2 1.5m tall led bars connected to usb, goovel ambien light connected via usb, external dac that eat wats from the wall and the usb connecte to the motherboard, fast charger station connected via usb, monitor extended hub with 4 usb 3.0 ports (1 charigin my phone, 2 chaging my dex, 3 charging my gamepad and 4th is a spare for a usb memory stick)

>You will have to get a 30 amp circuit for your PC
Well okay then

As someone with a gtx 1080 going on 4 years now, my next gpu will definitely be a 4070. No way in hell im letting my power bill go up 20-35%.

Can't they just... make a 2080ti that uses like a 3rd of the power it used upon initial release instead of this crap?

We have Cybersex, but you faggots complain over it

Attached: 1649977682812.png (5963x4862, 319.16K)

>can’t make better designs
>just put more power through it

this is because everyone refuses to move to RISC architecture

you asked for this by continuing to buy these dogshit CPUs

Ah, I forgot that I also have 1tb ssd, updated it with another 2tb ssd, then I put my old 500bg hdd, then another 2tb hdd and as I was short of space I got another 24tb helium hdd. I neverr used all of them at the same time but they can easily eat 150w.

Haven't you heard? We're moving backwards now, pretty sure 3090 and 3090 Ti are LESS energy efficient than a stock 2080 Ti.

Shut it brainlet

>the ukrainian flag
>that color scheme
>UUUUOOOOOOHHHHH
it all makes sense now

That is false, they are more efficient chips in terms of performance / power consumed, but consume more power overall to reach a higher overall performance.

No. You’re likely not going to be drawing the full 2kw but if you were, that’s 16.67 amps. Some household wall outlets are on 20 amp breakers, some are on 15s, but those are designed to be a cumulative load for multiple rooms, not sustained power on a single line.

Oh noo. Demiboy is way to sexist! Where's Demigirl, Demiwoman or DemiLoli?

That's because graphics cards never get hardware revisions like consoles. If there was just one gpu for 5 years with efficiency updates every year it would be better.

you're angry because I'm right

Damn, I'm pro-Ukraine now!!

Attached: 1626110430260.png (1220x700, 688.12K)

No, no it is not false. The top-tier cards are pushed so hard they are literally less efficient than a 2080 Ti from 2018 on a worse node.

Attached: file.png (500x1090, 168.82K)

they could but GPU manufacturers are dialing up the power consumption each new gen to hit arbitrary benchmarks like "50% more powerful than the previous gen" or trying to take the performance crown versus their competitors models

>3070 is the most energy efficient nvidia card in multiple generations
3070 CHADS WE CAN'T STOP WINNING

>7%
The difference is so small that can barely be apreciable. After all everyone has different mobos and the difference could jump in a 5% ratio.
So yes, is good that they consume almost the same but with smaller chips one could expect a better performance than this.

Or it's because amd is finally competing so nvidia is reeeing, housefires be damned.
Some fucking retard youtube reviewer out there will buy a 600w rtx charcoal edition, get some click bait review out and make children think nvidia is the best so they all buy a 4050ti for $350 for $200 of performance.

Every big graphics card for the last eighteen months has sold despite having 300W+ power consumption so companies assume the market will take it, ignoring that was all miners who underclock and undervolt in open air setups.

SLAVA UKRAINI!!!
UOOOOOOOOHHH

Eh, i'm fine with my 3060 Ti for another 5 years or so, or at the very least for the next round of energy efficient cards after the 4xxx series.

>MAP flag uses emoji colors
bros how deep does it go

Sure but I was talking about 3090 and 3090 Ti. Also no, you're not going to get 2080 Ti performance at 80-90W out of some garbage like Ampere.

I went AMD this generation and If Nvidia really is going to require me to use a 1000 watt PSU for their housefire cards then it's another AMD generation for me.

Attached: 1608788600968.jpg (1280x960, 322.8K)

To be fair, as you said yourself, it's because they are pushed to the absolute limit by default. If you limit the TDP yourself to something more reasonable (300W?), The 3090Ti is waaay more efficient and performs more or less the same.

>making flags for a specific group of people without their permission
Rude as fuck, and isn't doing this shit way off from their inclusivity spiel? Instead of pitching one flag to unify the entire human race, these faggots just segregated themselves. What a bunch of retards

Attached: UUUUUOHH.gif (220x220, 120.73K)

Don't care. I'm still using then ancient (and still in good form) 1080 ti overclocked to 2212 so I don't envy the 3070 users aside for the ray tracking and some extra fps.
Maybe next gen when I can get a 4080 for +/- 1000...

The 3080 was reportedly running at 240w with ~5% performance loss according to some early buyers, all three of them.

The 6800xt was in a similar boat, shown at 300w but all the AIB overclocked cards give a tiny bit of extra performance but go over 350w.

I love my IGP computer.
It's insane a machine you can't even be sure it's fucking on until there's something onscreen.

>outdated lgbt flag
You gonna get lynched by the tranny orbiters.

Honest question as a 1660ti owner, but why weren't you at all interested enough in ray-tracing or DLSS to upgrade to a 30 series cars? I can understand how your card is still decent for the current gen, but I just got hooked on the selling point of DLSS, and I'm wondering about your take on the feature and why it didn't feel like a worthwhile purchase

They can't even pass the performance of Apple chips except by clocking up 2k or more Mhz. And they still have more vastly heat vs an SoC that has the equivalent of a Geforce 1650 or so. It's not even competing cpu to cpu effiency, but cpu vs every component in an SoC.

I have 500 watt CPU with ryzen 2700x and rtx 2080ti. Never had a problem.

>got hooked on dlss
You mean that temporal artifacting piece of shit?

>The 12900 igpu was a bad joke; 2x hunger and x2 less performance than amd igp
Intel never learn.

Attached: k.jpg (2130x1347, 371.36K)

Why did they start sucking again?
My HD 530 from back in the day could play Counter Strike GO at 1080p. I thought future chips would have excellent prospects

>dude lets just start ramming as much electricity as physically possible through video cards to compensate for devs not being able to optimize anything

If it's able to retain graphics fidelity from a lower resolution, thereby increasing performance, we get the best of both worlds. Just seems like a good deal to me.

Only the laptop adler lake cpus got the full fat 96eu igpus.
Desktop parts swapped it out for a 32eu and used the extra space for more cores.
The 6core being the best because it doesn't have the crummy e-core shit either.

first world countries that run on sensible voltages can do it with ease

Desktop power consumption does not matter so long as breakers aren't tripped and throttling isn't necessary.
That being said, I really wonder how they intend to draw 1800 watts without tripping a breaker.

By IGP i mean a integrated GPU, i'm using a 2200g.
It does a lot more than it is supposed to.

Maybe we're just past the point of optimization to be able to meet the demands in further improvement of graphics technology. Optimization just might not cut it anymore, and now seems more like cutting corners than resource management.

That doesn't change that they cut corners to implement a cheap iGP and promote these chips. We know that a 105o can melt these but the performance from last gen was better than that.

Attached: eeeee.jpg (1679x1313, 197.75K)

You can easily draw 1800 watts in a transient spike without tripping a breaker. Happens for a microsecond, its not a big deal

how the fuck do you go a from a 350w card like a 3090 to needing a 600w connector in the OP image?

The trouble is it decreases fidelity with ghosting artifacts when in motion, if you can't see it then it's probably hidden by the txaa or motion blur that should be turned off anyway.
If you don't care then fine but it's eye cancer to me along with dof and chromatic aberration.

dsaa looks really nice in screenshots and stuff like text gets really crisp but pay attention to complex objects like trees and you notice problems.

user, the names of the right are the names of the iGP that these processor are using.

all that power just to play...
>scam citizen
>console ports
>unoptimized multiplats
>archaic emulated console games
....Y*KES

Attached: 1624507512802.png (1454x1468, 2.83M)

Meanwhile the West is too busy sucking Taiwan's shrimpdick to invest in actual fabs that can get these chips small enough to NOT set a house on fire.

Attached: 1598767443034.jpg (1536x864, 235.22K)

DLSS is an abomination. It's the opposite of what we should be doing which is increasing resolution and pixel density so that AA isn't even needed anymore.

Nobody needs these and the 30 series is already touching the point of enabling bad optimization and development practices by crutching the raw power of new gpu's to do what was already possible with 1-2 generation old gpu's.

>the poorfag cope has arrived

I was trying to seem reasonable, I'm firmly in the give me super sampling or death camp.

Intel's building new fabs as we speak. They just suck.

Supersampling would be great as an option as well. Unfortunately that means we'd have to reduce the graphical fidelity which would make retards mad, so we are stuck with TAA, DLSS and similar awful garbage.

"Moooom! I need a Razer™ Gaming Breaker®"

>poorfag cope

Attached: 1643251190875.png (404x402, 17.72K)

multiple new fabs are being built in Arizona as we speak, TSMC even has a plant being built there.

This. Heat is a meme anyway. Just put on another jacket.

>Americans are going to need to get a 220/240V circuit wired in just to run a gaming PC

Pork projects. The chips currently being made are old, and China backed pressure off Taiwan. TSMCs projections went through the fucking roof this quarter.
They'll start asking for more govt gibs then halt construction once they can't steal any more

You say its hungry, but have you ever measured how much it actually uses at max load?

Attached: KillAWatt.jpg (507x680, 93.57K)

Meds

Not an argument

if you think any gpu is "too fast" then you are poor, no question about it

>you WILL eat zee bugs

power is getting cheaper to produce
whats the problem?