>game auto-detects your hardware
>"Graphics have been set to Ultra Mega Super Duper High"
Game auto-detects your hardware
>set graphics to Ultra Mega Super Duper High
>computer can't handle it and it shuts down
>game doesn't reset graphics settings when you open it again
>"Graphics have been set to Ultra Mega Super Duper High"
>immediately lower graphics to Medium for better performance
>game auto-detects your hardware
>"Graphics have been set to Ultra Mega Super Duper High"
>get into the actual game
>30 fps with dips
t-thanks
>Set to medium
>Game still runs like shit because PC is a potato
>auto-detect sets graphics on low-medium
>runs like shit with constant fps drops
>set graphics to ultra max for a giggle
>game runs steady with a high fps
Name 3 games
Counter Strike
Doom 2016
STALKER
the one I had in mind while posting was Spec Ops: The Line. Ass Creed Black Flag is another. I can't think of another
Not the same user, but Team Fortress 2.
now this is a Yea Forums original post
>settings have been auto-enabled to big dick ultra mode!
>have piece of shit toaster
>this was always intentional because 30fps but nice looking is still acceptable on consoles
>game starts with unskippable segments and jumps straight in without giving you a chance to adjust the settings
>it begins in ultra max everything
>you literally cannot play it and have to edit the ini to lower the resolution so you can actually play the game long enough to get to the menu
fuck me. happened more times than i would like
iirc, Elder Scrolls Oblivion can't properly detect your hardware if it's newer so it sets itself to a lower quality.
I FUCKING HATE BLOOM AND DEPTH OF FIELD AND LENS FLARE
WHY DO PEOPLE THINK IT'S A GOOD THING TO HAVE YOUR SCREEN BE OFFUSCATED LIKE THAT
>accidently set in-game resolution to higher than what your monitor can handle
>game crashes on start-up every time now
>have to delete the .ini to fix it
i can't exactly blame RtCW seeing as it was made in like 2000 but still
I know this pain
It looks cinematic and better
>autodetect sets everything to medium
>toaster can't take it and dips below 20 fps at times
I'll do you one better OP
>have 2080 TI
>launch Oblivion
>Automatically detects graphics quality
>Set to low quality
>mfw
>Game suggests difficulty based on how well you do in the tutorial
>Difficulty gets set to hard
>have 1000 dollar computer that can run games at high settings
>put everything to the lowest
>have fan running 24/7 so graphics card never gets hot
If It's a single player game I don't mind that stuff so much but if it's multiplayer it can fuck right off. I turn off motion blur no matter what game I'm playing, dumbest shit ever.
>die like a bitch due to not being familiar with controls
>game suggests: very ultra turbo easy, white woman mode
>proceed to win the game on hard
>die tens of times during hard fights
This. How the FUCK do people aim when moving if 75% of the screen is fucking blurred out
I always immediately disable DoF and lens flare. Why the fuck are those a thing?
>playing late 2000's PC game
>auto-detects hardware
>set graphics to medium
DoF and lens flare DO NOT look "cinematic". The stated purpose of DoF is to mimic the human eye; I am looking at a screen with my human eyes, I don't need the edged blurred twice for no reason, thanks. Also, lens flare. My character is NOT holding a fucking camera; I don't experience LENS FLARE when I'm standing outside on a sunny day.
>over at my friends house
>has game I don't have
>asks if I want to try it, say yes
>immediately turn motion blur off in the settings
I don't know how he beats as many games as he has
C H R O M A T I C A B E R R A T I O N
No Mans Sky
Doom 2016 (with vulkan though)
Fallout 4
>play one of the unreal tournament games
>it cant auto detect what my modern rig is
>auto sets to ultra low 640x480
why is motion blur even still a thing in games
I used to get this back in early versions of minecraft. For some fucking reason turning up shadow quality improved my performance.
That game is a fucking confusing mess.
>game auto-detects your 32c/64t cpu
>"Application Error"
This is how you know your rig is far beyond even their development setup.
Consoles
>we noticed your last session didn’t close correctly we must restart again
Nani?
>game auto detects your hardware
>can't comprehend all these devices from the future
>sets everything to very low just in case
>buy $2k computer
>play HOMM 3 and Civ 5 because basically i have retard disorder
But those are fun games, try Age of Wonders III, its like a mix of those two games
consoles
>game is 10 years old so it runs at ultra in potato laptop even when it was a graphics benchmark back in the day
Console kiddies think (post effects = better graphics)
>I don't experience LENS FLARE when I'm standing outside on a sunny day.
>he doesn't own eyelashes
Imagine being this poor
actually it's "I can't see how shitty this looks because the scenery is blurred and there's dirt and lens flare in front of it"
>32 cores
>each running at 1ghz
>wft why won't it run?
Even though overwatch ended up a shitty game, they do have pretty good auto detection for low end PCs. During the beta i only had my shitty laptop with me, one that cant render anything 3D, and it managed to autodetect and setup the settings for 60 FPS.
Sure, the game looked like the original doom, i think the effective resolution was around 400p, and i couldn't tell apart the characters, but it was running at 60fps.
>4 year old graphics card
>Auto detected max settings
>Game runs like shit
Every damn time.
>blur
>bloom
>lens flare
at least bloom can be disabled
Subnautica
Blur legit makes me fucking sick. I've been playing video games since I was 3 years old and the only game that's ever given me motion sickness was fucking FFXV during that Leviathan fight where you couldn't turn the fucking blur off and the screen was going batshit insane.
Ambient occlusion, bloom, and fuzz can all fuck off too. Stop burning up GPU cycles for bullshit that makes the game look worse.
>autodetect
>super high
>15 fps
>B-BUT IF THE GPU USAGE GRAPH ISN'T AT 100% IT'S NOT WORKING!!!1!11!!!1
ARK
>game runs like shit, running worse the more of the sky is visible at once
>turn shadows off in settings
>shadows are still there
>but the game now runs perfectly
I don't even.
Thank you i definitely will
Hardware
Lots of laptops and anything branded green will keep using the CPU instead of the GPU unless it hits a threshold or software tells it otherwise. So when it seems to do better all of sudden it's because the GPU actually kicks in
Sun shadows, my man. The columns of light that like through the clouds. That's what you turned off. Occasionally called God rays
Metro 2033. You get better performance by setting vsync on in the original release.
Had that happen with Rome 2 Total War.
Bunch of graphical errors and stuttering on medium, ran fine on high.