what are your thoughts on anti-aliasing? do some games look better without it?
What are your thoughts on anti-aliasing? do some games look better without it?
Other urls found in this thread:
designcompaniesranked.com
i.lensdump.com
i.lensdump.com
i.lensdump.com
pcgamingwiki.com
guru3d.com
youtu.be
twitter.com
Higher resolutions don't need AA. Supersample the fuck out of it.
Yes, low poly or early 3d games with not too much fidelity look pretty good without it. Games with really high framerates and high dpi displays don't need it too much either. That said FUCK FXAA SMAA and all the other shitty 2010 AA's that killed MSAA. Vasaline sparkly aliased garbage that doesn't fix jackshit.
Isn't MSAA just intelligent supersampling?
>do some games look better without it?
Older titles with nearest neighbor "filtering" do.
skyrim looks better without AA
Skyrim looks best when you turn the game off.
Nothing worse than seeing jaggies stick out like sore thumbs.
I prefer the sharp edges on objects 2bh
>Supersample
It's the most expensive form of AA there is.
just play in 4k so that you dont need to even worry about it
Its generally shit and blurs image quality I find for most different techniques with the exception of a few. I'd much rather just downsample from 4k or disable AA entirely with the exception of maybe using SMAA
1440p masterrace don't need no filthy AA
kek
But where's the pro-aliasing?
>do some games look better without it?
pixel art games
I can't stand the N64's anti-aliasing. I've got my N64 RGB'd and it still just looks blurry and shitty.
There is no game on planet Earth that doesn't look better with AA. That's the whole point of emulation to begin with after all, up scaling PS2 and 3DS games to 4K resolution so they actually look like real games.
nigger AA doesn't have anything to do with upscaling
Any game with TXAA looks undeniably worse.
I'd say most games do as long as it's at least 1080p and up. Pretty much no matter the type of AA.
I'd say anything running at lower than 720p doesn't benefit too much from it, as the implementation is generally poor and extremely taxing.
I personally play with it off if it's any kind of competitive game.
That's what super sampling is. It's literally SSAA.
I don't think it's that important. Jaggies are very noticeable in stills, but not so much in motion, especially when you're paying attention to the gameplay.
I always turn it off because it tends to affect framerate on my medium tier computer.
Jaggies are one of the first things I notice unless it's an extremely fast paced game.
Pop-in is always the first thing I notice. RDR2 has massive issues with it and it put me off the game. What's the point in making a game look so good that you add horse balls just for shits and giggle but you can't load things in across the map in perfect detail yet? That's the only graphical evolution I care about anymore, I would take a game that looks like vanilla Morrowind but had 100% lod for every object in the game as soon as you load in.
That's honestly more of a memory issue than a graphical one. Plus being limited by the PS4's paltry 800mhz GPU with a max of 8GiB of total system memory doesn't help.
No anti-aliasing looks better in 2D games with the right display
>rasterization
On my 4k I don't even use it anymore since at my normal viewing distance I can't make out individual pixels anymore so the whole thing has become a non-issue for me.
Every game looks better without it. Jaggies > blurries.
Quake agrees with this statement.
most games look like shit without AA, if i can literally see the pixels move it needs AA.
It's this
Real resolution increase > antialiasing, always and forever
>what are your thoughts on anti-aliasing?
i am the type that goes through lengths in registry editor to turn off clear type in windows. does that answer your question?
anti-aliasing is too energy costly, every game should avoid raster and just use vector.
>2010 AA's that killed MSAA.
Deferred rendering is what killed MSAA (and mirrors).
because it only has FXAA
i turn it on for single player games to help immersion.
Someone give me a rundown on all the types of anti aliasing please. There's so many options I just leave it at the default AA option at 2x.
same place where "antilapse anus" is
Meme Blur Filter Tier:
TAA
TXAA
FXAA
SMAA
MLAA
Actually good AA:
MSAA (killed off in most modern engines)
Just downsample from a higher resolution, it's better than any other form of AA.
I like how MSAA has tiers to choose from but all the other ones are just ON/OFF.
jaggies ew
Aliasing: Soulless
Antialiasing: Soul
Lmao just downsample: Soulless
>SMAA
>Blurry
The only time SMAA is blurry is if its used as a post processing tool, like reshade and it blurs UI's, HUD's and text only.
Anti-aliasing is always shit, it's either a blurry shitshow or takes a big performance hit for marginal improvements.
TAA is *somewhat* enjoyable, the rest of the post-process methods are just rotten, MSAA will be sorely missed
2-4x MSAA is mostly unnoticeable on performance and clears up most jaggies.
Just turn on 50% scaling.
What is deferred rendering and why did it kill off MSAA? Why are the two incompatible with eachother?
It's a tradeoff, how much do you want to reduce jaggies vs how much blur you are willing to tolerate.
Strange Journey UI was way better than SJ Redux one.
there's some games I prefer without AA on, it's usually the first thing I turn off to save preformance.
not sure about the relevance to this thread but I agree
How do I play games in higher resolutions in windowed mode?
I want to use the nice AA from super resolutions but it doesnt work when you want to tab out a lot
AA is cancer for pixel art games
AA is good for HD games, since it smoothens the line
Run desktop in super resolution?
Pointless garbage for graphicsfucks. If you can't play games on 640x480 on ultra low settings, you don't like videogames.
Drop your resolution bro
Weird sentiment coming from somebody who wears glasses.
Deferred rendering is when you do your lighting and rendering through screenspace shader shenanigans. It allows you to have as many lights as you can calculate instead of the 8 the hardware would limit you. Because of lighting essentially being relegated to another post-processing effect it can play havoc to how your scene is rendered breaking all kinds of things unless you waste draw calls and processing power to work around it.
MSAA expects a "straight" 3D scene for it to work and deferred rendering doesn't provide it.
i'm not an autist so I really dont notice it unless it's on trees or mountains.
>4K
>not 8K
Poorfag
I think I'd trade having an arbitrary number of light sources and shader passes for having MSAA back because I don't really care how many lights are in a scene if I have to choose between everything shimmering at me and obvious jaggies or a big ugly blurr filter followed by a sharpening pass to cover everything up.
My boy
It's because modern games have lots of PBR textures, and the aliasing created by specularity (light flickering/shimmering) can't be fixed with MSAA.
he probably means upsample.
Supersample it or get the fuck out. I'm not going to have less information on my screen at the cost of performance
This I exclusively play all my PS2 games in 8k.
Mario Odyssey graphics are horrific and crying out for AA. Truly horrific jaggy mess
Purely 3D games %99.9 of the time always look better with AA.
however filters like that can completely ruin 2D games particularly pixel art ones.
One of the most egregious examples is KOFXIII. Some fucking spastic on the dev team put in a video filter that can't be turned off that blurs the absolute shit out of all the beautiful sprite work the artists worked so hard on. removing all the crispness and definition. what should have been one of the greatest looking Pixel art games of all time completely ruined.
>left picture doesn't have grey pixels
>right picture does
Good job adding thing that weren't originally here.
Congratulations, you've figured out that AA changes an image.
There is no such thing as a 3D model that doesn't look better with (in-process) anti-aliasing.
The concept doesn't exist in the rasterized world (read: 99% of 2D games).
Sprite-based 2d games look better with antialiasing. Fuck your shitty emulator filters.
then.. you mean without anti-aliasing.
I turn it on if my card can handle it without going below my monitor's refresh-rate, but I don't really care too much. I'm not a graphics whore though.
Some games do look better without AA
But most games look better with AA.
2d in general look better with no AA or filters, as noted here
If a game uses Depth of Field effects for whatever reason, it should use AA for at least those parts.
If not, then AA is a waste of resources.
My cards AA is always way faster & lighter on GPU than the games AA.
Yes fuck I edited my sentence to be wrong then after typing
I don't like AA.
I have enough blurring in real life with my shit-tier myopic eyes as it is.
Motion blur can also go fuck itself.
Turn antialias off if you play on a normal/small screen with a normal resolution. You only need antialias if you got a big screen with low resolution.
Also you only need vsynch for games with higher framerate than 60
>Also you only need vsynch for games with higher framerate than 60
So 100% of the ones you'll be playing because if you're not hitting at least 60 you should drop your settings until you do.
I hate these threads because they're a reminder most people on Yea Forums are technologically inept
lol, you new to gaming eh? Let ol gamer pa me tell you a secrety uneducated kiddo:
you dont need more than 10fps to have fun
graphics dont really matter. a fun game will be good and a shit game will stay shit
Protip: If you want to pretend like you're older than you actually are use a PC game with a vsync option game instead of a console game in slit-screen multiplauyer for your example otherwise you out yourself as a toddler who doesn't know what he's talking about.
I know you hear this a lot but man your are one dumb fucking motherfucker. Fun != tech
When did I say it was? Oh right I didn't, I just pointed out you're a moron and now you're trying to save face anyway you can.
finally a nigga who gets it. are there any open world games that have 100% los bthat arent minecraft tier?
Uhh, minecraft has pop-in too.
>want to downsample with DSR
>its blurrier than most TAA
t-thanks
anti-aliasing is a meme and looks like trash
I don't like anti-aliasing.
Not sure why.
The performance boost because i play without it is nice though.
>supersample lol whats that I'm dumb
What is your DSR smoothness factor set to?
Ugh, fuck that's unplayable
I know gameplay over graphics, but when your first-person shooter is turn based, there's a problem.
I used to go above and beyond with AA back when I was on a 1080p screen. Now that I use higher pixel density I no longer care, maybe some shimmering is annoying sometimes.
Scrolled past this and it made me laugh. Nothing to say about the OP, but I'm stealing that quote.
Shimmering is worse than aliasing in my opinion, it's so distracting. Fucking Cry Engine.
>some
Name a single game that looks better with this garbage setting.
Off every time.
I like FXAA
Alan Wake. Its visuals literally require AA to function.
is it possible to have too little aliasing?
This but unironically.
16x baybee
aliasing unironically looks good
I usually turn AA off. It's resource-hungry and usually takes things from grainy to blurry, and I would rather live with grainy.
There are two universal truths with 3D vidya:
If you 'need' anti-aliasing, your resolution isn't high enough.
If you 'need' motion blur, your framerate isn't high enough.
this
antialiasing is a meme and only here because of "high-definition" 1080p console garbage
I, for one, am pro-aliasing
>trying to actually talk about games and game technology on Yea Forums
madman
this shit pc should just literally burn
If you think you don't need either then you don't understand their purpose and shouldn't weigh in about how much you think you need them.
This is true with the advent of 4k.
With what apple terms a "retina display"; in other words a screen with a pixel density higher than your visual acuity can distinguish individual pixels you really don't need anti-aliasing anymore because your vision isn't sharp enough to see the jaggies anyway. A 4k monitor is pretty much a retina display at normal viewing distances, if you're the type that likes to hunch forward and shove his face a few inches from the screen you might need more than 4k but for like 99% of people 4k is more than enough.
This wasn't the case with 1080p or 1440p though, both still needed some anti-aliasing for typical monitor view distances. You can do a little test to see at what distance your monitor becomes a retina display designcompaniesranked.com
I liked it when i didnt even know what Aliasing was. Now it seems i notice in all my games recently.
4k and fxaa
I have a 1440p monitor, and I /always/ turn off any anti-aliasing. I'm aware the hard pixels are there, but it's not to the point it's a problem like it is with 1080p.
Fuck motion blur HARD, no exceptions. I don't want my image to be a blurry mess any time I move. And if someone feels like things are "Too stuttery", get a higher refresh rate monitor.
It is strange isn't it. Before you know about Aliasing you don't give a shit, but as soon as you learn about it suddenly you're okay dropping 20% of your performance just to get rid of it.
Fuck everything, roll back technology a bit and bring back MSAA.
Unironically, MSAA is more of an improvement visually I guarantee than any of the shitty technology that makes it impossible to use it.
Get a tape measure out, how big is your screen and how far are your eyes from it's center? Maybe it is a retina for you.
What exactly do you think polygons are?
If youre running a game at your monitors native res you dont really need AA.
Ever since the switch from forward to deferred shading in game engines and the subsequent disappearance of MSAA, internally rendering games at a higher resolution and downsampling to the output res is the only method of AA that has looked acceptable to me. All the post process forms of AA just look shit.
Even downsampling can look shit if you're trying to do something dumb like cram 1440p into 1080p and end up with gross downscaling artifacts.
Going to go out on a limb and say that all post process Anti-Aliasing is shit, and I'd only tolerate it if a game has uncapped framerate support, as I'd rather prioritize a high framerate.
That said, 4K supersampling to 1080p eliminates the need of post-AA, and it's better than recent MSAA implementations I've seen.
Couldn't upload the images I wanted onto here, since they're WAY too big.
i.lensdump.com
i.lensdump.com
i.lensdump.com
I'm wishing I got a 2070 or 2080 instead of a 1660Ti so I could run more modern games at UHD. 2160p on a 1080p display looks really fucking nice.
here's a question, what game has the best AA solution? I like the one used in doom 2016, where you can tweak the sharpening to your liking
Vector displays, going raster was a mistake.
I usually sit pretty close to my monitor, unless I have reason to view the entire display at the same time, like with playing certain vidya. It's a 34" ultrawide though, so 'sitting back' is probably farther than the average person, unless the game is ass and pillarboxes.
MSAA is the only antialiasing that doesn't hurt my eyes and look like shit
Serious Sam 3/HD. You get like four different AA options, can adjust them to your liking and put them all on if you want.
protip, if you have an nvidia card, turn on MFAA in the control panel, it will double the the effect of MSAA with little to no performance cost
it's a temporal solution where it takes the MSAA result of the previous frame and combines it with the next one, giving close to double the MSAA effect
Definitely. I'm running a GTX 1060 and while I can do very light 4K gaming, most games I can say goodbye to my framerate (it's really good for screenshots though).
I recently upgraded from my i5-7600k I had for two years, to a Ryzen 7 2700X, because I had really bad thermal throttling and bottlenecking issues, but I'd like to get a GPU upgrade, so I can play current games at really high framerates (for when I get a 240hz panel), but 4K with DSR for when a game is 60FPS capped.
Played on a Vectrex for the first time the other day. That display was AMAZING. If there was a way to recreate that in full colour and drawing to the entire screen that didn't rely on linearly going line by line that'd be the most amazing display ever.
tell me more nvidia control panel secrets
Skyrim SE has TAA which look far better than without it. If you got blurry issues just counter it via enb or reshade (if you're on a potato)
AA does more than just get rid of jaggies. You can have the highest resolution in the world, it won't do shit against shimmering and blowouts.
>but eventually you hit a point where
No, you won't. As long as a single raw pixel is capable of resulting in a visual anomaly, it will always occur regardless of resolution. You will need to filter it in some way to eliminate it completely.
that's pretty much the only actually useful performance hack
then there DSR which allows you to enable supersampling in games
then there's the display RGB range settings which allows you to fix your color range settings (full range RGB looks noticeably better on most PC displays because they're full range displays)
>Games with high dpi
>dpi
What fucking games do you play?
MSAA is dead. TAA with sharpening is the only thing that even has an effect honestly.
I don't know why people praise SMAA so much, in most modern games it will miss so many edges and it still impacts performance noticeably. Seems to only be worth it in games with simpler polygons or textures like Risk of Rain 2.
Assuming 20/20 vision your monitor becomes a retina when you're 37 or more inches away which is further than most people sit from their monitors.
RoR2 doesn't even have AA, which is odd since they use the Unity Post Stack which comes with TAA and SMAA options.
deferred renderers can implement MSAA actually
Yeah, it's strange. I use Reshade with SMAA though, and it looks nice. Eliminates all jagged edges in that game. Rare example of that happening with SMAA.
Not worth the performance impact. And 1080p games and above don't really need it
In nvidia inspector you can enable high quality upscaling, which will make sub-native resolutions much more bearable.
Nobody uses that term anymore you old coot.
I dislike AA on about anything. I like my jaggies sharp. It's hard to be objective about it, I just dislike artificial smoothness
Literally a de-soulificator.
I think it's part of their artsyle to have flat colors and crispy unfiltered pixels.
literal nostalgia goggles.
It's not bad, but I personally prefer smooth edges and SMAA works well in its case.
Is this bait, or just a retarded OP?
I can assure you I don't ever sit that far back, maybe 25" at absolute most. On average sitting 15" or less assuming I'm not using the entire monitor at that moment.
I think jaggies work on cartoony games but not in shit like RDR2
>it's a temporal solution where it takes the MSAA result of the previous frame and combines it with the next one
thats a bad thing
temporal AA is laggy
will DLSS ever be good? why did they even push such a failure of a feature if it can't work?
You can definitely use MSAA with deferred rendering, but with deferred you need to store the geometry information and material information in buffers to do the lighting afterwards. MSAA means the buffers need to be bigger to account for the extra samples, so the memory usage increases substantially, hence why we don't MSAA much anymore.
because it's meme technology, AI can fix everything
They have no idea what the fuck they're doing when it comes to software. Did you see their post-processing thing they added to try to be like Reshade? Only works with specific games and enabling anything in it kills performance, even a simple sharpening effect that costs like 1 FPS in Reshade costs 15 FPS in theirs.
it works fine I did side by side video and screenshot tests in GTA V and it improved the image quality hugely, it's literally just MSAA where you change the sampling pattern every frame and accumulate the result of the previous frame, I couldn't see any noticeable artifacts
having 2xMSAA with MFAA on looked almost as good as 4xMSAA without MFAA, that's a big performance win
DLSS is just a more advanced version of checkerboard rendering, which is primarily used for upscaling games from a lower resolution.
I'd argue artifacts from DLSS is worse than aliasing. It may be decent on a 4K TV for those who don't have the raw horsepower to run games at Native 4K and 60FPS, but it's definitely not a good AA solution.
the artifacts of temporal AA aren't visible in a screenshot obviously, the artifacts appear over time, because it's temporal
Left looks better - it's less blurry, the edge is sharper, color contrast is stronger between one pixel and its neighbour.
It's also less processor intensive.
I'd take checkerboarding on PC though
well if you read again what I wrote you'll see I also did video tests, I also played the game obviously and looked at the results, can try it out yourself though
also artifacts caused by temporal aa can easily show up in screenshots as weird dithering as an example
I thought you said you couldn't notice artifacts?
They appear on fast moving objects like effects
and not to mention it makes the whole game feel laggy because the anti-aliasing is basically a 'trail' of the frames that came before the current one
Checkerboarding is possible in Dx12 latest win 10 versions. I'm not sure which version exactly, but devs can specify the MSAA sample positions now which lets you implement checkerboarding.
yes I couldn't, that's the point, MFAA worked wonders and gave a good performance boost without any noticeable artifacts
retard. ue4s taa is top of the class and the best solution right now. msaa never even applied to transparent effects or props like a postprocessing solution like taa can.
also one good way to combat smearing with temporal aa solutions is to use motion vectors for pixels, every frame has a velocity and direction for each pixel, so they can predict where that pixel will be in the next frame, it works fairly well, dunno if MFAA does this
MXAAx4 or MXAAx2+FXAA are the only ones that should exist, both options looks crisp with almost no performance loss. TAA looks like absolute garbage, kills performance and adds input lag, fuck modern AA.
we need aliasing acceptance
all edges are beautiful
you'll still never get rid of the delay bank feel of the whole technique
extrapolating information will never give you true image quality, games would be better if they used real AA techniques like they used to but I suppose console players cant even notice sitting meters from their screen
I can't fucking stand playing without AA, it's like watching somethrough through shattered glass. "vaseline" effect be damned, every game at leasts gets a minimum FXAA tick.
If there's one reason I'm glad TB's dead, it's for butchering the meaning of fidelity
yeah sure, point is that MFAA works really well and I think for most it's easily worth the performance boost
you're looking only at the image quality and failing to notice the temporal lag
I already explained to you that nope, I've thoughtfully compared the results in motion as well as images
have you even tried MFAA? lmao you seem like you're just blindly against any temporal and dismiss it by assuming it has to fail miserably before even taking a look at it
Worst explanation ever.
>I've thoughtfully compared the results
lol what the fuck does that even mean
temporal AA gives temporal distortion, that's not an opinion thats a fact, you might consider it the lag insignificant but it's still there
I mean it's better than disgusting looking FXAA but MSAA or SSAA is still the best
>its okay because i said so
Nigga please, not even the same guy but temporal solutions will literally always be worse for true fidelity.
Depends on what anti aliasing is available, and also the performance change.
I explained what I meant, how do you not understand what it means to
1. play the game
2. compare footage
3. compare images
???
literally can try it out yourself which is what I suggested
holy shit with you people
I'll argue that pretty much anyone would prefer mfaa over lower quality msaa if your frame rate stays the same
Some form of AA is necessary, anything that isn't 2D looks like shit without it. But I don't know what all the different algorithms are and which is best.
He's just a Unity user bullshitting
>8 light limit in forward rendering
>built-in MSAA disables if you switch to deferred
That said he's not 100% wrong, Star Citizen devs have talked extensively about the problems with Deferred / AA solutions.
>do some games look better without it?
A lot do actually. You can buy modded N64s that remove that gross smearing filter the N64 inherently uses and sharpen everything. Makes the games look beautiful.
But you can no longer force it via GPU driver settings, so it's shit.
Your opinion doesn't mean anything
temporal anti-aliasing adds temporal artifacts, that's just a fact
MSAA on its own removes no information from the image
yeah the performance impact is not worth it
If he thinks foward rendering has an 8 light limit he's not a Unity user, things haven't been like that since the early 2000s
I could swear that when I was playing on CRTs as a kid, aliasing didn't show on the screen. Am I crazy?
because CRTs were blurry, at least TVs were
when did I say anything other than it literally being good because I can't even notice it and it improves the AA hugely with no impact to performance? do I write like a person who has no idea about how temporally accumulating shit works in games? I even showed that I at least know enough that I can list different methods for combating artifacts
I'll argue that pretty much everyone would prefer the MFAA result over the pure MSAA result at the same framerate
Horizontal lines might blur a bit depending on how sharp the cathode going across the screen isn't.
Only when using composite video. Component/RGB looks incredibly sharp, but because CRTs aren't fixed-pixel/resolution, low-res sprite art and polyogonal games look infinitely better.
>I'll argue that pretty much everyone would prefer the MFAA result over the pure MSAA result
I don't because I like games that are responsive and don't feel laggy
I'll admit it's good for slow-paced games like RPGs
scanlines and shadowmasks are good at masking aliasing
it's different from anti aliasing and it's why some prefer CRTs for old games (and also because they have basically no lag)
3D games: always on
2D games: usually off, maybe on if graphics are hand drawn...? usually off though
sprite-based/pixel art games: always, fucking always off
CRTs don't have square pixels, that's why low res content looks better on them.
I think Skyrim looks worse with TAA. Too blurry.
PSA: AA is only needed on lower res. Above 1440p you don't need AA at all. The high pixel count is basically what AA is simulating anyway.
>when did I say anything other
Once agan not the same guy but when I chimed in here
>temporal solutions will be worse for true fidelity
You responded with
>but i said stuff
If you cant handle multiple anons talking at you maybe go somewhere else
>it's literally just MSAA
>except for all the reasons its not at fucking all
If you want TRUE FIDELITY do not use temporal data. Thats it.
Faggot
>Above 1440p you don't need AA at all
just because you have bad eyesight doesnt mean everyone else does
bitch please, I really don't understand why this is even any sort of argument when all I am saying is that it's a very real way to improve AA quality without a performance impact, you're the people starting with some true fidelity arguing, while all I have said so far is that the quality is very good
this true fidelity shit is something you people introduced for whatever reason
I feel like this is a really newb question, but do you guys only enable AA on the nvidia control panel, and leave it off in game? or turn it on in game?
QCSAA>MSAA>SMAA>anything else
>my opinion is fact
You can stop now
LMAO. bro have your eyes check
mfaa doesn't feel laggy at all, though I guess you'll play with no MSAA instead since that'll improve your frame rate and reduce your input lag
again I literally told you it's my view and suggested you can try it out yourself
I'll say for like the third time though, I bet pretty much anyone would prefer the MFAA result over regular MSAA at the same framerate
Aliased art is god tier.
>and suggested you can try it out yourself
Why do you keep thinking I havent?
Just because something is "good enough" for you doesnt mean others will just fall in line and not have differing views. MFAA has factual downsides that are not worth it for me compared to other solutions. Thats it.
Fuck outta here with that shit m8
SMAA T2x (SMAA w/ Temporal AA) is the best anti-aliasing solution for modern day vidya.
I don't know if you have but you keep implying I am trying to push some objective truth and as if I am denying that it's a temporal solution even though I explained from the beginning how it works
this is such a stupid thing to argue about and that's why I love it
I'll think it's more than good enough for pretty much anyone too
if I had some game installed with MSAA I could do some comparisons to post here for everyone to see
Most games have anti-aliasing settings, though a lot of them are only on/off
>mfaa doesn't feel laggy at all
feels laggy to me
>oblivion can't turn on AA when HDR is used
why?
how are you gonna post comparsions of a technique where the downside is input lag on an image board genius
okay but it introduces no input lag and any potential smearing artifacts is pretty much confined to the edge areas due to the nature of how MSAA works
deferred rendering
it can, just needs to be forced
Ahh yes, play at 4K and supersample it. Can't wait to get 12 fps on a 10 000$ rig.
having 4x msaa has more input lag than 2x mfaa due to it having a bigger performance impact
it doesn't introduce lag from your inputs to the game I guess but it certainly introduces lag from your inputs to the rendered image seeing it's interpolating based on past images
Where did I disagree with that?
I just think your being a pretentious ass.
Saying shit like "its literally just msaa" and then listing how it isnt is stupid.
Claiming that its "good enough" because modern consumers are use to potato tier hardware that fakes its resolutions is also stupid.
Dont settle for less nig.
Strive for better.
Otherwise we wouldnt have even had temporal solutions to get around hardware limitations to begin with.
Now we should be striving for solutions that dont cause all the problems that come with using temporal data rather than fellating workarounds.
is it possible to learn this power?
2D games usually have no anti-aliasing...
sure but the interpolation is only on the edges and very subtle, I mean wouldn't you agree that lowering your frame rate through having equivalent quality in msaa would introduce higher input lag due to lower frame rates?
Deferred rendering is old news anyways. Clustered tiled forward is the new hotness.
Get a higher res monitor peasant
Maybe on older cards, are you still on the 9 series or something?
>but the interpolation is only on the edges and very subtle
no it's on the entire image, that's how temporal anti-aliasing works
it's not input lag, it's output lag
in academia they do, when contrasting it with ray/path tracing
pcgamingwiki.com
>Anti-aliasing (AA)
well you're annoyed over semantics but you know I have been clear on how it works
MFAA is a very efficient way with pretty much no perceivable differences (to me, to be clear) that I'll take it any day, though no games use MSAA anymore so it's dead
MSAA has a huge impact in GTA V for example, which is why I played with MFAA in the first place on a gtx 1070
it's only accumulating on the sampling pattern which is applied on the edges of polygons
Jaggies are the very definition of soul you niggardly mongoloid
>but you know I have been clear on how it works
Aside from all the times you werent sure.
Faggot
well I did explain how it works from the beginning so :)
there are other more important stuff like looking at ttextures close by and tesselation/LOD fuckery. AA is not worth the performance penalty (for my midrange rig at least)
>literally msaa
>except for all these reasons its not
You should sell cars.
tried supersampling in yakuza kiwami. 8x looks so pretty holy shit fuck.
>it's only accumulating on the sampling pattern which is applied on the edges of polygons
yes and? are you implying that's insignificant? because it isnt
rasterization is a term used by everyone what the fuck are you talking about
the other poster is at least sane but you're just some jaded asshole who cherry picks parts of sentences for some lame ass vidya argument
in the very same post I also explained how it combines the results from different frames temporarily
i have no doubt he's already selling cars. he could also be selling skyrim
fxaa is disgusting
msaa is cool
>AA is worthless
>tesselation is important
you're retarded
>yes and? are you implying that's insignificant? because it isnt
umm I explained it you because you just claimed it's applied on the entire image... just a technical clarification to your misunderstanding on how it works
that's what the entire image means you fucking dumbass, stop pretending to be smart
MFW battlefront 2 forces you using TAA
>jaded asshole
>just agree with the retard he cant be wrong he used big words
Yeah because we already covered the actual flaws previously.
If you came to a fucking graphics thread and didnt expect this you are fucking retarded.
That's completely wrong. At low resolution AA looks like shit because you notice much more how it's just blurring the pixels. Above 1440p is when post-process AA actually starts to be worth it and FXAA stops being blurry shit.
wrong and right about what? I explained what it is and you're the one going with this argument over how it's not a ground truth solution, even though I never said it is
are you trolling now? I'd say that it's different having temporarily accumulating information on every pixel and having temporarily accumulating information on some pixels
Are there any recent games that still use MSAA (and therefore support MFAA)?
>those zoomer turbohipsters who play old 2D games with sharp pixels blown up on a modern LCD
>they sincerely believe "that's what it was meant to look like"
I thought you meant the edges of the screen, seeing it's a screen space effect
the edges of polygons are the most significant visual information on the image, it's misleading to say its subtle because it only effects polygon edges
probably but I can't think of any big ones
>even though I never said it is
>literally msaa
>it works fine
>i couldnt see
>double the effect of msaa
All while conveniently dismissing any and all criticisms towards your favorite AA solution
No its a dead technique for many reasons.
>No its a dead technique
It's not dead, it's just less viable, hopefully becoming more viable now that people are moving away from deferred rendering
>Not keeping a CRT connected at all times explicitly to play old vidya on.
I played Dragon Quest XI with Downsampling, TXAA, FXAA and SMAA and it looked glorious.
I don't know about that but now you understand what I meant it seems, many other temporal solutions would apply on everything, including texture detail, MFAA applies wherever MSAA applies
what am I dismissing? literally what? I am just saying how it works and you're the one arguing over some ground truth shit even though I have not made any efforts refuting that it's temporal, in fact that's what my explanation on how it works is
I think I just left the AA off in that game.
>many other temporal solutions would apply on everything
No they wouldnt. Now you are just making shit up.
The vast amount of visual information in a scene of a modern game is filled to the brim with the edges of polygons. Trying to pretend its subtle is just wrong.
>literally what?
All the shit the anons have brought up.
>its laggy
>nuh uh
>it doesnt look as good as other olutions
>nuh uh its good enuf u troll
Fuck off
And yes I'm paraphrasing for dramatic effect
Couldn't handle the shimmering in the grass so I had to force TXAA with engine configs, good thing about the game being in UE4 is you can enable a lot of shit that the developers didn't intend to use.
>No they wouldn't. Now you are just making shit up.
thanks for genuinely showing you don't even know what the subject at hand
I thought it was you who wanted to clarify how it's just an opinion if it looks better or not, to which I have conceded many times already, adding that I bet most people would take the MFAA result any day at the same framerate
what's even more funny is that first you claimed it impacts the entire image and now somehow it's preposterous to say other solutions can impact the entire image
>Trying to pretend its subtle is just wrong.
well seeing how the results looks very close to just having the higher MSAA, I don't think so
When you have the resolution high enough you don't even notice.
>back in the day you had only one antialiasing option
>now you have supertxAAhyperultrawhatthefuck options
how do I know which one is better?
guru3d.com
here's a good article about MFAA
>thanks for genuinely showing you don't even know what the subject at hand
Honestly I misread and didnt notice you said temporal solutions and thought you meant AA in general throughout the ages so my bad.
>what's even more funny is that first you claimed it impacts the entire image
Not the same guy user.
Like I said if you cant handle multiple people calling out your shit post somewhere else.
>well seeing how the results looks very close
>to me
Just admit that you are able to deal with "good enough"
literally just easy to pretend you're all just one person because it changes nothing
>Just admit that you are able to deal with "good enough"
I have in pretty much every single post said that you fucking retard, I also bet pretty much anyone would take it at the same framerate and I think barely anyone would be able to tell the difference as well
Basically all the new ones suck and are kinda terrible but their purpose is filling the gap left behind now that we can't use MSAA with differed rendering.
Okay.
All I have said is that its not a prefect solution.
That we shouldnt settle for less.
Whats wrong with that
Faggot
How did the DS, a handheld from 2004, have such god-tier anti-aliasing while most modern AAs are blurry shite?
I always turn it off. Same with motion blur
2xMFAA and 4xMSAA look close to identical and one is significantly faster than the other
I mean might as well ask why should we settle for MSAA at all if SSAA exists?
Because super sampling is a meme.
Fuck you, pro-aliasing fag.
>why should we settle for MSAA at all if SSAA exists?
SSAA is too slow
well it's multisampling applied on every pixel so I don't see why it's inferior to msaa
>new solutions dont come up
I dont like you or your brain.
Before we didnt even have temporal solutions to get around hardware limitations to begin with.
Now we should be striving for solutions that dont cause all the problems that come with using temporal data.
Its called pushing technology forward not using shit thats literally outdated already because its "good enough"
That screenshot most definitely has the ingame anti aliasing setting cranked up. But yeah of course downsampling is the best solution, game is just demanding as fuck and it's hard to get 60fps in it.
>if SSAA
Way too heavy and ends up blurring shit too much.
>and one is significantly faster than the other
No its not.
Now I know you're trolling. SSAA is literally just a fancy word to say the game is rendered at a higher resolution and scaled down to fit the screen.
well there you go, also why I took 2xMFAA over 4xMSAA in GTA V, had a huge performance boost and I couldn't even tell the difference
SSAA isn't a temporal solution after all :)
yes it is, it allowed me to maintain over 60fps in gta v
blurring can be adjusted but precisely, hence why temporal super sampling is used in many games instead...
>ends up blurring shit too much.
are you retarded
SSAA is literally perfect anti-aliasing, it's just too slow to use, mostly
SSAA is the ultimate AA solution though, why settle for less? :^)
>That screenshot most definitely has the ingame anti aliasing s
Nope, I was just playing in 8k resolution can't upload the whole picture because filesize limit. Had to resize it down to 4k.
not him but while you're right, SSAA look janky if you're supersampling from odd resolutions, ideally you need to from a 4x higher res
>SSAA isn't a temporal solution after all :)
>SSAA is the ultimate AA solution though, why settle for less? :^)
>if i ignore previous parts of the conversation i win
Never change Yea Forums
irony
I just imitated your retarded ass
but please do clarify what you wanted to say
>yes it is
>literally works on my machine tier reasoning
Its not a HUGE gain.
Its modest at best in most games.
>SSAA look janky if you're supersampling from odd resolutions
yeah and my pants dont fit if i put them on backwards but why would i ever do that
You were playing in 8k or using Ansel super sampling to take screenshots? What fps were you getting? Specs?
literally just make diagonal pixels already
well good thing I wrote significant :)
but it close to doubles your MSAA performance basically
if MSAA has a big impact no your performance then it has a big impact, GTA V is a good example of such a game
Try reading, Im done repeating myself from 50 posts ago
1080ti got like 50fps average in 8k without any AA on.
because 4xSSAA is way too expensive for most and there's the option to sample like 1440p to 1080p, which looks somewhat better but doesn't map cleanly
haha you lack self awareness, at least I bother reiterating myself, I think you just missed the point I made with my SSAA remark
>but it close to doubles your MSAA performance basically
kek try again
25% to 30% isnt even close to 100%
why are you two retards arguing over MFAA which is pretty much a dead technology that never even gained any popularity?
that's a chart for game performance, not MSAA performance, brainlet, did you just not read what you replied to??
also read the article you took that image from :)
For the THIRD TIME then
>Before we didnt even have temporal solutions to get around hardware limitations to begin with.
>Now we should be striving for solutions that dont cause all the problems that come with using temporal data.
It WAS good that we had temporal AA to work around the ever increasing demand games have been pushing but now we have more than enough power to work on much better solutions that dont require shit tier workarounds.
SSAA is not a temporal solution, it's the best possible aa solution you can come up with if you're not thinking about performance
>i didnt mean in-game performance just theoretical performance
So youre a liar.
Cool
shit, man. I have a 1080 and couldn't get full 4k. Maybe it's better now that I've updated my cpu.
MSAA is the process of only supersampling depth. There's nothing intelligent about it, but at least you don't get pixely edges although depending on your shader you'll still get pixely surfaces.
game?
no MSAA plays part in your game performance, it literally allowed me to have very little jaggies in GTA V while mainting 60fps on ultra on gtx 1070
Yeah, people praise MSAA too much, it'd kill the polygonal edges but with all the shaders and effects going on in games it did nothing to kill specular aliasing and shit like that and looked horrible.
>still talking about muh SSAA when I said this shit literally an hour ago
>it's the best possible aa solution you can come up with if you're not thinking about performance
Which no one fucking said you absolute retard.
You are literally arguing that progression in AA solutions is bad at this point.
are you retarded?
How long have you been on 4channel?
>no one said
I just did...
you're the one asking why settle for less when better solution exist
my point being that the better solution might be more demanding and not worth the tradeoff
>people praise MSAA too much
it's the best anti-aliasing solution
aliasing on textures can be fixed by other means
Yeah no shit it's going bad. We're adopting 4K resolutions and already looking beyond, we're almost at pixel densities lower than what humans can discern.
Antialiasing as a whole is going the way of the dodo, not much point wasting any more time developing that stuff.
what? AA techniques are constantly being developed
Might be down to the difference in vram since everything else about the cards is so similar.
The process of antialiasing requires a signal(image) be rendered and a high resolution and then filtered to remove any frequency content that cannot be expressed at the lower target resolution before it is downsampled. This is the technical definition based on Shannon's sampling theory. Supersampling (distinct from super resolution) therefore is the only true form of AA. Everything else is post-aliasing trickery that tries and fails to hide aliasing after the fact. FXAA is the most bullshit, because it doesn't even blur evenly; it uses edge detection to blur things that look like they would be jagged due to aliasing, but generally all it does is give you slightly blurry jaggies instead of regular jaggies.
>we're almost at pixel densities lower than what humans can discern.
do pixels become so small you can't actually discern aliasing? is that what it's like on phones with high resolutions?
>my point being that the better solution might be more demanding and not worth the tradeoff
To you.
People that complain about performance limitations usually dont even have 144hz monitors.
The funny thing is that I have said its been opinions for over half an hour but youre apparently way to invested to let anyone else live how they want.
And thats something I just wont stand(sit) for
yes, also what the apple iphone retina display was all about, at a certain distance you can't see the pixels with an average human eye
Pretty much, phones have already passed that point. There are people who claim they can tell apart 1440p from 4K on a 5 inch screen, but we call those people 'liars'.
>Antialiasing as a whole is going the way of the dodo
>has no idea how game developement works anymore
You should take a look at what games anymore are since the ps4 took charge.
I wish I had a phone to test but I'm doubtful, I'm sure phones still use anti-aliasing and you'd be able to notice it on things like fonts if it wasnt there
to me and pretty much anyone I bet, including those who argued to me about lag because lowering performance is a sure way to introduce lag
4xSSAA is basically the same as rendering the game at 4x the resolution
it was like an arms length from your face or something with the original retina display
but look at those modern high res amoled screens on android and you'll realize it's actually really hard to see pixels, games might still have jaggies on those though because they often render at like 720p
Adding blur to something sharp is retarded.
And they will all be outclassed by new AA techniques that dont use flawed premises to begin with.
Thats how technology works user
>to me and pretty much anyone I bet
>literally im right you wrong
Good post!
also I'll add that I laughed because I've been very clear and I have many times clarified that I recognize the temporal nature of MFAA, it's literally you who's fixated on this imaginary agenda of pushing something as objective truth, lmao
SSAA is the perfect solution, you can't match it with another ground truth solution
>they will all be outclassed by new AA techniques
there isn't that much room for new AA techniques, we've done alot with fake extrapolating AA already and it's likely to move back towards true AA like super/multisampling as technology gets better
well seeing how even the best GPU's out there would struggle 60fps with proper SSAA I guess so
also say you own a 2080ti and are interested in image quality over everything to the extent where you're willing to quarter you performance for it, you probably then also own a 4k monitor
>pushing something as objective truth
Because I merely stated its drawbacks.
Stop acting like they dont exist for the sake of proving yourself right.
>my opinion better than yours
Faggot
>he thinks he can predict technology
Now that is funny.
I have like TEN FUCKING TIMES stated how it works and not denied the temporal nature of mfaa, you are genuinely retarded
you've also made it clear that you don't even really grasp how these things work in the first place, you're just arguing over some fee fees
>still straw-manning with muh ssaa
Good post!
how so? how about you backpedal or something
>flawed premises
Nigger sampling theory isn't a flawed premise. SSAA is best one can do worth regards to AA because it's the only technique that actually is antialiasing. The problem is it's too computationally costly to use in games that push current hardware to it's limit.
And I have stated TEN FUCKING TIMES how its literally down to opinions on trade offs.
Why cant you handle others views?
Faggot
advanced fake AA techniques only exist because computers arent powerful enough to do proper AA. They're going to be powerful enough to do it soon enough
yeah but I have been telling you that I know, how fucking dumb are you if this is genuinely the feeling based arguing you're going with? I guess you're just incredibly shit at reading comprehension
Life isn't rendered in pixels, smooth edges are completely normal.
I never once brought up ssaa.
You decided to prop up you're argument with it as the example of what I consider "true fidelity" when thats the furthest thing from what I was arguing which was that using temporal data is factually an inferior method in AA and always will be.
Theres a reason that msaa is still what most people want even though it also has tons of drawbacks.
The AA in REmake 2 was a fucking crime against humanity. I've never seen a worse implementation.
I wish I could say yes AA is good but I just can't. Developers today think that reducing aliasing means rubbing vaseline on the lens.
>it also has tons of drawbacks.
its slow
one drawback
>yeah but I have been telling you that I know
And then supplanting that with shitty arguments about how youre so much smarter than everyone in the thread even though several anons have proven otherwise.
>literally cant read
>again
I said msaa in that sentence
Faggot
I brought it up when you reiterated your "why settle for less" shit
you're arguing about something literally no one is arguing against
lmao now you're projecting some weird insecurity because how have I done any of what you claim?
>I brought it up when you reiterated your "why settle for less" shit
>that means just use the lest efficient thing possible
Literally the opposite of what I said.
Settle for less means in image quality, performance, usability and more.
Fucking hell m8
yes hence MFAA is very useful, you're the one wondering why on earth would I "settle for less", which is why I started imitating your retarded arguing by bringing up SSAA as an example
>claims his opinion is what "everyone else would prefer"
>dismisses any arguments against his view without even trying
>this all started because he wouldnt budge on his inablitiy to see temporal artifacts
hrmmmm
huh
what are the drawbacks to MSAA apart from it being slow?
Thats literally your opinion.
I dont settle for temporal solutions when other shit works just fine and still achieves 60fps.
It literally can't deal with certain objects.
That and the fact that it doesnt work on everything in the scene.
false narrative, let go of your constructed narrative and move on so you can actually advance this shit if you want to
I have many times reiterated that I think most people would take the MFAA result over the MSAA result at the same framerate
yes but I have been very clear about that like twenty times now
mfaa allows better performance and in gta v it allowed me to maintain 60fps at ultra without significant jaggies, having MSAA on 4x would have not allowed this
Yeah, because recent AA techniques like Temporal "Just fuck my screen up" Aliasing are so much better than techniques we had ten years ago.
well it doesnt handle alpha, but it doesnt need to since that anti-aliases itself and its a seperate pass anyway
>false narrative
Lets see
>claims his opinion is what "everyone else would prefer"
>I also bet pretty much anyone would take it
>dismisses any arguments against his view without even trying
You did it with the visual lag several times and the fact that its still just msaa which has drawbacks.
>this all started because he wouldnt budge on his inablitiy to see temporal artifacts
>because I can't even notice it
Looks like you dont have a LEG to stand on!
>didnt even follow the clusterfuck of a thread
I dont blame you
I was saying that temporal solutions were a workaround for all the games with god awful performance on modern hardware.
Now that hardware is getting past the need for shit tier workarounds we can expect new techniques to leverage all this potential is surprising ways.
how do I dismiss something I am not arguing against???
I just pointed out that if you care about lag you probably disable AA altogether because frame rate is sure to have a bigger impact
you're dumb beyond beliefs, first you start by saying I am making some superseding claims about shit and then you end wondering why I say it's my experience, so fucking stupid
I'm just saying it looks very good to me and I bet most people would take it any day, you're being delusional over it
>I just pointed out that if you care about lag you probably disable AA altogether because frame rate is sure to have a bigger impact
Not if you play at 60fps.
Only csfags try and pretend that 120-144 matters
>ow that hardware is getting past the need for shit tier workarounds we can expect new techniques to leverage all this potential
no we can expect us to go back to the old, already existing perfect AA techniques
all the extremely complicated AA techinques we use now are just fakes because we can't do the real thing
>and I bet most people would take it any day
>has never been over to /g/
Its called the pc MASTER race for a reason.
Elitist as fuck about the most minuscule things.
richard stallman larping zoomers are not most people thankfully
Not going to happen because the graffix and performance requirements are still going to increase a shit load next gen.
Do you really think youll be able to use those outdated expensive ass techniques on skypunk2077?
I don't know, I had only a 60fps monitor but I swear that playing at 240fps in csgo felt WAY smoother than 60fps when it comes to responsiveness
Well you are on Yea Forums
How did it feel smoother?
The only way you can tell is through the visuals.
Your mouse isnt tactile
you should def try it if you can because it's not even close to placebo I swear
I guess it has something to with the fact that the game can match your inputs with the final image more precisely, for example at 240fps it can take my input very close to the point where it renders an image to the screen
anyway seriously I swear it's very obvious
>outdated
That's like saying 2+2=4 is outdated
supersampling is perfect anti-aliasing, literally
when graphics pipelines are big enough it becomes more viable because it greatly reduces the amount of passes you need to do and the amount of pixel reconstruction you need to do
it's kind of like ray tracing, it's slower but it's simple and perfect, tech just needs to reach the point where it's viable
although supersampling is much closer to viable
>then there's the display RGB range settings which allows you to fix your color range settings (full range RGB looks noticeably better on most PC displays because they're full range displays)
Where can I find it?
layout should be the same in english
kiitos
That's basically it. With dumb vsync at 60Hz, the faster your machine, the longer it is between input and the next frame. You generally can't do anything about a game that doesn't use adaptive triple buffering, so the simplest fix is to run unlocked and minimise the time between input and frame display.
Also, CSGO has minimal latency to begin with and no motion blurring which will increase the perceived "snappiness".
You mean it would be if games werent still increasing the poly count and other parts at ridiculous rates each gen.
Yes things look like they stagnated because were at the tail end of a generation.
That wont last
What are the improvements that clustered tiled forward is supposed to bring? The whole point of deferred is that point lighting is effectively free.
yeah sure and it's surprisingly noticeable too
Things look like they have stagnated because they have stagnated, graphics cards are making diminishing returns
Improving polycounts hasn't been a thing for ten years, we can already render more than enough polygons, post processing, lighting and shadowing are the expensive parts
There's only so far you can go with fake AA algorithms, they're already quite sophisticated and delving into meme territory like neural networks, extrapolating information will never be perfect, you can feed it more data to make it more accurate but at that point you're basically halfway to doing SSAA anyway
>tfw have two cheap shitty monitors and my second 144hz monitor doesn't support newer HDMI so I'm stuck with one at only 60hz, both with uncalibrated colors because I have no idea how to even set that stuff up
>Improving polycounts hasn't been a thing for ten years
You realize that we are just following console graphics right? And that they have increased a SHIT LOAD on character models alone just form the ps4 launch titles to what is coming out now.
No game has even tried to use all the power of a capable modern PC because it literally wouldnt sell.
I'm a 30yo boomer so I always disable aa cause it helps to me to spot shit faster in certain larger-scale or twitch multiplayer games. The sight is just not the same anymore.
Consoles are behind PCs mostly in memory, not raw rendering power. The difference is much smaller than you think. Console vs PC usually just means more detail in the distance, not significantly better rendering. Same with polygons, cards can draw a shitload of polygons, the detail on screen is capped by other things like memory and animation, not poly count
more like CPU, but capping to 30fps helps with that
CPU has the least to do with it, games use CPU the least. It's all about GPU power and graphics memory speed/bandwidth/whatever
nah man games max out quad core i5s all the time, all them drawcalls and shieet
Not really user.
A ton of the modern games look like butt on console and cant even run the game at 1440p on the high end versions of the consoles half the time.
You clearly dont understand consoles then.
They underclock the CPU a shit load compared to retail parts
drawcalls are what the GPU executes dumbass
just because it's "maxing out your cores" doesn't neccessarily mean it's doing anything useful with them
The only games that demand anything of your CPU are crappy looking games like Dwarf Fortress or Hearts of Iron
>They underclock the CPU a shit load compared to retail parts
That's because the CPU speed matters the least
>drawcalls are what the GPU executes dumbass
lmao at you, what do you think drawcalls are? it's the CPU telling the GPU to draw shit
Thats not true
yes and for the CPU it's literally one command, for the GPU it's a huge batch of work. Drawcalls are what the GPU executes
That's exactly what Star Citizen was selling itself on, among other things. Who knows if that'll ever actually release for anyone to push that though.
Thats misleading and you know it.
Its because of the form factor and TDP buddy.
They would overclock the shit out of consoles if they could but they would literally catch fire even running at stock frequencies
you issue thousands of drawcalls every frame in games, it's one of the reasons for why vulkan is even a thing, to reduce the CPU overhead
That, and overclocking isn't good for stability or longevity, two things a console designer has to take some consideration for.
>you issue thousands of drawcalls every frame in games
and? You add millions of numbers together each frame. A thousand of anything is peanuts to any computer. The GPU is doing the work. If I tell you to run 10 miles, you're doing the work by running, not me for telling you to do it. CPUs have it easy for the average AAA game with 10 guys on screen at once
Well I was using it as an exaggerated example.
The point is that they cant even run stock clocks.
well most console games are limited to 30fps due to a CPU bottleneck
basically the CPU has to figure out where everything should be drawn which involves a lot of math in the end when you get bunch of dynamics shit and different materials
but yeah it depends on the game, open world games are more CPU hungry etc.
>CPUs have it easy for the average AAA game
kek
You havent played many modern games
also CPUs don't have thousands of cores
No they're limited by the GPU and memory bandwidth, all games are. Slow CPU doesn't help, but it's not a bottleneck
Oh you want to tell me how hard it is to animate and process AI for the literally 5-15 enemies there are on screen for a typical AAA game?
I still turn it off sometimes because I like the sharpness. At any rate, downsampling is far superior to AA.
well substantiate your claim I guess
who are you and what knowledge do you actually possess when it comes to the subject?
>implying thats all the CPU does in a game
kek
I doubt many of those games would run at 60fps if you paired the same CPU with better hardware
I thought I did. Not much is actually happening on the screen in most games, if you'd notice. Games are extremely complex of course and that demands alot from the processor but the things that actually scale and start demanding more and more from your processor is the amount of simulation happening, the amount of characters in your world moving around doing things, and if anything that's gotten LOWER for most games so they can render them as beautifully as they can
They literally would
>and if anything that's gotten LOWER for most games
It really hasnt user.
Hubs have more NPC milling about than ever.
Crowds in games are denser than ever.
Even shooters thanks to BR have to account for many more players than the usual 16 players of last gen.
Firefights have only gotten larger scale in game as well with every shooter being mmo-lite with raids and other shit.
What are you comparing to because its clearly not the 360 gen
the CPU also has to handle streaming of information and many games do shit like culling on the CPU side, you have to handle worldspace shit just as well as viewspace depending on what you're dealing with
but you're just claiming it's trivial without actually explaining why, I have to wonder how do you tell the difference between it being trivial enough for some 8-core 1.6ghz laptop CPU or for example a quad core 3.5GHz desktop CPU
so many remasters are 30fps though
You think Pubg invented 100 player servers? Look at an MMO 15 years before that
Some games have crowds and that's a good use of hardware, most don't though
Games have mostly gone down in scale
64 player HL servers were common, I'm pretty sure they went up to 128 players too
Have SSAO.
If it's a modern game I just slap FXAA on it and call it a day
older games I'll crank up MSAA since I have more leeway with performance power
you are actually retarded. The CPU handles anything and everything that isn't "crunch pixels". And even then, the CPU determines what information gets sent to the GPU.
You said most games.
Those are only a few examples not the standard.
Most games have increased in scale and complexity significantly.
Those games also had now where near the about of shit happening in them or the amount of props used to decorate maps as say something like BF5 on a big vehicle map.
I dunno it's hard to hit 144fps generally largely due to CPU limitations
>but you're just claiming it's trivial without actually explaining why
Do you want me to write you a book? How detailed an explaination do you need? You tell the difference by looking at what the game is doing
I mean in many games you can lower your resolution all you want but your frame rate is capped by your CPU
based non-sequitur
well I am not convinced that's for sure, something like GTA V is CPU heavy for sure
Half Life was the most popular online shooter for the better half of a decade
Same with WoW
Are you a console exclusive player? I'd understand if you were, games have certainly increased in complexity and maybe scale as in world size but not CPU complexity ie. the amount of things happening on screen at the same time
Yes. Some games look awful without AA, like FO4, which already looks bad, but it looks even worse without AA, see for yourself if you gave Todd your money.
But in most cases, I play with AA off, or just the default Nvidia FXAA turned on, which is pretty 'okay', if blurry. My eyes have adjusted to all the blur in modern games, so when I turn off AA, and see how much detail is lost, it blows my mind.
Those are not the standard user.
Most PC games were single player.
I get amazing FPS on GTA V with my mediocre computer
>hasnt played newer bethesda titles
It eats CPU alive, doesnt matter how low the graphics are because muh simulation.
Why would you rather play 8k with 50FPS than 4k with 60 or higher?
The worse framerate has a much higher effect on visual fidelity (unless you're just taking screenshots)
Is it modern?
I dipped below 60fps with a 4690k on highways, to almost 30fps with increased distance scaling, CPU temps went high and utilization maxed on all cores
You probably didnt max it out either right?
Is there an easy way to force games into 1440p and downsample them easily? Seems like most games don't have the option and it modern AA is such a blurry fucking mess I want to scream. Every single god damn game is so fucking blurry.
if the two most popular PC games aren't standard then what is?
it's hard to think of a single player game from that era because they pretty much all moved to console
Sure, use the GPU configurations itself
I dont know how on amd but nvidia is easy so I would figure its also easy on amd
This, supersampling > everything else.
I have Nvidia. I guess I'll look it up then.
yeah GPU control panel settings
DSR on Nvidia and VSR on AMD, once enabled you can increase the game resolution
grey pixels = soul
you never need vsync, I'll take responsive inputs over tearing every fucking time unless it's a turn based slow ass RPG or something
RTS/strategy games
Sim games
Stalker and the like
People love the fuck out of Gothic 2 anymore and that was early 2000s
The TONS of cross platform titles like FEAR
2 games are not the standard for an entire platform just because they had the highest user count. And because FFXI was out just as early as WoW and on console your mmo argument doesnt hold up.
>contrast = bad
>blurring = good
I'm not hugely into pc gaming, I play stuff on my PC around just as much as I do on console and also no job right now but do you think anti aliasing is worth it on a 1080p monitor and a 1050ti?
Are 1440p monitors becoming more standard, or do you think the jump to 4k is becoming more significant and happening within a shorter timespan?
Also is my GPU worth it over getting a 1060? A friend who's way more into playing on PC told me to go with the 1050ti, I paid about $250aud for it while the 1060's go for around $300-$350 here
>killed off in most modern engines
It was killed off because of d3d. Find any game that you can run in both dx9 and 11/12 that has msaa, you'll notice a huge drop in performance if you turn it on in 11/12.
Half Life was a Quake engine game. It was the bog standard 3D engine for games at that time. They could do 64 player servers without breaking a sweat. When most big devs moved to console a few years later games took a massive hit in complexity. The only way you can think that games are just getting more complex is if you were a console exclusive player.
That's pretty interesting, so would you say if you play anything that can run in dx9 it's worth using it?
>player count
Since when is that the argument?
Games have gotten way more complex in other ways.
I'm aware that high player count servers were standard and was disappointed as fuck that 64 player servers never got to console for fucking ever but thats not ht argument.
Idk man I'm still just in disbelief that we basically destroyed aliasing this generation, games literally just do not have jaggies or artifacts of any kind anymore even on consoles.
I mean look at the 4k footage from the xbone x / ps4 pro in games like MK11, it's legitimately blemish free and sharp as hell but you guys aren't satisfied with temporal anti aliasing? I don't get it man.
If your gpu isn't absolute shit-tier then yeah it is.
doom 2016 has my favorite aa ever
I'm talking about computational complexity
They've certainly gotten more complex in other areas, I'm trying to make the point that they aren't taxing the CPU
I'm 1050ti
But they literally are.
Compare gothic to a modern RPG like deliverance.
If you strip away all the graffix one is clearly more computationally taxing than the other.
Haven't played either than them
On average they seem the same to me
Like Doom 2016 is not really doing much more with the computer than Quake 1
1 player, a dozen enemies, basic AI attack patterns, it just has way better graphics, most AAA games follow this formula
what does Uncharted 4 do that that Prince of Persia Sands of Time didn't
doom has this megatexture system where it streams textures
Thats because you are using the most static examples possible.
Shit like GTA V and Fallout 4 are heavy on the CPU because of the simulation and are entirely single player.
The map size of Fallout 4 has not even increased since the original xbox with morrowind yet the CPU requirements are astronomical because a static environment will always be cheap on the CPU.
Yes shit like Sekiro is nothing because its simple levels with simple enemy AI but thats ignoring tons of games with an insane amount of variability.
Old survival sims were like stalker, new ones are like ark with so many massive player built bases and literally tens of thousands of creatures being simulated in the world at once live to the point that it makes an i7 sweat.
GTA 5 only simulates important objects and things close to you. Same with Fallout 4. Could be done on CPUs 10 years ago
Serious and constructive question, just what settings should I always use as sort of a template for every game?
based redditposter
give me source
1050ti has it's uses, but for general gaming, I'd go higher tier than that. 1060/1070 isn't a bad area to shoot for, only because price/performance's been borked ever since (actual) consumer cards broke the $1000usd threshold.
4K even today is difficult to drive with non meme tier priced cards, and even at that it's still a pain. 1440p is the sweet spot.
Vehicle physics have improved a lot and definitely cost more than they use to user.
I'm talking about GTA V online as well.
And no fallout 4 could not, it simulates the entire map as you play, obviously not in real time because that would destroy performance but the persistence combined with that is what kills the CPU.
It saves EVERYTHING down to the position and orientation of the crumbled piece of paper you bumped into in an alley. Of the tons of shit that went flying in a firefight.
You could argue other wise because skyrim is a 360 game but styrim also has a massively paired down local area to simulate compared to the current gen titles.
Persistence is what kills performance in Beth games, all other games toss that info out and load areas from scratch like Red Dead 2
who is the thot?
Persistence doesn't require CPU power because it just saves them away in memory and doesn't have to touch them. It does do non-realtime basic simulation on characters but that's very simple stuff, again stuff that could have been done a decade ago
the 1060 would've lasted you 1 or 2 years more than the 1050ti.
why don't you go online and get somebody else's opinion before listening to your tard friends.
Wouldn't it make WAY more sense to offload that kind of stuff into saved chunks that can be removed from current computations, and try to approximate the changes between when you left and got back?
I don't pretend to know how Bethesda set up their engine, but trying to calculate the physics constantly for all moved objects, even if they're on the opposite side of the map, would explain why performance is utter crap.
>trying to calculate the physics constantly for all moved objects
it doesnt
they're just frozen in place until you disturb them
that's why they do that wierd then where they jump, they're being woken up into an active simulation state
SMAA is fine, though. TXAA can be corrected with sharpening. FXAA is beyond redemption though.
I just said it was in skyrim already though.
That doesnt change the fact that the larger local areas of simulation factually cost more to compute combined with the fact that it also costs cpu power to save and load all that data on clutter as you move across the open world.
Even NPCs interactions with clutter are all saved.
The background stuff outside the players area has always been easy, thats why shit like random dead bandits exist in skyrim on 360. They were traveling and killed randomly by the simulation but until its loaded into your local area it costs basically nothing.
Thats why on PC you could up the uGrids to load to 4x more than the console version, we had more CPU.
Modern titles have ugrids to load cranked up WAY beyond what was ever possible before on those old titles so it does in fact cost more to run.
They do that already, and its "approximation" is way more accurate than it needs to be, the problem is that we want an ever increasing size of live simulation each game and they cant help but to try.
You would be surprised at the amount of improvements to their engine come each game.
Too bad its literally all background stuff no one cares about or severely needed optimizations that they put off until they literally cant anymore.
Like I said I'm not hugely into PC gaming, I'm happy to run stuff on 1080p mid-high settings and 60fps this seems to do the job for me, I'm satisfied with it for now and it was only just within the budget I had at the time for the new GPU, monitor, and mouse
Good point...
you're making incorrect assumptions about things you dont understand
its moving from argument territory into bullshit
Literally the only thing I said that disagreed with you was that increasing the local simulation that runs in real time costs more CPU.
I agreed on everything else.
So please tell me more
I honestly prefer ultra crips textures on high resolutions
it looks a lot better
AA looks good on screenshots, but when shit hits the fan, there's plenty of moments where it looks tacky as fuck, like the portrait camera mode in smartphones, that does all that faux blur
Gimme dem raw pixaaayyylllsss!!!
saving and loading stuff is trivial
consoles in that era were limited by their memory
SAUCE NAO!
And you seemed to misunderstand that when I say it costs to save clutter data I mean that it increases exponentially with the size of the local simulation that we love to push so much with mods and also ini tweaks.
Before there was only a few chunks to load each time you transitioned so those thousands of clutter objects were manageable to load with all the NPCs and world data.
In the newer games it has to load a shit load more than just that because the scale has increased so much especially in the hyper dense areas like downtown boston so on systems with weak cpus it can literally hang in transitions while it struggles through the spaghetti code left over from oblivion.
Disable all AA, texture filter, motion blur, DoF and all other post-processing effects.
it's complicated. the kind that looks for aliasing on the finished image is dumb and reduces the overall amount of detail you actually get from the game. the ideal of course is to render at absurdly high resolutions but if you can't do that might as well use anti aliasing on the edges of polygons
can we admit whoever started this trend of games only having TAA or FXAA as options now needs to be shot?