Is anti-aliasing important to you?
Is anti-aliasing important to you?
It helps, but i'd rather have a game that looks jaggy but runs at 60fps stable minimum, than a game that looks clean but has low fps.
Stable, acceptable fps should always be your priority.
not so much.
mostly set on 2x at1080p
looks fine
i rather care about stable frames
It really depends on the game, some look fine without it, others look like jaggied shit.
>cardlets
yes AA matters for me and I use it whenever possible
some console ports look like HD remasters with supersampling
yes unless playing with a CRT
Just fuck my framerate up
Most AA is good, but I prefer no AA to a really blurry AA
only when it doesn't look like the screen is covered in Vaseline that being said, downscaled resolutions are the best form of AA.
sure
but there are other lovevly things in life I like to blow my money on other than video cards
I could buy an amaying rig but I like to travel for example
Sharp TAA with a good filter is a godsend for anti-aliasing.
Fuck FXAA.
Fuck MSAA.
Fuck CSAA.
Fuck MLAA.
Fuck TXAA.
SSAA is based, though a good internal downscaler set to 4K or 1440p with TAA is usually better.
Reminder that if you dont have AF at x16, your a massive faggot, especially since it only ever has a 1% impact to performance, and at most a %5 if they are pajeet graphics engineers.
yes, I care more about af
This.
based AF poster
I play without it most of the time in modern games with my monitor a good metre away from where I sit, a lot of games are developed using PBR and with TAA in mind as part of the final presentation (some games even force it on now) so when it's disabled the shaders and specular highlights on the character models alias like crazy and practically make them glow, it makes target acquisition extremely easy.
Yes. Quake looks great with 8xEQ MSAA, usually use SMAA though
Reminder of diminishing returns.
how the fuck do consoles not use AF at all still
shit is mindboggling
AA and Textures on max, post processing and effects on med/low so they don't interfere with the gameplay
I remember seeing some guy on the Warframe forums claim that there is no difference between 8X and 16X AF after the devs removed the 16X option
No. I started playing vidya before AA was even brought in, and soft edges only meant the resolution didn't match our monitor. Pixelated edges meant native resolution, which to me meant the highest graphical fidelity. To this day, pixelated edges register as high fidelity.
I might turn AA to a minimum, but it's not necessary.
Depends on the game, a lot of the time it defaults to TAA now then I realize it looks like a blurry fucking mess and go in and turn on MSAA.
That's one shitty comparison.
Still, rather than AA, I'd prefer to have a higher native resolution.
Still, especially at lower resolutions, even some minimal AA does wonders; it's one reason even N64 games look surprisingly decent, once you strip away that extra blur that is.
The impact of AF is often minimal, but it can still affect performance in a negative way.
Still, some AF is always better than none.
4x offers a great balance between quality and performance.
You got some weird ideas there.
MSAA simply does not work in most games nowadays, thanks to Deferred rendering.
Why should i care? It's not like i masturbate looking at the fucking ground i have shit to do graphics faggot.
>temporalshit
Always looks fucking awful in motion
Only plebs use mipmapping so AF is never an issue
There is no difference. I don't remember the specifics but it's something like that at the point x16 would actually do anything the extreme circumstances means there's no texture detail to save anyway. It's beyond diminishing returns, there is no returns.
found the console player
TAA gets better on resolutions over 1080p
It's due to console architecture itself.
iirc it's because AF is mostly on the CPU for its calculations and console CPUs are always super underclocked. Console games also already have a really tight performance budget anyway, so even a 1% performance drop can be the difference between a locked 30fps and stutter with dips to 20fps.
Yeah, no, fuck off with those useless filters, i can play the game just fine without my computer going 80°C and melting my room
After playing dmc4se on pc, yes
>I love my games to look like I'm viewing them through a window smeared with shit!
>filters
Techlet.
>super underclocked
For what purpose.
Whats with the CRT spam lately?
Look at any console. That shit has laptop-tier airflow.
theres a very good reason to use mipmapping, you know
viewing textures at grazing angles or far away ones while they're at full resolution will cause a lot of image noise in motion, as the subpixel sized textures fight for which pixel of them gets displayed on your screen
mipmapping gets rid of this issue
Have you ever seen the GPU of a console? They have minimal cooling
Performance > Visuals. Rather have the game be full of jaggies and meet my new standard of 144 HZ.
What the other user said, plus the fact that normalfags love to put their consoles in tight living room furniture and entertainment centers with little to no airflow. The manufacturers absolutely take that into account.
They're cheap, have shitty cooling and have to fit under a TV with a wall blocking it's airflow.
Aliasing is caused the fact that literally no display resolution is currently high enough to properly resolve 3D geometry. Fucking "4k is a meme!" faggots.
No, because I play at 4k
Man if only these comparison pictures could be even FUCKING SMALLER
>ordinary jagged edges
Whatever
>shimmering
End my life
I usually put it on, but on the lowest setting
There's not really any noticeable difference and it doesn't impact performance as much as having it on high
No, I see it as a meme that makes you think you need a higher resolution
Don't forget that console hardware needs to be capable of running even when Roaches find their way in.
It's not resolution but the arrangement of individual pixels on LCD screens that make aliasing apparent.
Holy hell you are the biggest retard in this thread. At least research what you're talking about before you start spewing shit
Fucking hell, can't find an in-depth video explanation of how AF works and what causes the blurriness in the first place. They're all just lol it looks blurry at angles and AF fixes it.
Yes, I have ultra wide shit load of inches monitor with only 2560x1080 resolution so pixels are huge and AA makes it bearable.
Dumb and short version is the blurriness is a result filtering not being done in respect to the three dimensional space, anistropic filtering is adding that respect for three dimensional space.
How much of a dumbass do you have to be to let this happen?
Somewhat. A high and stable frame rate is still the number one priority.
Not having AF looks horrible in motion. The boundaries are extremely obvious.
Eurogamer did an article claiming that CRT's have better picture quality. Shame you cant get anything in 1080p or higher. Such a meme
Nowadays, x4 FXAA is pretty much the lowest you should go. The difference in performance when turning it off is less than 1 fps and the image quality changes drastically. Only faggots who force themselves to look at a shitty picture, for autistic reasons I don't really comprehend, are the ones turning it off.
Newer games don't even give you the option to go below x4, or "turn it off" automatically when selecting other options that already have their own AA configuration.
Playing with no AA at all should only be an option when testing mods compatibility and stuff.
my crt is 1600x1200@85hz
They don't have better quality, the pixels are just bigger, so older games, designed for those, look worse when played on LCD/LED.
High resolution pixels look a lot better on digital screens.
>Only faggots who force themselves to look at a shitty picture, for autistic reasons I don't really comprehend
Is it really that bad? I play all my games with AA at the lowest setting or entirely off because I honestly can't really see any difference when I'm playing. When my buddy comes over he immediately cranks AA to whatever its highest setting is and I can only tell because the framerate drops from the hundreds to 50-60. Now, obviously, if I'm sitting completely still in-game looking around for jaggies I can tell, but when I'm actually playing the game I don't notice it at all.
My eyesight is absolute dogshit btw maybe that's a factor.
Yes and I absolutely FUCKING HATE new games and deferred rendering
FUCK TAA
FUCK SMAA
FUCK FXAA
GIVE ME 4X MSAA
Literally no one doesn't turn AF to 16x, what the fuck is the point of this post?
>Is it really that bad?
Human brain works by comparison.
You can immediately tell if graphics belong to PS1, PS2, PS3, or PS4. At the time, we all thought what we were seeing was the best thing ever and looked realistic.
It's just when we leap forward that we realize what we were missing.
As soon as you get used to FXAA x4 as minimum, you can immediately tell when it's turned off. It is really unconfortable to be able to see the pixels move randomly without a proper shape. It isn't nitpicking, your eyes see those pixels as noise, without even paying attention.
Described like this, it sounds impractical to turn it on before getting used to it, but boy, it looks so good once you do.
Actually, I don't have 16xAF globally, because new games all have it by default and old games that don't have it look better without it
I would post example if I wasn't tranny janny rangebanned, but if you play an old game that has a repeating texture off into the distance, the texture tiling is immediately obvious with 16xAF, but the blur from trilinear filtering hides it. It also looks more authentic to how it would have looked when it released.
But that's just me. I play old games at a resolution that would be appropiate when it came out rather than stretching everything to widescreen 1080p like everyone else does.
Hell no, makes everything look blurry
I have a high-end setup and disable it every time
jaggies are fucking disgusting.
id rather play with taa + sharpening than have jaggies
I love the blurry look of TAA with a little bit of sharpening. Vermintide 2, Siege and Fallout 4 have my favorite types of TAA.
Yes, absolutely.
It's especially required in cel-shading.
Why yes, I still prefer FXAA, how could you tell?
4x MSAA or equivalent is an absolute must if you're using a sub 1080p monitor
>taa
ah yes I absolutely adore ghosting when rotating any faster than a sloth with neck problems
>that floor texture turning into fine mush in the distance
>zero impact on real world game performance from the 16x AF
Yeah I disable that shit even at 1080p. Supersampling is the real man's solution to jaggies.
MSAA x2 or GTFO
what about 4k?
Every single thread like this there's retards like who think antialiasing is just FXAA and TAA
Depends on the game.
>Why yes, I still prefer FXAA, how could you tell?
SOUL SOULLESS
most games don't ACTUALLY run at 4K, they render at a checkerboarded 2K and then use TAA to fill in the blanks
People ITT don't know what the fuck anti aliasing actually is so I will school you on the VERY VERY basics, simplified to a level anyone can understand without being 100% accurate
FXAA is Fast Approximate AA. It's a slightly more intelligent blur filter. Covers the whole screen including HUD. Can be used on literally anything but looks shit. MLAA is the same shit but it's AMD's equivalent.
SMAA is a slightly more intelligent FXAA. Not as blurry as FXAA, looks better overall and is fairly light. Won't clear up all the jaggies though.
MSAA is multisampled AA This is proper AA. Cleans up all jagged edges with zero blurring. Can't be used on nu-games because of the way their meme renderers work. Looks fantastic in PS3-360 era games at 1080p. Can be enhanced with CSAA and SGSSAA but it's so good on its own it doesn't really matter
TAA is temporal and it's a blur filter except it's applied over several frames which smears the fuck out of the screen in motion. It's designed for consoles to make shit look more """cinematic"""". It looks awful and arguably worse than just using FXAA.
Lesson is over.
If you can't "tell the difference" with AA on or off you're a fucking retard who has never actually compared the two or never got it working in the first place. If you think AA is just "blurry" you're a retard who's only experience with AA is FXAA and TAA
Depends on the game. I intentionally scale down the resolution of old games because they look better.
This is why the only valid "antialiasing" method worth using these days is to upscale using DSR.
That's not particularly viable for nu-games because that's just going to tank your framerate
TAA is so blurry devs actually use it to soften busy scenes or unify dithering (i.e hair in FFXV).
RE2make with TAA off is atrocious
I prefer to keep it off and set it on 4K so I get both no jaggies and no blurring
most important thing after resolution
Exactly, and FFXV without TAA means you're gonna be looking at ugly, distracting hair.
I feel like this approach is regessing back to the 90s where devs expected players to use shitty component cables that would merge colors to create new one.
Impossible to live without at 1080p.
Not AS essential now that I'm on 1440p.
Probably near useless at 4K.
I use a 4k and at that resolution I can't make out individual pixels so any form of AA is wasted on me.
Damn, that's the tastiest cope I've seen all day.
>The impact of AF is often minimal, but it can still affect performance in a negative way.
>Still, some AF is always better than none.
>4x offers a great balance between quality and performance.
In any modern game the difference between 4x and 16x AF is usually less than 1-2 frames per second. Literally no excuse to max it everytime.
Yeah. I don't give a shit about any ENBs or anything like that, gimme 8x SMAA any day of the week.
Quake's look really just comes down to turning off any texture filtering at all. Higher resolutions look fine, it's all about those unfiltered nearest neighbor looking pixels. As soon as you apply texture filtering everything turns into a blurry mess and it clashes with the gritty aesthetic the game is going for.
Quake was developed with 3D acceleration in mind. That blurry mess is what the devs intended it to look like. You're imaging the clash with a "gritty aesthetic". At the time there was no debate, the 3D accelerated version looked vastly superior. What you talking about is just a personal preference for the more "primitive" one because it goes better with nostalgia and rose tinted glasses.
>Sharp TAA with a good filter is a godsend for anti-aliasing.
>I only look at stationary things
NICE VIDEO GAMES NERD! YOU SURE YOU AREN'T WATCHING A PICTURE?!
None of this is true by the way. Quake was developed without texture filtering, that's how the textures are supposed to look. Nice and crisp. Texture filtering was added to it after the fact to sell graphics accelerator cards and while it's more technically impressive and harder to do it looks worse and undermines the game's intended visuals.
this dude gets it
>Quake was developed with 3D acceleration in mind
You are wrong. GLQuake came after.
>During the development of quake we talked to all of the 3d board vendors.
According to Carmack you are wrong. The only thing they fucked up is they backed the wrong horse. They chose to go with Rendition for their target platform instead of 3Dfx who won out.
The models and textures weren't designed for filtering, so they loose a lot of clarity with filtering just like filtering a modern pixel art game would look bad
>Quake was developed with 3D acceleration in mind.
you're thinking of Quake 2, Quake 1 was exclusively software rendered
and yet quake shipped with only software mode
>Rendition had a cool looking architecture, with good performance and a programmable risc chip. Of all the designs that we saw, we liked theirs best, so we decided to endorse them as a target platform.
No, I'm not.
>According to Carmack
Name dropping isn't a source
Explain why the initial release was software only then.
>quake.fandom.com
>John Carmack felt that Vérité provided the best performance per dollar at the time, resulting in the effort by him and chipset maker Rendition to make this the premium Quake experience.
Yes, a lot
Because the hardware manufacturer fucked up. The cards were late. It would only work on a specific piece of hardware. This wasn't OpenGL or DirectX or something.
That doesn't actually support your initial claim, if anything you just proved the other guy right when he said "Texture filtering was added to it after the fact to sell graphics accelerator cards " and yourself wrong. So, try again.
Based AF
Except id went looking at the 3d hardware manufacturers DURING the development of Quake, not after it was done. It was built with 3d acceleration in mind. This is not up for debate. I'm just repeating what Carmack, the guy who made the fucking game, has said. You are not arguing with me, you are arguing with him and who the fuck are you?
You're arguing with things he didn't actually say and not backing up your claims with any sort of citations and then when you did provide a citation it proved the other guy right. So, good luck trying to spin this.
No. I always turn that shit off.
>fandom.com
you cannot be seriously using that shit as a source. especially considering the openness of pre-zenimax id Software
>fabiensanglard.net
>Some work that will be going on in the near future:
>...
>Porting Quake on the metal to some 3D accelerator boards.
Quake was not hardware accelerated on launch
>I haven’t been working on QuakeWorld for the past week, because Michael has been working on the rendition port, which backburnered the win32 native port, which is a prerequisite for the QuakeWorld release.
Abrash, not Carmack, ported it to Verite
>I have been using OpenGL for about six months now, and I have been very impressed by the design of the API, and especially it’s ease of use.
>A month ago, I ported quake to OpenGL. It was an extremely pleasant experience. It didn’t take long, the code was clean and simple, and it gave me a great testbed to rapidly try out new research ideas.
Carmack ported it to OpenGL, preferring open APIs
>Rendition Verite: ($150) It won’t set any speed records, but it is an excellent value, the quake port is pretty good, and the risc architecture allows a lot of room for optimization.
>3DFX: ($300) The highest textured fill rate of anything short of high end SGI hardware. 50 mpix, almost irrespective of options. You can’t run a desktop or high res display on it, but at 640*480 it will stomp any other consumer level board, given equal optimization effort.
Carmack considered the 3DFX more powerful
>You're arguing with things he didn't actually
>During the development of quake we talked to all of the 3d board vendors.
This is Carmack's own words.
I like things sharp. I don't want intentionally blurry realism chasing shit.
I don't like the stairstep jaggies from aliasing either.
MSAA still blurs the image and it doesn't get rid of jaggies in 2d foilage and shit like that which makes games like the original crysis look like shit.
it's any texture that you look at from a flat angle
>tfw 1070 barely manages to run skyrim with enb in 60fps 1080p
It's one of the first things I turn down to improve performance.
Doesn't actually prove your point dude. They talked, they doesn't mean he has a wholesale endorsement for texture filtering, or the guy who drew the textures wanted you to apply a bilinear filter over them, and the game still shipped without it again proving you wrong. Sorry buddy, you'll need to actually find a source that supports your claims, you've only proved the other guy more right with each post.
>If I had to make the decision over again, I might very well have chosen 3dfx, but at the time we made the decision, I couldn't have known that:
>1: 3dfx would get their price down $100.
>2: z buffer performance would become important for quake, which helps 3dfx and hurts rendition.
>3: rendition would be as late shipping their board as they were.
>4: rendition would have the problems it does with poor windows performance and dma incompatability.
>Rendition was supposed to do most of the work for Quake, but it still occupied over TWO MONTHS of Michaels time. We absolutely can't afford that again.
The difficulty in working with Rendition is exactly why Carmack prefers open shit. And the z buffer comment makes it clear the choice was made early in the game's development. 3D acceleration of Quake was not some aftermarket release. What it was was a colossal fuckup.
>Vanquish
Based, I played through that the other week. Game looked fairly jaggy no matter what I did with it
I like both, but I really do prefer hardware wuake at 640x480. it doesn't look as gritty, but it looks so smooth and gorgeous.
not on PC
The cod beta had horrific blur on everything that I honestly would have taken jaggies over the Vaseline rubbed on my eyes. Luckily I could force sharpening through Nvidia and make it clear.
>Game looked fairly jaggy no matter what I did with it
Get a 4k and don't rub your nose against it, the pixels are small enough you can't make them out individually from a normal viewing distance which means all the jaggies aren't apparent unless you get really close. So when you're sitting back it takes care of any aliasing problems I've ever encountered.
Totally not. I often turn it off. I like crisp.
>Get a 4k
No
People mentioning airflow as the issue are only half correct. CPUs are notoriously unstable and difficult to manufacture, a huge number have major defects during quality testing and never even reach the market. To claim any sort of reliable lifespan CPUs are underclocked.
>Get a 4k
Fuck off Elon.
Imagine being this fucking wrong
AA is useless when you go 4K. Resolution is so high you can't no longer see edges even on a 25' monitor.
AA is quite usefull in 1080p and can improve some visuals; but more of the times you should be fine just with FXAA.
get 2.5k and a GPU that can render at 4k and downscale that shit, then you really dont need AA and you're not paying for a screen that has literally no decent 4k content
I use a little. Usually no more then 4x.
>2.5k
I wish you retards would stop using terms like this. Do you mean 1440p?
user, the conversation has been had. Several quotes from Carmack have been posted in this thread proving Quake had a 3D accelerated system as it's target system from early development but it all went south because the hardware manufacturer were a bunch of fuckups.
>user, the conversation has been had.
So you mean that part where you got proven wrong repeatedly, accidentally proved the other guy right, and everyone made fun of you for it? That's how I remember it going down.
No, I mean the part where I quote Carmack's own words proving the claim correct and retards just dismiss it because they don't like being wrong.
>Carmack's own words proving the claim correct and retards just dismiss it because they don't like being wrong.
So exactly what you're doing right now?
Careful user you're actually ousting yourself as the actual retard.
2.5k refers to the 2560 horizontal resolution, it's logical, concise and you recognised what it meant instantly. 2.5k is also consistent horizontal dimensions being used when referring to UHD, like 4k. Additionally 480/720/1080p were used to refer to HD resolutions, 1440p is far less impressive sounding when marketing a new generation of products compared to 2.5k
Makes it quite clear 3D acceleration was not something added offhand after the game was complete. A decision was made early in development. The decision wound up being the wrong one and so 3D acceleration was missing from launch. You're all wrong. That's all there is to it.
It almost always helps newer games.
Older games or nu-retro games often get blurred by it, especially when it is imposed on the game outside of the in game settings.
Except for the part where you don't actually have a source for any of this.
If you googled it you'd find the source. What doesn't have any source is the claim Quake was not built with 3D acceleration in mind. There's just a bunch of faggots assuming they are right even when faced with multiple quotes from Carmack contradicting them.
None of the quotes contradicted them, if anything the quotes support them more than they support you.
Not him but why are you being such a huge fucking faggot? user made his point and you're just thinking you're in the right because you're too much of a pussy to admit you're wrong. On an anonymous board where saving face amounts to nothing. Shut the fuck up
groups.google.com
Here's an engineer at Rendition discussing how early on Quake was intended to have 3D acceleration, and how problems arose which resulted in the shipped game not supporting it at all because Carmack needed more time to get it working with GL instead.
Not that it matters because no matter how much evidence you're provided with, nothing will get through to that thick fucking skull of yours
Internal downscaler is inherently inferior to SSAA. SSAA uses a rotated grid sample or other non-square grid sample. Internal downscaler is square grid by definition but it's trivial to implement and keep supported so it's all we've got in the post-DX10/deferred shading era.
Wow, why are you getting this mad about being proven wrong?
If I remember right GTA5 has deferred rendering and it has MSAA, although it far less effective then in some older game. I really wonder why they didn't implement SMAA into the game out of the box, FXAA is nice and all but SMAA is simply superior at a aliasing removal to performance ratio
Deferred rendering allows way more granular control over particles, lighting, and shader effects. The industry traded sharp lines for better better graphics and it was the right choice.
MSAA works really well for certain parts but modern games have so much alpha texturing and various effects that MSAA can't do shit with so devs choose to use a post-process AA that applies evenly to the whole screen.
I really hope more games will implement what Gears 5 does which is let you select a resolution but can then downscale that resolution while keeping the effects and post-processing at the native resolution.
The developer has to specifically support MSAA, and the multisample part is preceded and followed by other processing which is not sampled at a higher resolution. MSAA started as a speedhack for SSAA which is now not very relevant but still very demanding on performance. On the other hand both the base resolution of games is higher than before, and there's a lot more going on than just hard geometry edges, so post-process AA is just better now.
Nvidia and ATI tried to make MSAA on alpha layers a driver option but it was too much work to support across games.
Not really on 1080p and above. It was a huge deal back when everything was 800x600. Whoever decided not to have hardware AA on the 3DS should be fired
I like FXAA. It does its job of removing jaggies at virtually no resource cost. Looks somewhat blurry by a pixel of vicinity or so, but I'd rather max everything else and have excellent framerates over having to lower other options or getting a shit framerate.
but current gen consoles can't do 16x AF
Good to know. I remember when I first fired up the game with MSAA I was surprised at the jaggies it left uncovered, switched to FXAA instead and it works fine, but I really have to try SMAA injection one of these days to see how well it will perform, mrhandys injector did a great job in GTA4, but perhaps it won't work on GTA5. Guess I have to do sweetfx reshade fuckery then
1080p/1440p with DSR 1.5-2x looks better than any other Anti-Aliasing method. I even think it looks better than native 4k, because when you go native the aliasing immediately comes back.
So I don't give a fuck MSAA is gone, I don't need it anymore.
>Shame you cant get anything in 1080p or higher
CRTs from the 90's were capable of going way higher than that.
if it gives me a gun like in that picture then hell yeah
>switched to FXAA instead and it works fine
Man I wish I could enjoy FXAA like you young people. I just can't get passed the fact that instead of making everything sharp and clear like I'm used to it just blurs the problem area away instead. For me FXAA looks worse than leaving the jaggies unfiltered.
I have a 32 inch 4K monitor so I barely notice any aliasing even with anti-alising off, when I run a game in my monitors native resolution.
I don't like FXAA but I will use it over no AA at all. Jaggies are noticeably bad, but FXAA blur is just a constant thing its easier to tune your brain out of noticing
FXAA + Downsampling gives you the sharpness back.
My rig is pretty old and I have an RX580 8GB so I only turn on AA when it runs well
Resident evil 7 uses TAA+fxaa and it's perfect.
Gotta be honest here. A lot of people hate TAA but it's my favorite anti-aliasing as long as you have a way to sharpen the image in-game or using Reshade.
MSAA misses so much like the leaves on trees and costs a lot of performance, FXAA is just looking at the same jagged edges moving around through a blurry filter, SMAA is only worth it on stylistic games and misses half the edges on most games.
TAA really does get rid of all jagged moving edges through excessive blurring, something that FXAA doesn't actually do, and it's really not bad at all sharpened.
>TAA really does get rid of all jagged moving edges through excessive blurring, something that FXAA doesn't actually do, and it's really not bad at all sharpened.
It creates really bad artifacts wherever there's hard contrast though, weapon optics and tree canopies literally get solid white outlines around them.
It is true that it's not perfect. I guess it's always case by case per game. Like how I downplayed SMAA, but actually in a game like Risk of Rain 2 it's perfect.
FXAA has very little impact on performance nowadays so I'd at least have that.