He has a point you know

He has a point you know.

I still don't know what the fuck anisotropic filtering is except it should always be x16 because it has no real performance impact and looks much better.

Attached: antialiasing.png (568x369, 53K)

Other urls found in this thread:

en.wikipedia.org/wiki/Aliasing
neogaf.com/threads/anisotropic-filtering-in-modern-game-engines-is-16x-af-still-considered-cheap.1359136/
en.wikipedia.org/wiki/Temporal_anti-aliasing
geforce.com/hardware/technology/txaa/technology
en.wikipedia.org/wiki/Deferred_shading
stackoverflow.com/a/34982286
github.com/Microsoft/DirectXTK12/wiki/Simple-rendering#multisampling
vulkan-tutorial.com/Multisampling
developer.download.nvidia.com/presentations/2008/SIGGRAPH/HBAO_SIG08b.pdf
en.wikipedia.org/wiki/Graphics_display_resolution#2560_×_1440_(QHD,_WQHD)
en.m.wikipedia.org/wiki/Multisample_anti-aliasing
amazon.es/UltraSharp-UP3218K-Display-Ultra-Plana/dp/B0727ZQ21F?SubscriptionId=AKIAIPHVZTVH6LZ5BFZA&tag=tech0ae4-21&linkCode=xm2&camp=2025&creative=165953&creativeASIN=B0727ZQ21F&ascsubtag=cbq-1353689129624292711-21
en.wikipedia.org/wiki/1080p
wisegeek.com/what-is-1440p.htm
twitter.com/NSFWRedditVideo

You're a fucking dipshit then.

What is it then. In your words.

It took me years to figure out what SSAO does
I would look at pictures before and after and squint and still nothing

I think I figured it out with Kingdom Come Deliverance, it was a bit obvious there

Attached: 1556643447880.png (527x349, 91K)

Anti aliasing just makes everything look bad and the game run slower.
Aliased art is kino.

Attached: back-in-green.gif (600x450, 14K)

smoothing out pixels.

Wasn't kanye west going to run for president next?

>this is an actual presidential candidate in burgerland

Nice country you've got there, 56ers

Real life art would never be aliased though so anti-aliasing simply tries to make the computer result look closer to reality

What is what? Anti aliasing is removing aliasing (artifacting you get when displaying higher resolution on lower resolution, aka jaggies). Anisotropic filtering is improving textures that are viewed at an angle, most noticeable on ground textures since those are almost always at an angle.

anti-aliasing is the 'make your graphics card go vvvvvrrrRRRRRRRRRRRRRRR and add an extra zero to the power bill' option

Anti-aliasing is only a bandaid
Once resolutions are high enough we won't need it

What does 'aliasing' mean in graphics programming even mean?

>antialiasing is "supposedly" necessary
what a fucking retard

That much eh?

The best example I've seen of anisotropic filtering is Sleeping Dogs (not pictured)
The reason is because at least in the original version of the game, there was no AF setting. When it rained in-game, roads looked terrible in the distance. I forced it on through drivers and there was a huge improvement at no performance cost.
Before that I always just ticked it on without knowing what it did, until I saw a game that was absolute eye cancer without it.

Attached: 3267843-20170208154726_1.jpg (1920x1080, 800K)

AF makes textures at sharp angles relative to your camera look more detailed, it's most noticeable on distant ground.

anti-aliasing is rendering the image at a higher resolution than output and downscaling it to output to make edges appear smoother
anisotropic filtering is making copies of a texture for how it appears from different angles so it looks sharper at oblique angles
without knowing what they mean or how they work, thinking "it should always be x16 because it has no real performance impact and looks much better" makes you smarter than most people and gives you a drastically better experience than consolefags

Seems like more governmental overreach, we should get rid of these so we can decrease taxes

>buy 10000000-dollar graphics card to make the pixels sharper/brighter
>turn on the option that makes it all blurry again
??gamers rise up??

Attached: 6A9SWU3.jpg (960x672, 64K)

en.wikipedia.org/wiki/Aliasing
Read and you shall learn

You know how if you stand on a flat floor in a game the ground texture gets blurrier the farther away it gets from you?

Anisotropic filtering pushes out how far it goes before getting blurry, it makes textures at oblique angles sharper. Would also apply to lots of other stuff but large flat floors are the most obvious way to see it directly.

>artifacting you get when displaying higher resolution on lower resolution, aka jaggies
The vectors, or polygons have theoretical infinite resolution. You're displaying those at a lower resolution.
Take a line, if you draw that at an angle and you have a finite amount of pixels to work with, you need to staircase it up instead of smoothly moving it up. Those staircases, or jaggies, can be noticeable. That's aliasing.
>anti-aliasing is rendering the image at a higher resolution than output and downscaling it to output to make edges appear smoother
Not necessarily, that's supersampling. Most current AA don't do that, since it's less resource intensive.

Maybe you don't know those things because you're stupid

I'm puzzled why games still have a setting for anisotropic filtering. It has zero cost and works with every card that can run the games that has those settings. Why the devs don't just crank it to x16 in-engine and leave it at that.

>Yea Forums is now a bunch of fucking brainlets who don't even know what the settings in graphics options do
Not even surprised...

This tweet actually made me laugh

supersampling is rendering the entire image at higher resolution, multi-sample is detecting edges in the image and only rendering them at higher resolution
image-based methods like fast approximate detect edges and then soften them without rerendering them which is technically anti-aliasing but also STOOPID

>Most current AA don't do that
Most current AA sucks ass because it just smears jelly over the screen and calls it good.

AF off

Attached: 71340_screenshots_2013-11-14_00001.jpg (1280x720, 325K)

lol Nerd

AF on
this was the first time i ever manually enabled AF for a game, while trying different GPU control panel settings, and i thought wow this shit's magic i'm always using this now.

Attached: 71340_screenshots_2013-11-14_00002.jpg (1280x720, 363K)

I just set everything at max because i'm not poor. Imagine being frugal with your fps "i better not put shadow detail to max i wont be able to pay my electricity bill" stay poor.

>2013-11-14
jajajaja

nice dude bro science

But do you know how it works?
And what is a mip map?
That is an interesting question.

That subject is well researched and I think that kid is being sarcastic.

anybody who meets the only 3 requirements can run for president. Even some high school dropout could run. No party would represent him, but he could run.

>setting post processing effects to max
You won't be rich very long being such a brainlet.

usually better angled shadows, most noticeable on grass and shit

>motion blur
>DoF
Ok man you do you

It's not 100% free, see neogaf.com/threads/anisotropic-filtering-in-modern-game-engines-is-16x-af-still-considered-cheap.1359136/
Specifically a bit down for actual comparisons. Seems like there's about 6fps of difference generally. Personally, I don't notice the difference between 8x and 16x generally, so I'd take the extra frames.
>But do you know how it works?
Not him, but does it matter? It's some specific filtering to make angled textures less blurry. Same for mipmaps really, they're important and do shit internally, doesn't matter what specifically.

Supresampling works a bit differently than that.
Rendering something at a higher resolution, for example doing 4k on 1080p monitor will only make jaggies worse, MUCH worse.
It renders it at a higher resolution to smooth out colours.
The bigger resolution it renders it at originally the more accurate the colours are.
But it's a really shitty way to do anti-aliasing cause it's old and basic, and really outdated.

Attached: index.png (225x225, 4K)

what is ASSAO guys

like regular ass, but plumper. trust me, ive never had sex but i'm a respected contributor on the wikipedia page

YFW you realize that the higher res (i.e.: the more pixels) you go, the less AA is needed

at 4k there's no point to even go beyond x4 AA, there's literally no visual difference. I wouldn't even put it on x4, x2 is totally fine at 4k. AA exists for lower res, even at 1080p x4 is fine for most games and anything beyond has no benefit to visuals.

alright and comapred to HBAO+ ?

hbao to u 2 ;)

Adaptive SSAO, I'm guessing it's a less resource intensive version, or one that dynamically adjust the severity at least to better manage resources.

>pokes ur belleh

>tfw MSAA doesnt work on modern games because most games use deferred rendering now so youre stuck with a shitty TAA/FXAA anti aliasing that just smears vaseline everywhere

Attached: 1485757467548.jpg (807x659, 37K)

pixel size and the numbers are important.
If you manage to downscale somehow the 4k monitor to the size of 1080p monitor then you don't need the AA at all probably.

TAA is literally the best form of anti-aliasing shut your mouth
FXAA is garbage though and TAA should be backed up by a higher sample rate or upscaling instead

Most people probably don't know what DLSS does

Even worse now that AMD has its own application-agnostic thing.

Attached: nvidia dlss.jpg (1533x869, 447K)

TAA produces some nasty ass artifacts in many games

i havent kept track off it but isnt it basically checkerboarding?

If I was gaming president the first thing I would do would be the banning of Motion Blur, and make Subtitles be on by default.

kek
alright ty

my first act as Gaming President would be to delete this website.

In motion it's a lot better, and at least it doesn't eat up 20 - 40% of your FPS at all times. TAA makes things look smooth even at 1X and performance is kept high.

no motion blur is ok if it's done properly and not to mask low framerate

TAA is an alternative to motion blur, would you rather have that or a choppy animation?

subtitles are too distracting, I'm always reading ahead of the actor so it defeats the point of there being voice acting in the first place.

Based.

hmm yes let me just alt-tab to wikipedia to find out what all the fucking abbreviations mean in the fucking options page

like, why isnt there a 'just make it fucking look good' switch, instead of making some incel feel smarter than normies or whatever

If you are computer illiterate you are beneath me as a human being

This but also default invert y-axis
anarchist btw

thats why games have presets you dumb fuck. Just put shit on high/very high or ultra.

Something like that, it's AI based upscaling.
It'll theoretically produce a nicer image than traditional upscaling techniques such as checkerboarding by using stored samples of higher resolution images to fill in the missing pixels more accurately.

nigga what the fuck is you talking about

Not this election because he doesn’t want to interfere with Trumps 2nd term lmao

>subtitle are too distracting
I don't get it. If you know how to read they take a second to look at, while still being able to play the game. Hell your brain end up doing it automatically you don't even notice them.

Are you okay?

Ubisoft knows what's up, they have little image previews in the options menu to show what exactly each setting does.

>Turn on Anti aliasing
>play game
>turn off anti aliasing
>play game
>notice difference
bros....what happened....

thats a good idea

Stop being racist biggots Yea Forums aliasing has a right to exist. You being all anti-aliasing is what lead to the genocide of 60000000000 pixels. It is our fight to prevent you biggots from ever rising to power again. We will undermine your graphics cards and infiltrate your drivers and do what we can to take them down. Why don't you fight for us and wipe out the ray tracing that is in our god given place?

What I want to know is, why don't we just take the GPU power used for anti-aliasing and increase resolution instead?

The point of anti-aliasing is to reduce the appearance of blocky pixels, but increasing the resolution should do that too, no?

where can i download pictures of girls boobers

en.wikipedia.org/wiki/Temporal_anti-aliasing

Yea Forums

Anti aliasing in the most basic form take edges and smooth the serrations caused by pixels.

So in this picture you see the shield is blocky, if this was in a game the white back would be see-through and invisible so all you would see would be the black blocks. Anti-aliasing adds colors between the black and the background, anti-aliasing here would check the color of the background and pick some colors between the black shield and the background in an ongoing process.

Super sampling is another form of anti-aliasing that effectively runs the game at double your screen resolution, this looks really nice but is a huge resource hog akin to running the game at double the screen res. So if you run the game at 1920, super sampling effectively runs it at double that and scales it down to get rid of jaggies. So if you are gonna use super sampling try use adaptive instead since it won't destroy performance but still make the jaggies smoother.

Attached: anti aliasing.jpg (474x546, 45K)

but i hate that place

I'm talking about cutscenes mainly, but again, whats the point of voice acting if I already know what they are going to say.

yeah but increasing resolution cost more resources

>2019 and devs still focus on useless shit instead of focusing on draw distance and reducing pop in
>>''IT LOOKS SO PRETTY WITHIN A 10 FOOT RADIUS''

Attached: 1505280321255.gif (470x360, 2.34M)

TAA is not an alternative to motion blur and its lack (or the lack of motion blur) does not introduce choppy animations.

Congratulations, you've discovered anti-aliasing

GTA V's piss poor render distance on shadows and filtering is coming back to me.

Attached: 1334803112401.png (1000x1000, 550K)

>(artifacting you get when displaying higher resolution on lower resolution, aka jaggies)
That's not why jaggies exist though, rendering at a higher resolution is literally how they get rid of jaggies in some forms of AA

>why don't we just take the GPU power used for anti-aliasing and increase resolution instead?
That's literally how SSAA works

Anti aliasing is just blending adjacent pixels to dejaggy the picture

geforce.com/hardware/technology/txaa/technology
>TXAA is a new film–style anti–aliasing technique designed specifically to reduce temporal aliasing (crawling and flickering seen in motion when playing games).
Wut

This is why any anti-aliasing method that isn't MSAA is cancer.
>FXAA
>just makes everything blurry
>SSAA
>renders the entire image at multiple times the resolution, resulting in massive performance cost
>MSAA
>gets rid of any noticeable aliasing while only having a modest performance cost
I really don't understand why so many modern games have gotten rid of MSAA in favor of FXAA or SSAA.

anti-aliasing doesn't increase resolution, it just BLURS the pixels. you are simply playing a blurrier game when you use it.

Yes, it's designed specifically to reduce temporal aliasing (crawling and flickering seen in motion when playing games).
Not to be an alternative to motion blur.
Are you done with non sequiturs?

>anti-aliasing doesn't increase resolution
SSAA does.

SSAA literally increases resolution and then downsamples it to your monitor's native res.

It's the most brute version of AA and it works best but makes games unplayable.

>The absoltue state of Yea Forums's computer literacy

>but increasing the resolution should do that too, no?
no it won't do that.
It will look even more blocky and jaggied if you do that.

It's definitely why jaggies exist. Just rendering at higher resolutions without any proper downsampling would also introduce aliasing. SSAA isn't just rendering at a higher resolution, the next step is to properly downscale it to minimize aliasing. Besides, I think you're thinking only 1 or 2 resolutions exists in a game, while most things have a resolution.

when in doubt, blame it on Nvidia since they're probably the culprit.

DLSS is great until you see a object with thin details like wire fence or car's grille turn into mess as you get farther. But for the performance it gives I'm willing to tolerate it in games that I can't push to 4k@60fps at max.

I hope we can all agree that post process AA should always be turned off if possible. Need my games crispy yo

SMAA is better than no AA

Motion blur is often used to mask low framerate and temporal aliasing isn't it?

Yes. Consoles 101.

>Just rendering at higher resolutions without any proper downsampling would also introduce aliasing.
Yeah okay that's fair enough

It's because deferred rendering is the norm nowadays, which doesn't work well with MSAA.
en.wikipedia.org/wiki/Deferred_shading
stackoverflow.com/a/34982286

>substituting solid pixels for blurry pixels
I seriously hope you guys don't do this.

Why must their be 5 different type of AA? And why not just give a small expiration in the option menu to help understand it.

don't agree but also I play at a decently high res to begin with (1440p)

>Motion blur is often used to mask ... temporal aliasing isn't it?
No, the closest thing to temporal aliasing would be screen tearing, which motion blur does not fix.

>Motion blur is often used to mask low framerate
Yes, and TAA is not built to do that.
>and temporal aliasing
No, motion blur does not reduce crawling and flickering.

This thread is a perfect example of how retarded people on Yea Forums are.
Well maybe not retarded but they know fuck all about computer graphics and genuinely believe that supersampling is a good technique cause IT RENDERS IT AT A HIGHER RESOLUTION BIGGER NUMBERS = BETTER RIGHT???!1

Attached: 5e2.jpg (554x439, 38K)

>Yea Forums is completely tech illiterate
always a somber reminder of who I share this board with

>No, the closest thing to temporal aliasing would be screen tearing
What the fuck?

Attached: 1315598260938.jpg (526x300, 52K)

He could be president and he still wouldn't nearly be the most stupid president the US has ever had.

>Why must their be 5 different type of AA?
Different strengths and weaknesses, and different preferences. Some people are fine with TXAA, others prefer MSAA or nothing.

>And why not just give a small expiration in the option menu to help understand it.
Diipshit devs that don't understand how to UI. Or ones that don't care about PC and just include the options because they have to.

Because traditional MSAA is not compatible with modern graphics APIs, they don't use it because they cannot use it.

>bigger numbers aren't better
Get a load of this faggot.

Temporal resolution is framerate, too high resolution leads to aliasing, too high framerates leads to two frames in a single refresh (aliasing). I'm thinking he meant temporal anti-aliasing, which is something completely different.

Wrong.
github.com/Microsoft/DirectXTK12/wiki/Simple-rendering#multisampling
vulkan-tutorial.com/Multisampling

It is though. It's the best option you have, though almost always not a valid option because of the cost. In older games it's probably a good choice.

That user had the right idea but wasn't exactly right. See

>like, why isnt there a 'just make it fucking look good' switch
You mean the "Low, Medium, High, Ultra" settings that every game has? Are you retarded?

Supersampling IS a good technique if you can run it.

That' s a guy? Why the lipstick and earings?

anti-aliasing is completely negligible

its one of those things you need a magnifying glass to notice

>Different strengths and weaknesses
seems like a load of bullshit to me. like if there's a fundamental, noticable difference when using KRAZ over VOIT, why even have the option for VOIT in the first place

Not really.

Yeah well it isn't bullshit, it really is the state of affair.

Because VOIT might be less blurry, or less resource intensive, while leaving more aliasing.

fpbp

I can't even begin to describe how low my opinion of ignorant console fags is. They should be shot.

>I still don't know what the fuck anisotropic filtering is except it should always be x16 because it has no real performance impact and looks much better.
Consoles have limited bandwidth, especially the PS4 since it uses GDDR5, that's why early this gen PS4 lacked any AF in some cases compared to Xbone version. While 16xAF is obviously the best, in motion there's no much difference between 4x and 16xAF.
On PC AF is no problem though.

more and more i suspect 'pc gaming' on this website is just a fig leaf for 'i have the specific kind of add/autism/whatever that compels me to spend 1000000 hours learning exactly what all the individual switches and buttons do, instead of playing the damn game'

You type like a high school dropout

Imagine being proud of being ignorant

I learned 90% of typical options menu on the first game I bought for pc. Amazing how little effort brainlets will give to something as simple as googling

Absolutely wrong. It depends on the type of anti-alising, of course, but aliased graphics will always have the wrong colour information for the colour space that is rendered. Higher resolution won't fix that a colour space that should contain 50% black and 50% white is rendered as either black or white, when it should be grey.

When an object moves at a high speed and you film or render it at low fps (by low i mean not 2000FPS) it looks choppy.
This is called temporal aliasing.
TAA eliminates that by calculating a function of time of a dynamic object and then figuring out which pixels this function covers (and then applying colours).
It can also be used for each pixel which somewhat eliminates jaggies.
The downside is that TAA only works in motion. Objects will still have normal aliasing when they're still.
You can apply FXAA which is actually a really cool and smart anti-aliasing method and there ya go.
Modern AA in games and movies.

It doesn't take very long to google what Anti-Aliasing or Anisotropic Filtering filtering does. My retarded 11-year-old self learned what they were as soon as they were introduced to Runescape back in 2008.

It's more "spend 10 minutes reading about X subject to know the best setting to do and never need to worry about it again"
Look at this thread most here just pick it blindly and move on which is fine but if you are paying $2000 for a computer makes like sense to not do some research on stuff like AA

>1996
>Ugh, so much aliasing at 640*480, good thing it'll be gone when we reach higher resolutions
>2006
>Ugh, so much aliasing at 1024*768, good thing it'll be gone when we reach higher resolutions
>2016
>Ugh, so much aliasing at 1920*1080, good thing it'll be gone when we reach higher resolutions
>2026
>Ugh,

The left is how my eyes work in real life though. It's more realistic.

blind retard

>his eyes don't have AF
Enjoy being literally disabled, breh.

So, motion blur can be used to counter temporal aliasing, unlike what I said previously?

The left isn't how the fucking ground works in real life though.

If you need 1000000 hours to learn what a few acronyms mean you might be retarded.

>You can apply FXAA which is actually a really cool and smart anti-aliasing method
ah yes I too prefer my monitor to be smothered in vaseline

Attached: image.jpg (600x400, 83K)

You're both idiots.
Anti-aliasing reduces aliasing, where pixels on the edge of an object become jagged because they don't exactly align with the grid of pixels being rendered.

Anisotropic filtering filters textures being rendered at an oblique angle with respect to the camera position, where parallel lines on the texture don't align with the grid of pixels, so they don't become blurry garbage as the texture fades into the distance.

Attached: Anisotropic_filtering_en.png (1300x615, 1.67M)

Sad thing is, I bet a high school drop out loser would be able to run the country than any fucking establishment shithead they have now

SSAO is a shitty way to emulate global illumination.
HBAO is an upgraded SSAO.
Here's the presentation if you're interested
developer.download.nvidia.com/presentations/2008/SIGGRAPH/HBAO_SIG08b.pdf

This "Ace Watkins" character doesn't look over 35.
I think I can still see pimples on his profile pic there.

trilinear is more realistic

I'm not talking about visuals i'm talking about it's technical side.
It's a very smart way to do AA.
Much smarter than MSAA and supersampling.

AA is only needed if youre playing below 1440p. up to a 27 inch monitor theres no AA needed on 1440p. 4k is great for 30+ inches but then mostly youre sitting further away on such sizes to notice anyway. If you're playing at 1080p in the range of 22+ inches no amount of AA will save you.

Attached: 332.jpg (585x586, 72K)

When I played Pokemon snap on N64 years ago my mind was fucking destroyed by how realistic the graphics were. We are in the age right now of diminishing returns, were exponentially more effort is required to get even the smallest graphical boost. But I see that as a good thing; I predict that the next decade will play more with unique art styles, color palettes, and visual design or optimization to render more objects at the same time (for example large unit sizes in strategy games) as developers are forced to differentiate themselves from all their competitors and can't rely on just intense detail to do it.

The whole point of AA is to make visuals look better. If one form of AA (MSAA) results in a better-looking image than another form of AA (FXAA), then it's a smarter way to do AA.

more realistic is to use simulated materials like sand, dirt and stone and not triangles covered with textures.

>GTA V's piss poor render distance
i reinstalled it because i thought there was an issue with the game, but its literally made like that, fuck me

>1440p
Not an actual standard resolution.

It's a good compromise, between the quality and performance, using machine learning to upscale rather than the traditional supersampling method of over-rendering every frame.

The smaller the pixels are the better.

So?

You just have me an idea make some aliased as shit sculptures

Let's see, you're European? Are you a member of the European union?

You're not saying anything meaningful.

I play 1440p and there are still noticeable jaggies. I still prefer to play with aa off though if there's only post process aa available

Maybe you haven't noticed, but a meme candidate is already in office. Everything is possible. Sucks for you that your country isn't free enough to have meme candidates.

I have a 24 inch 1440p monitor and it is absolutely noticeable when there's no AA

>Try to create a videogame
>Ten billion random abbreviations you need to learn
>Then you need to implement them unless you want your thing to look like dogshit
Shaders were a fucking mistake. Give me the fixed function pipeline back.

Attached: 1547305569827.png (480x480, 212K)

What size is your screen? because its all about PPI

>this thread
If anyone needed any proof that Yea Forums are retards who have no idea what they're talking about then here it is.

Not that guy, but neither are you. Did you even read his post? Where did he claim 1440p is a standard resolution? What the fuck are you trying to say?

It's not distance blur, moron.
You're obviously too young to remember the time when you couldn't even run Anisotropic filtering on most games because of the performance penalty, Bilinear filtering was the best you could often get, but most games initially were MIP mapped, where they actually made several different levels of detail for a given texture that was shown depending on view distance.

It looked like ass, there was literally always a sharp line a certain distance from your viewpoint on every texture.
Trilinear filtering blurred the line a bit better, but you still lost detail.

To add to this, AF and trilinear work mostly on mipmaps, which are smaller resolution textures generated by your cpu, they're both a performance-saving measure and fidelity-enhancing one, because at longer distances regular textures have too much detail, so they go from a nice pattern to highly pixelated shit that looks like a nest of ants when moving. Make a giant cube of cobble in minecraft with the old texture and mipmaps & filtering off, then look at it and see it flip out.

Why does it matter if it's a standard resolution or not? You can still display games in it

>trilinear is more realistic
Are you retarded?
Does ground get more blurry the further away it is from you?
If so buy fucking glasses.

Creating games isn't for brainlets, user

Can't remember fully but its definitely not above 27 inches and pretty sure its below that

it's dumb that people just call any resolution [height]p when it's meant for TV resolutions and shit, just say 2560x1440
he's not contesting using the resolution, just what you call it

they're all bloat, gaming peaked with RPG maker 2003 anyway.

I have a PhD in computer science.

>Not an actual standard resolution.

Everyone here without some form of autism knows what 1440p means.

Yeah, if you're myopic in real life.

>Not that guy, but neither are you. Did you even read his post?
Yes.

>Where did he claim 1440p is a standard resolution?
Nowhere, didn't say he did. Did you even read my post?

>What the fuck are you trying to say?
That his post doesn't mean shit if he uses made up names.
If you just use names that don't mean anything, then you're not saying anything.

>You can still display games in it
Well no, because it technically doesn't exist.
Or WQHD.
Okay? People generally know what you mean when you say you turned on aliasing (you meaning AA in that case), even though that's fucking wrong.

1440p refers to 2560x1440 though. That's pretty common knowledge. Obviously you know what resolution it refers to so why get so anal about it? Is this autism?

Not sure where the logic comes from in abandoning the naming scheme for 720p and 1080p. 1440p is the next step up from 1080p; it only makes sense to name it the same way.

You know exactly what people mean when they say 1440p though. And it's not like it's inaccurate, the 'p' still means something

en.wikipedia.org/wiki/Graphics_display_resolution#2560_×_1440_(QHD,_WQHD)

>>QHD (Quad HD), WQHD (Wide Quad HD), or 1440p, is a display resolution of 2560×1440 pixels in a 16:9 aspect ratio.

Ironic calling it 1440p is more descriptive of what you're actually talking about.

>If you just use names that don't mean anything, then you're not saying anything.
But it does mean something, retard.

I assume that guy's autism is triggered because there's more than one resolution with 1440 vertical pixels.

>People generally know what you mean when you say you turned on aliasing

Literally nobody says this.

By your logic, 1080p is also incorrect and it should always be referred to as 1920x1080 so it doesn't get confused with the other 1080p resolutions.

To work in any industry you still need to learn shit.
No university will help you.

>1440p refers to 2560x1440 though.
No it doesn't, not in any standard.
There's no interlaced 2560x1440 for example, no need for the progressive abbreviation.
No it isn't, 4 times HD is plenty descriptive.
Not in any standard.
People definitely do. But fine, how about "I could care less".

>Not sure where the logic comes from in abandoning the naming scheme for 720p and 1080p.

They were only used in TV broadcast because there were other formats, such as 1080i that needed to differentiate.
HDTV can mean 1080p, 1080i or 720p, so if you're buying a new TV, what are you actually buying?

Well no, 1080p is a standard. In contrast to 1080i.

If all you have to show for your intelligence is a meme degree that's the equivalent of a construction worker for computers I've got bad news m8.

I refuse to believe someone can be this retarded, not the guy in the cap, I'm talking about you OP.

How is this shit not instantly noticeable, it's an approximation of a lack of lighting in points where ambient light will fail to fill that space, think corners, or where objects meet. It improves the sense of depth within a scene considerably, Ambient Occlusion itself isn't even new either, older games would bake it into their scenes, new games do it dynamically.

And you'kre any better, bashing on a cripple? At least he managed to post despite being blind, what the fuck have you done?

>HDTV can mean 1080p, 1080i or 720p, so if you're buying a new TV, what are you actually buying?
Anything with a higher resolution than 720p. But most 1080p will differentiate with FHD, which doesn't apply to 1080i or 720p, since that will attract the "premium" users.

Dude, you probably need to be checked for autism. Assuming you haven't already been diagnosed

"p" is the to not confuse 1080p with 1080i.

en.m.wikipedia.org/wiki/Multisample_anti-aliasing

You can already see that it's almost non-existent at 4k. Soon enough we'll be at resolutions even higher than that.

its not the bandwidth thats the issue on consoles.
most consoles are a single soc that has the cpu and gpu connected as such communication is really quick. actually often quicker than on pc since communication happens through pci-e lanes. also GDDR5 is incredibly fast.

the issue with consoles is the cpu and the system ram. manufacturers have to take heat into account in a closed system and have to design a console that runs the same in any condition for any user. as such they skimp on cpu power. developers work with a single system that always behaves the same. on pc its different. the different apis and varying hardware that is between the game and the system is offset by cpus and gpus that are multiple times more powerful than on consoles. so optimization is a moot point for most developers. the reasoning being "meh. its strong enough". on consoles however the very limited cpu cycles have to be properly sanction to whats going on.
ever wonder why games are getting bigger so quickly?
consoles arent strong enough to decompress assets on the fly like on pc. so to save on cpu cyles that need to decompress assets files simply arent compressed at all. the same applies to the graphics unit on consoles with things like anti-aliasing, filtering and other effects.

Attached: boomer greg.jpg (1029x1000, 275K)

You can google that information by yourself you know. This pretty much confirms that twitter users are mostly lazy fucks that's too lazy to even search about information necessary.

>m
get out

It's no more a standard than 1440p. Both technically refer to a family of resolutions, not one specific resolution. But when you hear 1080p, you know it refers to the 16:9 resolution. So why should 1440p be treated any differently? Oh right, because you are literally autistic.

>Anything with a higher resolution than 720p

720p is still classified as HD. Full HD was still used for 1080p, but it's still easy to confuse with 1080i which is not technically "Full HD".

The 'p' stands for progressive scan, while the 'i' stands for interlaced. They're not just random fucking letters chosen arbitrarily

I'm at work off the clock waiting for someone to show so I can leave

I'm not even referring to i vs p. All I'm saying is there are several 1080p resolutions with a different number of horizontal pixels. So, if 1440p is "not a standard", then neither is 1080p.

You don't know what the p and the i refer to, do you?

>It's no more a standard than 1440p.
It absolutely is, stop talking shit.

>Both technically refer to a family of resolutions, not one specific resolution.
Not true, 1080p is 16:9.
Yeah that's what I meant, higher than or the same as 720p. My bad.

Did you know that the driving in that game was developed by a gay rattlesnake?

Attached: 619A7A1B-9845-4A27-989B-6CDA2635D30E.jpg (465x430, 28K)

8k is a meme.
Just look at the size and the price
amazon.es/UltraSharp-UP3218K-Display-Ultra-Plana/dp/B0727ZQ21F?SubscriptionId=AKIAIPHVZTVH6LZ5BFZA&tag=tech0ae4-21&linkCode=xm2&camp=2025&creative=165953&creativeASIN=B0727ZQ21F&ascsubtag=cbq-1353689129624292711-21

en.wikipedia.org/wiki/1080p
>1080p (1920×1080 px; also known as Full HD or FHD and BT.709) is a set of HDTV high-definition video modes characterized by 1,920 pixels displayed across the screen horizontally and 1,080 pixels down the screen vertically
Stop bringing up non 16:9 resolutions as if those are also 1080p.

Bleeding edge tech is expensive who would've thought.

>Spain

Fucking wetback

>Not true, 1080p is 16:9.

>implying 1440x1080 doesn't exist
>implying 2560x1080 doesn't exist
>implying 2160x1080 doesn't exist
>implying 1728x1080 doesn't exist

All of these are 1080p, you dumb fucking retard. I'll say it again, both 1080p and 1440p are a FAMILY of resolutions. Is any of this getting through that thick fucking skull of yours, you dense mongoloid?

>>implying 1440x1080 doesn't exist
>>implying 2560x1080 doesn't exist
>>implying 2160x1080 doesn't exist
>>implying 1728x1080 doesn't exist
No, just saying they're not part of the 1080p standard.

>All of these are 1080p
Nope. Provide the source that says they're part of the same standard.

>There's no interlaced 2560x1440 for example, no need for the progressive abbreviation.
Sure, but it's still accurate to call 2560x1440 1440p. It's a convenient shorthand. Nobody is going to say or type the full resolution when you can simply use the vertical resolution and still get the message across. Plus, by adding the p, people know you are referring to the resolution.

Attached: Screenshot (78).png (382x546, 18K)

>All of these are 1080p

No, they're not.

You're fucking wrong, just use the name, WQHD.

i hate all of you autistic faggots

I'll be sure to tell YouTube to rename their 1440p option to "WQHD", then. Sorry.

Go cry about it.

>there are several resolutions with 1080 vertical pixels
>there are several resolutions with 1440 vertical pixels
>1080p refers to the 16:9 resolution that has 1080 vertical pixels
>1440p DOESN'T refer to the 16:9 resolution that has 1440 vertical pixels

Autism.

Go try that, but google would most likely prefer to be retarded and pander to dipshits.

Only a fucking faggot loser little kid doesn't know what antialiasing is. Jesus christ, you losers play on PC and don't know what that is? God damn it I hate all of you faggots

Yes, thought 1080p is named that for a specific reason that doesn't exist with WQHD.

>implying Youtube is a credible source when it comes to resolution standards

lmao

8k is a meme, but it helps drive the price of displays down, and the power of graphics technology up, you could ignore 8k as a whole and still benefit from it.

>people still denote progressive with a p
I don't get this sheer retardation, we need to try and convince normies that it means pixels.

Okay, I'll just start using 1080i and call it 1080p then.

>google "WQHD"
>1440p comes up

hmmmmmmmmmmmmmmmmmmm

Attached: hmmmmmmmm.gif (1080x1080, 504K)

Are resolutions video games?

Anti aliasing makes the edges smoother to explain simply. So instead of jagged edges you'll have smooth ones. How you think it makes games looks bad makes my brain hurt. You might need glasses or eye surgery.

Yes, people are wrong on the internet. Besides, it comes up in the context of "also known as", which denotes that it's referring to people calling it that, not necessarily with a good reason.

wisegeek.com/what-is-1440p.htm

WQHD = 1440p

CO CO CO CO COMBO BREAKER

Attached: Screenshot_20190616-201115.png (1920x1080, 1.11M)

>not necessarily with a good reason

You know, conveying thoughts using words seems like a pretty god damn good reason to me because everyone and their fucking grandma knows that 1440p means 2560x1440. This shorthand only bothers you because you are literally autistic.

>720p
>1080p
>1440p
>2160p
logical smooth sailing

>HD
>Full HD
>QHD
>4k
arbitrary retard roller coaster

Wouldnt it be 4k UHD then 8k UHD?

i just wanna go back to simple AA instead of having all these retarded movie AA filters
>2x 4x 8x

If those numbers are so logical give me the next 3 sequential resolutions

lmao, that's not an actual source, that's a dipshit catering to retards.
We've been over this already.
>HD
>FHD
>Q(uad)HD
>UHD

real life has no subtitles you faggot
learn to hear

>provides no source for his claims
>y-your s-source is b-bad!

lmao, pathetic

>me reading this thread
y-yeah sure guys I get it
>puts preset to "very high"

Attached: 1561309001058.jpg (719x601, 49K)

I think its UHD for 4k and UHD2 for 8k

it's logical because it's obvious which resolution is bigger than the next

I provided a source already. I'm not going to repeat myself.

It’s worth knowing, there are effects you might want off purely for your tastes

Same, then i go into the advanced just to toggle motion blur, Chromatic Abortion and bloom off.

Imagine literally having autism

More games need Adaptive Resolution setting like Titanfall 2

Then point me to that post because it wasn't in a reply to me. Do you mean the Wikipedia article? That's your source? Will it shock you to know what the source I provided was from a source on Wikipedia? fucking lmao

No they fucking dont. Just turn your settings down

>Then point me to that post because it wasn't in a reply to me.
Learn to read please.

>I don't actually have a source

Okay then. I win. Sucks to be you lol

> FXAA + Sharpening

solved it for you

Nah, rather just turn down the sample rate in Realtek HD Audio from 192kHz to 48kHz

You're just putting words in my mouth now, shut the fuck up.

I'd rather have a setting that temporarily changes a setting from ultra to high to keep a stable fps instead of messing with the resolution

Attached: puke.webm (720x1276, 1.9M)

>old good
>new bad

Didn't RAGE do that, just very aggressively?

Someone fill me in what the difference is between HDR 10 and HDR 10+ and what the fuck is HLG?

>y-your source b-bad!
>b-but I don't actually h-have a s-source
>s-shut up!

lol, stay mad kid

Attached: 318271da980706f7a18a811c3456a77d.png (633x758, 16K)

>there are people ITT who don't know what resolution 1440p is

You all need to go back

a bunch of bullshit and who fucking cares

HLG is basically HDR that can be broadcasts over traditional television means, which other forms of HDR cannot

HDR 10+ is a propriety version of HDR, like Dolby Vision. The main gimmick is it adds more data for TVs to use to adjust the image on playback

Why is texture filtering even a setting when everyone just wants it at max since it costs 0 FPS? Why didn't PS3/360 games (usually) have much AF since, again, it doesn't cost any performance and gives very noticeably improved quality?

Attached: 1562263626163.png (404x404, 118K)

I learned what AA is with the game AA2

>it costs 0 FPS
It doesn't tho on low spec systems with low vram

turning down AF from 16x to 4x gave me 10 fps in RotTR

Writing shaders is fun.

upscaling with nearest neighbor would make something more visibly aliased at higher resolutions. If the internal resolution itself is cranked up higher, it wouldn't.

user, why do you think he said "this was the first time i ever manually enabled AF for a game [...] i'm always using this now."

4K used to be expensive too.

>not playing all your games on Very Low settings to own people on the internet
hahaha fuck you guys, bet you feel stupid now

>just filter all of it xD

>Why is texture filtering even a setting when everyone just wants it at max
PC Master Race Gaming is all about having a billion garbage options and one option that looks nice. like what if you WANTED your game to look like garbage?????

that's a huge amount. That is nowhere close to 'free'. That is like what I expect a setting with moderate to heavy impact to have

Are you actually autistic user

he was referring to you saying that about whoever wrote that article was pandering, you fucking mongoloid

Shut. The. Fuck. Up.
Who the fuck cares about muh shadows? Stupid fucks like you are why Sony makes movie like experiences instead of games. Always MUH GRAPHIX with you faggots. Back in my day, all you needed was a few pixels and a game could look gorgeous. Now every game looks the same, plays the same too. Devs focusing on the presentation of a game instead of how it plays, the game always ends up shit. How about you go outside and have fun with your precious AA, faggot. Leave my video games out of it.

God leftist """"memes"""" are so cringe.
What happened to this place these past few months?

Attached: 1562987721655m.jpg (1024x777, 99K)

Being president isn't about qualifications, its about dedication for your fellow countrymen. Keeping your promises. Being strong and firm. Not backing down from other world leaders. Defending your land from invaders. Sadly, most people don't give a fuck about their others, never-mind their country. These same people get butthurt when someone with power acts like them.

t. seething poorfag

>Who the fuck cares about muh shadows?
Right here!

Attached: 1548063500632.jpg (462x500, 46K)

>Only the left could meme about gamer government.
Nice bait.

>Even some high school dropout could run. No party would represent him
One party already has.

Attached: My list of crimes is THIS big.jpg (700x700, 99K)

>volumetric clouds
>no difference in quality levels
wtf

>make Subtitles be on by default
Most games do that already.

This. The only time I turn subtitles on is if the voice actor is slurring their words or something and I can't understand them.

>I don't get it. If you know how to read they take a second to look at, while still being able to play the game.
And knowing what they're going to say before they say it takes a lot of impact out of their line delivery.

>Hell your brain end up doing it automatically you don't even notice them.
That's the *problem*. I don't like that my eyes are reflexively drawn to them even when they're not needed. It means I might miss something visual going on in the scene.

Or when they bear no resemblance to the actual spoken line.

Attached: That's not what I said.jpg (926x1024, 122K)

>Right here!

Attached: 99e4f635663da3d1f28853f631246771.gif (448x252, 1.83M)

Every fucking game uses TAA now because of consoles, so I always have to turn it off.

I'll take jaggies over fucking blur any day of the week

I noticed this really bad on MHW for pc

It's obviously a variation of the gamer joker meme and that is so fucking reddit

>upscaling with nearest neighbor would make something more visibly aliased at higher resolutions.
But it wouldn't introduce aliasing.

>If the internal resolution itself is cranked up higher, it wouldn't.
It would if you don't downscale properly.

I want to fuck that cat

What is LOD fading?

Youre a closeted homosexual who projects his need for daddy's big cock into his bootlicker political views. Kys.

Probably at what range detail fades to preserve performance.

I'm guessing it's a gradual fade between between different LOD (level of detail) models. They use different models for different distances, you don't need high details at long distances. Fading between them would make pop-in less noticeable.

A lot of games got it tied to texture quality, which makes sense.

SMAA is the only acceptable post-processing AA.

>defining a word with the same word

nigga you just had to say makes sharp edges look smooth thats it

I don't define a word with the same word mate.

>makes sharp edges look smooth
Doesn't explain it completely.

that you grandma?

*sips mountain dew* yeah i play with DX7

what terrible game would this give you a tangible performance improvement in

we found him, we found the person that bought an mcable

i care about shadows. 8192 is the minimum acceptable shadow map resolution.

That's just one thing anti-aliasing may prevent, and it's not like anti-aliasing should purposefully make sharp edges look smooth either, especially in the cases where you are supposed to be shown a sharp edge.
To keep it simple, aliasing removes information, anti-aliasing tries to reduce the effect of this information being removed or add the information back in.

>Keeping your promises.
wheres the wall, why isnt hillary in jail cletus

>acceptable post-processing AA

Attached: 1486177266587.jpg (307x352, 25K)

what

look at that tight bodied little slut

based

so consoles don't decompress their assets? that's interesting.

I remember when Titanfall was released and 50GB of the install was uncompressed audio so that low tier computers could run it better.

Ok but why tho? What does Anti aliasing have to do with the state of the USA

Yes, in real life you automatically focus on something. So why do you also need the game to focus on something too? This is why I hate this like DoF and motion blur. Your eyes are already not focusing on things that they are not fucking focusing on. Absolutely no need to blur them out when you aren't even looking at them directly.

And on top of it, the game has to assume that you're always looking at the same spot your character is, so if you do decide to look around independently you get an eye full of vaseline. Fuck these stupid options.

Depth of field isn't trying to simulate the eye focusing on things. It's trying to simulate depth of field as used in cinematography. In movies depth of field is used to reduce visual clutter and make you automatically focus on what the director wants you to look at. It adds a sense of depth to the shot.

Just like any effect it was used poorly in the early days and by some games nowadays but mostly it's pretty good now. Games hold off on adding any heavy DoF unless it's a fixed camera cutscene.

but video games aren't movies.

I know I was playing a few games semi-recently that had DoF and it was fucking distracting, having to constantly make my character look at the same thing i wanted to look at. Or god forbid I'm walk into a bush or something and have a perfectly crisp and clear leaf on my screen while everything else looks like a concussion.

might just be playing the wrong games though

AA it's the very first meme that I turn off in any game. He has my vote.

Attached: acceptable.jpg (710x888, 177K)

>720p
>1080p
>1440p
>2160p
>logical smooth sailing
what when you talk about 21:9, 16:10 res? wouldn't those get confused with 16:9?

You are kinda an idiot too. aliasing doesn't only happen at geometric edges. If you had another render of that scene without any antialiasing or filtering, you'd see aliasing that looks similar to pic related. Pretty much all anti aliasing methods (excluding anything that specifically targets only jaggies) will deal with that. Super sampling will deal with it nicely especially if you are doing 8+ samples per pixel. A cheap way to deal with it is mipmaps, i.e. using downsampled, "blurrier" versions of the texture as distance increases as you can see in the left side of your image. Anisotropic filtering is another form of anti aliasing that specifically targets this aliasing of textures and aims to replace mipmaps because its results are superior on computers that can handle it (most), as you can see in the right side of your image.

Attached: aliasing.jpg (300x300, 39K)