It doesn't take a genius to tell you that games can look very different on different PC's, depending on the hardware and game settings.

One of the settings with the most performance impact is Anti-Aliasing; but what exactly is it, and how does it work? Alt639 is here to educate us, so read on to become an expert...

A Guide to Anti-Aliasing

Anti-aliasing is a process that corrects jagged edges appearing on geometry in games. There are two different types of anti-aliasing, which work in different ways: deferred and shader algorithms. Deferred variants work by rendering the frames multiple times at a higher resolution, then downsampling the image to be displayed. Shader algorithms work by analyzing the frames and applying a blur filter over edges. Generally speaking, deferred techniques are graphically superior, while being more hardware intensive. Now that we have covered the basics, let's take an in-depth look at all the different techniques. (Mouse over the images to see their titles)

Deferred Anti-aliasing 

• Full scene anti-aliasing (FSAA): Also commonly known as super sample anti-aliasing, this technique simply renders the frames at a much higher resolution, then downsamples them to the appropriate resolution. What you get is a much softer and realistic scene, but, depending on your hardware, also comes with a steep performance hit. By virtue, using FSAA is the "best" choice for eliminating jagged edges, because of the way it operates. It also anti-aliases the alpha, or transparent, channels of textures. This was the first type of anti-aliasing used, but is now only viable in modern games for those with flagship GPUs, like the Radeon HD 7950/7970 or GTX 670/680.

• Multi sample anti-aliasing (MSAA): A derivation of FSAA, this technique is optimized to increase performance by only super sampling the depth and outline values associated with geometry. So, edges are super sampled, but nothing else is, which detracts from the softened detail that FSAA confers. MSAA also does not anti-alias alpha channels by itself, so it must be used with a form of transparency multi sampling if one wishes to anti-alias transparent textures. For Nvidia cards, this would be in the form of sparse grid super sample anti-aliasing, SGSSAA, which specifically targets alpha textures.

• Adaptive multi sample anti-aliasing (AMSAA): AMD graphics cards benefit from this technique, which combines traditional MSAA and transparency multi sampling. It effectively combines the two without a particularly noticeable effect on performance. The downside to this technique is that, occasionally, visual artifacts occur. DirectX 11 games, like Far Cry 3 and Battlefield 3, don't seem to suffer from this issue, but some DirectX 9 titles, most notably Skyrim, do encounter strange errors.

• Custom filter anti-aliasing (CFAA): Not exactly a form of anti-aliasing, CFAA is another AMD exclusive technology that changes the way that MSAA is applied to the scene. In earlier releases of the Catalyst drivers, users could select from four different filters: box, narrow tent, wide tent, and edge detect. As of the 13.6 Beta, only standard (box) and edge detect remain. What these filters do is change the radius of sampling, which basically equates to more samples. Box is no increase, narrow tent is a small increase, and wide tent is a larger increase. The edge detect filter is a notable exception though; it applies an algorithm that helps to, well, detect more edges, along with a sampling increase.

That's about it for deferred techniques. Keep in mind that performance may wildly vary between games using these types of anti-aliasing. Far Cry 3 becomes unplayable with more than 4x MSAA, while Left 4 Dead 2 remains above 60 FPS with 12x edge detect SSAA. Obviously the depth of the scene will have a large effect on these results.

Shader Algorithms 

• Fast approximate anti-aliasing (FXAA): Perhaps the most well known shader algorithm, FXAA is an Nvidia technology that works in an extremely simple fashion. The algorithm searches for all edges in the image, then blurs them to anti-alias the edges. It also works on alpha textures, as do all the shader techniques, and where MSAA causes a sharp decline in performance, FXAA has almost no noticeable impact. However, texture detail is not as sharp when using FXAA because of the way it operates. It's great for those who don't mind sacrificing a bit of image quality.

• Temporal anti-aliasing (TXAA): TXAA actually combines MSAA with a temporal filter. What this means is that you can achieve quality above 8x MSAA with the performance impact of 4x MSAA. It also doesn't blur the scene as much as FXAA does. It is another Nvidia exclusive, and carries the largest performance hit of all the shader techniques. Interestingly enough, temporal anti-aliasing was first used by AMD with the release of the X800 graphics card, but was abandoned because it became problematic with newer games.

• Morphological anti-aliasing (MLAA): AMD's exclusive MLAA works much the same as the other techniques, by finding edges and adding post-process blur. However, AMD has tuned it to produce results similar to SSAA with a performance impact close to 2x MSAA. It also doesn't blur texture detail as much as FXAA does.

• Sub-pixel morphological anti-aliasing (SMAA): SMAA was independently developed as an even cleaner version of MLAA. It works in a similar fashion, but without any noticeable blurring or performance decrease. The injectSMAA tool has become a favorite of many gamers for its versatility. Pure SMAA is a single sample, whereas in Crysis 3, it combines several different techniques under the same name. 2x SMAA is SMAA with transparency super sampling, and 4x SMAA is 2x SMAA along with 2x MSAA.

• Sub-pixel reconstruction anti-aliasing (SRAA): Nvidia is currently working on a new form of anti-aliasing that works similarly to MLAA, but approaches quality comparable to 16x SSAA! It also renders faster than any other form of anti-aliasing, meaning that it will deliver the best possible image quality for the least performance cost when it is finalized. If you'd like to read more about it, Nvidia has a PDF on the subject Here

I'd also like to quickly address a glaring misconception. Many people like to believe that anti-aliasing has no benefit at 1920x1080 or above, however this is not true. As you can see in the following image, without anti-aliasing there are a significant amount of jagged edges even on my 22" 1080P monitor. As seen in the images under each heading, the jagged edges are almost completely removed with the addition of anti-aliasing. Until such a time that we see WQHD or 4K resolutions in a smaller form factor, anti-aliasing will have a very noticeable effect on image quality.

So how educational did you find it? Do you guys now feel more comfortable chatting about anti-aliasing? Tell us below!