The real AA has disappeared.
The real AA has disappeared.
I recall when many games featured a solid MSAA setup, and now you're lucky to find something superior to FXAA or TAA. I miss the days when MSAA worked well. Even some modern MSAA versions are lacking in transparency support, and forcing it through the control panel often doesn't help. =(
I'm a bit puzzled about "the real AA" and why some games don't include proper AA features. It seems modern high-budget titles often skip supporting "true" AA. Edited November 29, 2016 by jkjoonas autocorrected wrong word
It's frustrating when Nvidia pushes Fxaa too hard. Killing Floor 2 only includes Fxaa but it just makes things blurry.
Essentially, antialiasing introduces blur. Perhaps this is true from your perspective—games now rarely provide high-quality anti-aliasing. I think many are shifting to 2K and 4K screens, making the need for AA at 4K almost nonexistent. It also puts a lot of strain on your hardware. I’m not sure if these numbers are accurate, but they make a clear point. See the image here: http://i.imgur.com/9MEMXrT.jpg
This phrase has been acceptable for PC gaming since the early days of high-resolution displays. If you need to play at 4K or ultra-high settings, it’s perfectly fine to do so. Game developers understand that not everyone uses 4K, so high-quality AA options are still highly valued by many players.
I'm feeling frustrated because it seems we're stuck on this debate. I haven't found any examples of modern big titles offering good antialiasing options, making your point less relevant. I think when you mention this, you're looking for real examples rather than just opinions. Many multisampling methods are applied after the fact, and supersampling works no matter the game settings. Also, what exactly are you hoping to achieve with this conversation—if it's just about opinions, then I might be misrepresenting your concerns.
I mean blur, you can notice it clearly with normal msaa, it doesn't stand out.