Developers use low-quality camera simulations to create more realistic and immersive experiences for players.
Developers use low-quality camera simulations to create more realistic and immersive experiences for players.
Chromatic aberration is an effect that should never appear in photos, yet simulating it in games can enhance visuals. Visual noise resembles issues from poor cameras, while lens flare can be useful when applied correctly to mimic bright elements like the sun. It can look great if done well but becomes distracting when artifacts spread across the screen. This applies to bloom as well. If games aim to immerse players, shouldn’t they replicate how the human eye works rather than a flawed camera?
I'm in full agreement, though I feel something important is being overlooked (emphasis mine). That's what I was saying as I typed this, mentioning @ Sakkura. Basically, it's about making it more cinematic—like being in a movie with well-planned, dramatic scenes. Chromatic aberration isn't just about bad cameras; it's a new trend in game development that should eventually fade (hopefully). Just like with games getting dirty or water or blood on the lens, these effects come from film and movies and should be adjustable in the game (though I still want the option to disable them). Bloom, lens flares, motion blur—all these elements are borrowed from cinema and should have a slider for how much you want in the game.
Since it's turned into a trend, numerous businesses feel compelled to adopt that visual style instead of a genuine one. This approach ultimately leads to the outcome described. It's an easy method to give the impression of realism to many audiences.