No one has developed foveated rendering for desktop yet.
No one has developed foveated rendering for desktop yet.
The main idea is clear—VR games could use foveated rendering if it worked well. For desktops, a decent webcam and affordable software already let you track eye movements, so why not apply that for foveated effects? Of course there would be challenges, such as poor streaming or recording quality, but it could still help users on lower or mid-range systems run smoother settings.
Your gaze usually lands farther from a desktop screen than from a headset display, allowing you to view more at once. Therefore, cutting resolution in less-focused areas would likely offer less advantage. The distance between your eyes and the screen changes constantly, unlike with a VR headset where alignment is fixed. Precision becomes harder in such scenarios. It seems calibration is a key challenge. We're currently testing a Vision Pro for this purpose; the cursor must align with your gaze. Without proper calibration, it doesn't function effectively. Now picture calibration applied to a monitor, where eye position and angle aren't constant.
this kind of configuration often means limited budget graphics cards, which can be a challenge for high-end gaming. it also tends to have a smaller display area, and the tools needed for accurate eye tracking usually demand more powerful hardware than you might have.
The initial concern was the screen distance and field of view. However, I wonder if these factors are the main challenge. If feasible, it might not be an easy or inexpensive solution. My thoughts lean toward VR, but tracking eye movements on a monitor could be more manageable than expected.
It was the initial thought that crossed my mind too. However, recalling my first encounter with the Vision Pro made me question if that could be an issue. There are commercial options available (such as the Tobii Eye Tracker 5), but I have limited insight into their performance. Their marketing video highlights lower latency, suggesting they might not be fast enough for FR applications. As @manikyath noted, they aren’t particularly affordable. With a budget of around 280 € for this equipment, it seems more sensible to allocate that toward a faster GPU instead.
It seems this approach has clear drawbacks. Eye tracking works, but the parts needing proper rendering are much larger. A dedicated headset that handles everything automatically would be far superior. Studios should focus on optimizing games themselves rather than relying on upscaling or low-quality assets. Otherwise, it just creates more problems without real improvement.