Yes, some games incorporate VXGI lighting to enhance visual effects.
Yes, some games incorporate VXGI lighting to enhance visual effects.
Nvidia has promoted their VXGI lighting system since the 900 series launched. From my understanding, it seems most games haven’t adopted this tech yet, except for internal demonstrations like the lunar landing showcase. Could there be a gap in implementation or adoption?
It's included in GameWorks' VisualFX suite, but currently few projects are leveraging it.
It's not the sole factor. Nvidia isn't widely recognized for emerging tech since they rarely make products available to the public. Others keep innovating while Nvidia stays behind. AMD, for example, shared their Mantle API—already quite impressive—so anyone could access it. Nvidia might adopt it themselves if needed, and Intel confirmed plans to integrate it into their iGPUs, which would be significant. FreeSync works with any compatible display as long as the hardware supports it. SMAA offers a free anti-aliasing option that performs well but has minimal impact compared to alternatives, making it easy to add to most modern games. Despite its benefits, it remains underused due to marketing efforts. Nvidia develops strong technology but tends to be cautious about incorporating external innovations and is particularly protective of their own creations.
VXGI imposes an excessive performance burden at this stage, making it unsuitable for gaming. Even with a GTX 980 and a scene with moderate assets, the frame rate remained poor. Picture a game similar to Assassin's Creed or Shadow of Mordor—using it would almost certainly cause the game client to crash immediately. Nvidia added this capability with Maxwell, but I question whether any gaming implementation will appear before Pascal, if at all. It’s worth noting that while developers might find ways to improve lighting effects, VXGI itself is unlikely to be adopted soon.
If the feature struggles to operate smoothly on next-gen consoles, it’s unlikely to appear in any PC titles soon. If it doesn’t perform well on a high-end Nvidia graphics card, don’t count on it even with a Nvidia Gameworks title in the near future. Developers are still wrestling with issues like integrating Nvidia Hairworks across certain models, which functions consistently on both cards. We see more advanced textures, sharper resolutions, improved lighting and shadows, enhanced physics, better view distance, reduced visual noise, and the option to push maximum anti-aliasing if the GPU supports it. Proprietary features often fall short, usually delivering subpar results that developers were paid to include. Occasionally it works well, but it’s not a reliable standard. I’ve never used TXAA on my GTX 770, expecting a poor experience. TressFX performed poorly at launch for Tomb Raider, though it seems improved later, and we’ll see how it performs in future titles. To be honest, I’m surprised Nvidia Hairworks won’t make Witcher 3 available. I’d be even more surprised if it doesn’t run smoothly or offers meaningful performance gains compared to other settings like supersampling, ubersampling, or downsampling. Physx? One game impressed me with its quality—Planetside 2—but Sony pulled it after the PS4 release. After that, I bought a R9 290 on sale.
Nvidia game solutions take a long time to be ready. Hairworks has been available for years but only recently began limited implementation in games. VXGI won’t appear in games for another few years at least. DX12 is the stronger choice since Mantle can handle up to 100,000 draw calls per frame while OpenGL remains the top option though support on Windows is still limited to around 150,000 calls per frame.