Origins with a virtual machine included.
Origins with a virtual machine included.
I was enjoying Assassins Creed Origin when I noticed my CPU fan working at maximum speed. I checked the temperatures and found the CPU temperature was 90°C. In Task Manager, the CPU usage reached 100%. I played numerous AAA games on the same PC, but Assassins Creed Origin was the only one consistently using full CPU power even in the menu. I looked up Steam forums and discovered it was running Denuvo, which I had played many other Denuvo titles without experiencing such high usage. Later, I found out it might be using VMProtect, meaning the game was running alongside a virtual machine. I’m worried about potential CPU damage from the heat. I requested a refund because I’ve only played about 1.7 hours so far. I also tried other AAA games with Denuvo, which had much lower CPU usage (30-40%). Finally, I closed the game after it spiked to 100% CPU again.
The issue is that the game shouldn't consume excessive CPU just because it's running in a virtual machine. Virtual machines do introduce some CPU overhead, but it's not the same as emulation where the CPU must constantly translate code between different architectures in real time. Therefore, if the game uses 100% CPU, it would still do so even without a VM.
You can always control the CPU usage on Windows through power settings. Besides that, this game is quite intensive (as highlighted by gamernexus), featuring multiple DRM layers. There are no magical solutions.
It hinges on how you set it up and what the software does within the VM. Here, it seems the game triggers a library each time you shift, almost nonstop, leading to significant slowdowns without any real cause beyond DRM. The VM setup itself might not be the issue, but the DRM system as a whole is.
It's what's inside the VM that matters, not whether it's a VM at all.
I'm looking for evidence or proof. Or am I jumping to conclusions by assuming a link between correlation and causation?