Anticipated performance in Skyrim depends on various factors such as mods, optimization, and hardware.
Anticipated performance in Skyrim depends on various factors such as mods, optimization, and hardware.
I’m really enjoying Skyrim. My brother purchased it for us on the Xbox 360 a couple of weeks ago, and after a few days I decided to give it a try since I’ve always been interested in the game since its launch. I reached level 6 or 7, so I’m not very experienced with it. I got it for Windows because I usually prefer PC games over console titles, and I’d say I’m much more comfortable using the keyboard and mouse there.
Avoid playing Skyrim in its original form on a PC unless you have the unofficial patches installed. These updates can significantly boost performance without compromising stability. In fact, they often enhance speed and responsiveness. This advice isn’t about your current request but is shared for broader gaming improvement tips.
I’m questioning some common beliefs. I enjoy Skyrim often, and my machine runs an AMD A10-4600M with high-quality options. My screen’s max resolution is 1336x768, but based on my knowledge of this APU, it should handle decent or even high settings well. Before upgrading, I could run Skyrim smoothly at medium settings on an A8-4500M and an A4-4300M without any issues—no need to push the CPU. Although these processors aren’t fast, their built-in GPUs perform impressively. With medium settings, I can play demanding titles like Far Cry 3 on the A10-4600M with almost no lag. This is based on real experience.
It's quite right. The A10-4600M isn't ideal for video editing, but it performs well in games. Of course, I don't rely solely on the GPU integrated into the CPU; my laptop has a secondary GPU that's more capable. Plus, it supports CrossFire. I'm not sure if I mentioned something else, but I own Skyrim plus all its expansions. On the built-in screen, I can smoothly play at high settings with consistent frames above 30 FPS. I understand that 30 FPS might seem low compared to professional benchmarks (like Linus's tests), since cards achieving 30 FPS often outperform those at 70 FPS. However, 30 FPS is definitely manageable.
Indeed, it's important to remember that the human brain can only process movement at about 15 frames per second, and becomes completely unresponsive above 24 fps. At 70 fps, it simply doesn't add much value.
It seems this relates to the performance range of the video card. If it runs at 30 frames per second, it might slow down during intense moments, which we usually don’t notice. This idea comes from my years of experience in video editing. There’s a concept called Retinal Persistency — the way our retina keeps tracking images. It shows that humans can't easily spot movement when it happens faster than 16 frames per second, but above that, we start to see a delay. GPUs running at 30 fps likely struggle under heavy loads and can’t make up for it, making the lag obvious. I haven’t noticed this with Skyrim, but Far Cry 3 does show some issues at higher frame rates.
sorry I did not notice that that was a link for a video. As far as I know there's a difference in the refresh rate of the monitor (as provided by the combination of the monitor and the video card) which is measured in MegaHerzt, and the framerates. The MHz at low rates produce the effect of flickering, whilst the low framerates produce lag, jumping frames and all those undesirable effects we know from videogames. As far as of my humble knowledge, those are not comparable.
The term refers to a refresh rate of 60 million times per second for displays or screens. The frame rate is unrelated to this specification.