my system can handle oblivion well, but it experiences lag and frame loss
my system can handle oblivion well, but it experiences lag and frame loss
In 2006 the game was released with just one central processing unit. My setup was an AMD Athlon 64 4000 with one core/one thread and a Gigabyte GeForce 7950GT featuring 512MB of RAM. I didn’t get a dual-core 2-thread CPU until the following year, which was an AMD Opteron 185. At that time you had to replace your hardware every year to stay current, and this pattern continued until the introduction of Sandy Bridge in 2011.
This situation raises questions about current performance expectations. A modern CPU should indeed be significantly faster on a single core compared to older models. The idea that Moore's Law has stopped holding around 2011 or 2012 is partially true, but it doesn't fully explain why older hardware still functions well in some cases. Your Athlon 64 x2 from around 2007-2018 performed adequately until motherboard issues arose, which affected reliability rather than raw speed.
I hope they could improve things. My i7 8086k with a 5ghz all-core overclock still struggled to reach the 30s even with my game that had around 250 modifiers. The 5800x managed to keep frames at 60 most of the time, but it wasn’t as smooth as capping the game at 30. My Intel Q9550 from 2010 could only handle about 20 frames. I first played the game on an Xbox 360, which was rough in some areas too. Between 2011 and 2018 I ran two i7 2600ks and got an i7 6700k in 2016, though it wasn’t a major jump. The only real upgrade was the i7 8700k. The 5800x and the i9 10900k were purchased for their higher IPC, but in most games they performed similarly to the six-core Intel. It was a bit let down until I tried games like Total War: Warhammer 2 and saw about 30 extra frames when needed. Without those titles, the i9 10900k and 5800x would have been quite disappointing.