Reduced frame rate performance
Reduced frame rate performance
Greetings, I require some guidance. I've been experiencing frame rate fluctuations while gaming, even when utilizing reduced graphics settings. My system details are as follows:
Operating System: Windows 10 Pro (64-bit)
Processor: AMD Ryzen 7 1700, running on the Summit Ridge 14nm architecture.
Memory: 16GB of dual-channel RAM, with an unknown speed of 1064MHz (timing: 15-15-15-36).
Motherboard: Gigabyte AB350-Gaming 3-CF (AM4), operating at a temperature of 34°C.
Graphics Card: ASUS VS247 with a NVIDIA GeForce GTX 1060 6GB (Gigabyte), running at 67°C.
Storage: a 465GB Western Digital WDC WD5000AAKS-00V1A0 (SATA) and a 119GB ADATA SX900 (SSD).
Optical Drive: TSSTcorp CDDVDW SH-222BB.
Audio: NVIDIA Virtual Audio Device (Wave Extensible) (WDM).
Performance Metrics: UserBenchmarks indicates 69% for games, 67% for desktop use, and 73% for work applications.
CPU Usage: AMD Ryzen 7 1700 – 80.2%
GPU Usage: NVIDIA GTX 1060-6GB – 73.7%
SSD Performance: ADATA XPG SX900 128GB – 69.6%
RAM Performance: G.SKILL F4-2400C DDR4 2x8GB – 78.1%
Motherboard Temperature: Gigabyte GA-AB350-Gaming 3-CF – 34°C
Current readings suggest elevated temperatures for inactivity.
Which game is this? What are the CPU, GPU, RAM, and SSD usage levels while playing it?
On a Friday the 13th, when playing with maximum settings and high frame rates, the game’s frames per second frequently decreased to between 40 and 55. This happened while the game was actively running, and I’m wondering if there’s a method to monitor resource consumption.