I increased the speed of my 3200 mhz CL16 RAM.
I increased the speed of my 3200 mhz CL16 RAM.
Hi, your setup looks solid. You've successfully boosted your CL16 to 3800 MHz with a CommandRate 1 setting at 1.37V, which is a good improvement. The +2 FPS you're seeing in games confirms the positive change. For further verification, look for benchmarks that match your specs.
Klesa,
If it remains stable, well done! You can still compare performance and also determine the actual latency. The calculation uses a specific formula:
1 divided by 0.5 DDR in GHz multiplied by Column Latency equals the true latency in nanoseconds.
Thus...
Half Double Data Rate (DDR) equals Single Data Rate (SDR).
Given DDR 3200MHz equals 3.2GHz, then SDR 1600MHz equals 1.6GHz.
Column Latency (CL) is 16.
Therefore 1 / 1.6 x 16 = 10.00ns True Latency.
Continuing similarly...
Half Double Data Rate (DDR) equals Single Data Rate (SDR).
With DDR 3800MHz equaling 3.8GHz, SDR 1900MHz equals 1.9GHz.
Column Latency (CL) is 17.
So 1 / 1.9 x 17 = 8.95ns True Latency.
Even though your memory frequency has been increased...
Klesa,
If it remains stable, congratulations!
You can still compare performance, and you can also determine the actual latency. The calculation is as follows:
True Latency in nanoseconds equals 1 divided by (0.5 × DDR in GHz) multiplied by Column Latency.
So...
A DDR speed of 3200MHz equals 3.2GHz; dividing by 2 gives 1.6GHz, and multiplying by column latency yields a true latency of 10.00ns.
Similarly, for 3800MHz DDR = 3.8GHz, SDR at 1900MHz is 1.9GHz, resulting in 8.95ns.
Even though your memory frequency rose by 600MHz (18.75%), your true latency dropped by 1.05ns (11.73%).
It’s worth noting that only the top-tier high-end modules can approach the long-standing 8.0ns latency mark. For reference, here are some typical frequency/time pairs for comparison:
- 3200 @ 16 = 10.0ns
- 3200 @ 15 = 9.38ns
- 4000 @ 18 = 9.00ns
- 3800 @ 17 = 8.95ns
- 3600 @ 16 = 8.88ns
- 3200 @ 14 = 8.75ns
- 3733 @ 16 ≈ 8.57ns
- 4000 @ 17 = 8.50ns
Your original setting was 3200 @ 16 (10.0ns).
If you upgrade to 3733 @ 16, the latency improves to 533MHz (16.65%), a reduction of 1.05ns (11.73%).
Consider trying 3733 @ 16 first, as it’s faster than 3800 @ 17, though you may need to adjust to around 1.38V. At 4000 @ 17, the latency would likely exceed 18.75% and reach 600MHz (18.75%), with a drop of 1.05ns.
Be cautious; only the best high-end modules can match the 8.0ns benchmark. Intel’s IMC voltage spec is 1.35V ±5%, which is just under 1.42V, giving you plenty of margin.
A tip: always back up your system thoroughly before attempting any overclocking. BSODs are common during stability tests and can cause crashes, damaging files like photos, documents, or game saves.
Hello, thank you for your message. I plan to attempt reaching 3733mhz CL16 today. Could you clarify whether lowering the command rate to one would provide a beneficial +0.01 voltage improvement?
Klesa
,
If you can get it stable at CR1, that would be a bonus. If need be, you can increase the voltage, but don't exceed Intel's spec at 1.35 + 5%, which is 1.42.
Here's an excellent Guide from Gamers Nexus, which also includes a video:
What Are Memory Timings?
CT
After evaluating today's tests, achieving 3733mhz CL16 proved challenging, with only 3700mhz stability. However, 3600mhz CL16 appears reliable. The reduction in latency by 0.07 ns seems to come at the cost of bandwidth. Should I continue using 3800 mhz CL17 (CR1) or stick with a different configuration?
As shown in my first post, the differences are minimal:
3800 @ 17 = 8.95ns
3600 @ 16 = 8.88ns