Issue encountered during overclocking of the 2080 Ti processor.
Issue encountered during overclocking of the 2080 Ti processor.
I can't reach beyond a 50MHz boost for the core clock on my Zotac 2080 Ti AMP! Even with full headroom for core voltage and power. Used Precison X1 and Afterburner. Temperatures stay around 70C. People usually get 150 to 200MHz overclock, and OC Scanning suggests 61MHz more. What's wrong with my GPU? How can I achieve higher overclock on this stock air-cooled unit?
Which power unit are you employing? Please specify the android model. How much was the power target configured? Were there any overclocking adjustments made?
PSU: Corsair HX1000i. Power target: Maximum allowed on this card. The maximum temperature headroom exists but rarely gets reached. No GPU memory clock overclocking has been done yet.
I also want to note that I linked the two 8-pin power sockets on the card using a single 8-pin connector from the PSU (the cable is a splitter). I believe this shouldn't restrict the power delivered to the card (hopefully).
There is an issue. Use two 8-pin connections. Avoid the splitter. It isn't overclocking due to insufficient power.
I will handle this as soon as I get home. I just want to confirm if the issue is really that simple. Still, I’m wondering—if the overclocking software indicates the card is drawing 110% to 115% of its TDP during tests (with constant changes), does that mean it’s relying on the PSU for power? And wouldn’t that suggest the splitter isn’t restricting current enough?
2 pins combined deliver 375 watts of total power. One 8-pin pin provides up to 150 watts, while the M.2 PCIe connection adds 75 watts. The 2080ti reference design has a 250-watt thermal design power. Overclocking may push it to 350-400 watts. Keep in mind that the auto OC scan forces the GPU to seek maximum stability at a specific overclock level. Gaming demands at that setting will usually be about half of that, and can rise with more demanding graphics rendering.
Lucky_SLS:
Two 8-pin connectors provide up to 375 watts of power. One 8-pin connector limits the output to 150 watts, while the PCIe connection adds 75 watts. The 2080ti reference design has a total power delivery of 250 watts. Overclocking may push it to 350-400 watts. Keep in mind, the auto OC scan forces the GPU to reach maximum stability at a specific overclock level. Actual gaming loads will usually be about half that amount, varying with scene graphics rendering needs.
Your observation about load during real gaming is correct.
The setup allows for flexible power distribution. Two 8-pin connectors can handle up to 375 watts combined. One 8-pin provides a maximum of 150 watts, while the PCIe connection contributes 75 watts. The 2080ti reference design supports 250 watts under DPDT. Overclocking may push the load to 350-400 watts. However, be aware that the auto OC scan forces the GPU to prioritize stability at a specific overclock level. Actual gaming performance will depend on the scene and rendering demands.
Currently, I use one PCIe cable linking a 1 8 slot PCIe to two 2 6/8 slots PCIe's (two male 6+2 pin to male 8 pin). The PSU I have is the M12ii-750-evo from Seasonic. Are you suggesting that power is divided between these two connections? Would it be better to use two 8-pin connectors on the PSU, reserving only one 6/8 pin per slot?
You can find more details here: https://imgur.com/a/5beNyMP
PSU link: http://www2.seasonic.com/product/m12ii-750-evo/
But as mentioned earlier, one 8-pin connector can only deliver up to 150 watts (depending on the power supply; for instance, the HX1000i won't have any problems, but lower capacity units like 450W might not reach that level). For a GTX 1080 class and below, a single 8-pin with a PCIe 75W adapter will provide sufficient power. Higher wattage cards such as the Vega 64 and 2080ti will experience issues.
I currently have two distinct cables from the PSU to the GPU—each Y-cable features an 8-pin connector on one end and a 6+2 pin connector on the other, plus another 6+2 pin connector that remains unconnected. This setup involves two such cables for a single card. Despite this, it hasn't resolved my problem. I updated my BIOS but saw no improvement. I'm not operating at thermal or temperature limits. The issue persists: my system crashes with a black screen and requires a hard reboot whenever the core clock exceeds 50MHz during a 4K benchmark run.