The maximum safe long-term voltage for a 2070 super is determined by considering its design specifications.
The maximum safe long-term voltage for a 2070 super is determined by considering its design specifications.
I’ve noticed many discussions about Pascal being around 1.094v, but not much talk about Turing. There was also a comment on here stating Nvidia claims their product is officially under 1.064v, which I haven’t been able to verify. Additionally, someone mentioned during a game or Nexus interview that increasing the voltage slider to 100% could shorten the lifespan to just one year—though I’m not sure if that’s intentional. Some of you might recognize me from previous posts, but here’s a brief summary: I own an Aorus 2070 Super. After manual overclocking with offsets, I achieved around +25 cores at 2010mhz and +800 memory before errors appeared on the OCCT 3D test. Using the voltage frequency curve gave better results. My card is mostly stock, running at 1.050 and allowing adjustments to 2070mhz and +800 memory without major issues. I can push up to about 2100mhz around 1.068v, though I’m cautious about long-term stability. Voltage varies per card and model, but I’m seeking a general safe guideline. Also, I don’t see errors in OCCT when using this curve, possibly because it hits power limits instantly, dropping the voltage to around 900 and destabilizing higher clocks. I suspect that if I can only tweak one point on the curve, it reverts to normal behavior. This might be worth noting as a selling point—my car is marketed as “Built for Extreme Overclocking 12+2 Power Phases,” which differs from Nvidia’s 8+2. I’m unsure if higher voltage really impacts lifespan, but I appreciate any advice you can offer.
If you need to supply a lot of voltage for a small jump, I wouldn't recommend it. For example, boosting my CPU by 25 MHz would require about a 30W increase in TDP, which isn't justified. I'd look up your specific model since some can handle more power than others. Regarding the other questions, I'm not sure if someone else could provide an accurate answer.
If you need to supply a lot of voltage for a small jump, I wouldn't recommend it. For example, boosting my CPU speed by 25 MHz would require about a 30W increase in TDP, which isn't justified. I'd look up your specific model first, as some can handle more power than others. As for the other questions, I'm not sure if someone else could provide an accurate answer.