F5F Stay Refreshed Hardware Desktop Which GPU to choose ?

Which GPU to choose ?

Which GPU to choose ?

Pages (2): 1 2 Next
C
ClassicMan_YT
Member
140
12-07-2025, 07:27 AM
#1
Here is my power supply model:
MSI MAG A650BE
I purchased it seven months ago.
The LDLC website (French reseller) suggests a 750W power supply for the RTX 3080.
I won't push it to its limits or attempt overclocking.
Can you tell me if the RTX 3080 would run better in 1440p/4K compared to the RX 9060XT with 8GB? Thanks.
C
ClassicMan_YT
12-07-2025, 07:27 AM #1

Here is my power supply model:
MSI MAG A650BE
I purchased it seven months ago.
The LDLC website (French reseller) suggests a 750W power supply for the RTX 3080.
I won't push it to its limits or attempt overclocking.
Can you tell me if the RTX 3080 would run better in 1440p/4K compared to the RX 9060XT with 8GB? Thanks.

S
Sophcutegirl
Junior Member
49
12-07-2025, 07:27 AM
#2
I wouldn’t purchase the RTX 3080 for several reasons. These graphics cards were frequently used for cryptocurrency mining when they first launched, even by regular users. Running them under heavy load could cause problems today. It’s hard to confirm whether a previously owned RTX 3080 was ever used for mining.

As mentioned by @Lutfij, the RTX 3080 consumes a lot of power and experiences frequent temporary spikes that might overload a standard power supply. This issue seems linked to Nvidia’s decision to use Samsung’s 8 nm process for the Ampere generation. Your current power supply won’t suffice—it will struggle significantly, no matter how you adjust it.

I’ve played MSFS 2020 and 2024, both of which are very demanding on CPU and GPU resources. The RTX 5060 8 GB is adequate for 1080p, but the limited 8 GB of VRAM could become a bottleneck over time. If possible, I’d suggest considering a higher-end option like the RTX 5060 Ti with 16 GB of VRAM to get better performance and more computing power.
S
Sophcutegirl
12-07-2025, 07:27 AM #2

I wouldn’t purchase the RTX 3080 for several reasons. These graphics cards were frequently used for cryptocurrency mining when they first launched, even by regular users. Running them under heavy load could cause problems today. It’s hard to confirm whether a previously owned RTX 3080 was ever used for mining.

As mentioned by @Lutfij, the RTX 3080 consumes a lot of power and experiences frequent temporary spikes that might overload a standard power supply. This issue seems linked to Nvidia’s decision to use Samsung’s 8 nm process for the Ampere generation. Your current power supply won’t suffice—it will struggle significantly, no matter how you adjust it.

I’ve played MSFS 2020 and 2024, both of which are very demanding on CPU and GPU resources. The RTX 5060 8 GB is adequate for 1080p, but the limited 8 GB of VRAM could become a bottleneck over time. If possible, I’d suggest considering a higher-end option like the RTX 5060 Ti with 16 GB of VRAM to get better performance and more computing power.

J
Jetfact14
Member
193
12-07-2025, 07:27 AM
#3
In france, the most affordable RTX 5060 Ti with 16gb begins at 430€, while the RX 9060 XT with 16gb starts at 330€, so I’ll consider it.
Are you completely leaving out the Intel Arc B580? Thanks.
J
Jetfact14
12-07-2025, 07:27 AM #3

In france, the most affordable RTX 5060 Ti with 16gb begins at 430€, while the RX 9060 XT with 16gb starts at 330€, so I’ll consider it.
Are you completely leaving out the Intel Arc B580? Thanks.

D
drwings
Junior Member
6
12-07-2025, 07:27 AM
#4
The B580 generally performs about two full tiers behind the RX 9060 XT 16GB and 5060 Ti 16GB models, unless you opt for the 8GB variants which may fall only a single tier above the B580 or even lower in some scenarios due to limited VRAM. When gaming at 1440p, this 8GB VRAM cap can become a concern in many recent and older titles if you push your settings too high, or in games that aren't well-optimized for lower VRAM configurations. The 16GB versions avoid these problems, and the B580 is likely to do the same.

Regarding the RTX 3080, it sits roughly one tier above the RX 9060 and 5060 Ti, but its suitability depends on the seller's usage details. You'd also need a solid 850-watt power supply to prevent transient shutdowns reported by others. I can attest that a 750-watt unit wasn't sufficient when paired with a 350-watt TDP 2x8pin 3090 and a 105-watt CPU without reducing overall power consumption (such as with the 7600X). A 3080 might be pushing the system to the brink of sudden shutdown due to overcurrent protection.

In short, choosing a 16GB card is preferable even if it exceeds your budget, since it offers better longevity and fewer low-VRAM limitations.
D
drwings
12-07-2025, 07:27 AM #4

The B580 generally performs about two full tiers behind the RX 9060 XT 16GB and 5060 Ti 16GB models, unless you opt for the 8GB variants which may fall only a single tier above the B580 or even lower in some scenarios due to limited VRAM. When gaming at 1440p, this 8GB VRAM cap can become a concern in many recent and older titles if you push your settings too high, or in games that aren't well-optimized for lower VRAM configurations. The 16GB versions avoid these problems, and the B580 is likely to do the same.

Regarding the RTX 3080, it sits roughly one tier above the RX 9060 and 5060 Ti, but its suitability depends on the seller's usage details. You'd also need a solid 850-watt power supply to prevent transient shutdowns reported by others. I can attest that a 750-watt unit wasn't sufficient when paired with a 350-watt TDP 2x8pin 3090 and a 105-watt CPU without reducing overall power consumption (such as with the 7600X). A 3080 might be pushing the system to the brink of sudden shutdown due to overcurrent protection.

In short, choosing a 16GB card is preferable even if it exceeds your budget, since it offers better longevity and fewer low-VRAM limitations.

D
Docmed123
Junior Member
5
12-07-2025, 07:27 AM
#5
I'm abandoning the RTX 3080 choice.
The RTX 5060 with 16GB is out of my budget; it's not an option.
But I still have these alternatives:
RX 9060XT 16GB and Intel Arc B580 (more affordable).
D
Docmed123
12-07-2025, 07:27 AM #5

I'm abandoning the RTX 3080 choice.
The RTX 5060 with 16GB is out of my budget; it's not an option.
But I still have these alternatives:
RX 9060XT 16GB and Intel Arc B580 (more affordable).

L
LazerBeam2910
Junior Member
30
12-07-2025, 07:27 AM
#6
If you can allocate more funds for the 9060xt 16gb model, it should probably be the optimal choice. The 16gb capacity fits well with the resolutions you're considering and stays within a reasonable portion of your initial budget. Keep in mind that at 4k, you'll still need to adjust settings in most games, even with the mentioned GPUs.
L
LazerBeam2910
12-07-2025, 07:27 AM #6

If you can allocate more funds for the 9060xt 16gb model, it should probably be the optimal choice. The 16gb capacity fits well with the resolutions you're considering and stays within a reasonable portion of your initial budget. Keep in mind that at 4k, you'll still need to adjust settings in most games, even with the mentioned GPUs.

I
Isolatid
Member
59
12-07-2025, 07:27 AM
#7
If you’re using 1080p, VRAM won’t be a problem, but at 1440p high settings the cards might still struggle because the processor isn’t powerful enough, so you’d likely have to lower your settings. In my view, the 16GB model isn’t worth the 1.5x price jump compared to the 8GB (5060Ti); if you need that warm and fuzzy experience for the 9060XT, the extra €50 is more sensible. The issue lies with the 5060Ti, though—it only supports an x8 connection.
I
Isolatid
12-07-2025, 07:27 AM #7

If you’re using 1080p, VRAM won’t be a problem, but at 1440p high settings the cards might still struggle because the processor isn’t powerful enough, so you’d likely have to lower your settings. In my view, the 16GB model isn’t worth the 1.5x price jump compared to the 8GB (5060Ti); if you need that warm and fuzzy experience for the 9060XT, the extra €50 is more sensible. The issue lies with the 5060Ti, though—it only supports an x8 connection.

R
Rounyx
Posting Freak
838
12-07-2025, 07:27 AM
#8
The x8 bus in the 5060/60Ti model becomes a concern only on older systems that support PCIe 3.0, which the Ryzen 5 7600X does not have. The VRAM problem is largely unrelated to this, as it mainly stems from excessive higher settings that consume RAM.

Even with 1080 resolution, having 8GB of VRAM can still cause issues for certain games, particularly if you keep the settings at their maximum. However, many games that are well-optimized—such as The Last of Us and Hogwarts—experience poor performance or graphics problems on 8GB VRAM cards at 1080p. The Last of Us 2 continues to show some issues with 8GB cards only when settings are pushed too high.

Investing a bit more for a higher VRAM card often pays off, especially since you can save money over time by avoiding the need for an upgrade sooner. It’s usually better to spend around €350-450 now for a 16GB card that could last 3 to 5 years or more, rather than spending about €600-750 upfront for a lower VRAM card and a potential upgrade in two years when you encounter many low-VRAM problems. You might only recoup roughly one-third to less of the cost of the new 8GB VRAM card after two years, as it will be used more frequently.

Personally, I wouldn’t consider spending €400-450 on a 5060 Ti, as it’s essentially comparable to a 3070 Ti and sits just above its performance level, which is starting to show signs of aging. The 5070 model seems like the better choice, offering roughly 1.5 to 2 tiers of improvement over the 5060 Ti.

In two years, the next generation of GPUs from AMD and Nvidia will be available. We’re expected to see a Super refresh for Nvidia cards sometime in early next year (Q1 January to April).
R
Rounyx
12-07-2025, 07:27 AM #8

The x8 bus in the 5060/60Ti model becomes a concern only on older systems that support PCIe 3.0, which the Ryzen 5 7600X does not have. The VRAM problem is largely unrelated to this, as it mainly stems from excessive higher settings that consume RAM.

Even with 1080 resolution, having 8GB of VRAM can still cause issues for certain games, particularly if you keep the settings at their maximum. However, many games that are well-optimized—such as The Last of Us and Hogwarts—experience poor performance or graphics problems on 8GB VRAM cards at 1080p. The Last of Us 2 continues to show some issues with 8GB cards only when settings are pushed too high.

Investing a bit more for a higher VRAM card often pays off, especially since you can save money over time by avoiding the need for an upgrade sooner. It’s usually better to spend around €350-450 now for a 16GB card that could last 3 to 5 years or more, rather than spending about €600-750 upfront for a lower VRAM card and a potential upgrade in two years when you encounter many low-VRAM problems. You might only recoup roughly one-third to less of the cost of the new 8GB VRAM card after two years, as it will be used more frequently.

Personally, I wouldn’t consider spending €400-450 on a 5060 Ti, as it’s essentially comparable to a 3070 Ti and sits just above its performance level, which is starting to show signs of aging. The 5070 model seems like the better choice, offering roughly 1.5 to 2 tiers of improvement over the 5060 Ti.

In two years, the next generation of GPUs from AMD and Nvidia will be available. We’re expected to see a Super refresh for Nvidia cards sometime in early next year (Q1 January to April).

A
applez13
Member
138
12-07-2025, 07:27 AM
#9
In that scenario, I suggest the 9060XT with 16 GB.
A
applez13
12-07-2025, 07:27 AM #9

In that scenario, I suggest the 9060XT with 16 GB.

B
Blue_Fox_Lady
Member
194
12-07-2025, 07:27 AM
#10
No problem at all. You miss out on a big portion even in PCIE 4.0. The games were still slow, but we're okay now. Just using up the VRAM you don't need for 1080p.
B
Blue_Fox_Lady
12-07-2025, 07:27 AM #10

No problem at all. You miss out on a big portion even in PCIE 4.0. The games were still slow, but we're okay now. Just using up the VRAM you don't need for 1080p.

Pages (2): 1 2 Next