F5F Stay Refreshed Hardware Desktop Is it beneficial to upgrade your GPU for additional vRAM?

Is it beneficial to upgrade your GPU for additional vRAM?

Is it beneficial to upgrade your GPU for additional vRAM?

C
chenglee1998
Member
147
09-12-2025, 06:26 AM
#1
I’m evaluating two upgrade paths for my RTX 2070 8GB. One option is a new RTX 5060 Ti with 16GB of VRAM priced at €459, and the other is a used RTX 3090 with 24GB of VRAM, estimated between €700 and €800 locally. Since gaming isn’t a priority, I’m mainly concerned with AI workloads. The 5060 Ti matches my current 2070 in memory bandwidth (448 GB/s), so compatibility seems likely. I’m curious about the 5080 model with 24GB VRAM—would it be a worthwhile upgrade? Bandwidth figures for the 5080 are around 960 GB/s, which is close to both systems. Reports suggest the 3090 maintains decent performance on PCIe 3.0, so similar expectations apply to the 5080. Regarding used GPUs, their longevity depends on condition; I’m aiming for a reliable choice. Let me compare the options to find the best value for my needs.
C
chenglee1998
09-12-2025, 06:26 AM #1

I’m evaluating two upgrade paths for my RTX 2070 8GB. One option is a new RTX 5060 Ti with 16GB of VRAM priced at €459, and the other is a used RTX 3090 with 24GB of VRAM, estimated between €700 and €800 locally. Since gaming isn’t a priority, I’m mainly concerned with AI workloads. The 5060 Ti matches my current 2070 in memory bandwidth (448 GB/s), so compatibility seems likely. I’m curious about the 5080 model with 24GB VRAM—would it be a worthwhile upgrade? Bandwidth figures for the 5080 are around 960 GB/s, which is close to both systems. Reports suggest the 3090 maintains decent performance on PCIe 3.0, so similar expectations apply to the 5080. Regarding used GPUs, their longevity depends on condition; I’m aiming for a reliable choice. Let me compare the options to find the best value for my needs.

I
Ireo
Member
150
09-12-2025, 06:26 AM
#2
Is the AI software reliant on VRAM for its operations or to achieve optimal performance?
I
Ireo
09-12-2025, 06:26 AM #2

Is the AI software reliant on VRAM for its operations or to achieve optimal performance?

Z
ZaitheGod
Member
236
09-12-2025, 06:26 AM
#3
It covers a variety of uses. Right now I'm mainly working with ComfyUI alongside Flux dev and schnell, but I'm also exploring open-source AI video generators. I'm testing several LLMs like Gemma 3 and DeepSeek, often in smaller or quantized formats. In total, I'm sticking to what fits within my 8 GB of VRAM.
Z
ZaitheGod
09-12-2025, 06:26 AM #3

It covers a variety of uses. Right now I'm mainly working with ComfyUI alongside Flux dev and schnell, but I'm also exploring open-source AI video generators. I'm testing several LLMs like Gemma 3 and DeepSeek, often in smaller or quantized formats. In total, I'm sticking to what fits within my 8 GB of VRAM.

H
helenma0301
Senior Member
250
09-12-2025, 06:26 AM
#4
When experimenting with LLMs at home, VRAM capacity is the main limitation. Based on this, it makes sense to opt for the 3090. With 24 GB available, you can run several LLMs at home for optimal performance. It likely offers inference speed comparable to or slightly superior to a 5070 Ti, depending on token usage.
On the other hand, this model consumes more power, works with older CUDA versions, and is a used unit rather than a new one.
For AI tasks involving images and videos, other factors may come into play, making the 5070 Ti a more suitable choice. You have a better understanding of landscape needs than I do.
H
helenma0301
09-12-2025, 06:26 AM #4

When experimenting with LLMs at home, VRAM capacity is the main limitation. Based on this, it makes sense to opt for the 3090. With 24 GB available, you can run several LLMs at home for optimal performance. It likely offers inference speed comparable to or slightly superior to a 5070 Ti, depending on token usage.
On the other hand, this model consumes more power, works with older CUDA versions, and is a used unit rather than a new one.
For AI tasks involving images and videos, other factors may come into play, making the 5070 Ti a more suitable choice. You have a better understanding of landscape needs than I do.

K
KEA_987
Junior Member
45
09-12-2025, 06:26 AM
#5
Thanks for the question! Used GPUs can still perform well if you buy one in good shape. A 3090 around 4.5 years old with a bykski FE watercooling setup can last several more years, depending on usage and maintenance.
K
KEA_987
09-12-2025, 06:26 AM #5

Thanks for the question! Used GPUs can still perform well if you buy one in good shape. A 3090 around 4.5 years old with a bykski FE watercooling setup can last several more years, depending on usage and maintenance.

J
169
09-12-2025, 06:26 AM
#6
Well, I don't have any background with that, sadly. It seems the card probably belongs to a gamer instead of an AI user. Maybe one of the gentlemen here can give us a clue.
J
josbakmeel2000
09-12-2025, 06:26 AM #6

Well, I don't have any background with that, sadly. It seems the card probably belongs to a gamer instead of an AI user. Maybe one of the gentlemen here can give us a clue.