Is it beneficial to upgrade your GPU for additional vRAM?
Is it beneficial to upgrade your GPU for additional vRAM?
I’m evaluating two upgrade paths for my RTX 2070 8GB. One option is a new RTX 5060 Ti with 16GB of VRAM priced at €459, and the other is a used RTX 3090 with 24GB of VRAM, estimated between €700 and €800 locally. Since gaming isn’t a priority, I’m mainly concerned with AI workloads. The 5060 Ti matches my current 2070 in memory bandwidth (448 GB/s), so compatibility seems likely. I’m curious about the 5080 model with 24GB VRAM—would it be a worthwhile upgrade? Bandwidth figures for the 5080 are around 960 GB/s, which is close to both systems. Reports suggest the 3090 maintains decent performance on PCIe 3.0, so similar expectations apply to the 5080. Regarding used GPUs, their longevity depends on condition; I’m aiming for a reliable choice. Let me compare the options to find the best value for my needs.
It covers a variety of uses. Right now I'm mainly working with ComfyUI alongside Flux dev and schnell, but I'm also exploring open-source AI video generators. I'm testing several LLMs like Gemma 3 and DeepSeek, often in smaller or quantized formats. In total, I'm sticking to what fits within my 8 GB of VRAM.
When experimenting with LLMs at home, VRAM capacity is the main limitation. Based on this, it makes sense to opt for the 3090. With 24 GB available, you can run several LLMs at home for optimal performance. It likely offers inference speed comparable to or slightly superior to a 5070 Ti, depending on token usage.
On the other hand, this model consumes more power, works with older CUDA versions, and is a used unit rather than a new one.
For AI tasks involving images and videos, other factors may come into play, making the 5070 Ti a more suitable choice. You have a better understanding of landscape needs than I do.
Well, I don't have any background with that, sadly. It seems the card probably belongs to a gamer instead of an AI user. Maybe one of the gentlemen here can give us a clue.