F5F Stay Refreshed Hardware Desktop What happened to SLI dual cards?

What happened to SLI dual cards?

What happened to SLI dual cards?

Pages (2): 1 2 Next
A
AlbyKursal
Junior Member
49
03-06-2016, 04:02 PM
#1
I became enamored with those cards a long time ago. I believe I owned two collections, each on separate computers I constructed. In 2016 I gave up the project and settled for just one set. Just wondering?
A
AlbyKursal
03-06-2016, 04:02 PM #1

I became enamored with those cards a long time ago. I believe I owned two collections, each on separate computers I constructed. In 2016 I gave up the project and settled for just one set. Just wondering?

G
GabeNewells
Member
55
03-14-2016, 09:06 AM
#2
Nvidia decided to ditch that technology, primarily because there weren't many devs investing in said multi GPU configurations. I'm sure they did their market research and found that the average user invested in a single most powerful GPU, negated the headaches of driver incompatibilities or making a game work with dual GPU setups and not needing a large wattage PSU. On a manufacturing level, you don't need to design connectors for a bridge atop the card either. Removing software support also helps funnel less resources to the dev team.
It's not just SLI, crossfire(x) was ditched as well.
G
GabeNewells
03-14-2016, 09:06 AM #2

Nvidia decided to ditch that technology, primarily because there weren't many devs investing in said multi GPU configurations. I'm sure they did their market research and found that the average user invested in a single most powerful GPU, negated the headaches of driver incompatibilities or making a game work with dual GPU setups and not needing a large wattage PSU. On a manufacturing level, you don't need to design connectors for a bridge atop the card either. Removing software support also helps funnel less resources to the dev team.
It's not just SLI, crossfire(x) was ditched as well.

S
SneakyFab
Junior Member
20
03-15-2016, 03:29 AM
#3
In addition to complexity and latency
S
SneakyFab
03-15-2016, 03:29 AM #3

In addition to complexity and latency

T
TheRealShrub
Senior Member
409
03-15-2016, 01:34 PM
#4
Heat, power, and real improvements or outcomes also played a role in the shift in direction.
@Lutfij highlights an important aspect, noting that given these circumstances, the recommendation was to select the single most powerful GPU available in order to prevent a mostly unfulfilling SLI/CF setup.
T
TheRealShrub
03-15-2016, 01:34 PM #4

Heat, power, and real improvements or outcomes also played a role in the shift in direction.
@Lutfij highlights an important aspect, noting that given these circumstances, the recommendation was to select the single most powerful GPU available in order to prevent a mostly unfulfilling SLI/CF setup.

T
TruZZted
Junior Member
26
03-15-2016, 06:25 PM
#5
I really enjoyed my experience with SLI. When it was set up properly, it performed very well; when it didn't, you just had to turn one card off. I didn't encounter the micro stutter that seemed quite frequent, although I think it could be related to gameplay. The extra stuff that came with it was more bothersome for me. I needed a bigger case to fit the bottom card plus a larger PSU, etc. The trial was still a learning process.
T
TruZZted
03-15-2016, 06:25 PM #5

I really enjoyed my experience with SLI. When it was set up properly, it performed very well; when it didn't, you just had to turn one card off. I didn't encounter the micro stutter that seemed quite frequent, although I think it could be related to gameplay. The extra stuff that came with it was more bothersome for me. I needed a bigger case to fit the bottom card plus a larger PSU, etc. The trial was still a learning process.

I
iMetalcrime_PT
Junior Member
15
03-15-2016, 07:18 PM
#6
Even at their best SLI and Crossfire were only marginally useful. There was no case where getting a 2nd GPU doubled your performance, and in most cases it only added 20-40% performance improvement, so the value of buying a second card for limited gains applied to very, very few people.
On top of that, most games weren't optimized for dual GPUs and there were a significant number of titles that actually saw worse performance under SLI than with a stand alone card.
Then on top of THAT, there were the setup issues, the latency issues, the cost issues, this impacted motherboard design, and in general over the last 10 years there's been a greater trend to smaller form factors PCs. At the same time the size and power requirements of GPUs has increased massively. None of that was conducive to continuing SLI or Crossfire.
When you throw in the fact that there was always a very tiny segment of the market that even used it, and the cost by nVidia and AMD to maintain that ecosystem, it was pretty much a foregone conclusion that it would go away.
I
iMetalcrime_PT
03-15-2016, 07:18 PM #6

Even at their best SLI and Crossfire were only marginally useful. There was no case where getting a 2nd GPU doubled your performance, and in most cases it only added 20-40% performance improvement, so the value of buying a second card for limited gains applied to very, very few people.
On top of that, most games weren't optimized for dual GPUs and there were a significant number of titles that actually saw worse performance under SLI than with a stand alone card.
Then on top of THAT, there were the setup issues, the latency issues, the cost issues, this impacted motherboard design, and in general over the last 10 years there's been a greater trend to smaller form factors PCs. At the same time the size and power requirements of GPUs has increased massively. None of that was conducive to continuing SLI or Crossfire.
When you throw in the fact that there was always a very tiny segment of the market that even used it, and the cost by nVidia and AMD to maintain that ecosystem, it was pretty much a foregone conclusion that it would go away.

R
renliff
Member
240
03-15-2016, 09:35 PM
#7
it was possible for a time but once dx10 and 11 arrived it didn't work well with multi-GPU setups. i'm not sure about nvidia, but using amd cards during the '00s gave a better alternative upgrade when next-gen GPUs became popular, since you could usually get another card cheaply to boost your system instead of buying the newest models. i recall it performed well in dx9 games most of the time.

dx12 contains many features hidden in its code that allow multi-GPU support without the hassle of SLI/crossfire, as PCIe now provides enough bandwidth for direct connections. occasionally i come across old articles discussing asynchronous multi-GPU setups in dx12, showing how it can use an iGPU to support a dGPU, and it works across different vendors—it's vendor neutral. but it's clear that this approach has been successful.

sli remains relevant in the server community, though it's now known as nvlink. nvidia upgraded its strategy to avoid enabling widespread cheap GPU networks, limiting the nvlink connector to newer models like the rtx3090. after that, they phased it out from their geforce line entirely, pushing users to upgrade to quadro cards. nevertheless, nvlink is the underlying technology that all these nvidia high-performance systems rely on to coordinate multiple GPUs.
R
renliff
03-15-2016, 09:35 PM #7

it was possible for a time but once dx10 and 11 arrived it didn't work well with multi-GPU setups. i'm not sure about nvidia, but using amd cards during the '00s gave a better alternative upgrade when next-gen GPUs became popular, since you could usually get another card cheaply to boost your system instead of buying the newest models. i recall it performed well in dx9 games most of the time.

dx12 contains many features hidden in its code that allow multi-GPU support without the hassle of SLI/crossfire, as PCIe now provides enough bandwidth for direct connections. occasionally i come across old articles discussing asynchronous multi-GPU setups in dx12, showing how it can use an iGPU to support a dGPU, and it works across different vendors—it's vendor neutral. but it's clear that this approach has been successful.

sli remains relevant in the server community, though it's now known as nvlink. nvidia upgraded its strategy to avoid enabling widespread cheap GPU networks, limiting the nvlink connector to newer models like the rtx3090. after that, they phased it out from their geforce line entirely, pushing users to upgrade to quadro cards. nevertheless, nvlink is the underlying technology that all these nvidia high-performance systems rely on to coordinate multiple GPUs.

D
DaaarkPlayer
Member
153
03-16-2016, 02:49 AM
#8
Great answers everywhere. Thanks to everyone. I was curious about why I still have that 1000 watt amp. It probably needs to be from 20 to 25 years ago. Dam it, I'm getting old. Haha!
D
DaaarkPlayer
03-16-2016, 02:49 AM #8

Great answers everywhere. Thanks to everyone. I was curious about why I still have that 1000 watt amp. It probably needs to be from 20 to 25 years ago. Dam it, I'm getting old. Haha!

S
SkyMaster280
Member
214
03-16-2016, 09:29 AM
#9
Old is a state of mind.
S
SkyMaster280
03-16-2016, 09:29 AM #9

Old is a state of mind.

L
LaniBooster
Senior Member
344
03-16-2016, 03:22 PM
#10
@COLGeek
... it seems like just a number, but some figures stand out more than others... I'm 71 years old. I never used SLI, but the main mystery was why a graphics card driver update would affect both my cards at once.
L
LaniBooster
03-16-2016, 03:22 PM #10

@COLGeek
... it seems like just a number, but some figures stand out more than others... I'm 71 years old. I never used SLI, but the main mystery was why a graphics card driver update would affect both my cards at once.

Pages (2): 1 2 Next