F5F Stay Refreshed Software Operating Systems Free Sync

Free Sync

Free Sync

X
Xitrax
Junior Member
40
05-28-2016, 07:07 PM
#1
AMD's FreeSync is compatible with Nvidia graphics cards.
X
Xitrax
05-28-2016, 07:07 PM #1

AMD's FreeSync is compatible with Nvidia graphics cards.

B
BlueBackChart
Member
84
05-29-2016, 02:56 AM
#2
Little is known about it on practice. What makes it work is the VESA standard on displayport, and a compatible monitor + Drivers. If AMD enables it to other manufacturers, Nvidia could benefit from it. However I don't see that happening soon.
B
BlueBackChart
05-29-2016, 02:56 AM #2

Little is known about it on practice. What makes it work is the VESA standard on displayport, and a compatible monitor + Drivers. If AMD enables it to other manufacturers, Nvidia could benefit from it. However I don't see that happening soon.

O
oOEmmaOo
Posting Freak
818
06-05-2016, 12:59 AM
#3
Delayed compared to AMD cards since they aim to keep G-Sync active until it fails completely.
O
oOEmmaOo
06-05-2016, 12:59 AM #3

Delayed compared to AMD cards since they aim to keep G-Sync active until it fails completely.

M
monxes1
Junior Member
18
06-06-2016, 03:43 PM
#4
Only if NVIDIA desires it.
M
monxes1
06-06-2016, 03:43 PM #4

Only if NVIDIA desires it.

N
NyanTwertle
Member
60
06-06-2016, 09:18 PM
#5
AMD is adopting DisplayPort 1.4B, following Nvidia's lead. They're embracing a fully open standard.
N
NyanTwertle
06-06-2016, 09:18 PM #5

AMD is adopting DisplayPort 1.4B, following Nvidia's lead. They're embracing a fully open standard.

C
CatNinjaXD
Member
208
06-07-2016, 07:16 PM
#6
It won't prevent anything because freesync will become an open standard and part of the displayport spec. However, NVIDIA must add it to their hardware, which isn't automatic for all compatible cards.
C
CatNinjaXD
06-07-2016, 07:16 PM #6

It won't prevent anything because freesync will become an open standard and part of the displayport spec. However, NVIDIA must add it to their hardware, which isn't automatic for all compatible cards.

P
P3kena
Junior Member
46
06-08-2016, 02:26 AM
#7
AMD's freesync goes beyond the standard VESA adaptive sync. It won't function on NVIDIA GPUs, though some adaptive sync capabilities might exist. Remember that even the lowest supported GPU (like the 280x from GCN 1.0) won't handle freesync.
P
P3kena
06-08-2016, 02:26 AM #7

AMD's freesync goes beyond the standard VESA adaptive sync. It won't function on NVIDIA GPUs, though some adaptive sync capabilities might exist. Remember that even the lowest supported GPU (like the 280x from GCN 1.0) won't handle freesync.

H
holylight1234
Member
50
06-08-2016, 11:50 PM
#8
This topic is full of confusion and misunderstandings. I won’t search for the link, but you can check my previous messages if you’d like to review the details. Adaptive Sync and Freesync are distinct technologies, and only a limited number of displays support them. The discussion centered on an ASUS representative who clarified the technical aspects, explaining why GSync was favored over Freesync and that Adaptive Sync is simply a new protocol added to the DP 1.2a standard. It enables devices to exchange data if both can handle variable refresh rates. Just because a monitor or GPU has a DP 1.2a port doesn’t guarantee full compatibility. For a display to support VRR, it needs special components—similar to GSync—but these aren’t free. The technology requires both hardware and software adjustments, which adds to the overall cost. Freesync isn’t entirely free either; it demands specific equipment, just like GSync. As noted before, GSync won’t need extra gear beyond what’s standard, while Freesync is still in development. It might take another year or more before Freesync becomes available, as AMD hasn’t officially launched it. Until then, we can’t compare prices or performance definitively. The reality remains that both require specialized solutions and there’s no clear winner yet.
H
holylight1234
06-08-2016, 11:50 PM #8

This topic is full of confusion and misunderstandings. I won’t search for the link, but you can check my previous messages if you’d like to review the details. Adaptive Sync and Freesync are distinct technologies, and only a limited number of displays support them. The discussion centered on an ASUS representative who clarified the technical aspects, explaining why GSync was favored over Freesync and that Adaptive Sync is simply a new protocol added to the DP 1.2a standard. It enables devices to exchange data if both can handle variable refresh rates. Just because a monitor or GPU has a DP 1.2a port doesn’t guarantee full compatibility. For a display to support VRR, it needs special components—similar to GSync—but these aren’t free. The technology requires both hardware and software adjustments, which adds to the overall cost. Freesync isn’t entirely free either; it demands specific equipment, just like GSync. As noted before, GSync won’t need extra gear beyond what’s standard, while Freesync is still in development. It might take another year or more before Freesync becomes available, as AMD hasn’t officially launched it. Until then, we can’t compare prices or performance definitively. The reality remains that both require specialized solutions and there’s no clear winner yet.

D
143
06-10-2016, 08:10 AM
#9
@ jmaster299 Yeah the 'Freesync' name is a bit of a misnomer. Like gsync, it does require both video card and the monitor to support it, but on the monitor side of things its an open VESA standard rather than a custom chip that only works with nvidia hardware, so its definitely superior to gsync in that regard IMO. Also, afaik there's nothing proprietary about the GPU support for freesync...AMD has stated that intel/nvidia could implement this with no restrictions or licensing fees (see comments from richard huddy in the interview linked below). Also, many of AMD's cards already support it on the GPU side (Almost all their APU's, and all of the R9 cards except the 280x and 270x), so for some AMD owners only a monitor upgrade will be required. I did see an interview recently with AMD's richard huddy where he commented about 'FreeSync'/Gsync here: https://www.youtube.com/watch?feature=pl...FM8#t=3717 (starting at 1 hour 1 minute in). Some notable points from his comments: 1. freesync shouldn't be difficult/expensive for monitor manufacturers to support. He said that it simply requires the monitor manufacturers to 'build the monitor in a slightly different way' to support adaptive sync. He also said the support on the GPU side 'isn't terribly complicated', and it could even be possible that chips that already support eDP could have the potential to support freesync (intel chips), since freesync is derived from the adaptive refresh spec in embedded display port, but that only intel could confirm that (obviously). 2. He claims freesync has no additional latency, and that gsync has additional latency in comparison because of the way it communicates with the monitor, and how it uses a 768mb memory buffer on the monitor side, inducing an extra frame of latency. He says freesync works plug and play, no handshaking, no additional buffer. 3. R9 290 cards, R9 260 cards, and all AMD APU's support freesync hardware. 4. No restrictions to when intel/nvidia can implement freesync support, and no license fees My personal opinion on gsync: I definitely give nvidia credit for pushing the market in this direction (who knows if we would have ever seen 'freesync', had nvidia not come out with gsync). Gsync definitely kicked off the adaptive refresh rate trend for desktop gaming, which is a good thing, because video tearing/vsync is something that hasn't been given nearly enough focus in pc gaming IMO. However, I really dislike the way Nvidia implemented gsync. By getting the the market first, and implementing gsync as a completely proprietary solution, it allowed them to: 1. scoop up the early adopter market who are willing to pay a big premium for this technology. 2. Lock these early adopters into the nvidia ecosystem. If someone has a gsync monitor, you can bet their future gpu upgrades will also be nvidia, or else they will lose their nice adaptive refresh rate, because they are conveniently locked into a completely proprietary solution! Nvidia did a good thing by introducing tech to improve vsync, and pushing the market in that direction, but did a terrible thing by making it a totally proprietary solution. The market could be forever segmented between freesync/gsync if nvidia decides not to also support the open Adaptive Sync standard, which would definitely a *bad* thing for consumers, because it means their GPU choice/monitor choice is limited if they want this feature. But of course time will tell which solution works better, only way to know for sure is when freesync monitors come to market and can be reviewed/compared to gsync.
D
DaniilKozhuhar
06-10-2016, 08:10 AM #9

@ jmaster299 Yeah the 'Freesync' name is a bit of a misnomer. Like gsync, it does require both video card and the monitor to support it, but on the monitor side of things its an open VESA standard rather than a custom chip that only works with nvidia hardware, so its definitely superior to gsync in that regard IMO. Also, afaik there's nothing proprietary about the GPU support for freesync...AMD has stated that intel/nvidia could implement this with no restrictions or licensing fees (see comments from richard huddy in the interview linked below). Also, many of AMD's cards already support it on the GPU side (Almost all their APU's, and all of the R9 cards except the 280x and 270x), so for some AMD owners only a monitor upgrade will be required. I did see an interview recently with AMD's richard huddy where he commented about 'FreeSync'/Gsync here: https://www.youtube.com/watch?feature=pl...FM8#t=3717 (starting at 1 hour 1 minute in). Some notable points from his comments: 1. freesync shouldn't be difficult/expensive for monitor manufacturers to support. He said that it simply requires the monitor manufacturers to 'build the monitor in a slightly different way' to support adaptive sync. He also said the support on the GPU side 'isn't terribly complicated', and it could even be possible that chips that already support eDP could have the potential to support freesync (intel chips), since freesync is derived from the adaptive refresh spec in embedded display port, but that only intel could confirm that (obviously). 2. He claims freesync has no additional latency, and that gsync has additional latency in comparison because of the way it communicates with the monitor, and how it uses a 768mb memory buffer on the monitor side, inducing an extra frame of latency. He says freesync works plug and play, no handshaking, no additional buffer. 3. R9 290 cards, R9 260 cards, and all AMD APU's support freesync hardware. 4. No restrictions to when intel/nvidia can implement freesync support, and no license fees My personal opinion on gsync: I definitely give nvidia credit for pushing the market in this direction (who knows if we would have ever seen 'freesync', had nvidia not come out with gsync). Gsync definitely kicked off the adaptive refresh rate trend for desktop gaming, which is a good thing, because video tearing/vsync is something that hasn't been given nearly enough focus in pc gaming IMO. However, I really dislike the way Nvidia implemented gsync. By getting the the market first, and implementing gsync as a completely proprietary solution, it allowed them to: 1. scoop up the early adopter market who are willing to pay a big premium for this technology. 2. Lock these early adopters into the nvidia ecosystem. If someone has a gsync monitor, you can bet their future gpu upgrades will also be nvidia, or else they will lose their nice adaptive refresh rate, because they are conveniently locked into a completely proprietary solution! Nvidia did a good thing by introducing tech to improve vsync, and pushing the market in that direction, but did a terrible thing by making it a totally proprietary solution. The market could be forever segmented between freesync/gsync if nvidia decides not to also support the open Adaptive Sync standard, which would definitely a *bad* thing for consumers, because it means their GPU choice/monitor choice is limited if they want this feature. But of course time will tell which solution works better, only way to know for sure is when freesync monitors come to market and can be reviewed/compared to gsync.