F5F Stay Refreshed Software PC Gaming Review of Watch Dogs Enhancements for SLI Systems

Review of Watch Dogs Enhancements for SLI Systems

Review of Watch Dogs Enhancements for SLI Systems

Pages (3): 1 2 3 Next
X
xXCiMangaxX
Junior Member
9
10-25-2016, 04:23 PM
#1
For those of you out there playing Watch Dogs and using SLI, this is a post you are going to want to pay attention to. To begin, I want to get one thing out of the way. About a week ago there was this "ninja driver" that a lot of people got. The driver was a beta driver, numbered 337.81. Some people know it as the "Chinese Driver". Any who... Today comes along the new WHQL, which promised increased performance across many games and an "improved" SLI profile for the highly anticipated Watch Dogs. First lets debunk that last statement. There is absolutely no difference in any way, shape, or form in the SLI profile found within this latest driver, 337.88. Going all the way back to 337.50 the Watch Dogs profile has not budged an inch. Nothing has been changed. Here is a comparison of the two between 337.81 and 337.88. A blind man can see that these driver are identical. Okay, so no big deal Joker, right? Wrong. Now the game is on the verge of release. Some people, myself included, received their copy of the game legally because the vendors shipped them early. Other people have been playing the game all week thanks to some internet piracy. We will ignore those folks, though. Let's get down to the performance of the game. Its bad. REALLY REALLY BAD! I am running dual GTX 780's and I am seeing my frames drop into the mid 40's while driving around the city. The story is even worse during police chases with explosions going off, slo-mo cameras, and all the hacking stuff. During these sequences my FPS have dropped down into the TEENS! This is unacceptable. The dev's are pointing the finger at system specs of the users. Um, excuse me.. But how is my $3,000 computer not up to spec to run a modern game? I run Battlefield 4 on 64 player servers at 200 FPS without dropping a single fucking frame. Let's look at some screenshots. Note the bottom right corner for framerates. The dev's biggest thing they are telling us to do is turn down textures. So I tested this. Below are two screens. The first is at medium textures (the lowest they go) with every other setting on Ultra with 4x TXAA. The second screen is with Ultra textures and all other settings on Ultra with 4x TXAA. So looking at these we have a difference of a whopping 3 frames per second. With the only change being the textures from the lowest possible setting to the highest possible setting. Tell me how in the hell is this good optimization? How it the users fault for not having good enough hardware like they want us to believe? Don't worry it gets better. Now I am going to show you two more screen shots. Both of these are with every setting in the game on Ultra, including textures and with 4x TXAA. The difference? In the first screen I am using the Far Cry 3 SLI bits. The second one is using the Assassin's Creed III SLI bits. The reason for choosing these is that the Watch Dogs engine is a combination of two engine's. The Dunia engine (Far Cry 3) and the Anvil Next Engine (Assassin's Creed III) So there you have it folks. With just 15 minutes of my time I took 74 frames on ultra and turned that into 93 frames per second on ultra. Why can't Nvidia do this? Why can't the developer's do this? It isn't rocket science. It is pure laziness on the part of the developer's and the people at Geforce making these drivers. 19 frames might not seem like much, but what about a guy running a GTX 760 where that 19 frames could be the difference of the game being utterly unplayable and just playable enough. The effort is pathetic and as consumers, we deserve better.
X
xXCiMangaxX
10-25-2016, 04:23 PM #1

For those of you out there playing Watch Dogs and using SLI, this is a post you are going to want to pay attention to. To begin, I want to get one thing out of the way. About a week ago there was this "ninja driver" that a lot of people got. The driver was a beta driver, numbered 337.81. Some people know it as the "Chinese Driver". Any who... Today comes along the new WHQL, which promised increased performance across many games and an "improved" SLI profile for the highly anticipated Watch Dogs. First lets debunk that last statement. There is absolutely no difference in any way, shape, or form in the SLI profile found within this latest driver, 337.88. Going all the way back to 337.50 the Watch Dogs profile has not budged an inch. Nothing has been changed. Here is a comparison of the two between 337.81 and 337.88. A blind man can see that these driver are identical. Okay, so no big deal Joker, right? Wrong. Now the game is on the verge of release. Some people, myself included, received their copy of the game legally because the vendors shipped them early. Other people have been playing the game all week thanks to some internet piracy. We will ignore those folks, though. Let's get down to the performance of the game. Its bad. REALLY REALLY BAD! I am running dual GTX 780's and I am seeing my frames drop into the mid 40's while driving around the city. The story is even worse during police chases with explosions going off, slo-mo cameras, and all the hacking stuff. During these sequences my FPS have dropped down into the TEENS! This is unacceptable. The dev's are pointing the finger at system specs of the users. Um, excuse me.. But how is my $3,000 computer not up to spec to run a modern game? I run Battlefield 4 on 64 player servers at 200 FPS without dropping a single fucking frame. Let's look at some screenshots. Note the bottom right corner for framerates. The dev's biggest thing they are telling us to do is turn down textures. So I tested this. Below are two screens. The first is at medium textures (the lowest they go) with every other setting on Ultra with 4x TXAA. The second screen is with Ultra textures and all other settings on Ultra with 4x TXAA. So looking at these we have a difference of a whopping 3 frames per second. With the only change being the textures from the lowest possible setting to the highest possible setting. Tell me how in the hell is this good optimization? How it the users fault for not having good enough hardware like they want us to believe? Don't worry it gets better. Now I am going to show you two more screen shots. Both of these are with every setting in the game on Ultra, including textures and with 4x TXAA. The difference? In the first screen I am using the Far Cry 3 SLI bits. The second one is using the Assassin's Creed III SLI bits. The reason for choosing these is that the Watch Dogs engine is a combination of two engine's. The Dunia engine (Far Cry 3) and the Anvil Next Engine (Assassin's Creed III) So there you have it folks. With just 15 minutes of my time I took 74 frames on ultra and turned that into 93 frames per second on ultra. Why can't Nvidia do this? Why can't the developer's do this? It isn't rocket science. It is pure laziness on the part of the developer's and the people at Geforce making these drivers. 19 frames might not seem like much, but what about a guy running a GTX 760 where that 19 frames could be the difference of the game being utterly unplayable and just playable enough. The effort is pathetic and as consumers, we deserve better.

E
EMANKILLER12
Member
167
11-04-2016, 08:33 PM
#2
Day 1 patch requested.
E
EMANKILLER12
11-04-2016, 08:33 PM #2

Day 1 patch requested.

H
husker53
Posting Freak
802
11-22-2016, 05:30 PM
#3
Hey there! Just a quick note about that 760 comment.
H
husker53
11-22-2016, 05:30 PM #3

Hey there! Just a quick note about that 760 comment.

S
Saiizar
Junior Member
15
11-23-2016, 02:05 AM
#4
Rephrase your request with the necessary details.
S
Saiizar
11-23-2016, 02:05 AM #4

Rephrase your request with the necessary details.

F
FlameSquid32
Senior Member
501
11-23-2016, 10:08 AM
#5
Wup, done.
F
FlameSquid32
11-23-2016, 10:08 AM #5

Wup, done.

L
LightCloud
Member
145
11-23-2016, 04:27 PM
#6
Thanks
L
LightCloud
11-23-2016, 04:27 PM #6

Thanks

G
Galaxtico99
Junior Member
13
11-28-2016, 09:15 AM
#7
This game is the next one with hardware problems, really clumsy, @ Misanthrope
G
Galaxtico99
11-28-2016, 09:15 AM #7

This game is the next one with hardware problems, really clumsy, @ Misanthrope

A
aikorner
Junior Member
43
11-30-2016, 09:30 AM
#8
I also discovered another method to enhance your experience; check out this screenshot. You'll notice the textures feel less detailed but it runs smoothly and is actually really enjoyable! For those missing the point, seriously take it seriously and save $45: http://store.steampowered.com/app/8190/
A
aikorner
11-30-2016, 09:30 AM #8

I also discovered another method to enhance your experience; check out this screenshot. You'll notice the textures feel less detailed but it runs smoothly and is actually really enjoyable! For those missing the point, seriously take it seriously and save $45: http://store.steampowered.com/app/8190/

M
Misch11
Junior Member
26
11-30-2016, 12:48 PM
#9
It seems someone noticed something interesting about a tweak they made and quickly saw its value. They’re curious about what developers or companies like Nvidia and AMD are up to with their time.
M
Misch11
11-30-2016, 12:48 PM #9

It seems someone noticed something interesting about a tweak they made and quickly saw its value. They’re curious about what developers or companies like Nvidia and AMD are up to with their time.

X
220
11-30-2016, 02:44 PM
#10
I also achieve a stable 60 fps at full settings with my 660 SLi configuration.
X
xXStrikeBackXx
11-30-2016, 02:44 PM #10

I also achieve a stable 60 fps at full settings with my 660 SLi configuration.

Pages (3): 1 2 3 Next