F5F Stay Refreshed Software PC Gaming Review of Watch Dogs Enhancements for SLI Systems

Review of Watch Dogs Enhancements for SLI Systems

Review of Watch Dogs Enhancements for SLI Systems

Pages (3): Previous 1 2 3 Next
I
iskela99
Member
247
12-04-2016, 12:50 AM
#11
I’m really struggling with the game right now. I tried a 690 but it’s just too hard to play. Honestly, I’m going to bed and cry. I planned to stay awake, but I guess I’ll just tell my boss I need extra hours.
I
iskela99
12-04-2016, 12:50 AM #11

I’m really struggling with the game right now. I tried a 690 but it’s just too hard to play. Honestly, I’m going to bed and cry. I planned to stay awake, but I guess I’ll just tell my boss I need extra hours.

S
sKyiex
Junior Member
3
12-04-2016, 01:43 AM
#12
Extremely improbable. The game isn't running well at the moment, and it's uncertain if it'll ever be fixed. It seems poorly optimized overall, with the SLI even more so. Better wait until they resolve these issues.
S
sKyiex
12-04-2016, 01:43 AM #12

Extremely improbable. The game isn't running well at the moment, and it's uncertain if it'll ever be fixed. It seems poorly optimized overall, with the SLI even more so. Better wait until they resolve these issues.

R
RubyMine
Junior Member
45
12-04-2016, 03:53 AM
#13
R
RubyMine
12-04-2016, 03:53 AM #13

F
FramezTheBest
Member
222
12-04-2016, 04:28 AM
#14
I adjusted the SLI bits in Nvidia Inspector. Initially I used Far Cry 3 bits, then switched to Assassin's Creed 3 bits because the Watch Dogs engine combines elements from those titles. To access the SLI bits, open Nvidia Inspector and click the wrench icon. Modify the SLI bits as shown in the image below. Simply select the SLI bits and paste the entire line into the same section under the Watch Dogs profile. There’s a drop-down menu in the top left for navigation to Watch Dogs or you can type it directly. After finishing, click apply changes in the top right. If you’re satisfied, keep it; otherwise, restoring to defaults is straightforward.
F
FramezTheBest
12-04-2016, 04:28 AM #14

I adjusted the SLI bits in Nvidia Inspector. Initially I used Far Cry 3 bits, then switched to Assassin's Creed 3 bits because the Watch Dogs engine combines elements from those titles. To access the SLI bits, open Nvidia Inspector and click the wrench icon. Modify the SLI bits as shown in the image below. Simply select the SLI bits and paste the entire line into the same section under the Watch Dogs profile. There’s a drop-down menu in the top left for navigation to Watch Dogs or you can type it directly. After finishing, click apply changes in the top right. If you’re satisfied, keep it; otherwise, restoring to defaults is straightforward.

V
verybored
Junior Member
24
12-04-2016, 08:07 AM
#15
I managed to do that, but performance stays around 45 FPS even at the lowest settings—down to 15 frames. The 650m sli is really frustrating. Got any suggestions for a solution?
V
verybored
12-04-2016, 08:07 AM #15

I managed to do that, but performance stays around 45 FPS even at the lowest settings—down to 15 frames. The 650m sli is really frustrating. Got any suggestions for a solution?

M
Meowables
Senior Member
608
12-04-2016, 03:42 PM
#16
Along with employing Far Cry 3 SLI bits, my average frame rate has reached 110 FPS. On v-sync disabled, I occasionally hit rates above 150. This improvement comes from Zawad Iftikhar’s detailed guide. His comprehensive article includes many additional optimizations. If you own a high-end system and wish to prevent graphics or performance problems before Ubisoft updates the game, try running it with AA disabled, GPU rendering at level 1, and v-sync turned off. Using Nvidia Control Panel, enable FXAA, set anisotropic filtering to 16X, activate AA Mode Override, configure AA settings to 32XCSAA, adjust AA transparency to 8X, and set the maximum pre-rendered frame to 1 (optional). Enable Multi Display, choose Prefer Maximum Performance, and disable texture filtering. Ensure Clamp and Threaded Optimization are active. Experiment with toggling v-sync to identify the best settings for reducing stutter. For AMD users, similar configurations exist in the Catalyst Control Panel. -Zawad Iftikhar P.S. Running at 650mHz in SLI is quite limited. A desktop model at that speed barely supports modern titles. Mobile versions perform even worse, often matching older desktop specs. For instance, a 680m chip matches the specs of a desktop 670. I recommend lowering the game to 1280x720 and pushing down all settings—you might achieve playable performance. Honestly, a gaming laptop is rarely worth it, even for beginners. It’s usually wiser to assemble a custom PC. A $500 build could deliver performance rivaling that of a gaming laptop. This insight comes from my own experience. Gaming laptops are generally not recommended.
M
Meowables
12-04-2016, 03:42 PM #16

Along with employing Far Cry 3 SLI bits, my average frame rate has reached 110 FPS. On v-sync disabled, I occasionally hit rates above 150. This improvement comes from Zawad Iftikhar’s detailed guide. His comprehensive article includes many additional optimizations. If you own a high-end system and wish to prevent graphics or performance problems before Ubisoft updates the game, try running it with AA disabled, GPU rendering at level 1, and v-sync turned off. Using Nvidia Control Panel, enable FXAA, set anisotropic filtering to 16X, activate AA Mode Override, configure AA settings to 32XCSAA, adjust AA transparency to 8X, and set the maximum pre-rendered frame to 1 (optional). Enable Multi Display, choose Prefer Maximum Performance, and disable texture filtering. Ensure Clamp and Threaded Optimization are active. Experiment with toggling v-sync to identify the best settings for reducing stutter. For AMD users, similar configurations exist in the Catalyst Control Panel. -Zawad Iftikhar P.S. Running at 650mHz in SLI is quite limited. A desktop model at that speed barely supports modern titles. Mobile versions perform even worse, often matching older desktop specs. For instance, a 680m chip matches the specs of a desktop 670. I recommend lowering the game to 1280x720 and pushing down all settings—you might achieve playable performance. Honestly, a gaming laptop is rarely worth it, even for beginners. It’s usually wiser to assemble a custom PC. A $500 build could deliver performance rivaling that of a gaming laptop. This insight comes from my own experience. Gaming laptops are generally not recommended.

L
LegomakerLP
Junior Member
26
12-07-2016, 02:05 PM
#17
I've tested everything but still face significant stuttering and FPS drops. My setup includes an i7 4770k with SLI GTX 780, 16GB RAM at 1600Mhz. The game runs smoothly on an SSD. Are there any other adjustments to consider? (Most of the stuttering disappears by setting textures to normal, though). Even with just 2GB of VRAM usage, performance drops drastically. Interestingly, my average FPS falls from 80 to around 30. Vsync improves from 60 to 30 frames. I don’t need to make much change. The only time the game runs smoothly is on Normal settings—not even high settings. That’s unusual. Edit: I realized that when I set the game to use just 2GB of VRAM, it actually uses 3GB, and with Ultra textures it tries to access more than 3GB. Even on Normal textures, my GPU is using 2–3GB. Is this behavior typical for my game?
L
LegomakerLP
12-07-2016, 02:05 PM #17

I've tested everything but still face significant stuttering and FPS drops. My setup includes an i7 4770k with SLI GTX 780, 16GB RAM at 1600Mhz. The game runs smoothly on an SSD. Are there any other adjustments to consider? (Most of the stuttering disappears by setting textures to normal, though). Even with just 2GB of VRAM usage, performance drops drastically. Interestingly, my average FPS falls from 80 to around 30. Vsync improves from 60 to 30 frames. I don’t need to make much change. The only time the game runs smoothly is on Normal settings—not even high settings. That’s unusual. Edit: I realized that when I set the game to use just 2GB of VRAM, it actually uses 3GB, and with Ultra textures it tries to access more than 3GB. Even on Normal textures, my GPU is using 2–3GB. Is this behavior typical for my game?

J
jjmonkey13
Member
236
12-28-2016, 11:02 PM
#18
Attempt to replicate your previous steps by turning off AA in the game and using the control panel. Detailed guidance is available in the instructions linked below. The solutions are outlined under #3 on this page.
J
jjmonkey13
12-28-2016, 11:02 PM #18

Attempt to replicate your previous steps by turning off AA in the game and using the control panel. Detailed guidance is available in the instructions linked below. The solutions are outlined under #3 on this page.

X
xBlue_Dod
Member
57
12-30-2016, 06:27 AM
#19
I also gave it a shot. It doesn't work well for me. I'm just going to share what I did so I can check if I made a mistake. Translated fully: Ansitropic Filtering: 16x AA FXAA: On AA Gammacorrection: Default (on) AA Setting - 32x CSAA AA Mode: Boost or enhance settings (poor translation, sorry) AA Transparency: 8x (supersample) CUDA GPU - Default (all) Triple Buffer: Off Energycontrol - Adaptive Max Pre-rendered frames - 1 Multi Screen/Scrambled GPU acceleration: Single Monitor Performance mode Ambient Occlusion: not available SLI Render Mode: Single GPU (this was the point I wasn't sure if I did right – it actually reduced stutter) Shader Cache - Default Texturefilter- Antialiasing example: Off Texturefilter Quality - Performance Texturefiler Negative LOD-Bias - Clamp Texturefilter Trilithine optimization - On Thead optimizing - Off Vsync - Off (also tried on)
X
xBlue_Dod
12-30-2016, 06:27 AM #19

I also gave it a shot. It doesn't work well for me. I'm just going to share what I did so I can check if I made a mistake. Translated fully: Ansitropic Filtering: 16x AA FXAA: On AA Gammacorrection: Default (on) AA Setting - 32x CSAA AA Mode: Boost or enhance settings (poor translation, sorry) AA Transparency: 8x (supersample) CUDA GPU - Default (all) Triple Buffer: Off Energycontrol - Adaptive Max Pre-rendered frames - 1 Multi Screen/Scrambled GPU acceleration: Single Monitor Performance mode Ambient Occlusion: not available SLI Render Mode: Single GPU (this was the point I wasn't sure if I did right – it actually reduced stutter) Shader Cache - Default Texturefilter- Antialiasing example: Off Texturefilter Quality - Performance Texturefiler Negative LOD-Bias - Clamp Texturefilter Trilithine optimization - On Thead optimizing - Off Vsync - Off (also tried on)

U
UnMuteLP
Member
74
01-01-2017, 06:17 AM
#20
It was actually a secondary test, but I accidentally uploaded the incorrect image initially. I first attempted it at level 1 and then at level 4 just to check, but neither helped much.
U
UnMuteLP
01-01-2017, 06:17 AM #20

It was actually a secondary test, but I accidentally uploaded the incorrect image initially. I first attempted it at level 1 and then at level 4 just to check, but neither helped much.

Pages (3): Previous 1 2 3 Next