F5F Stay Refreshed Software Operating Systems Slow transfer speeds on Manjaro

Slow transfer speeds on Manjaro

Slow transfer speeds on Manjaro

I
imnotben
Member
67
06-26-2016, 01:26 PM
#1
Computer: I'm using a NUC J4005 with 2.7GHz processor, 16 GB RAM, and a 120 GB SATA SSD. My internet connection is 30 Mbps with gigabit LAN. I've been trying to run Linux on it, but data transfer is slow—especially over the local network when using Windows 10. On Fast.com, both my machines show 30 Mbps, but when downloading from an FTP server, I get only about 900 KiB/s on Linux and around 700 KiB/s on Windows. FileZilla works similarly on both systems. I've also tried HTTP downloads, which cap around 1.2 MiB/s. When pulling files from my NAS, speeds are much higher (up to 105 MiB/s on Windows). It seems Samba is the issue, as I get full speeds locally but not over the network. What should I do?
I
imnotben
06-26-2016, 01:26 PM #1

Computer: I'm using a NUC J4005 with 2.7GHz processor, 16 GB RAM, and a 120 GB SATA SSD. My internet connection is 30 Mbps with gigabit LAN. I've been trying to run Linux on it, but data transfer is slow—especially over the local network when using Windows 10. On Fast.com, both my machines show 30 Mbps, but when downloading from an FTP server, I get only about 900 KiB/s on Linux and around 700 KiB/s on Windows. FileZilla works similarly on both systems. I've also tried HTTP downloads, which cap around 1.2 MiB/s. When pulling files from my NAS, speeds are much higher (up to 105 MiB/s on Windows). It seems Samba is the issue, as I get full speeds locally but not over the network. What should I do?

_
_KotoVasa_
Member
197
06-27-2016, 05:12 PM
#2
I've explored using wget for downloads. There are some resources discussing how it compares to other methods on Linux and Windows servers, noting potential performance benefits. Generally, HTTP is considered safer and often faster. For quicker access, open the terminal (Ctrl-Alt-F2) as root or with sudo, rather than the GUI, which can help speed things up.
_
_KotoVasa_
06-27-2016, 05:12 PM #2

I've explored using wget for downloads. There are some resources discussing how it compares to other methods on Linux and Windows servers, noting potential performance benefits. Generally, HTTP is considered safer and often faster. For quicker access, open the terminal (Ctrl-Alt-F2) as root or with sudo, rather than the GUI, which can help speed things up.

S
Sonicmolina
Junior Member
9
06-29-2016, 03:00 AM
#3
I attempted this and it appears the server handles roughly 1MiB/s via FTP compared to 3MiB/s with wget. It might be worth focusing on FileZilla instead. Perhaps I should explore another FTP tool after all these years without any hiccups. It seems gFTP is pulling at full capacity, so I’ll go with that for now.
S
Sonicmolina
06-29-2016, 03:00 AM #3

I attempted this and it appears the server handles roughly 1MiB/s via FTP compared to 3MiB/s with wget. It might be worth focusing on FileZilla instead. Perhaps I should explore another FTP tool after all these years without any hiccups. It seems gFTP is pulling at full capacity, so I’ll go with that for now.

O
OpSpambot
Member
57
06-29-2016, 04:25 AM
#4
Things evolve, or just keep using wget for speed. I can't offer advice if I avoid FTP altogether, unless it's the only option. Best of luck!
O
OpSpambot
06-29-2016, 04:25 AM #4

Things evolve, or just keep using wget for speed. I can't offer advice if I avoid FTP altogether, unless it's the only option. Best of luck!

C
Craft_Mob
Member
73
07-06-2016, 11:13 AM
#5
Use the built-in FTP tool in your Linux distribution. Connect to the server via ftp://foo.bar and enter your login details when prompted.
C
Craft_Mob
07-06-2016, 11:13 AM #5

Use the built-in FTP tool in your Linux distribution. Connect to the server via ftp://foo.bar and enter your login details when prompted.

R
ripa5000
Posting Freak
884
07-20-2016, 02:20 PM
#6
Windows seems to work, but it isn't the best approach because I need consistent access to multiple servers beyond just downloading files. This won't deliver a smooth user experience.
R
ripa5000
07-20-2016, 02:20 PM #6

Windows seems to work, but it isn't the best approach because I need consistent access to multiple servers beyond just downloading files. This won't deliver a smooth user experience.

O
OmqDace
Posting Freak
798
07-22-2016, 02:52 AM
#7
You have options for managing that path. You can pin it as a quick access item, add an alias or shortcut to your Desktop or Panel, or set it to auto-mount at startup and keep it permanently in the file manager. If you prefer, you can also configure it to mount automatically when the system boots. Unless you need more advanced features, using the terminal to mount the share and possibly creating scripts for automatic syncing might be sufficient. But if that feels overwhelming and gFTP is working well, it might be best to stick with it.
O
OmqDace
07-22-2016, 02:52 AM #7

You have options for managing that path. You can pin it as a quick access item, add an alias or shortcut to your Desktop or Panel, or set it to auto-mount at startup and keep it permanently in the file manager. If you prefer, you can also configure it to mount automatically when the system boots. Unless you need more advanced features, using the terminal to mount the share and possibly creating scripts for automatic syncing might be sufficient. But if that feels overwhelming and gFTP is working well, it might be best to stick with it.