We are in the process of migrating this forum. A new space will be available soon. We are sorry for the inconvenience.

extremely slow backup of small files

03-15-2016, 11:52 AM
"However, for optimimum performance, we recommend that you do not to exceed 50,000 files per folder."

tried ask about this question many times from devs in private, never got answer.
but anyway i went over this limit and it still working. (im using a filesystem top of hubic, meaning im writing to filesystem and that is writing data files under one folder) it works with no problem, speed 150mbit/s both directions from ovh vps.

03-15-2016, 07:43 AM

I do have the same issue, I NEED to backup a working directory with lots of small files and uploads is crazy slow! it upload just a few files per minutes. I tried to increase the number of concurrent upload but then it fails very often. And the overall process don't seems faster anyway I have lots of time with no real upload at all, all the files are listed but not uploaded.
I suspect some kind of very VERY slow connection process. It is almost unusable and I don't think I will ever reach the end of this folders.

I can see a few ways to improve this :
- Faster connection/checking
- upload in bulk for small files (or at least for the first sync)
- allow to set exclusion on file patterns. most of those files are dev dependancies I don't need to backup. But they are all in the project tree. I would be great if I could exclude some files/folder on a regexp matching rules.


05-28-2015, 12:17 PM
I see here that it is advisable not to upload a folder with more than 50,000 files:
"However, for optimimum performance, we recommend that you do not to exceed 50,000 files per folder."

Does it matter if the files are all in one folder, or spread out in subfolders?

05-25-2015, 10:26 AM
My workaround is to tar the folder, put it in a tar folder, and backup the tar folder.
But this would be inconvenient if I want to backup a working folder with lots of small files that may change daily.
I thought it should work like rsync (which is also slower for lots of small files, but not that slow).

05-25-2015, 10:09 AM
I started a linux backup of a folder that has about 50000 files and total size of about 1.3GB, and it was extremely slow.
After about 12 hours, it was only 33% through.
The upload speed shown by "hubic status" was about 1 KB/s (I remember seeing speeds of 800 B/s and 2.2 KB/s).

I run the backup from a linux OVH server, so my network connection should not be a problem. With other backups containing large files, I get typical upload speeds of 500KB/s and even up to 4MB/s.

I cancelled the backup (it was a bit tricky to do this) and started a backup of large files. It was fast.

So the backup seems to be extremely slow for small files (slower than 1 file per second).
I also noticed that the upload speed varies widely during a backup. Apparently it changes all the time depending on the size of the current file being processed.