• Welcome to the Chevereto user community!

    Here users from all over the world gather around to learn the latest about Chevereto and contribute with ideas to improve the software.

    Please keep in mind:

    • 😌 This community is user driven. Be polite with other users.
    • 👉 Is required to purchase a Chevereto license to participate in this community (doesn't apply to Pre-sales).
    • 💸 Purchase a Pro Subscription to get access to active software support and faster ticket response times.

Random/Bulk Defined Upload Directory Structure

tryimg

Chevereto Member
I ran into another issue where all the uploads are residing on local storage. After few months, I found out that the upload folder was consuming 16GB of files. Then I had to manually create another directory and point the uploads to the second directory. It was just for the purpose to manage the files so that in future if I have to download the backup and restore it, I can do it in chunks and not the whole 50GB directory.

So the best way I did it, I started putting the directory names as 1, 2, 3, 4 and so on.

I think it would be a nice feature if you can let the users select multiple directories instead of just one directory, or create directories on incremental basis (lets say if the directory size reaches to 1GB, a new directory is created with the next number or character).

Or may be if the user selects multiple directories, then the uploads are randomly placed among any of those directories. (But I guess it would overload the database for websites with huge number of files while accessing albums and view links.)
 
How are you handling the backups? Distribute the files among different folders for local storage is something that will only help you if you use old fashion backup methods.
 
I'm doing a normal backup via cPanel. Or using the SSH, I used to ZIP the folder and download that folder.
http://tryimg.com/1
http://tryimg.com/2
http://tryimg.com/3
and so on, is how I was managing all the uploads. After every couple of months, I used to change the upload directory path and give it a new number. So I was able to zip the entire directory and download it. So instead of downloading a huge 50GB single file, I was dividing it into chunks like this.
 
You should use rsync, between servers is very fast. In that way you will always have a 1:1 copy of that folder.
 
Back
Top