• Welcome to the Chevereto User Community!

    Here, users from all over the world come together to learn, share, and collaborate on everything related to Chevereto. It's a place to exchange ideas, ask questions, and help improve the software.

    Please keep in mind:

    • This community is user-driven. Always be polite and respectful to others.
    • Support development by purchasing a Chevereto license, which also gives you priority support.
    • Go further by joining the Community Subscription for even faster response times and to help sustain this space
  • Chevereto Support CLST

    Support response

    Support checklist

    • Got a Something went wrong message? Read this guide and provide the actual error. Do not skip this.
    • Confirm that the server meets the System Requirements
    • Check for any available Hotfix - your issue could be already reported/fixed
    • Read documentation - It will be required to Debug and understand Errors for a faster support response

What's the recommended method of transferring from Bunny Storage to Amazon S3

Version
4.0.12
PHP version
8.2.29
Database driver
MariaDB
Database version
10.6.23
Web browser
Chrome

Mortgage

Chevereto Member
Hey guys,

Stressful time, last 24hours.

What's the recommended method of transferring from Bunny Storage to Amazon S3 bucket? I have 53.48 GB tried to download locally via FTP and then reupload but FTP is meh and just hangs on certain files and is going to take me days.

Tried rsync but won't connect to Bunny, kept saying directory not found regardless despite the directory exists and I can connect via FileZilla without issue.

Thanks.
 
Okay, so I have made some kind of progress using rclone.

However, It's throwing a lot of Entry doesn't belong in directory "YYYY/MM/DD/" (too short) - ignoring and "Failed to read last modified"

Anyone from the community use rclone much and can shed some light on this? As to me it sounds like I'm going to be missing a lot of media.
 
Hey Rodolfo, I've checked the Bunny storage end and none of the folders are empty, unfortunately.

Since rclone was having many issues I'm currently waiting for S3 to upload the entire .zip 49.9 GB file but got 11 hours to wait now.. 1.3 MB/s transfer rate.

Then gonna attempt to use ChatGPT to unzip the folder with a EC2 Instance. I am not tech savvy enough for all this (lol!)
 
You just need a VPS with good network, so you pivot the data there. You can pay these machines on an hourly basis, for your need it cost about 3.5 cents per hour... Just make sure to remove the machine once done.
 
You just need a VPS with good network, so you pivot the data there. You can pay these machines on an hourly basis, for your need it cost about 3.5 cents per hour... Just make sure to remove the machine once done.

I ended up using Amazon EC2 instance think I have successfully transferred everything. I did the "Migrate external storage records" from ID: 2 > ID: 1.

Although I think the bucket Policy is blocking things as it's AccessDenied on all media.

domain.tld is obviously my domain and <bucket-name> is my buckets name just removed it from here.

JSON:
{
    "Version": "2012-10-17",
    "Id": "http referer policy example",
    "Statement": [
        {
            "Sid": "Allow get requests referred by www.DOMAIN.TLD and DOMAIN.TLD.",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::<bucket-name>/*",
            "Condition": {
                "StringLike": {
                    "aws:Referer": [
                        "https://www.DOMAIN.TLD/*",
                        "https://DOMAIN.TLD/*"
                    ]
                }
            }
        },
        {
            "Sid": "Allow upload of objects from any source",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:PutObject",
            "Resource": "arn:aws:s3:::<bucket-name>/*"
        }
    ]
}

Should I be using this one from 2015? https://chevereto.com/community/threads/minimum-s3-policy-for-chevereto.6052/post-33189

EDIT:
This one from ChatGPT works and resolved images (not sure if it's correct.. right... or valid as GPT can make mistakes):
JSON:
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "AllowUploadsFromWebsite",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:PutObject",
            "Resource": "arn:aws:s3:::<bucket-name>/*"
        },
        {
            "Sid": "AllowCDNAndWebsiteToRead",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::<bucket-name>/*"
        }
    ]
}
 
Last edited:
Back
Top