Article

  • 9 years

    1 year

    YetiShare / Core

    4857

Uploading or downloading large files not working when using FTP/S3/Wasabi/Backblaze storage

Problem

When using a FTP, S3, Backblaze or Wasabi server for storage, you try to upload or download a large file and it fails. Smaller files work ok.

Fix

This can be caused by 2 things:

1) Memory issues - This can be an issue with PHP not having access to enough memory to process the request. To fix it add this line at the end of your _config.inc.php file (in the script root folder):

ini_set('memory_limit', '4096M');

You should set this limit to the maximum permitted filesize you're allowing on upload. Note: 2048M may be the maximum on a 32 bit system.

2) Timeout issues on large files - The way file uploads to off-site storage works (apart from "direct") is as follows:

- User uploads file to your site
- Site receives file into temp storage
- When it reaches 100%, it's then transferred into external storage while the user waits

For large files this last process can take time. If it's longer than your server HTTP timeout settings, it'll error. To fix this, make sure your PHP timeout settings are raised, you should be able to check your server error logs if this is the cause.

Also note that if you're using a proxy like CloudFlare, they will have their own HTTP timeouts (normally 30 seconds), so we'd recommend disabling CloudFlare.

Notes:

- These types of storage methods are best suited to smaller file sizes (< 1GB) and low traffic sites. If you want to support very large filesizes and high traffic, you're best using "local" or "direct" storage.