You’ve built a slick file upload feature in Flask. It works perfectly when you test with small files—images, PDFs, whatever. Then someone tries uploading a 20MB video and your app returns a cryptic 413 Request Entity Too Large error. Or worse, the upload just hangs forever and times out. What’s going on?
File uploads in Flask can be deceptively tricky. Between Flask’s own size limits, web server configurations, reverse proxy settings, and memory constraints, there are at least five different places where large uploads can fail. Let’s walk through each one and fix them all.
TLDR: Quick Fix for Flask File Upload Errors
Most Common Cause: Flask’s MAX_CONTENT_LENGTH setting blocks uploads above 16MB by default.
❌ Before (rejects files >16MB):
from flask import Flask, request
app = Flask(__name__)
# No MAX_CONTENT_LENGTH set - defaults to 16MB
@app.route('/upload', methods=['POST'])
def upload_file():
file = request.files['file']
file.save(f'/uploads/{file.filename}')
return {'status': 'success'}
✅ After (accepts larger files):
from flask import Flask, request
app = Flask(__name__)
# Allow uploads up to 100MB
app.config['MAX_CONTENT_LENGTH'] = 100 * 1024 * 1024
@app.route('/upload', methods=['POST'])
def upload_file():
if 'file' not in request.files:
return {'error': 'No file provided'}, 400
file = request.files['file']
if file.filename == '':
return {'error': 'Empty filename'}, 400
file.save(f'/uploads/{file.filename}')
return {'status': 'success'}
Quick Prevention Tips:
- Set
MAX_CONTENT_LENGTHto your actual limit - Configure your reverse proxy (nginx, Apache) limits
- Use streaming for very large files (>100MB)
- Validate file types and sizes on the client side first
Symptom Description
When Flask rejects file uploads, you’ll typically see one of these errors:
413 Request Entity Too Large:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">
<title>413 Request Entity Too Large</title>
<h1>Request Entity Too Large</h1>
<p>The data value transmitted exceeds the capacity limit.</p>
RequestEntityTooLarge Exception:
werkzeug.exceptions.RequestEntityTooLarge: 413 Request Entity Too Large:
The data value transmitted exceeds the capacity limit.
Connection Reset or Timeout: Sometimes the upload doesn’t even get an error—it just hangs and eventually times out. This usually happens when the reverse proxy (nginx, Apache) blocks the upload before Flask even sees it.
Memory Error (for massive files):
MemoryError: Unable to allocate array with shape...
This happens when you try loading huge files entirely into memory instead of streaming them.
Diagnostic Steps
Before we start fixing things, let’s figure out exactly where your upload is failing.
Step 1: Check Flask’s Debug Output
Enable Flask debug mode to see detailed error messages:
from flask import Flask
app = Flask(__name__)
app.config['DEBUG'] = True # Only in development!
app.config['TESTING'] = True
Now when an upload fails, you’ll get a full stack trace showing whether it’s Flask blocking the upload or something else.
Step 2: Test with curl
Use curl to test uploads directly, bypassing any frontend JavaScript:
# Test with a small file
curl -X POST -F "file=@small_file.jpg" http://localhost:5000/upload
# Test with a large file
curl -X POST -F "file=@large_file.mp4" http://localhost:5000/upload
# Check the response
echo $?
If curl succeeds but your web form fails, the problem is in your JavaScript. If curl also fails, it’s a server-side issue.
Step 3: Check All Size Limits
There are multiple size limits that can block uploads. Check each one:
from flask import Flask
app = Flask(__name__)
# Check Flask's limit
print(f"Flask MAX_CONTENT_LENGTH: {app.config.get('MAX_CONTENT_LENGTH', 'Not set')}")
# In your route, check the actual request size
@app.route('/upload', methods=['POST'])
def upload_file():
content_length = request.content_length
print(f"Request size: {content_length / (1024*1024):.2f} MB")
# ... rest of your code
Step 4: Monitor Your Server Logs
Check logs for your web server (nginx, Apache, Gunicorn):
# Nginx logs
tail -f /var/log/nginx/error.log
# Gunicorn logs
tail -f /var/log/gunicorn/error.log
# Flask logs
tail -f /var/log/flask/app.log
Look for error messages mentioning “client intended to send too large body” or similar.
Cause #1: Flask’s MAX_CONTENT_LENGTH Too Low
Flask has a built-in safety limit that blocks requests above a certain size. By default, it’s pretty conservative.
The Problem:
from flask import Flask, request
app = Flask(__name__)
# MAX_CONTENT_LENGTH not set - uses default
@app.route('/upload', methods=['POST'])
def upload_file():
# This fails for files > 16MB
file = request.files['file']
file.save(f'/uploads/{file.filename}')
return {'status': 'success'}
The Fix:
Set MAX_CONTENT_LENGTH to match your needs:
from flask import Flask, request
app = Flask(__name__)
# Allow uploads up to 100MB
app.config['MAX_CONTENT_LENGTH'] = 100 * 1024 * 1024
@app.route('/upload', methods=['POST'])
def upload_file():
file = request.files['file']
file.save(f'/uploads/{file.filename}')
return {'status': 'success'}
Better Approach - Handle the Exception:
Don’t let Flask return a generic 413 error. Catch it and return a helpful message:
from flask import Flask, request, jsonify
from werkzeug.exceptions import RequestEntityTooLarge
app = Flask(__name__)
app.config['MAX_CONTENT_LENGTH'] = 100 * 1024 * 1024
@app.route('/upload', methods=['POST'])
def upload_file():
try:
file = request.files['file']
file.save(f'/uploads/{file.filename}')
return jsonify({'status': 'success'})
except RequestEntityTooLarge:
return jsonify({
'error': 'File too large',
'max_size': '100MB'
}), 413
Production Pattern - Global Error Handler:
from flask import Flask, jsonify
from werkzeug.exceptions import RequestEntityTooLarge
app = Flask(__name__)
app.config['MAX_CONTENT_LENGTH'] = 100 * 1024 * 1024
@app.errorhandler(RequestEntityTooLarge)
def handle_file_too_large(e):
max_size = app.config['MAX_CONTENT_LENGTH'] / (1024 * 1024)
return jsonify({
'error': 'File exceeds maximum size',
'max_size_mb': max_size,
'message': f'Please upload files smaller than {max_size}MB'
}), 413
@app.route('/upload', methods=['POST'])
def upload_file():
file = request.files['file']
file.save(f'/uploads/{file.filename}')
return jsonify({'status': 'success'})
Now every route automatically gets nice error messages for oversized uploads.
Cause #2: Nginx Client Body Size Limit
If you’re running Flask behind nginx (and you should be in production), nginx has its own size limit that defaults to 1MB. Yeah, just 1MB. This catches everyone.
The Problem:
Your nginx config doesn’t specify a client body size:
server {
listen 80;
server_name example.com;
location / {
proxy_pass http://127.0.0.1:5000;
proxy_set_header Host $host;
# No client_max_body_size set - defaults to 1MB!
}
}
The Fix:
Add client_max_body_size to your nginx config:
server {
listen 80;
server_name example.com;
# Allow uploads up to 100MB
client_max_body_size 100M;
location / {
proxy_pass http://127.0.0.1:5000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
# Also increase timeouts for large uploads
proxy_connect_timeout 300;
proxy_send_timeout 300;
proxy_read_timeout 300;
send_timeout 300;
}
}
Important: After changing nginx config, reload nginx:
sudo nginx -t # Test config first
sudo systemctl reload nginx
Route-Specific Limits:
You can set different limits for different routes:
server {
listen 80;
server_name example.com;
# Global limit
client_max_body_size 10M;
# Higher limit for upload endpoint
location /upload {
client_max_body_size 100M;
proxy_pass http://127.0.0.1:5000;
}
# Standard limit for other routes
location / {
proxy_pass http://127.0.0.1:5000;
}
}
Cause #3: Apache Request Size Limit
If you’re using Apache instead of nginx, it has a similar limit called LimitRequestBody.
The Problem:
Apache config with default limits:
<VirtualHost *:80>
ServerName example.com
WSGIDaemonProcess myapp user=www-data group=www-data threads=5
WSGIScriptAlias / /var/www/myapp/app.wsgi
# No LimitRequestBody set - defaults to unlimited, but...
# ...individual modules might still have limits
</VirtualHost>
The Fix:
Set LimitRequestBody explicitly:
<VirtualHost *:80>
ServerName example.com
# Allow up to 100MB uploads (value in bytes)
LimitRequestBody 104857600
# Apply to specific location
<Location "/upload">
LimitRequestBody 104857600
</Location>
WSGIDaemonProcess myapp user=www-data group=www-data threads=5
WSGIScriptAlias / /var/www/myapp/app.wsgi
# Increase timeout for large uploads
Timeout 300
</VirtualHost>
Reload Apache after changes:
sudo apache2ctl configtest # Test config
sudo systemctl reload apache2
Cause #4: Memory Issues with Large Files
Even if you’ve configured all the size limits correctly, loading massive files into memory can crash your app.
The Problem:
from flask import Flask, request
app = Flask(__name__)
app.config['MAX_CONTENT_LENGTH'] = 500 * 1024 * 1024 # 500MB
@app.route('/upload', methods=['POST'])
def upload_file():
file = request.files['file']
# This loads the ENTIRE file into memory!
file_data = file.read()
# Do something with file_data
# ...but your server just ran out of memory
The Fix - Stream Instead of Loading:
from flask import Flask, request
import os
app = Flask(__name__)
app.config['MAX_CONTENT_LENGTH'] = 500 * 1024 * 1024
UPLOAD_FOLDER = '/uploads'
@app.route('/upload', methods=['POST'])
def upload_file():
file = request.files['file']
# Stream the file to disk in chunks
file_path = os.path.join(UPLOAD_FOLDER, file.filename)
# This reads and writes in chunks, not all at once
file.save(file_path)
return {'status': 'success', 'path': file_path}
Advanced - Manual Chunked Upload:
For even more control:
from flask import Flask, request
import os
app = Flask(__name__)
app.config['MAX_CONTENT_LENGTH'] = 500 * 1024 * 1024
UPLOAD_FOLDER = '/uploads'
CHUNK_SIZE = 4096 # 4KB chunks
@app.route('/upload', methods=['POST'])
def upload_file():
file = request.files['file']
file_path = os.path.join(UPLOAD_FOLDER, file.filename)
# Write file in chunks to control memory usage
with open(file_path, 'wb') as f:
while True:
chunk = file.stream.read(CHUNK_SIZE)
if not chunk:
break
f.write(chunk)
return {'status': 'success', 'path': file_path}
Production Pattern - With Progress Tracking:
from flask import Flask, request, jsonify
import os
import hashlib
app = Flask(__name__)
app.config['MAX_CONTENT_LENGTH'] = 500 * 1024 * 1024
UPLOAD_FOLDER = '/uploads'
CHUNK_SIZE = 8192 # 8KB chunks
@app.route('/upload', methods=['POST'])
def upload_file():
file = request.files['file']
file_path = os.path.join(UPLOAD_FOLDER, file.filename)
# Track progress and calculate checksum
file_size = 0
file_hash = hashlib.sha256()
with open(file_path, 'wb') as f:
while True:
chunk = file.stream.read(CHUNK_SIZE)
if not chunk:
break
f.write(chunk)
file_hash.update(chunk)
file_size += len(chunk)
return jsonify({
'status': 'success',
'path': file_path,
'size': file_size,
'checksum': file_hash.hexdigest()
})
Cause #5: Gunicorn/uWSGI Worker Timeout
When running Flask with Gunicorn or uWSGI in production, workers have timeouts. Large file uploads can exceed these timeouts.
The Problem with Gunicorn:
# Default Gunicorn command
gunicorn -w 4 -b 0.0.0.0:5000 app:app
# Workers timeout after 30 seconds by default
The Fix:
Increase the timeout:
# Allow 5 minutes for requests to complete
gunicorn -w 4 -b 0.0.0.0:5000 --timeout 300 app:app
Better - Use a Config File:
Create gunicorn.conf.py:
import multiprocessing
# Worker configuration
workers = multiprocessing.cpu_count() * 2 + 1
worker_class = 'sync' # or 'gevent' for async
worker_connections = 1000
# Timeout configuration (5 minutes)
timeout = 300
keepalive = 2
# Logging
accesslog = '/var/log/gunicorn/access.log'
errorlog = '/var/log/gunicorn/error.log'
loglevel = 'info'
# For large file uploads
limit_request_line = 8190
limit_request_fields = 100
limit_request_field_size = 8190
Run with:
gunicorn -c gunicorn.conf.py app:app
The Problem with uWSGI:
[uwsgi]
module = app:app
master = true
processes = 4
socket = /tmp/myapp.sock
# No timeout set - defaults to 60 seconds
The Fix:
[uwsgi]
module = app:app
master = true
processes = 4
socket = /tmp/myapp.sock
# Increase timeout for large uploads (5 minutes)
harakiri = 300
# Increase buffer sizes
buffer-size = 65535
post-buffering = 8192
# Handle large requests
limit-post = 104857600 # 100MB
Cause #6: Client-Side JavaScript Issues
Sometimes the upload fails not because of server limits, but because your JavaScript isn’t configured properly.
The Problem:
// Frontend upload code
async function uploadFile(file) {
const formData = new FormData();
formData.append('file', file);
// Fetch with default timeout - fails on slow connections!
const response = await fetch('/upload', {
method: 'POST',
body: formData
});
return response.json();
}
The Fix - Add Timeout and Progress:
async function uploadFile(file, onProgress) {
const formData = new FormData();
formData.append('file', file);
return new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest();
// Track upload progress
xhr.upload.addEventListener('progress', (e) => {
if (e.lengthComputable) {
const percentComplete = (e.loaded / e.total) * 100;
onProgress(percentComplete);
}
});
// Handle completion
xhr.addEventListener('load', () => {
if (xhr.status === 200) {
resolve(JSON.parse(xhr.responseText));
} else {
reject(new Error(`Upload failed: ${xhr.statusText}`));
}
});
// Handle errors
xhr.addEventListener('error', () => {
reject(new Error('Network error during upload'));
});
xhr.addEventListener('timeout', () => {
reject(new Error('Upload timeout'));
});
// Set a long timeout (5 minutes)
xhr.timeout = 300000;
// Send the request
xhr.open('POST', '/upload');
xhr.send(formData);
});
}
// Usage
const fileInput = document.getElementById('file-input');
const progressBar = document.getElementById('progress');
fileInput.addEventListener('change', async (e) => {
const file = e.target.files[0];
try {
const result = await uploadFile(file, (percent) => {
progressBar.style.width = `${percent}%`;
progressBar.textContent = `${percent.toFixed(0)}%`;
});
console.log('Upload successful:', result);
} catch (error) {
console.error('Upload failed:', error);
}
});
Cause #7: Insufficient Disk Space
This one’s obvious but often overlooked—your server’s disk is full.
The Problem:
from flask import Flask, request
app = Flask(__name__)
app.config['MAX_CONTENT_LENGTH'] = 500 * 1024 * 1024
@app.route('/upload', methods=['POST'])
def upload_file():
file = request.files['file']
# Fails silently if disk is full
file.save(f'/uploads/{file.filename}')
return {'status': 'success'}
The Fix - Check Available Space:
from flask import Flask, request, jsonify
import os
import shutil
app = Flask(__name__)
app.config['MAX_CONTENT_LENGTH'] = 500 * 1024 * 1024
UPLOAD_FOLDER = '/uploads'
def get_free_space(path):
"""Get free disk space in bytes"""
stat = shutil.disk_usage(path)
return stat.free
@app.route('/upload', methods=['POST'])
def upload_file():
file = request.files['file']
# Check if file will fit
content_length = request.content_length
free_space = get_free_space(UPLOAD_FOLDER)
if content_length > free_space:
return jsonify({
'error': 'Insufficient disk space',
'required': content_length,
'available': free_space
}), 507 # Insufficient Storage
file_path = os.path.join(UPLOAD_FOLDER, file.filename)
file.save(file_path)
return jsonify({
'status': 'success',
'path': file_path
})
Production Pattern - Monitor Disk Usage:
from flask import Flask, jsonify
import shutil
import logging
app = Flask(__name__)
app.config['MAX_CONTENT_LENGTH'] = 500 * 1024 * 1024
UPLOAD_FOLDER = '/uploads'
MIN_FREE_SPACE = 1 * 1024 * 1024 * 1024 # 1GB minimum
@app.before_request
def check_disk_space():
"""Check disk space before accepting uploads"""
if request.method == 'POST' and 'file' in request.files:
free_space = shutil.disk_usage(UPLOAD_FOLDER).free
if free_space < MIN_FREE_SPACE:
logging.critical(f"Low disk space: {free_space / (1024**3):.2f}GB remaining")
return jsonify({
'error': 'Server storage nearly full',
'message': 'Please try again later'
}), 507
Still Not Working?
If you’ve checked all the common causes and uploads still fail, here are some edge cases.
Edge Case 1: File Permission Issues
Your upload folder might not be writable:
# Check permissions
ls -la /uploads
# Fix permissions
sudo chown www-data:www-data /uploads
sudo chmod 755 /uploads
Edge Case 2: SELinux Blocking Writes
On CentOS/RHEL with SELinux:
# Check SELinux status
getenforce
# Allow httpd to write to uploads directory
sudo chcon -R -t httpd_sys_rw_content_t /uploads
# Or set permanently
sudo semanage fcontext -a -t httpd_sys_rw_content_t "/uploads(/.*)?"
sudo restorecon -Rv /uploads
Edge Case 3: Cloud Provider Limits
AWS, Google Cloud, and Azure have their own limits:
AWS Application Load Balancer: 1MB limit for HTTP requests (can’t be changed)
Solution: Use S3 pre-signed URLs for direct uploads:
from flask import Flask, jsonify
import boto3
from datetime import timedelta
app = Flask(__name__)
s3_client = boto3.client('s3')
@app.route('/upload-url', methods=['GET'])
def get_upload_url():
"""Generate pre-signed URL for direct S3 upload"""
bucket_name = 'my-bucket'
object_key = f'uploads/{request.args.get("filename")}'
# Generate upload URL (valid for 1 hour)
presigned_url = s3_client.generate_presigned_url(
'put_object',
Params={'Bucket': bucket_name, 'Key': object_key},
ExpiresIn=3600
)
return jsonify({
'upload_url': presigned_url,
'key': object_key
})
Client-side:
async function uploadToS3(file) {
// Get pre-signed URL
const response = await fetch(`/upload-url?filename=${file.name}`);
const { upload_url, key } = await response.json();
// Upload directly to S3
await fetch(upload_url, {
method: 'PUT',
body: file
});
return { key };
}
Summary Checklist
When debugging file upload errors, work through this list:
✓ Flask Configuration:
- [ ] Set
MAX_CONTENT_LENGTHappropriately - [ ] Add error handler for
RequestEntityTooLarge - [ ] Use streaming for large files, not
.read()
✓ Web Server (nginx/Apache):
- [ ] Set
client_max_body_size(nginx) orLimitRequestBody(Apache) - [ ] Increase proxy timeouts
- [ ] Reload config after changes
✓ Application Server (Gunicorn/uWSGI):
- [ ] Increase worker timeout (300+ seconds)
- [ ] Increase buffer sizes
- [ ] Set appropriate
limit-postvalue
✓ Client-Side:
- [ ] Use XMLHttpRequest with progress tracking
- [ ] Set long timeout (5+ minutes)
- [ ] Show upload progress to users
✓ Infrastructure:
- [ ] Check available disk space
- [ ] Verify upload folder permissions
- [ ] Check SELinux/AppArmor policies
- [ ] Consider cloud provider limits
Prevention Best Practices
Here’s how to build robust file upload handling from the start.
1. Validate Files Before Upload
Don’t waste bandwidth on invalid files:
from flask import Flask, request, jsonify
import os
app = Flask(__name__)
app.config['MAX_CONTENT_LENGTH'] = 100 * 1024 * 1024
ALLOWED_EXTENSIONS = {'png', 'jpg', 'jpeg', 'gif', 'pdf', 'mp4'}
def allowed_file(filename):
return '.' in filename and \
filename.rsplit('.', 1)[1].lower() in ALLOWED_EXTENSIONS
@app.route('/upload', methods=['POST'])
def upload_file():
if 'file' not in request.files:
return jsonify({'error': 'No file provided'}), 400
file = request.files['file']
if file.filename == '':
return jsonify({'error': 'Empty filename'}), 400
if not allowed_file(file.filename):
return jsonify({
'error': 'Invalid file type',
'allowed': list(ALLOWED_EXTENSIONS)
}), 400
# Safe to process the file
file.save(f'/uploads/{file.filename}')
return jsonify({'status': 'success'})
2. Use Secure Filenames
Never trust user-provided filenames:
from flask import Flask, request
from werkzeug.utils import secure_filename
import os
import uuid
app = Flask(__name__)
UPLOAD_FOLDER = '/uploads'
@app.route('/upload', methods=['POST'])
def upload_file():
file = request.files['file']
# Generate safe, unique filename
ext = os.path.splitext(secure_filename(file.filename))[1]
unique_filename = f"{uuid.uuid4()}{ext}"
file_path = os.path.join(UPLOAD_FOLDER, unique_filename)
file.save(file_path)
return {
'status': 'success',
'filename': unique_filename
}
3. Implement Chunked Uploads for Large Files
For files over 100MB, use chunked uploads:
from flask import Flask, request, jsonify
import os
app = Flask(__name__)
UPLOAD_FOLDER = '/uploads/temp'
@app.route('/upload/chunk', methods=['POST'])
def upload_chunk():
chunk = request.files['chunk']
chunk_number = int(request.form['chunkNumber'])
total_chunks = int(request.form['totalChunks'])
file_id = request.form['fileId']
# Save chunk
chunk_path = os.path.join(UPLOAD_FOLDER, f"{file_id}_chunk_{chunk_number}")
chunk.save(chunk_path)
# If all chunks received, merge them
if chunk_number == total_chunks - 1:
final_path = os.path.join(UPLOAD_FOLDER, f"{file_id}_complete")
with open(final_path, 'wb') as final_file:
for i in range(total_chunks):
chunk_path = os.path.join(UPLOAD_FOLDER, f"{file_id}_chunk_{i}")
with open(chunk_path, 'rb') as chunk_file:
final_file.write(chunk_file.read())
os.remove(chunk_path)
return jsonify({
'status': 'complete',
'path': final_path
})
return jsonify({
'status': 'chunk_received',
'chunk': chunk_number
})
4. Add Upload Progress API
Let users check upload status:
from flask import Flask, request, jsonify
import os
import threading
app = Flask(__name__)
# Store upload progress
upload_progress = {}
@app.route('/upload', methods=['POST'])
def upload_file():
file = request.files['file']
upload_id = request.form.get('upload_id')
file_path = f'/uploads/{file.filename}'
total_size = request.content_length
upload_progress[upload_id] = {'uploaded': 0, 'total': total_size}
# Stream and track progress
with open(file_path, 'wb') as f:
while True:
chunk = file.stream.read(8192)
if not chunk:
break
f.write(chunk)
upload_progress[upload_id]['uploaded'] += len(chunk)
upload_progress[upload_id]['status'] = 'complete'
return jsonify({'status': 'success'})
@app.route('/upload/progress/<upload_id>')
def get_progress(upload_id):
progress = upload_progress.get(upload_id, {})
if not progress:
return jsonify({'error': 'Upload not found'}), 404
return jsonify({
'uploaded': progress.get('uploaded', 0),
'total': progress.get('total', 0),
'percent': (progress.get('uploaded', 0) / progress.get('total', 1)) * 100,
'status': progress.get('status', 'uploading')
})
Related Posts
Want to learn more about Flask error handling? Check out:
- How to Fix Flask 400 Bad Request Error: Complete Guide
- How to Fix Flask 500 Internal Server Error: Complete Guide
- Why Does Flask Throw ‘Working Outside Request Context’?
Debugging Upload Errors? Use Debugly’s trace formatter to quickly parse and analyze Python tracebacks when Flask throws exceptions during file uploads. It’ll show you exactly where the upload pipeline failed.