File Upload Best Practices for Web Applications
File uploads are deceptively tricky. On the surface, it's just accepting a file and storing it somewhere. In practice, you're dealing with security vulnerabilities, performance bottlenecks, and user experience challenges that can sink your app.
I've built file upload systems for everything from profile picture uploads to document management platforms. Along the way, I've made most of the mistakes so you don't have to. Here's what I've learned.
The Basics: Client-Side Validation
Start with client-side validation. Not because it's secure (it isn't—users can bypass it), but because it provides immediate feedback and saves bandwidth.
// React example
function FileUploader() {
const MAX_SIZE = 10 * 1024 * 1024 // 10MB
const ALLOWED_TYPES = ['image/jpeg', 'image/png', 'image/webp', 'application/pdf']
const [error, setError] = useState<string | null>(null)
function validateFile(file: File): string | null {
if (!ALLOWED_TYPES.includes(file.type)) {
return `File type ${file.type} is not allowed`
}
if (file.size > MAX_SIZE) {
return `File is too large. Maximum size is ${MAX_SIZE / 1024 / 1024}MB`
}
return null
}
function handleFileSelect(event: React.ChangeEvent<HTMLInputElement>) {
const file = event.target.files?.[0]
if (!file) return
const validationError = validateFile(file)
if (validationError) {
setError(validationError)
return
}
setError(null)
uploadFile(file)
}
return (
<div>
<input
type="file"
accept={ALLOWED_TYPES.join(',')}
onChange={handleFileSelect}
/>
{error && <p className="text-red-500 mt-2">{error}</p>}
</div>
)
}The accept attribute provides a hint to the browser about which files to show in the picker. It's not a security measure—just UX polish.
Server-Side Validation (The Real Security)
Never trust the client. Ever. Your server must validate everything again.
// Express + multer example
import express from 'express'
import multer from 'multer'
import fileType from 'file-type'
const MAX_SIZE = 10 * 1024 * 1024 // 10MB
const ALLOWED_MIME_TYPES = ['image/jpeg', 'image/png', 'image/webp', 'application/pdf']
const upload = multer({
storage: multer.memoryStorage(),
limits: { fileSize: MAX_SIZE },
fileFilter: (req, file, cb) => {
// First check: Content-Type header
if (!ALLOWED_MIME_TYPES.includes(file.mimetype)) {
cb(null, false) // Reject the file
return
}
cb(null, true)
},
})
app.post('/api/upload', upload.single('file'), async (req, res) => {
if (!req.file) {
return res.status(400).json({ error: 'No file provided or invalid file type' })
}
// Second check: Actual file content (magic bytes)
const detected = await fileType.fromBuffer(req.file.buffer)
if (!detected || !ALLOWED_MIME_TYPES.includes(detected.mime)) {
return res.status(400).json({ error: 'Invalid file type' })
}
// File is validated, proceed with storage
// ...
})Here's the big takeaway: check the actual file content, not just the Content-Type header. A malicious user can upload a PHP script with a .jpg extension and image/jpeg content type. The file-type library checks the file's magic bytes—the actual binary signature that identifies the format.
Filename Sanitization
User-provided filenames are a security risk. They can contain:
- Path traversal attacks (
../../../etc/passwd) - Special characters that break filesystems
- Unicode exploits
- Executable extensions disguised as images
The safest approach? Don't use the original filename at all.
import { randomUUID } from 'crypto'
function generateSafeFilename(originalName: string, mimeType: string): string {
const uuid = randomUUID()
// Get extension from detected MIME type, not original filename
const extensions: Record<string, string> = {
'image/jpeg': '.jpg',
'image/png': '.png',
'image/webp': '.webp',
'application/pdf': '.pdf',
}
const ext = extensions[mimeType] || ''
return `${uuid}${ext}`
}If you need to preserve the original filename for display purposes, store it in your database—never in the filesystem path.
Upload Progress and UX
Large file uploads need progress indicators. Without them, users don't know if the upload is working or stuck.
function useUploadWithProgress() {
const [progress, setProgress] = useState(0)
const [uploading, setUploading] = useState(false)
const abortController = useRef<AbortController | null>(null)
function cancel() {
abortController.current?.abort()
}
async function upload(file: File): Promise<string> {
setUploading(true)
setProgress(0)
abortController.current = new AbortController()
return new Promise((resolve, reject) => {
const xhr = new XMLHttpRequest()
xhr.upload.addEventListener('progress', (event) => {
if (event.lengthComputable) {
const percent = Math.round((event.loaded / event.total) * 100)
setProgress(percent)
}
})
xhr.addEventListener('load', () => {
setUploading(false)
if (xhr.status >= 200 && xhr.status < 300) {
const { url } = JSON.parse(xhr.responseText)
resolve(url)
} else {
reject(new Error('Upload failed'))
}
})
xhr.addEventListener('error', () => {
setUploading(false)
reject(new Error('Upload failed'))
})
xhr.addEventListener('abort', () => {
setUploading(false)
reject(new Error('Upload cancelled'))
})
const formData = new FormData()
formData.append('file', file)
xhr.open('POST', '/api/upload')
xhr.send(formData)
})
}
return { upload, progress, uploading, cancel }
}For the UI, even a simple progress bar makes a huge difference:
function UploadProgress({ progress }: { progress: number }) {
return (
<div className="w-full bg-gray-200 rounded-full h-2">
<div
className="bg-blue-600 h-2 rounded-full transition-all duration-150"
style={{ width: `${progress}%` }}
/>
</div>
)
}Direct-to-CDN Uploads
For production applications, uploading through your server can be suboptimal. Your server becomes a bottleneck, and you're paying for bandwidth twice (user → server → CDN).
Many CDN and storage providers offer presigned URLs or direct upload endpoints that let clients upload directly. Check your provider's documentation for this feature—it can significantly reduce server load and improve upload speeds for large files.
Handling Large Files: Chunked Uploads
For files over 100MB, even presigned URLs can be unreliable. Network hiccups, browser timeouts, and memory constraints all become issues.
Chunked uploads solve this by splitting files into smaller pieces:
const CHUNK_SIZE = 5 * 1024 * 1024 // 5MB chunks
async function uploadChunked(file: File) {
const chunks = Math.ceil(file.size / CHUNK_SIZE)
// These functions depend on your CDN/storage provider's API
// (e.g., AWS S3, easyCDN multipart upload endpoints)
const uploadId = await initiateMultipartUpload(file.name, file.type)
const parts: { partNumber: number; etag: string }[] = []
for (let i = 0; i < chunks; i++) {
const start = i * CHUNK_SIZE
const end = Math.min(start + CHUNK_SIZE, file.size)
const chunk = file.slice(start, end)
const { etag } = await uploadPart(uploadId, i + 1, chunk)
parts.push({ partNumber: i + 1, etag })
// Update progress
const progress = Math.round(((i + 1) / chunks) * 100)
console.log(`Upload progress: ${progress}%`)
}
// Complete the upload
const url = await completeMultipartUpload(uploadId, parts)
return url
}The advantage? If a chunk fails, you only retry that chunk—not the entire file. Users with flaky connections can still upload large files reliably.
Image-Specific Handling
Images deserve special treatment. Users often upload photos directly from their phones—10MB JPEGs that could be 200KB with proper compression.
Process images server-side before storage:
import sharp from 'sharp'
async function processImage(buffer: Buffer, options: {
maxWidth?: number
maxHeight?: number
quality?: number
format?: 'jpeg' | 'webp' | 'png'
}) {
const { maxWidth = 1920, maxHeight = 1080, quality = 80, format = 'webp' } = options
let pipeline = sharp(buffer)
.resize(maxWidth, maxHeight, {
fit: 'inside',
withoutEnlargement: true,
})
.rotate() // Auto-rotate based on EXIF
if (format === 'webp') {
pipeline = pipeline.webp({ quality })
} else if (format === 'jpeg') {
pipeline = pipeline.jpeg({ quality, mozjpeg: true })
} else {
pipeline = pipeline.png({ quality })
}
return pipeline.toBuffer()
}This typically reduces file sizes by 70-90% while maintaining visual quality. Your users get faster uploads, and you pay less for storage and bandwidth.
Security Checklist
We've covered most of these in detail above, but here's a quick reference to ensure you don't miss anything:
- Server-side validation - Never trust client data
- File type verification - Check magic bytes, not just extensions
- Size limits - Both per-file and total upload limits
- Filename sanitization - Generate safe names, store originals separately
- Rate limiting - Prevent abuse (consider Express middleware like
express-rate-limit) - Virus scanning - For applications handling documents (consider integrating ClamAV or a third-party API)
- Access control - Ensure users can only access their own files
- HTTPS only - Never accept uploads over HTTP
- Separate domain - Serve uploads from a different domain to prevent XSS
Wrapping Up
File uploads touch every part of your stack—frontend UX, backend security, infrastructure costs, and user privacy. Getting it right requires attention to detail at every layer.
To recap: secure file uploads boil down to server-side validation, safe filename handling, and leveraging tools like CDNs for scalability. Nail these, and you'll save yourself a lot of headaches.
If you're looking for a way to simplify file uploads, a CDN like easyCDN can save you time with built-in features for secure storage and delivery. We've baked these best practices into our SDKs, so you get secure, fast uploads without the boilerplate. Check it out if you're curious—the free tier handles most side projects.
Got questions about file uploads? Hit me up on Twitter—I love nerding out about this stuff.
