How to Serve User-Generated Content Securely

Back to Blog
Lukas Gisder-Dubé
7 min read
cdnsecurityuser-uploadsfile-validationbest-practices

How to Serve User-Generated Content Securely

User uploads are a minefield. Last year, I audited a community app where users could upload profile pictures. Sounds harmless, right? Except someone had uploaded a PHP file with a .jpg extension that executed when accessed directly. The server was compromised within hours of discovery.

Handling user-generated content safely isn't optional—it's survival. Let me show you the patterns that keep your app secure while still letting users upload freely.

The Threat Model

When you accept user uploads, you're accepting risk. Here's what can go wrong:

Malicious files. Executables disguised as images, scripts that run on access, files that exploit parser vulnerabilities.

XSS through filenames. A filename like <script>alert('xss')</script>.jpg can execute if you display it unsanitized. XSS (cross-site scripting) lets attackers inject malicious scripts into your pages.

Storage exhaustion. One user uploading thousands of large files can exhaust your storage and bankrupt your account.

Hotlinking abuse. Your CDN bill explodes because someone embedded your user content on a high-traffic site.

Privacy leaks. EXIF data in photos reveals GPS coordinates, device info, and timestamps.

Rule 1: Never Trust the File Extension

File extensions mean nothing. A file named cute-cat.jpg can contain executable code. Always verify content by reading the actual bytes—specifically, the "magic bytes" (unique binary signatures that identify file formats).

ts
import fileType from 'file-type'

async function validateFileType(buffer: Buffer): Promise<string | null> {
  const detected = await fileType.fromBuffer(buffer)

  if (!detected) return null

  const allowedTypes = [
    'image/jpeg',
    'image/png',
    'image/webp',
    'image/gif',
    'application/pdf',
  ]

  return allowedTypes.includes(detected.mime) ? detected.mime : null
}

// Usage
const buffer = await readFile(uploadedFile)
const mimeType = await validateFileType(buffer)

if (!mimeType) {
  throw new Error('Invalid or disallowed file type')
}

The file-type library reads magic bytes—the actual binary signature that identifies the format. A PHP script can't fake JPEG magic bytes and still execute.

Rule 2: Generate Safe Filenames

Never use user-provided filenames for storage. They can contain:

  • Path traversal attacks (../../../etc/passwd—attempts to access files outside your upload directory)
  • Special characters that break filesystems
  • Unicode exploits
  • XSS payloads

Generate a random filename server-side:

ts
import { randomUUID } from 'crypto'
import mime from 'mime'

function generateSafeFilename(detectedMimeType: string): string {
  const uuid = randomUUID()
  const extension = mime.getExtension(detectedMimeType) || 'bin'
  return `${uuid}.${extension}`
}

// Result: "a1b2c3d4-e5f6-7890-abcd-ef1234567890.jpg"

Store the original filename in your database if you need to display it, but always sanitize before rendering:

ts
function sanitizeDisplayName(filename: string): string {
  return filename
    .replace(/[<>:"\/\\|?*]/g, '') // Remove dangerous chars
    .slice(0, 255) // Limit length
}

Rule 3: Strip Metadata

Photos often contain EXIF metadata with sensitive information:

  • GPS coordinates (where the photo was taken)
  • Device model and serial number
  • Date and time
  • Sometimes even the photographer's name

Strip this data before storing:

ts
import sharp from 'sharp'

async function stripMetadata(buffer: Buffer): Promise<Buffer> {
  return sharp(buffer)
    .rotate() // Preserve orientation
    .toBuffer()
}

Sharp's processing removes EXIF data by default unless you explicitly preserve it. This protects your users' privacy.

Rule 4: Serve from a Separate Domain

Never serve user content from your main domain. If someone manages to upload something malicious, it shouldn't have access to your session cookies.

text
Your app: app.example.com
User content: cdn.example.com (different domain)

With a CDN like easyCDN, this separation is built-in—your content is automatically served from a distinct domain, reducing the risk of cookie theft or CSRF attacks.

This prevents:

  • Cookie theft (cookies are domain-scoped)
  • CSRF attacks using uploaded files
  • Same-origin policy bypasses

Rule 5: Set Proper Content-Type Headers

When serving user content, set headers that prevent browser interpretation:

http
Content-Type: image/jpeg
Content-Disposition: inline
X-Content-Type-Options: nosniff

The nosniff header tells browsers to trust the Content-Type and not try to guess. Without it, some browsers might interpret a file as HTML and execute embedded scripts.

Complete Upload Flow

Here's a secure upload implementation that combines all these principles. These examples use Node.js with TypeScript—install the dependencies with npm install @easycdn/server file-type sharp express multer:

ts
import { createClient } from '@easycdn/server'
import fileType from 'file-type'
import sharp from 'sharp'
import { randomUUID } from 'crypto'

const cdn = createClient({
  secretKey: process.env.EASYCDN_SECRET_KEY!,
})

const ALLOWED_TYPES = ['image/jpeg', 'image/png', 'image/webp', 'image/gif']
const MAX_SIZE = 10 * 1024 * 1024 // 10MB

interface UploadResult {
  url: string
  originalName: string
}

async function secureUpload(
  buffer: Buffer,
  originalFilename: string,
  userId: string
): Promise<UploadResult> {
  // 1. Check size
  if (buffer.length > MAX_SIZE) {
    throw new Error('File too large')
  }

  // 2. Validate file type by content
  const detected = await fileType.fromBuffer(buffer)
  if (!detected || !ALLOWED_TYPES.includes(detected.mime)) {
    throw new Error('Invalid file type')
  }

  // 3. Process image (strips metadata, normalizes format)
  const processedBuffer = await sharp(buffer)
    .rotate() // Auto-orient based on EXIF
    .resize(2000, 2000, { fit: 'inside', withoutEnlargement: true })
    .webp({ quality: 85 })
    .toBuffer()

  // 4. Generate safe filename
  const safeFilename = `${userId}/${randomUUID()}.webp`

  // 5. Upload to CDN
  const result = await cdn.upload(processedBuffer, {
    fileName: safeFilename,
  })

  return {
    url: result.asset.url,
    originalName: originalFilename.slice(0, 255),
  }
}

Express API Route Example

Here's how to use this in an Express API:

ts
import express from 'express'
import multer from 'multer'

const app = express()
const upload = multer({
  storage: multer.memoryStorage(),
  limits: { fileSize: 10 * 1024 * 1024 },
})

app.post('/api/upload', upload.single('file'), async (req, res) => {
  if (!req.file) {
    return res.status(400).json({ error: 'No file provided' })
  }

  // Get user ID from session/auth
  const userId = req.user?.id
  if (!userId) {
    return res.status(401).json({ error: 'Unauthorized' })
  }

  try {
    const result = await secureUpload(
      req.file.buffer,
      req.file.originalname,
      userId
    )

    res.json(result)
  } catch (error) {
    if (error instanceof Error) {
      return res.status(400).json({ error: error.message })
    }
    res.status(500).json({ error: 'Upload failed' })
  }
})

Rate Limiting and Quotas

Protect against abuse with upload limits:

ts
import rateLimit from 'express-rate-limit'

const uploadLimiter = rateLimit({
  windowMs: 60 * 1000, // 1 minute
  max: 10, // 10 uploads per minute
  message: { error: 'Too many uploads, try again later' },
})

app.post('/api/upload', uploadLimiter, upload.single('file'), ...)

Also track per-user storage:

ts
async function checkUserQuota(userId: string, newFileSize: number): Promise<boolean> {
  const currentUsage = await db.getUserStorageUsage(userId)
  const quota = 100 * 1024 * 1024 // 100MB per user

  return currentUsage + newFileSize <= quota
}

Content Moderation

For public-facing content, consider automated moderation:

  • Image scanning APIs can detect inappropriate content
  • Hash matching identifies known problematic images
  • Manual review queues for flagged content

This is outside CDN functionality but important for community apps.

Security Checklist

Before launching user uploads, verify:

  • File type validated by content (magic bytes), not extension
  • Filenames generated server-side
  • EXIF metadata stripped from images
  • Content served from separate domain
  • Proper Content-Type headers set
  • Rate limiting in place
  • Per-user storage quotas implemented
  • Upload size limits enforced

Wrapping Up

User-generated content is tricky because you're accepting untrusted input and serving it back to users. One mistake can compromise your entire platform.

The good news? Most risks are preventable with proper validation and separation. Validate content type by bytes, generate safe filenames, strip metadata, serve from a CDN domain, and set proper headers. These steps aren't complicated—they just need to be consistent.

A CDN helps by providing that domain separation automatically and handling the infrastructure concerns while you focus on your application logic.

Looking for a simpler way to handle user uploads? easyCDN provides secure storage with automatic domain separation—your user content is served from CDN edge locations, completely separate from your application. The free tier is enough to validate your upload flow before scaling up.

Ready to host your assets?

Create your free account and start serving your assets in minutes.

No credit card required • Get started in under 2 minutes