Client-Side Image Compression with browser-image-compression

A hands-on tutorial for browser-image-compression 2.0.2 covering file input handling, maxSizeMB targeting, Web Worker offloading, WebP output, progress tracking, EXIF orientation, and initial quality control for upload-heavy web apps.

Sources & References

Tested with browser-image-compression v2.0.2

Introduction

browser-image-compression is a zero-dependency-ish JavaScript library that shrinks JPEG, PNG, WebP, and BMP files directly inside the user's browser before they ever leave the device. It runs on the Canvas API and offloads the heavy lifting to a Web Worker, so compressing a 10 MB phone photo down to 700 KB takes roughly a second on a mid-range laptop without blocking the main thread. The only runtime dependency is uzip, used for WebP encoding on browsers that lack native support.

Client-side compression matters for three reasons. First, bandwidth: uploading a 5 MB image over a 4G connection costs the user time and data, while a pre-compressed 500 KB equivalent feels instant. Second, server costs: S3 egress, CDN storage, and database rows shrink when images arrive pre-optimized. Third, user experience: progressive web apps and mobile-first sites can accept images that would otherwise time out a multipart form, and the browser UI can show accurate progress during compression rather than during a slow upload.

Common use cases include profile photo uploads, multi-image product listings on e-commerce forms, PWA offline queues that batch uploads when connectivity returns, chat apps that need to display a preview before the server responds, and form wizards that must keep payload sizes within serverless function limits (typically 4-6 MB). This tutorial targets browser-image-compression 2.0.2 and assumes a modern evergreen browser.

Installation and Setup

The library ships on npm with TypeScript definitions included and on jsDelivr for no-build environments. Pick whichever matches the project.

npm install browser-image-compression
# or
yarn add browser-image-compression
# or
pnpm add browser-image-compression

For a plain HTML page without a bundler, use the CDN build. The UMD bundle attaches imageCompression to the global scope.

<script src="https://cdn.jsdelivr.net/npm/[email protected]/dist/browser-image-compression.js"></script>
<script>
  // window.imageCompression is now available
</script>

ES module consumers import the default export. TypeScript projects pick up types automatically from dist/browser-image-compression.d.ts.

import imageCompression from 'browser-image-compression';
import type { Options } from 'browser-image-compression';

const options: Options = {
  maxSizeMB: 1,
  maxWidthOrHeight: 1920,
  useWebWorker: true
};

No build configuration is required for webpack, Vite, Rollup, or Next.js. When a strict Content Security Policy is in place, allow blob: in script-src so the Web Worker can bootstrap from a Blob URL.

Core Features

1. Basic Compression with File Input

The minimal usage wires an <input type="file"> to the compressor and logs the before/after sizes. The function returns a File object that can be uploaded via FormData or previewed through URL.createObjectURL().

import imageCompression from 'browser-image-compression';

const input = document.querySelector('input[type=file]');

input.addEventListener('change', async (event) => {
  const file = event.target.files[0];
  if (!file) return;

  console.log(`original: ${(file.size / 1024 / 1024).toFixed(2)} MB`);

  const options = {
    maxSizeMB: 1,
    maxWidthOrHeight: 1920,
    useWebWorker: true
  };

  try {
    const compressed = await imageCompression(file, options);
    console.log(`compressed: ${(compressed.size / 1024 / 1024).toFixed(2)} MB`);

    const preview = document.querySelector('#preview');
    preview.src = URL.createObjectURL(compressed);

    const form = new FormData();
    form.append('image', compressed, compressed.name);
    await fetch('/api/upload', { method: 'POST', body: form });
  } catch (error) {
    console.error('compression failed', error);
  }
});

The returned File preserves the original filename and lastModified timestamp, so downstream code that inspects file.name continues to work. Revoke object URLs with URL.revokeObjectURL() once the preview is no longer needed to free memory.

2. maxSizeMB and maxWidthOrHeight

The two most important options are maxSizeMB and maxWidthOrHeight. The former tells the compressor to iterate quality until the output fits under a target size. The latter scales down the longest edge while preserving aspect ratio. Combine both for a predictable upper bound on bytes and pixels.

const thumbnail = await imageCompression(file, {
  maxSizeMB: 0.2,          // target 200 KB
  maxWidthOrHeight: 640,   // cap longest edge at 640px
  useWebWorker: true,
  maxIteration: 10         // default, binary search attempts
});

const hero = await imageCompression(file, {
  maxSizeMB: 1.5,          // target 1.5 MB
  maxWidthOrHeight: 2560,  // cap at 2560px for retina displays
  useWebWorker: true
});

Internally the library runs a binary search over JPEG quality values, re-encoding the canvas on each iteration until the output is smaller than maxSizeMB or maxIteration is exhausted. Setting maxSizeMB without maxWidthOrHeight still works but may produce soft, low-quality JPEGs when the source is huge (think 12 megapixel phone photos forced under 200 KB).

3. Web Worker Usage

By default useWebWorker: true moves canvas operations off the main thread, which keeps the UI responsive during compression. The worker is spawned from a Blob URL, so it inherits no application JavaScript. When the main thread is idle-sensitive (drag-and-drop uploaders, canvas editors), always enable the worker.

const options = {
  maxSizeMB: 1,
  maxWidthOrHeight: 1920,
  useWebWorker: true,
  libURL: 'https://cdn.jsdelivr.net/npm/[email protected]/dist/browser-image-compression.js'
};

const compressed = await imageCompression(file, options);

The libURL option tells the worker where to fetch its own copy of the library. The default points at jsDelivr, which is fine for public apps. Self-host the script and set libURL to a same-origin path when the CSP forbids third-party origins, or when offline PWAs must keep working without a network.

When useWebWorker is false or unsupported, the library falls back to main-thread compression. Browsers without OffscreenCanvas (older Safari) also fall back silently. Feature-detect in advance if UI feedback differs by mode.

4. Custom fileType (WebP Output)

By default the compressor preserves the input MIME type. Override with fileType to convert JPEG or PNG input into WebP, which typically cuts file size 25-35 percent at equivalent perceived quality. The receiving server should set Content-Type: image/webp accordingly.

const webpFile = await imageCompression(file, {
  maxSizeMB: 0.8,
  maxWidthOrHeight: 1600,
  useWebWorker: true,
  fileType: 'image/webp',
  initialQuality: 0.8
});

console.log(webpFile.type);  // 'image/webp'
console.log(webpFile.name);  // original name, extension unchanged

// Replace extension manually if needed
const renamed = new File(
  [webpFile],
  webpFile.name.replace(/\.(jpe?g|png)$/i, '.webp'),
  { type: 'image/webp' }
);

WebP output uses uzip under the hood for browsers whose Canvas lacks native WebP encoding. The file extension on the returned File is not rewritten, so rename manually before uploading if the backend routes by extension rather than MIME type. Supported fileType values are image/jpeg, image/png, and image/webp.

5. Progress Callback

Pass onProgress to receive a 0-100 number as compression advances. The callback fires from the Web Worker thread but is proxied back to the main thread, so DOM updates are safe inside it.

const progressBar = document.querySelector('#progress');
const progressLabel = document.querySelector('#progress-label');

const compressed = await imageCompression(file, {
  maxSizeMB: 1,
  maxWidthOrHeight: 1920,
  useWebWorker: true,
  onProgress: (percent) => {
    progressBar.value = percent;
    progressLabel.textContent = `${percent}%`;
  }
});

progressLabel.textContent = 'done';

Progress values are not strictly linear. They track iteration count rather than wall-clock time, so the bar may jump from 40 to 80 when the binary search converges quickly. Pair the callback with a spinner for reassuring UX during the final encode step.

6. EXIF Orientation Handling

Phone cameras record landscape pixels with an EXIF orientation flag instead of rotating the bitmap. Browsers vary in whether they honor that flag on canvas draws, so the library auto-detects orientation and rotates the canvas to match. Override with exifOrientation when a specific transform is required, or disable rotation by passing 1 (no change).

import imageCompression, {
  getExifOrientation
} from 'browser-image-compression';

const orientation = await getExifOrientation(file);
console.log('exif orientation:', orientation);

const compressed = await imageCompression(file, {
  maxSizeMB: 1,
  maxWidthOrHeight: 1920,
  useWebWorker: true,
  exifOrientation: orientation,  // 1-8, per EXIF spec
  preserveExif: false            // default strips EXIF from output
});

EXIF orientation values run 1 through 8. Value 1 is no rotation, 3 is 180 degrees, 6 is 90 clockwise, 8 is 90 counter-clockwise. When preserveExif: true, the library copies the original EXIF block into the compressed JPEG (not WebP) so downstream tools that read camera make, model, or GPS tags still find them.

7. Initial Quality Control

The initialQuality option seeds the binary search at a specific quality (0-1 range) instead of starting at 1. When target sizes are aggressive, starting lower saves CPU cycles because fewer iterations are needed. Use alwaysKeepResolution: true to forbid dimensional downscaling entirely, which is useful for images that must match exact pixel specs (badges, avatars, fixed-grid galleries).

// Aggressive: seed low, cap iterations tight
const thumb = await imageCompression(file, {
  maxSizeMB: 0.1,
  maxWidthOrHeight: 400,
  initialQuality: 0.6,
  maxIteration: 6,
  useWebWorker: true
});

// Keep 800x800 avatar dimensions, only reduce quality
const avatar = await imageCompression(file, {
  maxSizeMB: 0.3,
  alwaysKeepResolution: true,
  initialQuality: 0.85,
  useWebWorker: true
});

When alwaysKeepResolution is on, the library cannot meet maxSizeMB by resizing, so very large source images may exceed the target even at minimum quality. Combine with a pre-resize step or raise maxSizeMB to be safe.

Common Pitfalls

Web Worker fallback on older browsers: Safari versions before 16.4 lack OffscreenCanvas, so the library runs on the main thread even with useWebWorker: true. Do not rely on the worker to keep the UI smooth for very large files (20 MB+) without also chunking or showing a loading state.

Memory pressure on huge images: Compressing an 8000x6000 pixel photo allocates roughly 192 MB for the canvas buffer alone (width * height * 4 bytes RGBA). Mobile Safari may kill the tab. Use maxWidthOrHeight to cap the canvas, or sanity-check file.size before invoking the compressor and reject inputs above a threshold.

iOS Safari canvas size limit: iOS Safari caps canvas area at roughly 4096x4096 pixels on older devices (16,777,216 total pixels on modern ones). Images that exceed the cap return blank output. The library handles this by downscaling automatically, but always test on real iOS hardware with full-resolution camera uploads.

Output MIME type mismatch: When fileType is not set, the output inherits the input MIME type. PNG input therefore produces PNG output, which compresses poorly compared to JPEG or WebP for photographic content. Force fileType: 'image/jpeg' or 'image/webp' when targeting size reductions on screenshot uploads.

AbortController cancellation: Long compressions can be cancelled with signal from an AbortController. Always wire one up to component unmount in React/Vue to prevent setState-after-unmount warnings and unnecessary CPU burn.

const controller = new AbortController();

try {
  const compressed = await imageCompression(file, {
    maxSizeMB: 1,
    signal: controller.signal
  });
} catch (err) {
  if (err.name === 'AbortError') {
    console.log('user cancelled');
  }
}

// Cancel after 3 seconds
setTimeout(() => controller.abort(), 3000);

Alternatives Comparison

browser-image-compression vs Compressor.js: Compressor.js (by Fengyuan Chen) has a similar API surface and comparable output quality. The main difference is worker support: browser-image-compression includes a built-in Web Worker path while Compressor.js runs on the main thread only. For forms where users drag many files at once, the worker offload matters. For single-file profile uploads, either library works. Compressor.js has a smaller bundle (around 12 KB vs 40 KB) when worker support is not needed.

browser-image-compression vs native Canvas API: Rolling your own with HTMLCanvasElement.toBlob() works for simple cases but misses the iterative quality search, EXIF rotation, and WebP fallback encoder. Expect to write 100-200 lines of code to match even the basic features, and then discover iOS Safari canvas quirks the hard way. Use the library unless bundle size is truly critical, in which case copy only the binary-search logic.

browser-image-compression vs server-side compression (Sharp, ImageMagick): Server-side processing produces better quality output because it can use libvips or mozjpeg with unlimited CPU, but it happens after upload. Client-side compression wins on upload bandwidth and server storage costs. Many production apps layer both: compress on the client to shrink the upload, then re-encode on the server for canonical thumbnail variants.

References

The official sources at the top of this article track every release. The GitHub README includes a live demo page with a file picker for experimenting with option combinations, and the examples/ directory shows framework-specific integrations for React, Vue, and vanilla HTML. For advanced use cases (custom encoders, EXIF round-tripping), read the TypeScript definitions in dist/browser-image-compression.d.ts which expose helpers like getDataUrlFromFile, loadImage, and drawImageInCanvas.

Back to Blog