How to Build a Resumable File Uploader in React with Node.js and resumable.js

how-to-build-a-resumable-file-uploader-in-react-with-nodejs-and-resumable.js

📦 Why Resumable Uploads?
Uploading large files comes with risks - unstable connections, browser crashes, or poor bandwidth can cause full uploads to fail. Instead of starting over every time, resumable uploads allow you to:
✅ Upload files in small chunks
 ✅ Resume if the connection breaks
 ✅ Prevent re-uploads of completed files
 ✅ Efficiently handle large files (GBs+)
In this tutorial, we’ll build a fully working Resumable File Uploader using React (Frontend) and Node.js (Backend), powered by the resumable.js library.
🧰 What You’ll Learn
Chunked uploads with resumable.js
Handling file chunks in Node.js with multer
Merging chunks into a complete file
Preventing duplicate uploads using localStorage
Deleting chunks after merge for clean storage.

⚙️ Project Setup

🔹 Backend: Express + Multer

Let’s create a simple Express server to handle chunk uploads.

mkdir resumable-uploader-backend

cd resumable-uploader-backend

npm init -y

npm install express multer cors

Create server.js with

// ✅ server.js

const express = require(“express”);

const cors = require(“cors”);

const fs = require(“fs”);

const path = require(“path”);

const multer = require(“multer”);

const app = express();

app.use(cors());

app.use(express.json());

app.use(express.urlencoded({ extended: true }));

const UPLOAD_DIR = path.join(__dirname, “uploads”);

if (!fs.existsSync(UPLOAD_DIR)) fs.mkdirSync(UPLOAD_DIR);

🧩 Handling Chunk Uploads and Merge

// Check if chunk exists (resumable.js uses this to skip duplicates)

app.get(“https://dev.to/upload”, (req, res) => {

 const { resumableIdentifier, resumableChunkNumber } = req.query;

 const chunkFile = path.join(UPLOAD_DIR, ${resumableIdentifier}.${resumableChunkNumber});

 fs.existsSync(chunkFile) ? res.status(200).send(“Found”) : res.status(204).send(“Not Found”);

});

// Upload chunk with Multer

const storage = multer.diskStorage({

 destination: (req, file, cb) => cb(null, UPLOAD_DIR),

 filename: (req, file, cb) => {

 const { resumableIdentifier, resumableChunkNumber } = req.body;

 cb(null, ${resumableIdentifier}.${resumableChunkNumber});

 },

});

const upload = multer({ storage });

app.post(“https://dev.to/upload”, upload.single(“file”), (req, res) => {

 res.status(200).send(“Chunk uploaded”);

});

Merging Chunks + Deleting After Use

app.post(“https://dev.to/merge”, (req, res) => {

 const { filename, totalChunks, identifier } = req.body;

 const finalPath = path.join(UPLOAD_DIR, filename);

 const writeStream = fs.createWriteStream(finalPath);

let index = 1;

const appendChunk = () => {

 const chunkPath = path.join(UPLOAD_DIR, ${identifier}.${index});

 fs.createReadStream(chunkPath)

 .on(“error”, err => res.status(500).send(“Chunk read error”))

 .on(“end”, () => {

 fs.unlink(chunkPath, () => {});

 if (++index <= totalChunks) appendChunk();  else writeStream.end(() => res.send(“File merged successfully”));

 })

 .pipe(writeStream, { end: false });

 };

appendChunk();

});

app.listen(5000, () => console.log(“✅ Server running on http://localhost:5000″));

Frontend: React + Resumable.js

Install in your React app:

npm install resumablejs

📁 ResumableUploader.js

import React, { useEffect, useRef, useState } from “react”;

import Resumable from “resumablejs”;

const ResumableUploader = () => {

 const browseRef = useRef(null);

 const [uploadProgress, setUploadProgress] = useState(0);

 const [status, setStatus] = useState(“”);

 const resumableRef = useRef(null);

const handleFileAdded = (file) => {

 const uploaded = JSON.parse(localStorage.getItem(“uploaded”) || “[]”);

 if (uploaded.includes(file.uniqueIdentifier)) {

 setStatus(“File already uploaded.”);

 return;

 }

 setStatus(“Uploading…”);

 resumableRef.current.upload();

 };

const handleFileSuccess = async (file) => {

 const uploaded = JSON.parse(localStorage.getItem(“uploaded”) || “[]”);

 uploaded.push(file.uniqueIdentifier);

 localStorage.setItem(“uploaded”, JSON.stringify(uploaded));

await fetch(“http://localhost:5000/merge“, {

 method: “POST”,

 headers: { “Content-Type”: “application/json” },

 body: JSON.stringify({

 filename: file.fileName,

 totalChunks: file.chunks.length,

 identifier: file.uniqueIdentifier,

 }),

 });

resumableRef.current.removeFile(file);

 setStatus(“Upload complete and merged.”);

 };

useEffect(() => {

 const r = new Resumable({

 target: “http://localhost:5000/upload“,

 chunkSize: 1 * 1024 * 1024, // 1MB

 fileParameterName: “file”,

 testChunks: true,

 throttleProgressCallbacks: 1,

 });

resumableRef.current = r;

 r.assignBrowse(browseRef.current);

r.on(“fileAdded”, handleFileAdded);

 r.on(“fileProgress”, file => setUploadProgress(Math.floor(file.progress() * 100)));

 r.on(“fileSuccess”, handleFileSuccess);

 r.on(“fileError”, () => setStatus(“Upload failed.”));

 }, []);

return (

 

 

Resumable File Uploader

 Choose File

 {uploadProgress > 0 && (

 <>

 

 

{uploadProgress}%

 

 )}

 {status &&

Status: {status}

}

 

 );

};

export default ResumableUploader;

✅ Final Result

Upload large files chunk-by-chunk

Resume even after refreshing

See real-time progress

Auto-merge on success

Clean up disk space by deleting chunks

🧠 Final Thoughts

Chunked uploads are essential for large files and unstable connections. With resumable.js, React, and Node.js, you get full control and reliability.

This setup can easily be extended to:

🔒 Secure uploads via JWT tokens

☁️ Upload to AWS S3 or Google Cloud

🧼 Auto-clean expired chunks

📂 Organize files per user or project

Total
0
Shares
Leave a Reply

Your email address will not be published. Required fields are marked *

Previous Post
-the-ai-native-browser-wars:-comet-vs.-chatgpt-agent

🚀 The AI-Native Browser Wars: Comet vs. ChatGPT Agent

Next Post
how-revisiting-your-icp-twice-a-year-can-save-your-b2b-product

How revisiting your ICP twice a year can save your B2B product

Related Posts