How to upload large files in JavaScript using chunking
Learn how to chunk files on the frontend, and handle them on the backend
Uploading large files can be a difficult task for both developers and users due to the potential for slow network connections, interruptions, and server limitations. However, due to improvements in various technologies such as better and more accessible cameras that produce high file-size images, the requirement to upload large files has become commonplace. Users now expect a simple and intuitive interface for uploading large files, regardless of how stable their connection is. As with anything in modern web development, making something appear simple to the user often requires substantial effort for the developer.
Thankfully, chunking provides a robust solution for these challenges by breaking down a large file into smaller, manageable pieces and uploading them sequentially. In this article, we'll explore how to implement chunked file uploads in JavaScript, leveraging modern web technologies and practices. This comprehensive guide will cover the basics, provide detailed code examples, and include links to relevant documentation for further reading.
If you're interested in learning more about this topic you should check out our article on Uploading large files using chunking.
Understanding File Chunking
File chunking involves splitting a large file into smaller parts, or "chunks", and uploading these parts individually. This approach has several benefits:
If a chunk fails to upload, only that chunk needs to be re-uploaded.
Multiple chunks can be uploaded simultaneously to speed up the process.
Servers and clients can handle smaller chunks more efficiently.
Example Use Cases
Uploading large video or image files.
Handling uploads in environments with unstable internet connections. In a world of mobile devices this is a common issue.
Implementing resumable uploads.
Setting Up the Environment
Before we dive into the code, ensure you have the following prerequisites:
Node.js: For setting up a simple server.
Express.js: A minimal and flexible Node.js web application framework.
Frontend Tools: HTML5, JavaScript (ES6+), and Fetch API.
Although JavaScript is required for the frontend no matter what approach we take, you can easily switch out the backend for your own stack. We've used Node.js and Express.js here, but you can just as easily use PHP and Laravel, or any other common stack.
Creating the Frontend with HTML and JavaScript
Let's start by creating a simple HTML interface for file uploads.
HTML Interface
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Chunked File Upload</title>
</head>
<body>
<h1>Upload Large Files with Chunking</h1>
<input type="file" id="fileInput" />
<button id="uploadButton">Upload</button>
<progress id="uploadProgress" value="0" max="100"></progress>
<script src="upload.js"></script>
</body>
</html>
JavaScript Code for Chunking and Uploading
Next, we'll write the JavaScript code to handle the file input, chunking, and uploading.
document.getElementById('uploadButton').addEventListener('click', uploadFile);
const CHUNK_SIZE = 5 * 1024 * 1024; // 5MB. You may want to lower this if you anticipate poor connections
async function uploadFile() {
const fileInput = document.getElementById('fileInput');
const file = fileInput.files[0];
if (!file) return alert('Please select a file.');
const totalChunks = Math.ceil(file.size / CHUNK_SIZE);
let currentChunk = 0;
const uploadProgress = document.getElementById('uploadProgress');
uploadProgress.value = 0;
uploadProgress.max = totalChunks;
while (currentChunk < totalChunks) {
const start = currentChunk * CHUNK_SIZE;
const end = Math.min(start + CHUNK_SIZE, file.size);
const chunk = file.slice(start, end);
await uploadChunk(chunk, currentChunk, totalChunks, file.name);
currentChunk++;
uploadProgress.value = currentChunk;
}
alert('Upload complete!');
}
async function uploadChunk(chunk, chunkNumber, totalChunks, fileName) {
const formData = new FormData();
formData.append('chunk', chunk);
formData.append('chunkNumber', chunkNumber);
formData.append('totalChunks', totalChunks);
formData.append('fileName', fileName);
try {
const response = await fetch('/upload', {
method: 'POST',
body: formData
});
if (!response.ok) throw new Error('Upload failed');
} catch (error) {
console.error('Error uploading chunk:', error);
}
}
In this script, we read the file from the input, split it into chunks, and upload each chunk sequentially using the Fetch API.
Backend Implementation
Now, let's set up the backend to handle the file upload. We'll use Node.js with Express for simplicity.
Setting Up Express Server
First, install the necessary packages:
npm init -y
npm install express multer
Create a new file server.js
and set up the Express server:
const express = require('express');
const multer = require('multer');
const fs = require('fs');
const path = require('path');
const app = express();
const upload = multer({ dest: 'uploads/' });
app.post('/upload', upload.single('chunk'), (req, res) => {
const { chunkNumber, totalChunks, fileName } = req.body;
const chunk = req.file;
const chunkDir = path.join(__dirname, 'uploads', fileName);
if (!fs.existsSync(chunkDir)) {
fs.mkdirSync(chunkDir);
}
const chunkPath = path.join(chunkDir, `${fileName}-${chunkNumber}`);
fs.renameSync(chunk.path, chunkPath);
res.sendStatus(200);
});
app.listen(3000, () => {
console.log('Server is running on port 3000');
});
In this code, we use the multer
middleware to handle file uploads. Each chunk is saved in a directory named after the file, with the chunk number appended to the filename.
Handling Errors and Retries
When dealing with network operations, errors are inevitable. Let's enhance our JavaScript code to handle retries.
async function uploadChunk(chunk, chunkNumber, totalChunks, fileName) {
const formData = new FormData();
formData.append('chunk', chunk);
formData.append('chunkNumber', chunkNumber);
formData.append('totalChunks', totalChunks);
formData.append('fileName', fileName);
const MAX_RETRIES = 3;
let retries = 0;
let success = false;
while (retries < MAX_RETRIES && !success) {
try {
const response = await fetch('/upload', {
method: 'POST',
body: formData
});
if (!response.ok) throw new Error('Upload failed');
success = true;
} catch (error) {
retries++;
console.error(`Retrying upload of chunk ${chunkNumber} (${retries}/${MAX_RETRIES})`);
await new Promise(resolve => setTimeout(resolve, 1000)); // wait before retrying
}
}
if (!success) throw new Error(`Failed to upload chunk ${chunkNumber} after ${MAX_RETRIES} attempts`);
}
This code will retry uploading a chunk up to three times if it fails, with a delay between attempts.
Assembling Chunks on the Server
After all chunks have been uploaded, we need to assemble them into a single file on the server.
Backend Code for Assembling Chunks
Update the server.js
file with a new endpoint to assemble the chunks:
app.post('/assemble', express.json(), (req, res) => {
const { fileName, totalChunks } = req.body;
const chunkDir = path.join(__dirname, 'uploads', fileName);
const finalPath = path.join(__dirname, 'uploads', fileName);
const writeStream = fs.createWriteStream(finalPath);
for (let i = 0; i < totalChunks; i++) {
const chunkPath = path.join(chunkDir, `${fileName}-${i}`);
const data = fs.readFileSync(chunkPath);
writeStream.write(data);
fs.unlinkSync(chunkPath); // remove chunk file after appending
}
writeStream.end(() => {
fs.rmdirSync(chunkDir); // remove the directory after assembling
res.sendStatus(200);
});
});
This endpoint reads each chunk, writes it to the final file, and cleans up the temporary files and directories.
Triggering Assembly from the Frontend
After all chunks are uploaded, trigger the assembly process from the frontend:
async function uploadFile() {
// (previous code omitted for brevity)
alert('Upload complete! Assembling file...');
try {
const response = await fetch('/assemble', {
method: 'POST',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify({
fileName: file.name,
totalChunks
})
});
if (!response.ok) throw new Error('Assembly failed');
alert('File assembled successfully!');
} catch (error) {
console.error('Error assembling file:', error);
}
}
This ensures that once all chunks are uploaded, the server assembles them into the final file.
Improvements and Best Practices
Implementing chunked uploads is effective, but there are several improvements and best practices to consider:
Parallel Chunk Uploads
Instead of uploading chunks sequentially, you can upload multiple chunks
in parallel to speed up the process.
async function uploadFile() {
const fileInput = document.getElementById('fileInput');
const file = fileInput.files[0];
if (!file) return alert('Please select a file.');
const totalChunks = Math.ceil(file.size / CHUNK_SIZE);
const uploadProgress = document.getElementById('uploadProgress');
uploadProgress.value = 0;
uploadProgress.max = totalChunks;
const uploadPromises = [];
for (let currentChunk = 0; currentChunk < totalChunks; currentChunk++) {
const start = currentChunk * CHUNK_SIZE;
const end = Math.min(start + CHUNK_SIZE, file.size);
const chunk = file.slice(start, end);
uploadPromises.push(uploadChunk(chunk, currentChunk, totalChunks, file.name));
}
await Promise.all(uploadPromises);
alert('Upload complete! Assembling file...');
// (trigger assembly as before)
}
Progress Indicators
Provide detailed progress indicators to keep users informed about the upload status.
function updateProgress(currentChunk, totalChunks) {
const uploadProgress = document.getElementById('uploadProgress');
uploadProgress.value = currentChunk;
const percentage = Math.round((currentChunk / totalChunks) * 100);
uploadProgress.textContent = `Uploading: ${percentage}%`;
}
async function uploadFile() {
// (previous code omitted for brevity)
uploadProgress.value = 0;
uploadProgress.max = totalChunks;
for (let currentChunk = 0; currentChunk < totalChunks; currentChunk++) {
// (upload code omitted for brevity)
updateProgress(currentChunk + 1, totalChunks);
}
// (assembly trigger omitted for brevity)
}
Client-Side Integrity Checks
Ensure the integrity of the uploaded chunks using hashes.
async function uploadChunk(chunk, chunkNumber, totalChunks, fileName) {
const chunkHash = await calculateHash(chunk);
const formData = new FormData();
formData.append('chunk', chunk);
formData.append('chunkNumber', chunkNumber);
formData.append('totalChunks', totalChunks);
formData.append('fileName', fileName);
formData.append('chunkHash', chunkHash);
// (upload code omitted for brevity)
}
async function calculateHash(chunk) {
const arrayBuffer = await chunk.arrayBuffer();
const hashBuffer = await crypto.subtle.digest('SHA-256', arrayBuffer);
return Array.from(new Uint8Array(hashBuffer)).map(b => b.toString(16).padStart(2, '0')).join('');
}
Further Reading
Uploading large files with chunking in JavaScript provides a reliable and efficient way to handle file uploads, especially in scenarios involving large files and unstable networks. By implementing chunked uploads, handling errors gracefully, and optimizing the process with parallel uploads and integrity checks, we can significantly improve the user experience.
This guide covered the basics of chunked file uploads, provided a detailed walkthrough of the frontend and backend implementation, and offered best practices for further improvements. For more information, consider exploring the following resources:
Think you know your stuff with JavaScript? Our JavaScript certification lets you showcase your knowledge for all to see.