Download file in Node.js

how to download a file in node js

You can easily download a file in Node.js using built-in modules like https, http, or external libraries like axios or node-fetch.

Here are two clear examples using native modules and Axios:


βœ… Using Built-in Modules (https / fs):

javascript
const https = require('https'); const fs = require('fs'); const url = 'https://example.com/file.zip'; const filePath = 'file.zip'; const file = fs.createWriteStream(filePath); https.get(url, (response) => { if (response.statusCode !== 200) { console.error(`Download failed. Status Code: ${response.statusCode}`); return; } response.pipe(file); file.on('finish', () => { file.close(); console.log('Download completed.'); }); }).on('error', (err) => { fs.unlink(filePath, () => {}); // Delete incomplete file console.error(`Error: ${err.message}`); });

βœ… Using Axios (Recommended for simplicity):

First, install Axios:

bash
npm install axios

Then use the following code:

javascript
const axios = require('axios'); const fs = require('fs'); const url = 'https://example.com/file.zip'; const filePath = 'file.zip'; async function downloadFile(url, path) { const response = await axios({ method: 'GET', url: url, responseType: 'stream' }); const writer = fs.createWriteStream(path); response.data.pipe(writer); return new Promise((resolve, reject) => { writer.on('finish', () => { console.log('Download completed.'); resolve(); }); writer.on('error', (err) => { fs.unlink(path, () => {}); console.error(`Error: ${err.message}`); reject(err); }); }); } downloadFile(url, filePath);

πŸ“Œ Summary of Steps:

  1. Choose between built-in modules (https) or libraries (axios).

  2. Use a stream (fs.createWriteStream) to write the file.

  3. Handle success/error events properly.

This ensures efficient, robust downloading of files in Node.js.

how to download it within a range

To download a file within a specific byte range in Node.js, you can set the HTTP header Range. This is commonly used for downloading large files in segments or resuming interrupted downloads.

Here's how you can achieve this easily:


πŸš€ Using Built-in Modules (https)

javascript
const https = require('https'); const fs = require('fs'); const url = 'https://example.com/file.zip'; const filePath = 'partial-file.zip'; // Define the byte range (for example, first 1MB of file) const startByte = 0; const endByte = 1048575; // 1 MB = 1024 * 1024 - 1 const options = { headers: { 'Range': `bytes=${startByte}-${endByte}` } }; https.get(url, options, (res) => { if (res.statusCode !== 206) { // 206 Partial Content console.error(`Unexpected status code: ${res.statusCode}`); res.resume(); // Consume response data to free up memory return; } const fileStream = fs.createWriteStream(filePath); res.pipe(fileStream); fileStream.on('finish', () => { fileStream.close(); console.log('Partial download completed.'); }); }).on('error', (err) => { fs.unlink(filePath, () => {}); console.error(`Download failed: ${err.message}`); });

πŸš€ Using Axios

javascript
const axios = require('axios'); const fs = require('fs'); const url = 'https://example.com/file.zip'; const filePath = 'partial-file.zip'; // Byte range (e.g., bytes 0 to 1MB-1) const startByte = 0; const endByte = 1048575; async function downloadRange(url, path, start, end) { const response = await axios.get(url, { headers: { 'Range': `bytes=${start}-${end}` }, responseType: 'stream' }); if (response.status !== 206) { throw new Error(`Unexpected status code: ${response.status}`); } const writer = fs.createWriteStream(path); response.data.pipe(writer); return new Promise((resolve, reject) => { writer.on('finish', () => { console.log('Partial download completed.'); resolve(); }); writer.on('error', (err) => { fs.unlink(path, () => {}); console.error(`Error: ${err.message}`); reject(err); }); }); } downloadRange(url, filePath, startByte, endByte);

πŸ“Œ Important Notes:

  • When requesting byte ranges, the server must support HTTP Range Requests (most servers do).

  • If the range request succeeds, you'll get an HTTP status code 206 Partial Content.

  • Adjust startByte and endByte values as per your requirements.

  • Useful for segmented downloads, resuming downloads, or parallel downloading.

That's how you can download a specific range of bytes from a file in Node.js!

download 8 threads at the same time

To download a file in 8 parallel threads (segments) using Node.js, you can:

  1. Get the file size.

  2. Split it into 8 ranges.

  3. Download each range concurrently using HTTP Range header.

  4. Merge the parts into one file.

Here’s a complete working example using axios, fs, and Promise.all:


βœ… Step-by-Step Code

Install required package first:

bash
npm install axios

Then use this code:

javascript
const fs = require('fs'); const axios = require('axios'); const path = require('path'); const os = require('os'); const FILE_URL = 'https://example.com/largefile.zip'; const OUTPUT_FILE = 'final.zip'; const THREADS = 8; async function getFileSize(url) { const res = await axios.head(url); const size = parseInt(res.headers['content-length'], 10); if (isNaN(size)) throw new Error('Could not get file size'); return size; } async function downloadPart(url, start, end, index) { const response = await axios.get(url, { headers: { Range: `bytes=${start}-${end}` }, responseType: 'stream' }); const tempPath = path.join(os.tmpdir(), `part-${index}`); const writer = fs.createWriteStream(tempPath); response.data.pipe(writer); return new Promise((resolve, reject) => { writer.on('finish', () => resolve(tempPath)); writer.on('error', reject); }); } async function mergeParts(tempPaths, outputPath) { const writer = fs.createWriteStream(outputPath); for (const tempPath of tempPaths) { const data = fs.readFileSync(tempPath); writer.write(data); fs.unlinkSync(tempPath); // delete part after merge } writer.end(); console.log('Merge completed:', outputPath); } (async () => { const size = await getFileSize(FILE_URL); const partSize = Math.ceil(size / THREADS); const downloadPromises = []; for (let i = 0; i < THREADS; i++) { const start = i * partSize; const end = i === THREADS - 1 ? size - 1 : (start + partSize - 1); downloadPromises.push(downloadPart(FILE_URL, start, end, i)); } const tempPaths = await Promise.all(downloadPromises); await mergeParts(tempPaths, OUTPUT_FILE); })();

βš™οΈ What This Does:

  • Detects the total file size.

  • Splits the download into 8 byte ranges.

  • Downloads each part concurrently.

  • Merges them in order into a final file.

  • Deletes temporary part files.


⚠️ Notes:

  • Works only if the server supports HTTP range requests (status 206).

  • Use os.tmpdir() or any other temp path to store parts safely.

  • Optional: Add retry/failure handling for robustness.

Do you want a version that shows download progress or retry on failure too?

add resume and pause functionality if supported by the server

Adding pause/resume functionality to a multi-threaded downloader in Node.js is possible if the server supports Range requests, and if you track the downloaded byte ranges.

Here’s how to do it:


βœ… Strategy for Pause/Resume in Multi-threaded Downloader

🧠 Core Concepts:

  • Each thread writes to a temp file, and we track the bytes downloaded per thread.

  • We use a progress.json file to save state (start, end, downloaded).

  • On resume, we read this file and continue unfinished ranges.


πŸ›  Updated Code with Pause/Resume Support

javascript
const fs = require('fs'); const axios = require('axios'); const path = require('path'); const os = require('os'); const FILE_URL = 'https://example.com/largefile.zip'; const OUTPUT_FILE = 'final.zip'; const THREADS = 8; const PROGRESS_FILE = 'progress.json'; async function getFileSize(url) { const res = await axios.head(url); const size = parseInt(res.headers['content-length'], 10); if (isNaN(size)) throw new Error('Could not get file size'); return size; } function saveProgress(progress) { fs.writeFileSync(PROGRESS_FILE, JSON.stringify(progress, null, 2)); } function loadProgress() { if (fs.existsSync(PROGRESS_FILE)) { return JSON.parse(fs.readFileSync(PROGRESS_FILE)); } return null; } async function downloadPart(url, part, index) { const tempPath = path.join(os.tmpdir(), `part-${index}`); const writer = fs.createWriteStream(tempPath, { flags: 'a' }); // append if resuming const headers = { Range: `bytes=${part.downloaded}-${part.end}` }; const response = await axios.get(url, { headers, responseType: 'stream' }); return new Promise((resolve, reject) => { response.data.on('data', chunk => { fs.appendFileSync(tempPath, chunk); part.downloaded += chunk.length; saveProgress(progress); // update progress on each chunk }); response.data.on('end', () => resolve(tempPath)); response.data.on('error', reject); }); } async function mergeParts(progress, outputPath) { const writer = fs.createWriteStream(outputPath); for (let i = 0; i < progress.length; i++) { const tempPath = path.join(os.tmpdir(), `part-${i}`); const data = fs.readFileSync(tempPath); writer.write(data); fs.unlinkSync(tempPath); } fs.unlinkSync(PROGRESS_FILE); writer.end(); console.log('Download complete:', outputPath); } let progress; (async () => { const size = await getFileSize(FILE_URL); const partSize = Math.ceil(size / THREADS); progress = loadProgress(); if (!progress) { // initialize progress if not resuming progress = Array.from({ length: THREADS }).map((_, i) => { const start = i * partSize; const end = i === THREADS - 1 ? size - 1 : start + partSize - 1; return { start, end, downloaded: start }; }); saveProgress(progress); } const downloadPromises = progress.map((part, i) => { if (part.downloaded > part.end) return Promise.resolve(); // already complete return downloadPart(FILE_URL, part, i); }); await Promise.all(downloadPromises); await mergeParts(progress, OUTPUT_FILE); })();

βœ… Features Supported:

  • 🟒 Pause = Just stop the script (e.g. CTRL+C).

  • 🟒 Resume = Run the script again, it continues where it left off.

  • 🟒 Each thread stores partial data + tracked progress.

  • 🟒 Automatically skips finished parts.


βœ… Optional Enhancements:

  • Add a CLI command pause, resume, clear.

  • Show download progress bar per thread.

  • Add retry logic if a thread fails.

Would you like me to add a progress bar or retry mechanism to this version?

check if the server supports streaming by http headers

To check if a server supports streaming and range requests, you can examine its HTTP response headers β€” especially:

  • Accept-Ranges β†’ indicates support for byte-range requests.

  • Content-Length β†’ needed for resuming logic.

  • Transfer-Encoding: chunked β†’ suggests streaming (chunked transfer).

  • Content-Type β†’ helps infer if the content is streamable (e.g., video/audio).


βœ… Example: Check Server Streaming Support with Axios

javascript
const axios = require('axios'); const url = 'https://example.com/largefile.zip'; async function checkStreamingSupport(url) { try { const res = await axios.head(url); const acceptRanges = res.headers['accept-ranges']; const contentLength = res.headers['content-length']; const transferEncoding = res.headers['transfer-encoding']; const contentType = res.headers['content-type']; console.log('--- Server Headers ---'); console.log('Accept-Ranges:', acceptRanges); console.log('Content-Length:', contentLength); console.log('Transfer-Encoding:', transferEncoding); console.log('Content-Type:', contentType); if (acceptRanges === 'bytes') { console.log('βœ… Server supports partial content (Range requests).'); } else { console.log('❌ Server does NOT support Range requests.'); } if (transferEncoding === 'chunked') { console.log('βœ… Server supports streaming via chunked transfer.'); } else if (contentLength) { console.log('βœ… Server provides Content-Length (can download in parts).'); } else { console.log('❌ No clear support for streaming or partial download.'); } } catch (err) { console.error('Error checking server:', err.message); } } checkStreamingSupport(url);

πŸ“Œ Sample Output:

--- Server Headers --- Accept-Ranges: bytes Content-Length: 10485760 Transfer-Encoding: undefined Content-Type: application/zip βœ… Server supports partial content (Range requests). βœ… Server provides Content-Length (can download in parts).

βœ… Summary:

Header Meaning Use Case
Accept-Ranges: bytes Server allows ranged (partial) downloads Needed for resume/pause
Transfer-Encoding: chunked Server sends data in chunks (streaming) Good for stream processing
Content-Length Total size in bytes Required to split ranges

Would you like this wrapped as a reusable CLI or function module?

use it inside the download manager to download the file sequentially if streaming is not supported

Great idea. Here's a version of the Node.js multi-threaded downloader that:

  • βœ”οΈ Detects if the server supports range requests (streaming/partial download).

  • βœ”οΈ If supported, downloads using 8 parallel threads.

  • ❌ If not supported, falls back to a single-threaded sequential download.


βœ… Unified Download Manager with Fallback Logic

javascript
const fs = require('fs'); const axios = require('axios'); const path = require('path'); const os = require('os'); const FILE_URL = 'https://example.com/largefile.zip'; const OUTPUT_FILE = 'final.zip'; const THREADS = 8; const PROGRESS_FILE = 'progress.json'; async function getFileHeaders(url) { const res = await axios.head(url); return { acceptRanges: res.headers['accept-ranges'], contentLength: parseInt(res.headers['content-length'], 10), contentType: res.headers['content-type'], transferEncoding: res.headers['transfer-encoding'], }; } async function downloadPart(url, part, index) { const tempPath = path.join(os.tmpdir(), `part-${index}`); const writer = fs.createWriteStream(tempPath, { flags: 'a' }); const headers = { Range: `bytes=${part.downloaded}-${part.end}` }; const response = await axios.get(url, { headers, responseType: 'stream' }); return new Promise((resolve, reject) => { response.data.on('data', chunk => { fs.appendFileSync(tempPath, chunk); part.downloaded += chunk.length; saveProgress(progress); }); response.data.on('end', () => resolve(tempPath)); response.data.on('error', reject); }); } function saveProgress(progress) { fs.writeFileSync(PROGRESS_FILE, JSON.stringify(progress, null, 2)); } function loadProgress() { if (fs.existsSync(PROGRESS_FILE)) { return JSON.parse(fs.readFileSync(PROGRESS_FILE)); } return null; } async function mergeParts(progress, outputPath) { const writer = fs.createWriteStream(outputPath); for (let i = 0; i < progress.length; i++) { const tempPath = path.join(os.tmpdir(), `part-${i}`); const data = fs.readFileSync(tempPath); writer.write(data); fs.unlinkSync(tempPath); } fs.unlinkSync(PROGRESS_FILE); writer.end(); console.log('βœ… Download complete:', outputPath); } async function downloadSequential(url, outputPath) { console.log('⚠️ Server does not support multi-threaded download. Falling back to sequential mode...'); const response = await axios.get(url, { responseType: 'stream' }); const writer = fs.createWriteStream(outputPath); return new Promise((resolve, reject) => { response.data.pipe(writer); writer.on('finish', () => { console.log('βœ… Download complete:', outputPath); resolve(); }); writer.on('error', reject); }); } let progress; (async () => { const headers = await getFileHeaders(FILE_URL); const supportsRange = headers.acceptRanges === 'bytes' && headers.contentLength; const fileSize = headers.contentLength; if (!supportsRange) { await downloadSequential(FILE_URL, OUTPUT_FILE); return; } const partSize = Math.ceil(fileSize / THREADS); progress = loadProgress(); if (!progress) { progress = Array.from({ length: THREADS }).map((_, i) => { const start = i * partSize; const end = i === THREADS - 1 ? fileSize - 1 : start + partSize - 1; return { start, end, downloaded: start }; }); saveProgress(progress); } const downloadPromises = progress.map((part, i) => { if (part.downloaded > part.end) return Promise.resolve(); return downloadPart(FILE_URL, part, i); }); await Promise.all(downloadPromises); await mergeParts(progress, OUTPUT_FILE); })();

πŸ” What This Does

Feature Behavior
Server supports range Download in 8 threads concurrently
Server doesn't support range Download whole file in one stream (sequential fallback)
Pause Just stop the script (e.g., CTRL+C)
Resume Run script again; continues where it left off

πŸ“¦ Optional Add-Ons

  • βœ… Progress bar with cli-progress

  • πŸ” Retry failed threads

  • πŸ”„ CLI options (pause, resume, status)

Would you like me to integrate a progress bar or command-line control next?

implement the cli-progress and retry logic

Perfect. Here's an enhanced version of your Node.js download manager that includes:


βœ… New Features:

  1. πŸ“Š Progress bar per thread using cli-progress

  2. πŸ” Retry logic on download failure (up to 3 retries)

  3. 🧠 Auto pause/resume from a progress.json file

  4. πŸ”„ Automatic fallback to sequential download if server doesn't support range


πŸ›  1. Install dependencies:

bash
npm install axios cli-progress

βœ… 2. Enhanced Downloader Code:

javascript
const fs = require('fs'); const axios = require('axios'); const path = require('path'); const os = require('os'); const cliProgress = require('cli-progress'); const FILE_URL = 'https://example.com/largefile.zip'; const OUTPUT_FILE = 'final.zip'; const THREADS = 8; const MAX_RETRIES = 3; const PROGRESS_FILE = 'progress.json'; // 🧠 Get server headers async function getFileHeaders(url) { const res = await axios.head(url); return { acceptRanges: res.headers['accept-ranges'], contentLength: parseInt(res.headers['content-length'], 10), transferEncoding: res.headers['transfer-encoding'], }; } // πŸ’Ύ Save/load progress function saveProgress(progress) { fs.writeFileSync(PROGRESS_FILE, JSON.stringify(progress, null, 2)); } function loadProgress() { return fs.existsSync(PROGRESS_FILE) ? JSON.parse(fs.readFileSync(PROGRESS_FILE)) : null; } // πŸ“€ Download single part with retry and progress async function downloadPart(url, part, index, bar) { const tempPath = path.join(os.tmpdir(), `part-${index}`); const writer = fs.createWriteStream(tempPath, { flags: 'a' }); const download = async (attempt = 1) => { try { const headers = { Range: `bytes=${part.downloaded}-${part.end}` }; const response = await axios.get(url, { headers, responseType: 'stream', }); return new Promise((resolve, reject) => { response.data.on('data', (chunk) => { writer.write(chunk); part.downloaded += chunk.length; bar.update(part.downloaded - part.start); saveProgress(progress); }); response.data.on('end', () => resolve(tempPath)); response.data.on('error', reject); }); } catch (err) { if (attempt < MAX_RETRIES) { console.warn(`πŸ” Retry [${index}] (attempt ${attempt})`); return download(attempt + 1); } else { throw new Error(`❌ Failed to download part ${index}: ${err.message}`); } } }; return download(); } // πŸ”— Merge all parts into a single file async function mergeParts(progress, outputPath) { const writer = fs.createWriteStream(outputPath); for (let i = 0; i < progress.length; i++) { const tempPath = path.join(os.tmpdir(), `part-${i}`); const data = fs.readFileSync(tempPath); writer.write(data); fs.unlinkSync(tempPath); } fs.unlinkSync(PROGRESS_FILE); writer.end(); console.log('βœ… Download complete:', outputPath); } // 🧱 Fallback: download full file sequentially async function downloadSequential(url, outputPath) { console.log('⚠️ Server does not support partial downloads. Fallback to sequential...'); const response = await axios.get(url, { responseType: 'stream' }); const writer = fs.createWriteStream(outputPath); return new Promise((resolve, reject) => { response.data.pipe(writer); writer.on('finish', () => { console.log('βœ… Download complete:', outputPath); resolve(); }); writer.on('error', reject); }); } let progress; (async () => { const headers = await getFileHeaders(FILE_URL); const supportsRange = headers.acceptRanges === 'bytes' && headers.contentLength; const fileSize = headers.contentLength; if (!supportsRange) { await downloadSequential(FILE_URL, OUTPUT_FILE); return; } // Initialize or resume progress const partSize = Math.ceil(fileSize / THREADS); progress = loadProgress(); if (!progress) { progress = Array.from({ length: THREADS }).map((_, i) => { const start = i * partSize; const end = i === THREADS - 1 ? fileSize - 1 : start + partSize - 1; return { start, end, downloaded: start }; }); saveProgress(progress); } // Setup progress bars const multibar = new cliProgress.MultiBar({ clearOnComplete: false, hideCursor: true, format: '[{bar}] {percentage}% | {value}/{total} bytes | Thread {thread}', }, cliProgress.Presets.shades_classic); const bars = progress.map((part, i) => { return multibar.create(part.end - part.start + 1, part.downloaded - part.start, { thread: i + 1, }); }); const downloadPromises = progress.map((part, i) => { if (part.downloaded > part.end) return Promise.resolve(); return downloadPart(FILE_URL, part, i, bars[i]); }); await Promise.all(downloadPromises); multibar.stop(); await mergeParts(progress, OUTPUT_FILE); })();

πŸ“¦ Features Recap

Feature Implemented?
βœ… Multi-threaded download βœ…
βœ… Resume on rerun βœ…
βœ… Retry failed downloads βœ… (3x)
βœ… Per-thread progress bars βœ…
βœ… Fallback to sequential βœ…

Would you like to also:

  • Convert this into a CLI tool (npx download-manager)?

  • Add download speed info?

Let me know and I’ll help you modularize or extend it!

add download speed info

Awesome β€” let's add live download speed per thread to the progress bars. Here's how we’ll do it:


βœ… What's Added:

  • Tracks download speed per thread in KB/s or MB/s.

  • Updated cli-progress bar to display:

    [#####...] 85% | 800KB/s | 1.3MB/5.0MB | Thread 3

🧠 How Speed Tracking Works:

  • On every data chunk, we:

    • Track the number of bytes received.

    • Calculate speed over time.

  • Use setInterval to update the displayed speed every second.


πŸ›  UPDATED CODE SNIPPET

Replace the downloadPart() function in the previous code with this version:

javascript
async function downloadPart(url, part, index, bar) { const tempPath = path.join(os.tmpdir(), `part-${index}`); const writer = fs.createWriteStream(tempPath, { flags: 'a' }); const download = async (attempt = 1) => { try { const headers = { Range: `bytes=${part.downloaded}-${part.end}` }; const response = await axios.get(url, { headers, responseType: 'stream', }); let received = 0; let lastUpdateTime = Date.now(); let speed = 0; // Speed updater const speedInterval = setInterval(() => { const now = Date.now(); const elapsed = (now - lastUpdateTime) / 1000; // in seconds speed = received / elapsed; // bytes/sec bar.update(part.downloaded - part.start, { speed: formatSpeed(speed), }); received = 0; lastUpdateTime = now; }, 1000); return new Promise((resolve, reject) => { response.data.on('data', (chunk) => { writer.write(chunk); part.downloaded += chunk.length; received += chunk.length; }); response.data.on('end', () => { clearInterval(speedInterval); bar.update(part.end - part.start, { speed: 'Done', }); resolve(tempPath); }); response.data.on('error', (err) => { clearInterval(speedInterval); reject(err); }); }); } catch (err) { if (attempt < MAX_RETRIES) { console.warn(`πŸ” Retry [${index}] (attempt ${attempt})`); return download(attempt + 1); } else { throw new Error(`❌ Failed to download part ${index}: ${err.message}`); } } }; return download(); }

✨ Helper Function: formatSpeed()

Add this above or below your main function:

javascript
function formatSpeed(bytesPerSec) { const kb = bytesPerSec / 1024; const mb = kb / 1024; if (mb >= 1) return `${mb.toFixed(2)} MB/s`; return `${kb.toFixed(1)} KB/s`; }

πŸ§ͺ Also Update Progress Bar Template

Update the MultiBar config in your setup:

js
const multibar = new cliProgress.MultiBar({ clearOnComplete: false, hideCursor: true, format: '[{bar}] {percentage}% | {speed} | {value}/{total} bytes | Thread {thread}', }, cliProgress.Presets.shades_classic);

βœ… Sample Output

[#####-----] 52% | 730.5 KB/s | 2.6MB/5.0MB | Thread 2 [##########] 100% | Done | 5.0MB/5.0MB | Thread 1

Would you like a summary speed indicator across all threads too?

yes

Great! Let's now add a global speed indicator to show total download speed across all threads β€” live and updated in real-time β€” while keeping the per-thread progress bars.


βœ… Final Features Overview

Feature Status
Per-thread download speed βœ…
Global download speed βœ…
Live progress bars βœ…
Pause/resume βœ…
Retry on failure βœ…
Fallback to sequential βœ…

πŸ›  Implementation Plan

  1. Track total downloaded bytes across all threads.

  2. Set an interval to update global speed every second.

  3. Print speed to console (either in the same line or as a separate log above/below the bars).


πŸ” Add to Your Script:

1. πŸ”§ At the top, define shared counters:

js
let totalDownloaded = 0; let lastTotalDownloaded = 0;

2. 🧠 Add this function to format global speed:

js
function formatSpeed(bytesPerSec) { const kb = bytesPerSec / 1024; const mb = kb / 1024; if (mb >= 1) return `${mb.toFixed(2)} MB/s`; return `${kb.toFixed(1)} KB/s`; }

3. πŸ• Global speed tracking β€” insert before await Promise.all(...):

js
const globalSpeedInterval = setInterval(() => { const speed = totalDownloaded - lastTotalDownloaded; lastTotalDownloaded = totalDownloaded; process.stdout.write(`\r🌐 Global speed: ${formatSpeed(speed)}`); }, 1000);

4. βœ… Stop it after downloads complete, right before merging:

js
clearInterval(globalSpeedInterval); console.log(); // for line break

5. πŸ”„ Update downloadPart() function β†’ inside .on('data', ...) add:

js
totalDownloaded += chunk.length;

So that section becomes:

js
response.data.on('data', (chunk) => { writer.write(chunk); part.downloaded += chunk.length; received += chunk.length; totalDownloaded += chunk.length; });

βœ… Sample Output in Terminal:

🌐 Global speed: 3.25 MB/s [########--] 86% | 420 KB/s | 4.3MB/5.0MB | Thread 2 [####------] 40% | 1.2 MB/s | 2.0MB/5.0MB | Thread 3 ...

This gives you a clean, professional-looking CLI download manager experience.


Would you like a feature to limit global speed (e.g., max 5 MB/s) or throttle individual threads?

implement both

Perfect. Let’s now enhance the downloader with bandwidth throttling, adding two features:


βœ… New Features to Implement:

Feature Description
1. Global speed limit Restricts total download speed (e.g., 5 MB/s across all threads)
2. Per-thread speed throttling Ensures no thread exceeds a specified rate (e.g., 1 MB/s per thread)

We’ll implement this using setTimeout() between data chunks to delay reads when needed.


βš™οΈ Setup Config Values (define at top of your file)

js
// Limits (in bytes per second) const GLOBAL_SPEED_LIMIT = 5 * 1024 * 1024; // 5 MB/s const THREAD_SPEED_LIMIT = 1 * 1024 * 1024; // 1 MB/s

🧠 Throttle Utility Function

Add this anywhere before downloadPart:

js
function sleep(ms) { return new Promise(resolve => setTimeout(resolve, ms)); }

πŸ” Modify downloadPart() β†’ Insert throttling logic

Replace the .on('data', ...) block in downloadPart() with the following:

js
response.data.on('data', async (chunk) => { writer.write(chunk); part.downloaded += chunk.length; received += chunk.length; totalDownloaded += chunk.length; // Per-thread throttling const delayPerChunk = (chunk.length / THREAD_SPEED_LIMIT) * 1000; // ms if (delayPerChunk > 5) await sleep(delayPerChunk); // skip small delays });

🌍 Global Throttling (shared control)

We add a download token bucket to enforce the global limit. Add this at the top of your file:

js
let globalBucket = GLOBAL_SPEED_LIMIT;

Then insert this token refill logic before starting downloads:

js
setInterval(() => { globalBucket = GLOBAL_SPEED_LIMIT; // refill each second }, 1000);

Now modify the .on('data', ...) section again to respect global speed:

js
response.data.on('data', async (chunk) => { // Throttle globally while (globalBucket < chunk.length) { await sleep(10); // wait a bit and retry } globalBucket -= chunk.length; // Proceed with write and tracking writer.write(chunk); part.downloaded += chunk.length; received += chunk.length; totalDownloaded += chunk.length; // Per-thread throttling const delayPerChunk = (chunk.length / THREAD_SPEED_LIMIT) * 1000; if (delayPerChunk > 5) await sleep(delayPerChunk); });

βœ… Optional (Verbose Logging)

You can log when throttling kicks in for debugging:

js
if (delayPerChunk > 5) console.log(`⏳ Throttling Thread ${index + 1}`);

πŸ§ͺ Summary

Feature Status Method
Global speed limit βœ… Token bucket with refill per second
Per-thread speed limit βœ… Delays after each chunk per thread
Efficiency βœ… Non-blocking, respects concurrency

Would you like to extract all these into a clean CLI tool (aia-downloader) or an npm package?

explain the throttling logic

Absolutely! Let's break down the throttling logic step by step so it's easy to understand and customize.


🧠 Goal of Throttling

Throttle = "slow down" the download speed.

We’re controlling:

  • βœ… Per-thread speed (each thread shouldn't exceed a certain speed)

  • βœ… Global speed (all threads combined shouldn’t go above a limit)

This helps:

  • Prevent network overload

  • Stay within API rate limits

  • Avoid consuming all available bandwidth


πŸ”„ How It Works – Conceptually

1. Per-Thread Throttling

Purpose: Make each thread obey a max speed (e.g., 1MB/s).

Logic:

  • After writing a chunk, calculate:

    delay = chunkSize / threadLimitPerSecond
  • Example:

    • Chunk = 100KB

    • Limit = 1MB/s = 1024KB/s

    • Delay = 100 / 1024 = 0.0976 seconds (~97ms)

  • We wait that long using await sleep().

Code:

js
const delayPerChunk = (chunk.length / THREAD_SPEED_LIMIT) * 1000; if (delayPerChunk > 5) await sleep(delayPerChunk); // skip tiny delays

2. Global Speed Throttling

Purpose: All threads combined must not exceed total bandwidth (e.g., 5MB/s).

Logic:

  • Use a "token bucket" (shared value globalBucket)

  • Each second, refill it to the allowed global bandwidth.

  • Before writing a chunk:

    • Check if there are enough "tokens" (bytes) left in globalBucket.

    • If yes β†’ proceed and subtract the bytes.

    • If no β†’ wait and retry.

Code:

js
// refill every second setInterval(() => { globalBucket = GLOBAL_SPEED_LIMIT; }, 1000); // in download: while (globalBucket < chunk.length) { await sleep(10); // wait a bit } globalBucket -= chunk.length;

🎯 Why Use Both?

Throttle Type Controls Use Case
Per-thread Fairness per task Don’t let one thread hog speed
Global Total bandwidth Don't exceed total network cap

This two-layer system gives you fine-grained control over download behavior.


πŸ“Œ Visual Summary

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚ Thread 1 β”‚ β”‚ Thread 2 β”‚ β”‚ Limit: 1MB/sβ”‚ β”‚ Limit: 1MB/sβ”‚ β””β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”˜ β””β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚ β”‚ β””β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”˜ β–Ό β–Ό Global Token Bucket (e.g. 5MB/s total) β–Ό Network

Would you like to visualize this in a diagram or refactor it into a reusable download module?

how does using delays is the same as restricting network bandwidth usage?

Great question β€” and you're absolutely right to dig deeper. Let’s clarify:


βœ… Short Answer

Delays don't restrict actual network bandwidth at the TCP level.
Instead, they control how fast your app reads and writes data, which indirectly limits how much data flows through.

So:

⏱ Delaying read/write operations β‰  true bandwidth shaping,
but βœ… it limits your effective bandwidth usage from the app side.


🧠 Here's What Actually Happens

🎯 Your Node.js app is pulling data from a stream (e.g., HTTP response):

  • The OS/network stack buffers incoming data.

  • Your code decides when to read it.

  • If you pause reading (via await sleep()), the TCP window shrinks, and the server eventually slows down sending.

Effect: You "throttle" how fast you pull data β†’ server adapts β†’ traffic slows down.


πŸ“‰ Example:

Let’s say:

  • You delay 100ms after reading each 100KB

  • That’s ~1MB/s effective pull rate.

Even though:

  • The server could send at 20MB/s

  • You’re voluntarily reading at just 1MB/s

➑️ So the total bandwidth used stays low, though the physical NIC is still capable of more.


βš™οΈ Analogy

Imagine a high-speed water tap and a small cup:

  • The tap is fast (server can send a lot).

  • You fill the cup, wait a bit, then drink.

  • You’re not limiting the tap β€” just how fast you take sips.


❌ What It’s Not

  • It’s not real QoS (Quality of Service).

  • It doesn’t prevent other apps from using the bandwidth.

  • It can’t cap traffic outside your app.


βœ… But It Works Well When…

  • You’re writing a CLI downloader or API scraper.

  • You need basic control to avoid server bans.

  • You’re working on multiple concurrent requests and want fairness.


πŸ”§ If You Need True Bandwidth Control...

Then consider using lower-level tools:

Tool/Method Description
tc (Linux Traffic Control) Kernel-level bandwidth control
trickle User-level bandwidth limiter wrapper
Docker network config Limit containers
Proxy with shaping (e.g., Squid) Centralized control

βœ… TL;DR

Method Controls Network? App-Level Control OS-Level Bandwidth Cap
await sleep() after chunk ❌ (not true throttle) βœ… Yes ❌ No
tc, trickle βœ… Yes ❌ External βœ… Yes

So β€” you're not forcing the network to slow down,
but you’re slowing your own usage, which results in the same outcome from the server’s perspective.


Do you want a version that uses OS-level throttling or test the download manager's effectiveness with speed charts?