Let's say I want to create a video streaming server
from scratch using node.js how to do that?
Creating a video streaming server from scratch using Node.js involves setting up an HTTP or WebSocket server to serve video files efficiently. Here's a step-by-step guide:
There are two main methods for streaming videos:
Progressive Streaming (HTTP Streaming) β Sends video chunks via HTTP (like YouTube).
Adaptive Streaming (HLS, DASH) β Uses a playlist with multiple bitrates for a better experience.
For simplicity, let's start with progressive streaming.
You'll need
express
for routing and
fs
to
read video files.
shmkdir video-streaming-server cd video-streaming-server npm init -y npm install express
Create an
index.js
file and add this code:
javascriptconst express = require("express"); const fs = require("fs"); const path = require("path"); const app = express(); const PORT = 3000; // Serve the HTML player app.get("/", (req, res) => { res.sendFile(path.join(__dirname, "index.html")); }); // Stream video endpoint app.get("/video", (req, res) => { const videoPath = "video.mp4"; // Change this to your video file const videoSize = fs.statSync(videoPath).size; const range = req.headers.range; if (!range) { return res.status(400).send("Requires Range header"); } const CHUNK_SIZE = 10 ** 6; // 1MB const start = Number(range.replace(/\D/g, "")); const end = Math.min(start + CHUNK_SIZE, videoSize - 1); const contentLength = end - start + 1; const headers = { "Content-Range": `bytes ${start}-${end}/${videoSize}`, "Accept-Ranges": "bytes", "Content-Length": contentLength, "Content-Type": "video/mp4", }; res.writeHead(206, headers); const videoStream = fs.createReadStream(videoPath, { start, end }); videoStream.pipe(res); }); app.listen(PORT, () => { console.log(`Server is running on http://localhost:${PORT}`); });
Create a file named
index.html
in the same folder:
html<!DOCTYPE html> <html lang="en"> <head> <meta charset="UTF-8"> <meta name="viewport" content="width=device-width, initial-scale=1.0"> <title>Video Streaming</title> </head> <body> <h1>Video Streaming</h1> <video controls width="640" height="360"> <source src="/video" type="video/mp4"> Your browser does not support the video tag. </video> </body> </html>
Place a sample video (video.mp4
) in the project folder and start the server:
shnode index.js
Visit
http://localhost:3000/
in your browser to watch the streamed video.
Use HLS Streaming (for adaptive bitrate streaming)
Deploy the server (e.g., AWS, DigitalOcean, or a CDN)
Optimize video delivery (e.g., cache, CDN, compression)
Would you like to explore HLS streaming next? π
The Range header is used in HTTP requests to enable partial content retrieval. This is essential for video streaming because it allows the client (e.g., a browser or media player) to request only specific portions of a file, rather than downloading the entire video at once.
Range
header important?
Efficient Streaming β The client requests only small chunks of the video, improving performance.
Seamless Seeking β Allows users to skip to different parts of the video without reloading the entire file.
Bandwidth Optimization β Reduces data usage, especially for large videos.
Resuming Downloads β Supports interrupted downloads by fetching only the missing parts.
When a browser plays a video, it sends an HTTP request
with a
Range
header, asking for a specific byte range.
httpGET /video HTTP/1.1 Range: bytes=1000000-
This means:
The client wants the video starting from byte 1,000,000 to the end.
The server does not return the entire file. Instead, it responds with:
httpHTTP/1.1 206 Partial Content Content-Range: bytes 1000000-2000000/5000000 Content-Length: 1000001 Content-Type: video/mp4
The 206 Partial Content status code tells the client that it's getting only a part of the file.
The
Content-Range
header indicates which portion of the file is
being sent.
Range
in Node.js
In the streaming server, we use:
javascriptconst range = req.headers.range; if (!range) { return res.status(400).send("Requires Range header"); }
If the client doesnβt send a
Range
header, we reject the request because streaming
requires it.
Then, we extract the byte range:
javascriptconst start = Number(range.replace(/\D/g, "")); // Extracts the starting byte const end = Math.min(start + CHUNK_SIZE, videoSize - 1);
This allows us to stream only the requested chunk of the video.
Range
Header?
If you remove it, the entire video would be sent at once:
javascriptfs.createReadStream(videoPath).pipe(res);
This wastes bandwidth and makes seeking slow, as the browser has to download the whole file before skipping.
The
Range
header makes
progressive video streaming possible
by letting the client request only
small chunks
of a video file instead of downloading everything at
once.
Let me know if you want to dive deeper into advanced streaming techniques like HLS (HTTP Live Streaming)! π
Let's break this down step by step:
range.replace(/\D/g, "")
javascriptconst start = Number(range.replace(/\D/g, ""));
range
is a string received from the request headers,
like:
js"bytes=1000000-"
The
regular expression
/\D/g
means:
\D
β Matches any
non-digit
character (anything except
0-9
).
g
β Global flag (removes all non-digits in the
string).
πΉ How it works:
js"bytes=1000000-".replace(/\D/g, ""); // Result: "1000000" (only digits remain)
Number("1000000")
converts the string to the number
1000000.
π‘
Final result:
start
is
1000000, which is the byte position where the client wants
to start streaming.
Math.min(start + CHUNK_SIZE, videoSize - 1)
javascriptconst end = Math.min(start + CHUNK_SIZE, videoSize - 1);
We define the end byte to send as the minimum of:
start + CHUNK_SIZE
β The next chunk size (e.g.,
start + 1MB
).
videoSize - 1
β The last byte of the file (to prevent
reading beyond the file size).
πΉ Example Calculation:
Let's say:
start = 1000000
(from the
Range
header).
CHUNK_SIZE = 1MB = 10^6 = 1000000
.
videoSize = 5000000
(5MB file).
Then:
jsend = Math.min(1000000 + 1000000, 5000000 - 1); end = Math.min(2000000, 4999999); end = 2000000; // Chosen value
This ensures we don't read beyond the actual video file size.
javascriptconst start = Number(range.replace(/\D/g, "")); // Extracts the requested start byte. const end = Math.min(start + CHUNK_SIZE, videoSize - 1); // Ensures we donβt read beyond the file size.
π This is a key part of progressive video streaming, ensuring efficient and controlled chunk-based delivery of video content.
Let me know if you need more clarity! π¬
Good question! The reason we do
videoSize - 1
instead of just
videoSize
is because
byte ranges in HTTP are zero-based. Let's break this down:
In computing, byte positions start from 0.
If a video file has
videoSize = 5000000
bytes, the last byte is at
position
4999999
, not
5000000
.
πΉ Example:
Video file size: 5,000,000 bytes First byte position: 0 Last byte position: 4,999,999 (videoSize - 1)
So, when we calculate the end byte, we make sure it doesnβt exceed the valid range.
javascriptconst end = Math.min(start + CHUNK_SIZE, videoSize - 1);
If
videoSize
is
5,000,000, but we set
end = 5000000
, we'd be
requesting a byte that doesnβt exist.
Using
videoSize - 1
ensures we donβt
go out of bounds
when reading the file.
Imagine a client requests:
httpRange: bytes=4000000-
And our file size is 5,000,000 bytes.
πΉ Without
videoSize - 1
:
javascriptend = Math.min(4000000 + 1000000, 5000000); end = 5000000 β (out of range!)
πΉ With
videoSize - 1
:
javascriptend = Math.min(4000000 + 1000000, 4999999); end = 4999999 β (correct)
We use
videoSize - 1
to ensure: β
Byte positions stay within valid range (0 to
videoSize - 1
).
β
No out-of-bounds errors when reading the
file.
Let me know if you need a deeper explanation! π
No, removing the last bit does not cause an errorβin fact, it's necessary to prevent an out-of-bounds read. Let's break this down carefully.
Files are zero-indexed, meaning:
The first byte is at
position
0
.
If the total file size is
5,000,000 bytes, the
last valid byte
is at
position
4,999,999
(videoSize - 1).
videoSize - 1
Prevents Errors
Imagine we
don't
use
videoSize - 1
:
javascriptconst end = Math.min(start + CHUNK_SIZE, videoSize);
If
videoSize = 5000000
, then
end = 5000000
.
But byte position
5000000
does not exist
because the last byte is at
4999999
.
β This would cause an error like:
RangeError: Out of bounds read
So we must do:
javascriptconst end = Math.min(start + CHUNK_SIZE, videoSize - 1);
β This ensures we never exceed the file's actual byte range.
No! The last byte
is still included.
For example:
If
videoSize = 5000000
, the last byte is at
4999999
.
If
start = 4999000
,
end = 4999999
,
so the full video is served.
videoSize - 1
)
File Size | Requested Start |
Computed End (Math.min(...) )
|
Bytes Sent |
---|---|---|---|
5,000,000 | 4,999,000 | 4,999,999 | β 1,000 bytes |
5,000,000 | 4,999,500 | 4,999,999 | β 500 bytes |
5,000,000 | 4,999,999 | 4,999,999 | β 1 byte |
videoSize
instead of
videoSize - 1
)
File Size | Requested Start |
Computed End (Math.min(...) )
|
Bytes Sent |
---|---|---|---|
5,000,000 | 4,999,000 | 5,000,000 β | β Error (out of bounds) |
β
Using
videoSize - 1
prevents out-of-bounds errors.
β The last byte is still included.
β
Not using
videoSize - 1
can cause crashes when reading beyond the file
limit.
So removing the last bit? No. Preventing an error? Yes! π
Let me know if you need more details! π
To adjust video quality dynamically based on the user's internet speed, you need to implement adaptive bitrate streaming (ABR). This allows the server to deliver different video resolutions (e.g., 240p, 480p, 720p, 1080p) depending on the user's bandwidth. Hereβs how to do it in Node.js:
You need a streaming format that supports multiple video qualities:
HLS (HTTP Live Streaming) β Best for web & mobile (used by YouTube, Netflix, etc.).
DASH (Dynamic Adaptive Streaming over HTTP) β Alternative to HLS, used by YouTube.
For simplicity, letβs go with HLS.
Use FFmpeg to create multiple versions of the video.
bashffmpeg -i input.mp4 \ -vf "scale=640:360" -c:v libx264 -b:v 800k -hls_segment_filename 360p_%03d.ts 360p.m3u8 \ -vf "scale=1280:720" -c:v libx264 -b:v 2500k -hls_segment_filename 720p_%03d.ts 720p.m3u8 \ -vf "scale=1920:1080" -c:v libx264 -b:v 5000k -hls_segment_filename 1080p_%03d.ts 1080p.m3u8
This generates:
360p.m3u8
(playlist for 360p)
720p.m3u8
(playlist for 720p)
1080p.m3u8
(playlist for 1080p)
.ts
files (video chunks)
A master
.m3u8
file tells the video player which qualities are
available.
Create
master.m3u8
:
m3u8#EXTM3U #EXT-X-STREAM-INF:BANDWIDTH=800000,RESOLUTION=640x360 360p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=2500000,RESOLUTION=1280x720 720p.m3u8 #EXT-X-STREAM-INF:BANDWIDTH=5000000,RESOLUTION=1920x1080 1080p.m3u8
Now, set up a
Node.js Express server
to serve the
.m3u8
files.
javascriptconst express = require("express"); const path = require("path"); const app = express(); app.use(express.static("videos")); // Serve HLS files app.get("/video", (req, res) => { res.sendFile(path.join(__dirname, "videos", "master.m3u8")); }); app.listen(3000, () => console.log("Streaming server running on port 3000"));
Use HLS.js (for browsers that donβt support HLS natively):
html<video id="video" controls></video> <script src="https://cdn.jsdelivr.net/npm/hls.js@latest"></script> <script> var video = document.getElementById("video"); if (Hls.isSupported()) { var hls = new Hls(); hls.loadSource("/video"); hls.attachMedia(video); } else { video.src = "/video"; // Works in Safari (native HLS support) } </script>
The video player monitors the user's internet speed.
If the connection is slow, it automatically switches to a lower resolution.
If the connection improves, it upgrades to a higher resolution.
π Result: Smooth streaming without buffering! β
Would you like help setting up a full-stack implementation? π
Great question! The
HTML5
<video>
tag
itself does
not
handle adaptive streaming. Instead,
the video player (e.g., HLS.js, DASH.js, or native
HLS in Safari)
manages it
internally
using these mechanisms:
Most modern video players use buffering and download speed monitoring to decide which resolution to play. Hereβs how:
Videos are broken into small segments (e.g., 4-10 seconds each).
The player measures how fast it downloads a segment.
If a segment downloads too slowly, it switches to a lower resolution.
If a segment downloads quickly, it switches to a higher resolution.
Example:
Fast internet:
4K
video loads quickly β player keeps high
resolution.
Slow internet:
4K
segment loads too slowly β player switches to
1080p
or lower.
The player tracks the buffer size (how much video is preloaded).
If the buffer starts running low, it reduces the quality.
If the buffer stays full, it increases the quality.
Example:
Watching a video on WiFi β smooth streaming at 1080p.
Moving to 4G β buffering detected β player drops to 720p.
<video>
Tag Handle It?
No, the
<video>
tag
alone
does not handle adaptive streaming.
However,
some browsers (Safari, iOS) support HLS
natively, meaning the built-in player
does handle adaptive streaming.
For other browsers (Chrome, Firefox, Edge), you need a JavaScript player like:
HLS.js
(for
.m3u8
HLS streaming)
DASH.js (for MPEG-DASH streaming)
If you're using HLS.js, it automatically monitors bandwidth and switches resolutions.
html<video id="video" controls></video> <script src="https://cdn.jsdelivr.net/npm/hls.js@latest"></script> <script> var video = document.getElementById("video"); if (Hls.isSupported()) { var hls = new Hls(); hls.loadSource("/video"); hls.attachMedia(video); // π― Listen for quality changes hls.on(Hls.Events.LEVEL_SWITCHED, function (event, data) { console.log("Switched to quality level:", data.level); }); } else { video.src = "/video"; // Safari (native HLS support) } </script>
If you want to manually set or monitor the quality, you can:
Get a list of available quality levels:
javascriptconsole.log(hls.levels);
Force a specific quality (e.g., 720p):
javascripthls.nextLevel = 1; // 0 = 360p, 1 = 720p, 2 = 1080p
Want to test adaptive streaming?
Use Chrome DevTools:
Open
DevTools (F12
) β Network tab.
Click "No Throttling" and choose:
Fast 3G (slow internet)
Slow 3G (very slow internet)
Watch the video drop in quality automatically!
β
The
<video>
tag
itself doesnβt handle adaptive streaming.
β
The
player (e.g., HLS.js, Safariβs built-in
player)
monitors
download speed + buffer size.
β
The player
automatically switches resolution
based on network conditions.
β
You can
manually control quality
using JavaScript.
π Let me know if you want an example with DASH or more customization! π
If you want to manually calculate internet speed and adjust video quality yourself, you need to measure how fast the user can download video chunks. Letβs break it down:
Weβll calculate the user's download speed (Mbps) by:
Downloading a small file (e.g., a video chunk or test file).
Measuring the time it takes to download.
Calculating speed using:
Speed (Mbps)=Download Time (seconds)Γ106File Size (bits)βSwitching video quality based on speed.
Hereβs a simple way to measure download speed using JavaScript:
javascriptfunction measureSpeed(callback) { const fileUrl = "/test-file.mp4"; // Small test file (~1MB) const startTime = Date.now(); fetch(fileUrl).then(response => response.blob()).then(blob => { const endTime = Date.now(); const duration = (endTime - startTime) / 1000; // Convert ms to seconds const fileSize = blob.size * 8; // Convert bytes to bits const speedMbps = fileSize / (duration * 1_000_000); // Convert bits to Mbps callback(speedMbps); }); } // Example usage measureSpeed(speed => { console.log(`Your internet speed: ${speed.toFixed(2)} Mbps`); });
Fetches a small test file (/test-file.mp4
).
Measures start & end time to calculate download duration.
Converts file size (bytes β bits) and calculates speed in Mbps.
Calls
callback(speedMbps)
with the result.
Once we have the user's speed, we can dynamically set the video resolution.
javascriptfunction selectVideoQuality(speed) { if (speed > 5) { return "1080p.m3u8"; // High quality } else if (speed > 2) { return "720p.m3u8"; // Medium quality } else { return "360p.m3u8"; // Low quality } } measureSpeed(speed => { const videoSrc = selectVideoQuality(speed); console.log(`Loading video: ${videoSrc}`); var video = document.getElementById("video"); if (Hls.isSupported()) { var hls = new Hls(); hls.loadSource(videoSrc); hls.attachMedia(video); } else { video.src = videoSrc; // For Safari (native HLS support) } });
Calls
measureSpeed()
.
Selects the best video resolution based on speed:
>5 Mbps β 1080p
2-5 Mbps β 720p
<2 Mbps β 360p
Loads the correct
.m3u8
playlist.
Open
Chrome DevTools (F12
)
β Go to the
Network tab.
Under "Throttling", select:
Fast 3G (~1.5 Mbps) β Should load 360p.
Slow 3G (~500 Kbps) β Should load 240p.
No Throttling (~100 Mbps) β Should load 1080p.
Refresh the page and check the
console logs
(F12 β Console
).
Instead of measuring speed only once, you can check it every X seconds to adapt dynamically:
javascriptfunction autoAdjustQuality() { setInterval(() => { measureSpeed(speed => { const newSrc = selectVideoQuality(speed); if (video.src !== newSrc) { console.log(`Switching to: ${newSrc}`); video.src = newSrc; // Reloads video with new quality } }); }, 10_000); // Check every 10 seconds } autoAdjustQuality();
β
We manually measure
download speed
using a test file.
β
We
choose video quality
dynamically based on
internet speed.
β
We use
HLS.js
to
switch streams in real-time.
β
We
periodically check speed
to adapt to changing network conditions.
π Now you have full control over video quality switching!
Would you like to add a smooth transition between qualities instead of reloading the video? π
Yes! You can dynamically
transform each chunk
of a
fs.createReadStream
in Node.js
before sending it
to the client. This allows you to
adjust the video quality
on the fly based on the user's internet speed.
Instead of switching entire video files, we can:
Read video chunks
using
fs.createReadStream()
.
Transform each chunk (e.g., compress, re-encode, resize bitrate).
Stream the transformed chunk to the client.
fs.createReadStream
Hereβs how a basic streaming server works:
javascriptconst fs = require("fs"); const http = require("http"); http.createServer((req, res) => { const filePath = "video.mp4"; // Original video const stat = fs.statSync(filePath); const fileSize = stat.size; // Get the range header (if requested) const range = req.headers.range; if (!range) { res.writeHead(416, { "Content-Range": `bytes */${fileSize}` }); return res.end(); } const CHUNK_SIZE = 10 ** 6; // 1MB chunks const start = Number(range.replace(/\D/g, "")); const end = Math.min(start + CHUNK_SIZE, fileSize - 1); // Set response headers res.writeHead(206, { "Content-Range": `bytes ${start}-${end}/${fileSize}`, "Accept-Ranges": "bytes", "Content-Length": end - start + 1, "Content-Type": "video/mp4", }); // Stream the chunk const stream = fs.createReadStream(filePath, { start, end }); stream.pipe(res); }).listen(3000, () => console.log("Server running on http://localhost:3000"));
Reads
chunks of 1MB
(CHUNK_SIZE
).
Uses
range
headers for
partial streaming.
Streams the video directly to the browser.
If you want to modify each chunk before sending it, use Node.js streams and FFmpeg.
javascriptconst { spawn } = require("child_process"); const fs = require("fs"); const http = require("http"); http.createServer((req, res) => { const filePath = "video.mp4"; const stat = fs.statSync(filePath); const fileSize = stat.size; const range = req.headers.range; if (!range) { res.writeHead(416, { "Content-Range": `bytes */${fileSize}` }); return res.end(); } const CHUNK_SIZE = 10 ** 6; // 1MB const start = Number(range.replace(/\D/g, "")); const end = Math.min(start + CHUNK_SIZE, fileSize - 1); res.writeHead(206, { "Content-Range": `bytes ${start}-${end}/${fileSize}`, "Accept-Ranges": "bytes", "Content-Length": end - start + 1, "Content-Type": "video/mp4", }); // Read the video chunk const videoStream = fs.createReadStream(filePath, { start, end }); // Use FFmpeg to compress each chunk const ffmpegProcess = spawn("ffmpeg", [ "-i", "pipe:0", // Read from stdin "-b:v", "500k", // Reduce bitrate (adjustable) "-f", "mp4", // Output format "pipe:1" // Write to stdout ]); videoStream.pipe(ffmpegProcess.stdin); // Send the chunk to FFmpeg ffmpegProcess.stdout.pipe(res); // Send transformed chunk to client }).listen(3000, () => console.log("Server running on http://localhost:3000"));
The server reads
only the requested chunk
(fs.createReadStream
).
The chunk is piped into FFmpeg, which compresses/transcodes it.
The transformed chunk is streamed directly to the client.
If you want to change quality dynamically, you can measure the userβs speed and adjust the FFmpeg settings.
javascriptfunction getBitrateForSpeed(speed) { if (speed > 5) return "2000k"; // 5 Mbps β 1080p if (speed > 2) return "1000k"; // 2-5 Mbps β 720p return "500k"; // <2 Mbps β 480p } http.createServer((req, res) => { measureSpeed(speed => { const bitrate = getBitrateForSpeed(speed); console.log(`Adjusting bitrate to: ${bitrate}`); const filePath = "video.mp4"; const stat = fs.statSync(filePath); const fileSize = stat.size; const range = req.headers.range; if (!range) { res.writeHead(416, { "Content-Range": `bytes */${fileSize}` }); return res.end(); } const CHUNK_SIZE = 10 ** 6; const start = Number(range.replace(/\D/g, "")); const end = Math.min(start + CHUNK_SIZE, fileSize - 1); res.writeHead(206, { "Content-Range": `bytes ${start}-${end}/${fileSize}`, "Accept-Ranges": "bytes", "Content-Length": end - start + 1, "Content-Type": "video/mp4", }); const videoStream = fs.createReadStream(filePath, { start, end }); // Adjust bitrate dynamically const ffmpegProcess = spawn("ffmpeg", [ "-i", "pipe:0", "-b:v", bitrate, // Dynamically selected "-f", "mp4", "pipe:1" ]); videoStream.pipe(ffmpegProcess.stdin); ffmpegProcess.stdout.pipe(res); }); }).listen(3000, () => console.log("Adaptive streaming server running!"));
Uses
measureSpeed(speed => { ... })
to detect network speed.
Chooses
bitrate dynamically
(500k
,
1000k
,
2000k
).
Streams video chunks with optimized bitrate for each user.
Instead of modifying each chunk manually, another approach is:
Pre-encode multiple quality levels
(1080p
,
720p
,
480p
).
Use an
.m3u8
HLS playlist
to let the player switch dynamically.
This is
easier
and more
efficient
than on-the-fly transformation.
Would you like an
HLS setup guide
as well? π
β
You
can transform each chunk
of
fs.createReadStream
.
β
Using
FFmpeg, you can
compress, resize, or change bitrate
dynamically.
β
By
measuring speed, you can
adjust video quality in real-time.
β
An
HLS playlist
might be a
better alternative
for smoother switching.
Would you like help setting up HLS with Node.js? π