NodeJS Server explained
In this article I'll try to explain how a NodeJS server works. The basics that every NodeJS developer should know to efficiently build and debug any kind of node applications.
Nowadays software developers are used to work with different frameworks for building server-side applications like Express, Koa or Nest. These frameworks are really good and help us to build large, scalable applications, but they also abstract a lot of interesting things behind them. Using frameworks creates a magic effect, but we don't believe in magic, do we?
Lets start from scratch, and create a simple web server using node docs. Snippet below creates a server, which respond Hello world
string when you access localhost:3000
.
import { createServer } from 'node:http'
const hostname = '127.0.0.1'
const port = 3000
const server = createServer((req, res) => {
res.statusCode = 200
res.setHeader('Content-Type', 'text/plain')
res.end('Hello World')
})
server.listen(port, hostname, () => {
console.log(`Server running at http://${hostname}:${port}/`)
})
What does the createServer
function do? It creates an instance of the http.Server
class, which inherits from the net.Server
class. This, in turn, extends the EventEmitter
class. As you may know, an EventEmitter can emit various events. If you're unfamiliar with this concept, check out the article: [[Node.js EventEmitter explained]]. However, at this point, we’re only interested in the request
event, which is emitted whenever an HTTP request is received. This event provides both the request and response objects.
The code above is just a wrapper for:
const server = new http.Server()
server.on('request', (req, res) => {
res.end('Hello!')
})
The http.Server
instance gives you access to methods and events like:
server.listen(...)
server.close()
server.on('request', ...)
server.on('connection', ...)
server.timeout
server.maxHeadersCount
- And more…
Because it inherits from EventEmitter
, so you can listen to low-level events.
Now that we understand what happens when the server receives an HTTP request, let’s take a closer look at the request
and resposen
objects.
Both request
and response
are stream objects—request
is a Readable stream, while response
is Writable. This allows us to leverage the full power of NodeJS streams when working with them. To learn more about Streams check [[NodeJS Streams explained]] article.
Basic HTTP Server with Routing
We’re already able to handle incoming requests, but now we need to figure out how to differentiate and process them individually. The request
object provides all the necessary metadata for this, such as:
req.method
- HTTP method likeGET
,POST
, etc.req.url
- The full URL (path + query string)req.headers
- An object with all headersreq.httpVersion
- HTTP version stringreq.socket
- The underlying TCP socket
To implement basic routing, we can extract the path using the req.url
and access additional information through the req.headers
property.
const path = new URL(req.url, `http://${req.headers.host}`).pathname
Then, we can route the requests based on the extracted path.
switch (path) {
case '/': {
res.setHeader('Content-Type', 'text/html')
res.end('<h1>Hello World</h1>')
break
}
case '/upload': {
return uploadHandler(req, res)
}
case '/posts': {
return postsHandler(req, res)
}
}
Inside each handler we can handle each request based on the method using req.method
property.
async function postsHandler(req, res) {
switch (req.method) {
case 'POST':
res.statusCode = 200
res.setHeader('Content-Type', 'application/json')
res.end(JSON.stringify({ status: 'ok' }))
break
case 'GET':
res.statusCode = 200
res.setHeader('Content-Type', 'application/json')
res.end(JSON.stringify({ status: 'ok' }))
break
}
}
How to get path params?
To get path params (dynamic values embedded directly in the URL path that are used to identify a specific resource) form request object we need to manually parse URL created from req.url
property.
const pathname = const path = new URL(req.url, `http://${req.headers.host}`).pathname // e.g. /users/42/posts/7
const segments = pathname.split('/').filter(Boolean);
if (segments[0] === 'users' && segments[2] === 'posts') {
const userId = segments[1];
const postId = segments[3];
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ userId, postId }));
} else {
res.writeHead(404);
res.end('Not Found');
}
How to get query params?
Use the searchParams
property of the URL interface to retrieve query parameters (key-value pairs that are appended to the end of a URL to filter, sort, search, paginate, or otherwise modify the request). This property is read-only and returns an URLSearchParams
object. This object allows us to decode query parameters contained in the URL.
/** /posts?ids=1,2,3 */
const parsedUrl = new URL(req.url, `http://${req.headers.host}`)
const searchParams = parsedUrl.searchParams
const ids = searchParams.get('ids') // '1,2,3'
Here we get a query params string, which then can be parsed and validated.
How to retrieve a JSON data from a request Stream?
Streams transmit data in small chunks. To obtain the complete data, we need to collect and concatenate all of these chunks.
There are several ways to achieve this:
- Listen
data
event (default event of Readable Stream)
let body = []
req.on('data', (chunk) => {
body.push(chunk)
})
Here we listen data event and collect binary data chunks in array.
req.on('end', () => {
const fullBody = Buffer.concat(body);
const bodyStr = new TextDecoder().decode(fullBody)
const bodyJSON = JSON.parse(bodyStr)
res.statusCode = 200;
res.setHeader('Content-Type', 'application/json');
res.end(JSON.stringify({ status: 'ok' }))
})
The end
event is emitted once all chunks have been received. To retrieve the full request body, we can concatenate the chunks and decode them into a string. After that, we can parse the string as JSON if needed.
Note: If the expected body is a string (e.g., JSON payload), you can directly concatenate the chunks as strings.
let body = '';
req.on('data', chunk => {
body += chunk; // == body = body + chunk.toString();
const decodedChunk = new TextDecoder().decode(chunk)
...
});
- Use
async iterator
for await...of
const chunks = []
for await (const chunk of req) {
chunks.push(chunk)
}
const fullBody = Buffer.concat(chunks)
This syntax is cleaner and more readable.
- Use
stream/consumers.text()
(Node.js 18+)
import { text } from 'stream/consumers'
const body = await text(req)
console.log('Body:', body)
Also available: buffer()
, json()
, and arrayBuffer()
.
Using popular framework Express it can be done:
app.use(express.text())
app.post('/', (req, res) => {
console.log(req.body) // already parsed!
})
How to retrieve a binary data from a request Stream
There is no difference between retrieving string or binary data, as all data chunks from the request
are always raw binary data
(Buffer objects). The final result depends solely on how we choose to decode (interpret) that raw data.
- Is it text? Decode it with
Buffer.toString()
orTextDecoder
. - Is it an image? Save or process it as binary.
- Is it JSON? Decode as UTF-8 and then
JSON.parse()
. - Is it a file upload? Parse it using
multipart/form-data
logic.
async function uploadHandler(req, res) {
switch (req.method) {
case 'POST':
const chunks = []
for await (const chunk of req) {
chunks.push(chunk)
}
const fullBody = Buffer.concat(chunks)
try {
fs.promises.writeFile('./uploads/image.jpg', fullBody)
} catch (e) {
throw new Error(`Error writing file: ${e.message}`)
}
res.statusCode = 200
res.setHeader('Content-Type', 'application/json')
res.end(JSON.stringify({ status: 'ok' }))
break
case 'GET':
res.statusCode = 200
res.setHeader('Content-Type', 'application/json')
const respData = Readable.from([JSON.stringify({ status: 'ok' })])
respData.pipe(res)
break
}
}
We can also handle this more efficiently by using streams.
const fileStream = fs.createWriteStream('./uploads/image.jpg')
req.pipe(fileStream)
Congratulations! You now have the knowledge and understanding needed to build any type of NodeJS server.
While using pure Node.js is a great learning experience, it’s not the most convenient approach for building large or complex projects. Still, having a solid grasp of the native NodeJS APIs is essential for every developer.