Introduction
Node Js is a runtime environment that allows JavaScript to run outside the browser. Traditionally, JavaScript was mainly used to make web pages interactive, but Node Js changed that by enabling developers to build servers, command-line tools, APIs, automation scripts, and real-time applications using the same language on both the frontend and backend. It exists to solve a practical problem: developers wanted a fast, scalable, and efficient way to handle many connections at once without creating heavy server processes for every request. Node Js uses Google Chromeās V8 engine to execute JavaScript quickly, and it is designed around an event-driven, non-blocking I/O model, which makes it especially useful for network applications and data-heavy systems that wait on files, databases, or external services.
In real life, Node Js is used in chat applications, streaming platforms, REST APIs, dashboards, e-commerce systems, serverless functions, and build tooling. Tools like npm, Vite, Next.js development servers, and many automation workflows rely on Node Js. One reason it became so popular is that frontend and backend teams can share JavaScript knowledge, libraries, and even validation logic. Another reason is speed of development: Node Js has a huge package ecosystem and makes it easy to prototype and ship features quickly.
At its core, Node Js includes a runtime, built-in modules, and an event loop. The runtime executes JavaScript. Built-in modules such as fs, http, and path provide file handling, web server capabilities, and path utilities. The event loop helps Node Js manage asynchronous tasks like reading files or waiting for HTTP responses without blocking the whole application. This is why Node Js is often described as lightweight and efficient. It does not mean it is magically faster for every task; instead, it performs very well when applications spend a lot of time waiting for input and output operations.
Step-by-Step Explanation
To start with Node Js, first install it from the official website. Then verify the installation using node -v and npm -v in a terminal. Create a file such as app.js, write JavaScript inside it, and run it with node app.js. A Node program can print output, read files, create servers, or use installed packages. The syntax is standard JavaScript, but Node Js also gives access to server-side features. For example, instead of interacting with the browser DOM, you often work with files, environment variables, processes, and HTTP requests.
Comprehensive Code Examples
Basic example
console.log('Hello from Node Js');Real-world example
const http = require('http');
const server = http.createServer((req, res) => {
res.writeHead(200, { 'Content-Type': 'text/plain' });
res.end('Welcome to my Node Js server');
});
server.listen(3000, () => {
console.log('Server running on http://localhost:3000');
});Advanced usage
const fs = require('fs');
fs.readFile('notes.txt', 'utf8', (err, data) => {
if (err) {
console.error('Error reading file:', err.message);
return;
}
console.log('File contents:', data);
});
console.log('Reading file in the background...');Common Mistakes
- Mistake: Expecting browser features like
documentorwindowin Node Js. Fix: Remember Node Js is server-side and does not include the browser DOM. - Mistake: Forgetting to install dependencies before using them. Fix: Run
npm install package-nameand checkpackage.json. - Mistake: Using blocking operations carelessly. Fix: Prefer asynchronous APIs when building servers and scalable apps.
Best Practices
- Use clear file and folder names for maintainability.
- Start with small scripts, then move toward modular applications.
- Handle errors explicitly in callbacks, promises, and async functions.
- Keep Node Js updated to benefit from performance and security improvements.
Practice Exercises
- Create a file named
intro.jsthat prints your name and a short message about learning Node Js. - Write a Node Js script that displays the current year in the terminal.
- Build a simple server that responds with plain text on port 3000.
Mini Project / Task
Create a small welcome server in Node Js that listens on port 3000 and returns a different text message for the homepage and an /about route.
Challenge (Optional)
Build a Node Js script that reads a text file asynchronously and prints the number of words in that file to the terminal.
How Node.js Works
Node.js is an open-source, cross-platform, JavaScript runtime environment that executes JavaScript code outside a web browser. It was created by Ryan Dahl in 2009 and is built on Chrome's V8 JavaScript engine. The primary reason for Node.js's existence was to enable developers to use JavaScript for server-side programming, bridging the gap between front-end and back-end development with a single language. This unification simplifies development, reduces context-switching, and allows for better code sharing between client and server. Node.js is widely used for building fast, scalable network applications, including real-time applications like chat servers, streaming services, and APIs, as well as complex single-page applications and microservices. Its non-blocking, event-driven architecture makes it particularly effective for I/O-intensive tasks.
At its core, Node.js operates on a single-threaded, event-driven, non-blocking I/O model. This is a fundamental concept that differentiates it from many traditional server-side technologies. While JavaScript itself is single-threaded, Node.js achieves concurrency not through creating multiple threads for each request, but through an event loop. When Node.js performs an I/O operation (like reading a file from the disk or making a network request), instead of waiting for that operation to complete, it registers a callback function and immediately moves on to process other requests. Once the I/O operation finishes, it places the callback function in an event queue. The event loop continuously monitors this queue and executes callbacks when the main thread is free. This design makes Node.js highly efficient for tasks that involve a lot of waiting for external resources, as it can handle many concurrent connections without needing to spawn a new thread for each, thus saving memory and CPU resources.
Step-by-Step Explanation
Let's break down the execution flow:
- V8 JavaScript Engine: Node.js uses Google's V8 engine, the same engine that powers Google Chrome. V8 compiles JavaScript code directly into machine code, making it very fast.
- Event Loop: This is the heart of Node.js's concurrency model. The event loop is a single-threaded process that constantly checks if the call stack is empty. If it is, it looks into the event queue for pending callbacks and pushes them onto the call stack for execution.
- Non-blocking I/O: When an asynchronous I/O operation is initiated (e.g., reading a file, database query, network request), Node.js offloads the operation to the system kernel (or a thread pool managed by libuv) and continues executing other JavaScript code. It doesn't wait for the I/O operation to complete.
- Callbacks: Once the I/O operation finishes, it triggers a callback function that was originally provided. This callback is then placed into the event queue.
- libuv: Node.js uses a library called libuv, which provides cross-platform asynchronous I/O. libuv handles the underlying operating system's asynchronous capabilities and maintains a thread pool for operations that the OS doesn't support asynchronously (like DNS lookups or file I/O on some systems).
Comprehensive Code Examples
Basic Example: Simple HTTP Server
This example demonstrates a basic Node.js HTTP server that responds to requests with 'Hello, Node.js!'. It shows the non-blocking nature as it handles multiple requests without waiting.
const http = require('http');
const hostname = '127.0.0.1';
const port = 3000;
const server = http.createServer((req, res) => {
res.statusCode = 200;
res.setHeader('Content-Type', 'text/plain');
res.end('Hello, Node.js!\n');
});
server.listen(port, hostname, () => {
console.log(`Server running at http://${hostname}:${port}/`);
});Real-world Example: Asynchronous File Read
This code reads a file asynchronously. Node.js won't block while reading the file; it will continue executing other code and call the provided callback when the file is ready.
const fs = require('fs');
console.log('Start reading file...');
fs.readFile('example.txt', 'utf8', (err, data) => {
if (err) {
console.error('Error reading file:', err);
return;
}
console.log('File content:', data);
});
console.log('Finished initiating file read (program continues executing).');
// This line will execute before the file content is logged, demonstrating non-blocking I/O.Advanced Usage: Handling Multiple Concurrent Operations with Promises
Using Promises (or async/await) for better management of asynchronous operations, especially when dealing with multiple I/O tasks.
const fs = require('fs').promises;
async function processFiles() {
try {
console.log('Starting multiple file reads...');
const [file1Content, file2Content] = await Promise.all([
fs.readFile('file1.txt', 'utf8'),
fs.readFile('file2.txt', 'utf8')
]);
console.log('Content of file1:', file1Content.trim());
console.log('Content of file2:', file2Content.trim());
console.log('All files processed.');
} catch (error) {
console.error('An error occurred:', error);
}
}
// Create dummy files for the example
fs.writeFile('file1.txt', 'Hello from File 1!')
.then(() => fs.writeFile('file2.txt', 'Hello from File 2!'))
.then(processFiles)
.catch(err => console.error('Setup error:', err));Common Mistakes
- Blocking the Event Loop: Performing long-running synchronous operations (e.g., complex calculations without breaking them up, synchronous file I/O in a web server request handler) will block the single event loop, making your application unresponsive.
Fix: Always favor asynchronous APIs for I/O operations and break down CPU-intensive tasks into smaller, non-blocking units or offload them to worker threads if necessary. - Callback Hell (Pyramid of Doom): Nesting callbacks too deeply can make code hard to read and maintain.
Fix: Use Promises, async/await, or modularize your code to flatten nested callbacks. - Ignoring Error Handling in Callbacks: Forgetting to handle errors in asynchronous callbacks can lead to silent failures or unhandled promise rejections.
Fix: Always check for an `err` object in the first argument of callbacks or use `try...catch` with `async/await`.
Best Practices
- Embrace Asynchronous Programming: Understand and leverage Node.js's non-blocking nature. Use `async/await` for cleaner asynchronous code.
- Use Environment Variables: Store configuration settings (like port numbers, database credentials) in environment variables instead of hardcoding them.
- Modularize Your Code: Break down your application into smaller, manageable modules. This improves readability, reusability, and testability.
- Implement Robust Error Handling: Always handle errors gracefully, especially in asynchronous operations. Use `try...catch` and proper error propagation.
- Monitor and Log: Use logging libraries (e.g., Winston, Pino) to gain insight into your application's behavior and performance.
Practice Exercises
- Exercise 1: Simple Timer
Write a Node.js script that prints 'Hello after 3 seconds!' to the console after a 3-second delay, without blocking the main thread. - Exercise 2: Asynchronous Greeting
Create two functions: `greetUser(username, callback)` and `sayGoodbye(username, callback)`. `greetUser` should simulate an asynchronous operation (e.g., with `setTimeout`) and then call `sayGoodbye` in its callback. `sayGoodbye` should also be asynchronous. - Exercise 3: Read and Uppercase File
Write a Node.js program that reads the content of a file named `input.txt` asynchronously, converts its content to uppercase, and then prints it to the console.
Mini Project / Task
Build a simple command-line utility that takes a file path as an argument. The utility should read the file, count the number of words in it, and then print the word count to the console. Ensure that the file reading is asynchronous.
Challenge (Optional)
Extend the mini-project. If the file contains JSON data, the utility should parse it and pretty-print it to the console instead of counting words. If it's a plain text file, it should count words. Handle potential errors for file not found or invalid JSON dynamically.
Installation
Node.js is an open-source, cross-platform JavaScript runtime environment that executes JavaScript code outside a web browser. It was created by Ryan Dahl in 2009 and has since grown to become a cornerstone technology for backend development, real-time applications, and microservices. Its event-driven, non-blocking I/O model makes it highly efficient and scalable, perfect for data-intensive real-time applications. From building robust APIs to developing command-line tools and even desktop applications with frameworks like Electron, Node.js has a vast array of use cases. Companies like Netflix, PayPal, and LinkedIn heavily rely on Node.js for their backend infrastructure, demonstrating its real-world applicability and performance capabilities.
The installation process for Node.js is straightforward, offering several methods to suit different operating systems and developer preferences. The primary way to install Node.js is by downloading the official installer from its website, which provides Long Term Support (LTS) versions and current releases. LTS versions are recommended for most users as they are stable and receive long-term maintenance. Current releases, on the other hand, include the latest features and are ideal for those who want to experiment with new functionalities. Beyond direct installers, package managers like npm (Node Package Manager), nvm (Node Version Manager), Homebrew for macOS, and apt for Debian/Ubuntu-based Linux distributions offer more flexible installation and version management options.
Step-by-Step Explanation
Installing Node.js can be done through various methods, but the most common and recommended approach for beginners is using the official installer or a version manager.
Method 1: Using the Official Node.js Installer
1. Open your web browser and navigate to the official Node.js website: https://nodejs.org/.
2. On the homepage, you will typically see two download options: 'LTS' (Long Term Support) and 'Current'. For most production environments and general development, the LTS version is highly recommended due to its stability and extended support.
3. Click on the 'LTS' button. This will download the appropriate installer for your operating system (Windows, macOS, or Linux).
4. Once the download is complete, locate the installer file (e.g., .msi for Windows, .pkg for macOS) and run it.
5. Follow the prompts in the installation wizard. Generally, accepting the default settings is sufficient. Click 'Next', agree to the license terms, choose the installation destination, and select the components to install (Node.js runtime, npm package manager, documentation, and optionally, Chocolatey for Windows).
6. Click 'Install' to begin the installation. You might be prompted for administrator privileges.
7. Once the installation is finished, click 'Finish'.
8. To verify the installation, open your terminal or command prompt and type:
node -vThis command should display the installed Node.js version. Then, type:
npm -vThis will show the installed npm version. If both commands return version numbers, Node.js and npm are successfully installed.
Method 2: Using Node Version Manager (nvm)
nvm is a popular tool for managing multiple Node.js versions on a single machine, which is incredibly useful for projects requiring different Node.js environments.
For macOS/Linux:
1. Open your terminal.
2. Install nvm using the curl command:
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.1/install.sh | bash(Note: The version number 'v0.39.1' might change; always check the nvm GitHub page for the latest installation script.)
3. After installation, close and reopen your terminal, or source your shell configuration file (e.g.,
source ~/.bashrc or source ~/.zshrc) for nvm to be available.4. Verify nvm installation:
nvm -v5. Install the latest LTS version of Node.js:
nvm install --lts6. Use the installed version:
nvm use --lts7. Verify Node.js and npm:
node -v
npm -vFor Windows:
Use
nvm-windows, which is a different project. Download the installer from its GitHub releases page: https://github.com/coreybutler/nvm-windows/releases. Follow the installation steps and then use commands similar to the macOS/Linux version (e.g., nvm install latest, nvm use [version]).Comprehensive Code Examples
Basic Example: Verifying Installation
After installing Node.js, the first thing you'll do is verify it. This isn't strictly 'code' but a command-line interaction.
# Open your terminal or command prompt
node -v
npm -vExpected Output (version numbers may vary):
v18.18.0
9.8.1Real-world Example: Running a Simple Node.js Script
Let's create a simple 'Hello, Node.js!' script. This demonstrates that your Node.js runtime is operational.
1. Create a file named
app.js in a directory of your choice.2. Add the following content to
app.js:// app.js
console.log('Hello, Node.js!');
console.log('Node.js is running successfully!');3. Save the file.
4. Open your terminal or command prompt, navigate to the directory where you saved
app.js, and run:node app.jsExpected Output:
Hello, Node.js!
Node.js is running successfully!Advanced Usage: Managing Node.js Versions with nvm
If you need to work on multiple projects that require different Node.js versions, nvm is indispensable.
# Install a specific LTS version
nvm install 16.20.2
# Install the latest stable version
nvm install 20.9.0
# List all installed Node.js versions
nvm ls
# Switch to a specific version
nvm use 16.20.2
# Switch back to another version
nvm use 20.9.0
# Set a default version to use when opening new terminals
nvm alias default 20.9.0
# Uninstall a version
nvm uninstall 16.20.2Common Mistakes
1. Not restarting the terminal/command prompt: After installing Node.js (especially with an installer or nvm), the system's PATH variable might not be updated immediately. Always close and reopen your terminal or command prompt to ensure the new environment variables are loaded. If not, commands like
node -v will fail.2. Confusing Node.js and npm versions: Node.js and npm are separate entities that are typically bundled together. Running
node -v gives the Node.js runtime version, while npm -v gives the package manager version. They don't always have matching major versions, and that's perfectly normal.3. Permissions issues during global package installation: When installing npm packages globally (e.g.,
npm install -g ), users on macOS/Linux sometimes encounter permission errors. A common (but generally discouraged) fix is using sudo npm install -g . A better solution is to configure npm to install global packages to a user-owned directory, or even better, use nvm which handles permissions correctly.Best Practices
1. Use LTS versions for production: Always prefer the Long Term Support (LTS) versions for production applications due to their stability, reliability, and extended maintenance.
2. Utilize a Node Version Manager (nvm): For development, especially if you work on multiple projects, use nvm (or nvm-windows) to easily switch between different Node.js versions. This prevents version conflicts and simplifies project setup.
3. Keep Node.js and npm updated: Regularly update your Node.js and npm installations to benefit from performance improvements, security patches, and new features. With nvm, this is as simple as
nvm install --lts and then nvm use --lts.4. Understand global vs. local npm packages: Install project-specific dependencies locally (
npm install ) and only install tools meant to be used across multiple projects globally (npm install -g ). This ensures project isolation and avoids dependency conflicts.Practice Exercises
1. Verify Your Setup: Open your terminal and confirm both Node.js and npm are installed and report their versions. If not, troubleshoot the installation steps.
2. Run a Basic Script: Create a new file called
hello.js. Inside it, write console.log('My Node.js environment is ready!');. Save the file and then execute it using Node.js from your terminal.3. Create a Version Log: If you have nvm installed, list all Node.js versions available on your system. If you only have one, install an older LTS version (e.g.,
nvm install 16), then list them again.Mini Project / Task
Create a simple Node.js script that takes your name as an argument from the command line and prints a personalized greeting. For example, if you run
node greet.js John, it should output Hello, John! Welcome to Node.js!.Challenge (Optional)
Write a Node.js script that checks if a specific command-line argument (e.g.,
--version) is provided. If it is, print the current Node.js version. Otherwise, print a default message like 'No specific command requested.' (Hint: Look into process.argv for accessing command-line arguments).Node REPL
Node REPL stands for Read, Eval, Print, Loop. It is an interactive command-line environment that lets you write JavaScript one line at a time and immediately see the result. It exists to make experimentation fast. Instead of creating a file, saving it, and running node app.js, you can open the REPL with node in the terminal and test ideas instantly. In real life, developers use the REPL to verify syntax, inspect objects, try built-in Node APIs, debug logic, and learn JavaScript behavior quickly. It is especially useful when exploring strings, arrays, math, JSON, timers, modules, and small utility functions.
The REPL follows a simple cycle. First, you type input. Next, Node evaluates the JavaScript. Then it prints the result. Finally, it waits for the next command. Common uses include evaluating expressions, declaring variables, defining functions, and loading modules like fs or path. Node REPL also supports special commands such as .help, .exit, .clear, and .save. These commands are not standard JavaScript. They are REPL features that improve workflow. Another helpful idea is the underscore variable _, which stores the last evaluated result in many REPL sessions, making quick calculations easier.
Step-by-Step Explanation
Start by opening a terminal and typing node. You will see a prompt like >. This means the REPL is ready. Type a JavaScript expression such as 2 + 3 and press Enter. Node evaluates it and prints 5. You can declare variables with const or let, for example const name = 'Sam'. Then type name to inspect the value. You can define a function, call it, and immediately verify the output. To work with Node features, import a module using const os = require('os') and then inspect data such as os.platform(). Use .help to list commands. Use .exit or press Ctrl + C twice to leave the REPL.
Comprehensive Code Examples
Basic example
2 + 2
const language = 'Node Js'
language.toUpperCase()Real-world example
const path = require('path')
path.basename('/users/dev/project/app.js')
path.extname('/users/dev/project/app.js')Advanced usage
const users = [{ name: 'Ava', score: 80 }, { name: 'Leo', score: 95 }]
const topUser = users.find(user => user.score > 90)
topUser
_Common Mistakes
- Forgetting REPL commands are not JavaScript: commands like
.helponly work inside the REPL, not in script files. - Using browser-only APIs: objects like
windowordocumentdo not exist in Node REPL. Use Node globals and modules instead. - Expecting saved state after exit: variables disappear when the REPL closes unless you save commands manually with
.save. - Confusing syntax errors with runtime behavior: type one expression carefully and read the error output before retrying.
Best Practices
- Use the REPL for quick experiments, not for storing production code.
- Test one idea at a time so results are easy to understand.
- Import built-in modules to explore Node capabilities safely.
- Use
.helpoften until REPL commands become familiar. - Move useful working code into a
.jsfile once it becomes larger than a few lines.
Practice Exercises
- Open the Node REPL and evaluate five JavaScript expressions using numbers, strings, and booleans.
- Create a variable called
userName, assign your name, and display it in uppercase. - Load the
osmodule and inspect your platform and CPU architecture.
Mini Project / Task
Use the Node REPL to create a tiny score checker. Define an array of student scores, calculate the average, and test whether the class average is above 70.
Challenge (Optional)
In the REPL, load the path module and experiment with at least three different file paths. Determine the filename, extension, and directory name for each input path.
NPM Basics
NPM stands for Node Package Manager. It exists to help developers install, share, update, and manage JavaScript packages for Node Js projects. In real-life development, npm is used when you need external tools such as web frameworks, testing libraries, linters, database clients, or build utilities. It also helps manage project metadata through the package.json file, which acts like a project manifest. When working on backend apps, command-line tools, or full-stack projects, npm is usually one of the first tools you use after installing Node Js.
At its core, npm works with packages and dependencies. A package is reusable code published to the npm registry or stored locally. Dependencies are packages your app needs to run, while devDependencies are packages needed only during development, such as testing or formatting tools. Important npm concepts include package.json, which stores project information and scripts, and package-lock.json, which locks exact package versions for consistent installs. You will also commonly use local installs for project-specific tools and global installs for command-line utilities you want available system-wide.
Step-by-Step Explanation
To start, create a project folder and open a terminal inside it. Run npm init -y to generate a default package.json file quickly. If you want to answer prompts manually, use npm init. To install a package for your app, run npm install express. This adds the package to dependencies. To install a development-only package, run npm install --save-dev nodemon. You can remove packages with npm uninstall package-name.
Npm also supports scripts. Inside package.json, you can define commands such as start or dev. Then run them using npm run script-name. If the script is named start, you may also run npm start. To install all dependencies listed in an existing project, simply run npm install. To check outdated packages, use npm outdated, and to update packages, use npm update.
Comprehensive Code Examples
// Basic example: initialize a project
// terminal
npm init -y
npm install lodash// Real-world example: package.json with scripts
{
"name": "notes-api",
"version": "1.0.0",
"main": "index.js",
"scripts": {
"start": "node index.js",
"dev": "nodemon index.js"
},
"dependencies": {
"express": "^4.19.2"
},
"devDependencies": {
"nodemon": "^3.1.0"
}
}// Advanced usage: custom script workflow
{
"scripts": {
"start": "node server.js",
"dev": "nodemon server.js",
"test": "node test.js",
"build": "node build.js"
}
}
// terminal usage
npm run dev
npm run testCommon Mistakes
- Installing everything globally: Beginners often use global installs for packages that should belong to one project. Fix: install app dependencies locally unless a tool truly needs global access.
- Editing dependencies manually without installing: Changing
package.jsonalone can create mismatches. Fix: usenpm installcommands so both manifest and lock file stay consistent. - Deleting
package-lock.jsoncarelessly: This can cause version inconsistency across machines. Fix: keep it committed for team projects. - Forgetting
--save-devfor dev tools: This mixes runtime and development packages. Fix: place testing, linting, and watcher tools indevDependencies.
Best Practices
- Run
npm init -yat the start of every new Node Js project. - Use clear script names like
start,dev, andtest. - Commit both
package.jsonandpackage-lock.jsonto version control. - Prefer exact team workflows by using local dependencies and npm scripts.
- Review packages before installing to avoid unnecessary or untrusted dependencies.
Practice Exercises
- Create a new folder, initialize npm, and inspect the generated
package.jsonfile. - Install one runtime package and one development package, then identify where each appears in
package.json. - Add
startanddevscripts to a project and run both from the terminal.
Mini Project / Task
Set up a simple Node Js project called task-runner. Initialize npm, install one utility package and nodemon, create a basic index.js file, and add scripts for normal execution and development mode.
Challenge (Optional)
Create a small project with at least three npm scripts such as start, dev, and test. Then uninstall and reinstall one dependency to observe how package.json and package-lock.json change.
Package.json
package.json is the central configuration file for a Node Js project. It describes your application or package using JSON and tells Node tools important information such as the project name, version, entry file, scripts, dependencies, license, and metadata. It exists so developers and tools can work with a project in a consistent way. In real life, package.json is used in almost every Node Js application, from simple scripts to enterprise APIs, because it defines how a project starts, installs libraries, runs tests, and is shared with other developers. Common fields include name, version, description, main, scripts, dependencies, devDependencies, type, and engines. The dependencies section stores packages needed in production, while devDependencies stores tools used during development, such as testing or formatting libraries. The scripts section lets you define commands like start, test, and build so your team uses the same workflow.
Step-by-Step Explanation
You can create package.json by running npm init or npm init -y. The file must contain valid JSON, which means keys and string values use double quotes and there are no trailing commas. A simple file starts with name and version. The main field points to the main file, often index.js. Under scripts, each key becomes a command you can run with npm run commandName, except special commands like npm start and npm test. When you install a package using npm install express, npm automatically adds it to dependencies. If you install a development tool with npm install -D nodemon, it goes to devDependencies. If your project uses ECMAScript modules, add "type": "module" so Node treats .js files as ES modules.
Comprehensive Code Examples
{
"name": "my-node-app",
"version": "1.0.0",
"main": "index.js"
}{
"name": "notes-api",
"version": "1.0.0",
"description": "A simple backend API",
"main": "server.js",
"scripts": {
"start": "node server.js",
"dev": "nodemon server.js",
"test": "node test.js"
},
"dependencies": {
"express": "^4.19.2"
},
"devDependencies": {
"nodemon": "^3.1.0"
}
}{
"name": "modern-service",
"version": "2.0.0",
"type": "module",
"private": true,
"engines": {
"node": ">=18"
},
"scripts": {
"start": "node src/index.js",
"dev": "node --watch src/index.js",
"lint": "eslint ."
},
"dependencies": {
"dotenv": "^16.4.5"
},
"devDependencies": {
"eslint": "^9.0.0"
}
}Common Mistakes
- Invalid JSON syntax: Using single quotes, comments, or trailing commas breaks the file. Fix it by using strict JSON formatting.
- Putting all packages in dependencies: Tools like nodemon and eslint should usually be in
devDependencies. - Wrong main or script paths: If
server.jsdoes not exist,npm startwill fail. Verify file names and folders carefully. - Forgetting type module: Using
importsyntax without settingtypecan cause module errors.
Best Practices
- Use meaningful project names and keep versions updated with semantic versioning.
- Add useful scripts such as
start,dev, andtestfor consistent team workflows. - Separate runtime packages from development tools.
- Use
privateset totrueif the project should not be published accidentally. - Specify supported Node versions with
enginesfor better compatibility.
Practice Exercises
- Create a new Node Js project with
npm init -yand edit the generated package.json to add a description and author field. - Add a
startscript that runsindex.jsand adevscript for a development workflow. - Install one production package and one development package, then identify where each appears in package.json.
Mini Project / Task
Set up package.json for a small Express server with scripts for start and development mode, a correct main entry, one runtime dependency, and one development dependency.
Challenge (Optional)
Design a package.json for a private Node Js API project that uses ES modules, enforces Node 18 or higher, and includes scripts for start, dev, and lint.
Modules
Modules are one of the most important features in Node Js because they let you split your application into smaller, reusable files. Instead of writing all your code in one giant file, you can place related logic into separate modules such as authentication, database access, math helpers, logging, or configuration. This makes applications easier to read, test, debug, and maintain. In real projects, modules are used everywhere: an e-commerce app may have separate modules for products, orders, payments, and users; an API may have route modules, controller modules, and utility modules.
Node Js supports two major module systems: CommonJS and ES Modules. CommonJS uses require() to import code and module.exports or exports to share code. This has been the traditional Node.js approach for many years. ES Modules use import and export, which follow modern JavaScript standards and are common in frontend and backend development. Node also includes many built-in modules such as fs for files, path for file paths, http for servers, and os for system information. You can also create your own custom modules or install third-party modules from npm.
Understanding modules is essential because they improve structure and help teams collaborate. A well-designed module should do one job clearly and expose only what other parts of the program need.
Step-by-Step Explanation
In CommonJS, you create reusable code in one file and export it. Another file can load it with require(). For example, if a file contains a function, you export that function using module.exports = myFunction. Then another file imports it with const myFunction = require('./fileName').
In ES Modules, you export values using export or export default, and import them with import. To use ES Modules in Node Js, you usually set "type": "module" in package.json or use the .mjs extension.
There are several common kinds of modules in Node Js: built-in modules provided by Node, custom modules you create, and third-party modules installed from npm. A beginner should first learn how to create custom modules, then use built-in modules, and finally combine many modules into a larger application.
Comprehensive Code Examples
Basic example
// math.js
function add(a, b) {
return a + b;
}
module.exports = add;
// app.js
const add = require('./math');
console.log(add(2, 3));Real-world example
// logger.js
function logMessage(message) {
console.log(`[LOG] ${message}`);
}
module.exports = { logMessage };
// server.js
const http = require('http');
const { logMessage } = require('./logger');
const server = http.createServer((req, res) => {
logMessage(`Request received: ${req.url}`);
res.end('Hello from Node Js');
});
server.listen(3000);Advanced usage
// utils.mjs
export function formatCurrency(value) {
return `$${value.toFixed(2)}`;
}
export function applyDiscount(price, percent) {
return price - (price * percent / 100);
}
// app.mjs
import { formatCurrency, applyDiscount } from './utils.mjs';
const finalPrice = applyDiscount(100, 15);
console.log(formatCurrency(finalPrice));Common Mistakes
- Using the wrong path: Writing
require('math')instead ofrequire('./math')for local files. Use./for files in the same folder. - Mixing CommonJS and ES Module syntax incorrectly: Do not use
importin a file configured for CommonJS unless your project supports ES Modules. - Forgetting to export: If you define a function but do not export it, other files cannot use it. Always check your export statement.
- Exporting incorrectly with exports: Replacing
exportsdirectly can break the link. Prefermodule.exportswhen exporting a single value.
Best Practices
- Create small modules with one clear responsibility.
- Use descriptive file names like
auth.js,db.js, oremailService.js. - Group related modules into folders for better project structure.
- Prefer named exports when sharing multiple utilities.
- Keep sensitive values such as API keys out of modules and use environment variables instead.
- Document what each module exports so other developers can use it correctly.
Practice Exercises
- Create a custom module named
greetings.jsthat exports a function to greet a user by name. Import it into another file and run it. - Use the built-in
osmodule to print your system platform and CPU architecture. - Create a module that exports two math functions, then import and use both in a separate file.
Mini Project / Task
Build a small utility-based Node Js app with three modules: one for logging messages, one for formatting dates, and one main file that uses both modules to print a timestamped log message.
Challenge (Optional)
Create a modular command-line expense tracker where one module handles data storage, one handles calculations, and one handles user output. Organize the code so each module has a single purpose.
System Modules
System modules are the built-in modules that come with Node Js. They exist so developers can perform common backend tasks without installing external libraries. In real projects, these modules are used to read and write files, work with folders, inspect the operating system, manage environment variables, build command-line tools, create events, and process data streams efficiently. Because they are part of Node Js itself, they are stable, fast, and widely used in production applications. Common system modules include fs for file handling, path for working with file paths, os for system information, events for event-driven programming, and process for runtime control. Another important distinction is how modules are imported: CommonJS uses require(), while modern Node Js projects may use ES Modules with import. Understanding built-in modules helps beginners write practical programs immediately and also teaches how Node Js interacts with the computer underneath the application layer.
Step-by-Step Explanation
To use a system module, first import it into your file. In CommonJS, write const fs = require('fs'). In ES Modules, write import fs from 'fs'. After importing, call its methods. For example, the fs module has synchronous and asynchronous methods. Synchronous methods block execution until they finish, while asynchronous methods allow the program to continue and are preferred in most backend applications. The path module helps create safe paths across Windows, macOS, and Linux using methods like join() and basename(). The os module returns values such as CPU architecture, free memory, and platform. The events module allows you to define custom events and listeners. The process object is global and gives access to command-line arguments, environment variables, and application exit handling. A beginner should start by importing one module, calling one method, printing the result, and then gradually combining modules to build useful scripts.
Comprehensive Code Examples
Basic example
const os = require('os');
console.log('Platform:', os.platform());
console.log('Home Directory:', os.homedir());Real-world example
const fs = require('fs');
const path = require('path');
const filePath = path.join(__dirname, 'notes.txt');
fs.writeFile(filePath, 'Node Js system modules are useful.', (err) => {
if (err) return console.error('Write error:', err);
fs.readFile(filePath, 'utf8', (err, data) => {
if (err) return console.error('Read error:', err);
console.log('File content:', data);
});
});Advanced usage
const EventEmitter = require('events');
class AppLogger extends EventEmitter {}
const logger = new AppLogger();
logger.on('log', (message) => {
console.log('LOG:', message);
});
process.on('exit', () => {
console.log('Application is shutting down');
});
logger.emit('log', 'Server started');
console.log('Args:', process.argv.slice(2));Common Mistakes
- Using the wrong import style: Mixing
requireandimportincorrectly causes module errors. Use the style supported by your project configuration. - Ignoring async behavior: Beginners often expect asynchronous
fsmethods to return data immediately. Use callbacks, promises, or async/await properly. - Hardcoding file paths: Writing paths as plain strings can break across operating systems. Use
path.join(). - Forgetting error handling: File and process operations can fail. Always check errors in callbacks or catch promise rejections.
Best Practices
- Prefer asynchronous methods for file and network-related work.
- Use built-in modules before adding third-party dependencies.
- Keep module usage focused and organized by responsibility.
- Use
pathutilities for cross-platform compatibility. - Validate command-line input when using
process.argv.
Practice Exercises
- Create a script that uses the
osmodule to print the platform, CPU architecture, and total memory. - Use the
fsandpathmodules to create a text file and then read its contents. - Build a small event system using
EventEmitterthat prints a message when a custom event is triggered.
Mini Project / Task
Build a Node Js system info tool that prints the current operating system, current working directory, command-line arguments, and saves the results into a log file using built-in modules only.
Challenge (Optional)
Create a command-line script that accepts a filename as an argument, checks whether it exists, reads its content if present, and emits a custom success or error event depending on the result.
File System
The Node Js File System module, usually imported with require('fs'), gives your application the ability to interact with files and folders on the operating system. It exists because backend programs often need to persist data outside memory, load configuration files, store logs, process uploaded content, and generate reports. In real life, a web server may save user-uploaded images, a billing system may export invoices as text or CSV files, and a monitoring tool may append events to log files. Node Js supports both synchronous and asynchronous file operations. Synchronous methods block execution until work finishes, which can be acceptable for small scripts or startup tasks. Asynchronous methods are preferred in servers because they avoid blocking the event loop and let the program continue handling other work. The module also includes promise-based access through fs.promises, which is often the cleanest approach with async/await. Common operations include reading files with readFile, writing with writeFile, appending with appendFile, deleting with unlink, creating folders with mkdir, and listing directory contents with readdir. You will also commonly pair fs with path so file paths work correctly across operating systems.
Step-by-Step Explanation
Start by importing the module: const fs = require('fs') or const fs = require('fs').promises for promise-based methods. To read text, call readFile(path, 'utf8', callback) or await the promise version. The path is the file location, the encoding converts bytes into readable text, and the callback receives an error and data. To write text, use writeFile; this creates the file if it does not exist and overwrites it if it does. If you want to add content without replacing existing data, use appendFile. For directories, mkdir creates a folder and can create nested folders with recursive: true. To inspect a folder, use readdir. To delete a file, use unlink. Always handle errors because missing files, permission problems, or invalid paths are common. In professional code, prefer absolute or joined paths using path.join(__dirname, ...) rather than hard-coded separators.
Comprehensive Code Examples
const fs = require('fs');
fs.readFile('notes.txt', 'utf8', (err, data) => {
if (err) {
console.error(err.message);
return;
}
console.log(data);
});const fs = require('fs').promises;
const path = require('path');
async function saveLog() {
const filePath = path.join(__dirname, 'logs', 'app.log');
await fs.mkdir(path.join(__dirname, 'logs'), { recursive: true });
await fs.appendFile(filePath, 'Server started\n');
console.log('Log saved');
}
saveLog().catch(err => console.error(err.message));const fs = require('fs').promises;
const path = require('path');
async function backupConfig() {
const source = path.join(__dirname, 'config.json');
const target = path.join(__dirname, 'backup', 'config-backup.json');
await fs.mkdir(path.join(__dirname, 'backup'), { recursive: true });
const content = await fs.readFile(source, 'utf8');
const config = JSON.parse(content);
config.backupCreatedAt = new Date().toISOString();
await fs.writeFile(target, JSON.stringify(config, null, 2));
console.log('Backup complete');
}
backupConfig().catch(err => console.error(err.message));Common Mistakes
- Using synchronous methods in a busy server: Replace
readFileSyncandwriteFileSyncwith asynchronous or promise-based methods. - Forgetting encoding when reading text: Without
'utf8', you may get a Buffer instead of a string. - Ignoring errors: Always check
error wrap awaited calls intry/catch. - Hard-coding file paths: Use
path.joinfor cross-platform compatibility.
Best Practices
- Prefer
fs.promiseswithasync/awaitfor readable code. - Create directories safely with
recursive: truebefore writing files into them. - Validate user-provided file names to avoid unsafe path access.
- Use append operations for logs instead of overwriting files.
- Keep file handling logic modular so it can be tested easily.
Practice Exercises
- Create a script that reads a text file named
message.txtand prints its contents to the console. - Write a program that creates a folder named
reportsand saves a file calleddaily.txtinside it. - Build a script that appends the current date and time to a file named
activity.logeach time it runs.
Mini Project / Task
Build a simple notes manager that can create a notes folder, save a note to a text file, read an existing note, and delete a note by file name.
Challenge (Optional)
Create a directory scanner that lists all files in a folder and writes their names, sizes, and last modified dates into a summary report file.
Path Module
The path module is a built-in Node Js module used to work with file and folder paths safely and consistently. In real applications, paths are used when reading files, serving static assets, saving uploads, generating logs, and building project directory structures. Instead of manually joining strings like "src/" + "config/" + "app.json", the path module handles separators correctly for different operating systems such as Windows and Linux. This matters because Windows uses backslashes while many other systems use forward slashes.
The module exists to make path operations reliable, readable, and cross-platform. Common methods include join(), resolve(), basename(), dirname(), extname(), and parse(). You will also often use special values like __dirname together with the path module to locate files relative to your current script. A beginner should understand that the path module does not check whether a file really exists. It only helps you create, normalize, and inspect path strings.
There are several important path ideas. Absolute paths start from the root of the system, while relative paths depend on the current working directory. path.join() combines parts into a clean path. path.resolve() creates an absolute path, often based on the current working directory. path.basename() returns the last part of a path, path.dirname() returns the folder path, and path.extname() gives the file extension. path.parse() breaks a path into an object, and path.format() builds a path from an object.
Step-by-Step Explanation
First, import the module using require('path'). Then call its methods based on your need. Use join() when combining folder and file names. Use resolve() when you need a full absolute path. Use basename(), dirname(), and extname() when extracting parts from a path.
Syntax basics:const path = require('path');path.join('folder', 'file.txt');path.resolve('folder', 'file.txt');path.basename('/users/docs/report.pdf');path.dirname('/users/docs/report.pdf');path.extname('/users/docs/report.pdf');
Comprehensive Code Examples
const path = require('path');
const filePath = path.join('documents', 'notes', 'todo.txt');
console.log(filePath);const path = require('path');
const configPath = path.join(__dirname, 'config', 'app.json');
console.log('Config file path:', configPath);const path = require('path');
const fullPath = '/projects/node-app/public/images/logo.png';
console.log(path.basename(fullPath));
console.log(path.dirname(fullPath));
console.log(path.extname(fullPath));
console.log(path.parse(fullPath));
const rebuilt = path.format({
dir: '/projects/node-app/public/images',
name: 'logo',
ext: '.png'
});
console.log(rebuilt);Common Mistakes
- Manually concatenating paths: Writing folder names with
+can break on different operating systems. Usepath.join(). - Confusing relative and absolute paths: A relative path may fail depending on where the command is run. Use
__dirnamewithpath.join()for stable file locations. - Thinking path checks file existence: The path module only handles strings. Use the
fsmodule if you need to check whether a file exists. - Using
resolve()and expecting simple joining:resolve()returns an absolute path and processes from right to left. Usejoin()when simple combination is enough.
Best Practices
- Use
path.join(__dirname, ...)for project files accessed from code. - Prefer built-in path methods over hardcoded separators like
/or\. - Use
parse()when you need multiple path parts at once. - Keep paths readable by storing repeated base directories in variables.
- Be careful with user-provided path input in backend apps to avoid unsafe file access patterns.
Practice Exercises
- Create a script that joins
assets,images, andphoto.jpginto one path and prints it. - Write a program that takes a file path string and prints its directory name, base name, and extension.
- Build a script that creates an absolute path to a
logs/app.logfile using__dirname.
Mini Project / Task
Create a file path utility script for a small Node Js project that builds paths for config, public, and uploads folders, then prints file details for a chosen filename.
Challenge (Optional)
Write a script that accepts several path strings in an array, parses each one, and groups the files by extension such as .txt, .js, and .json.
OS Module
The Node Js os module provides information about the operating system where your application is running. It exists so developers can inspect system details such as CPU architecture, memory, hostname, home directory, platform, temporary folders, network interfaces, and system uptime without relying on external packages. In real projects, this is useful for diagnostics tools, deployment-aware scripts, monitoring utilities, CLI apps, logging, and environment-specific behavior. For example, a backend service may log memory usage and hostname for troubleshooting, while a setup script may store files in the correct temp directory depending on whether the app runs on Windows, Linux, or macOS.
The module is built into Node Js, so no installation is needed. You load it with require('os'). Common methods include os.platform(), os.arch(), os.cpus(), os.totalmem(), os.freemem(), os.hostname(), os.homedir(), os.tmpdir(), os.uptime(), and os.networkInterfaces(). Some return simple strings or numbers, while others return arrays or objects. A key idea for beginners is that the os module mostly gives system information; it does not manage processes like the process module and does not manipulate files like the fs module.
Step-by-Step Explanation
First, import the module using const os = require('os');. Then call a method based on the type of data you need. If you want the operating system name, use os.platform(). If you want RAM values, use os.totalmem() and os.freemem(). These memory values are returned in bytes, so divide by 1024 * 1024 * 1024 to convert to gigabytes. For CPU details, os.cpus() returns an array of objects, one per core. For network details, os.networkInterfaces() returns a nested object, so you often loop through it. Always read the returned type before using it, because some methods return strings and others return complex structures.
Comprehensive Code Examples
const os = require('os');
console.log('Platform:', os.platform());
console.log('Architecture:', os.arch());
console.log('Hostname:', os.hostname());
console.log('Home Directory:', os.homedir());const os = require('os');
const totalMemory = (os.totalmem() / 1024 / 1024 / 1024).toFixed(2);
const freeMemory = (os.freemem() / 1024 / 1024 / 1024).toFixed(2);
console.log(`Total RAM: ${totalMemory} GB`);
console.log(`Free RAM: ${freeMemory} GB`);
console.log(`System Uptime: ${os.uptime()} seconds`);const os = require('os');
const interfaces = os.networkInterfaces();
for (const name in interfaces) {
for (const details of interfaces[name]) {
if (details.family === 'IPv4' && !details.internal) {
console.log(`${name}: ${details.address}`);
}
}
}const os = require('os');
function getSystemReport() {
return {
hostname: os.hostname(),
platform: os.platform(),
cpuCores: os.cpus().length,
totalMemoryGB: (os.totalmem() / 1024 / 1024 / 1024).toFixed(2),
freeMemoryGB: (os.freemem() / 1024 / 1024 / 1024).toFixed(2),
tempDirectory: os.tmpdir()
};
}
console.log(getSystemReport());Common Mistakes
- Forgetting to import the module: Fix by adding
const os = require('os');before calling methods. - Misreading memory values:
os.totalmem()andos.freemem()return bytes, not MB or GB. Convert the values properly. - Assuming one network format:
os.networkInterfaces()differs across systems. Loop carefully and checkfamilyandinternal.
Best Practices
- Use the
osmodule for environment-aware logic, but keep platform-specific code minimal. - Convert raw byte values into readable units before displaying them to users.
- Wrap system reporting in helper functions so the rest of your application stays clean.
- Use system data mainly for logging, monitoring, and configuration instead of business logic.
Practice Exercises
- Create a script that prints the hostname, platform, and architecture of your machine.
- Write a program that shows total memory and free memory in GB with two decimal places.
- Build a script that lists all non-internal IPv4 addresses from available network interfaces.
Mini Project / Task
Build a small āsystem info reporterā CLI tool that prints hostname, OS platform, CPU core count, free memory, total memory, uptime, and temporary directory in a readable format.
Challenge (Optional)
Create a reusable function that returns a formatted health summary and warns when free memory drops below a percentage threshold you define.
Events
Events are one of the most important ideas in Node Js because the platform is built around an event-driven architecture. An event is simply a signal that something happened. For example, a file finished loading, a user made a request, a timer completed, or data arrived from a stream. Instead of constantly checking whether something has happened, Node Js lets your program listen for events and react when they occur. This makes applications more efficient, especially for servers that handle many tasks at the same time. In real life, events are used in HTTP servers, file streams, sockets, chat apps, logging systems, background jobs, and custom application workflows. The main tool for working with events in Node Js is the EventEmitter class from the events module. It allows you to create named events, attach listener functions, and trigger those listeners when needed.
The core concepts are simple but powerful. A listener is a function that runs when an event occurs. The emit method triggers an event. The on method registers a listener that runs every time the event is emitted. The once method registers a listener that runs only the first time. The off or removeListener method removes a listener. Events can also pass data to listeners, which is how different parts of an application communicate. Node Js itself uses events heavily. For example, streams emit events like data, end, and error. HTTP request and response objects also depend on event handling. Understanding this topic helps you read Node Js APIs more confidently and design your own modular systems.
Step-by-Step Explanation
First, import EventEmitter from the built-in events module. Next, create an emitter object, either directly with new EventEmitter() or by extending the class. Then register listeners using on or once. Each listener is connected to an event name such as login or orderCreated. Finally, trigger the event with emit. If needed, pass extra values after the event name, and Node Js will send them as arguments to the listener function. This pattern separates the part that announces an action from the part that responds to it, which keeps code cleaner and easier to maintain.
Comprehensive Code Examples
const EventEmitter = require('events');
const emitter = new EventEmitter();
emitter.on('greet', () => {
console.log('Hello from an event listener');
});
emitter.emit('greet');const EventEmitter = require('events');
const emitter = new EventEmitter();
emitter.on('userRegistered', (username, email) => {
console.log(`New user: ${username}, email: ${email}`);
console.log('Send welcome email');
});
emitter.emit('userRegistered', 'Amina', '[email protected]');const EventEmitter = require('events');
class OrderSystem extends EventEmitter {}
const orders = new OrderSystem();
orders.once('orderPlaced', (id) => {
console.log(`First order received: ${id}`);
});
const logOrder = (id, total) => {
console.log(`Order ${id} total: $${total}`);
};
orders.on('orderPlaced', logOrder);
orders.emit('orderPlaced', 101, 49.99);
orders.emit('orderPlaced', 102, 79.50);
orders.off('orderPlaced', logOrder);
orders.emit('orderPlaced', 103, 19.00);Common Mistakes
- Forgetting to listen before emitting: If you call
emitbefore adding listeners, nothing will happen. Register listeners first. - Ignoring error events: Some emitters use an
errorevent. If you do not handle it, the app may crash. Add an error listener when needed. - Adding too many listeners: Repeatedly attaching listeners inside loops or requests can cause memory warnings. Reuse listeners or remove them when finished.
- Using the wrong event name: Event names are exact strings. A typo means the listener will never run.
Best Practices
- Use clear event names such as
userCreatedorpaymentFailed. - Keep listeners focused on one responsibility.
- Use
oncefor one-time actions like setup confirmations. - Pass meaningful data when emitting events so listeners have context.
- Document custom events in larger applications to help teams understand system behavior.
Practice Exercises
- Create an emitter with an event named
startthat prints a message when triggered. - Create an event named
messagethat accepts a text value and logs it. - Use
onceto build an event that runs only on the first button-like trigger in your script.
Mini Project / Task
Build a simple order notification system where placing an order emits an event, logs the order details, and triggers a second listener that simulates sending a confirmation message.
Challenge (Optional)
Create a custom class that extends EventEmitter for a quiz app. Emit events for correctAnswer, wrongAnswer, and quizCompleted, and attach separate listeners for score tracking and final reporting.
EventEmitter
EventEmitter is a built-in Node.js class used to create and handle custom events. It exists because many backend systems are event-driven: something happens, and other parts of the program react to it. For example, a server may emit an event when a user connects, a payment service may emit an event when a transaction succeeds, or a logger may emit an event when an error occurs. Instead of tightly connecting one function to another, EventEmitter allows a cleaner publish-and-listen pattern. In real applications, it is used in streams, HTTP servers, websockets, job queues, notifications, and custom application workflows. The main idea is simple: an object emits an event name, and one or more listeners run when that event occurs.
The most common concepts are the emitter, event names, listeners, and listener management methods. An emitter is an instance of the EventEmitter class. Event names are usually strings such as 'login' or 'orderCreated'. Listeners are callback functions attached with methods like on() and once(). The on() method runs every time the event is emitted, while once() runs only the first time. You can trigger an event using emit(), remove listeners with off() or removeListener(), and inspect listeners if needed. Node also treats the 'error' event specially, so it should usually have a handler to avoid crashes.
Step-by-Step Explanation
First, import the class from the built-in events module. Then create an instance. Next, register a listener using on(eventName, callback). Finally, trigger the event with emit(eventName, data). The callback receives any extra values passed to emit(). If you want a listener to run only one time, use once() instead of on(). If your application no longer needs a listener, remove it to keep memory usage clean. Event names should be descriptive and consistent so larger applications remain understandable.
Comprehensive Code Examples
Basic example
const EventEmitter = require('events');
const emitter = new EventEmitter();
emitter.on('greet', (name) => {
console.log(`Hello, ${name}!`);
});
emitter.emit('greet', 'Ava');Real-world example
const EventEmitter = require('events');
const orderSystem = new EventEmitter();
orderSystem.on('orderPlaced', (order) => {
console.log(`Order received: ${order.id}`);
});
orderSystem.on('orderPlaced', (order) => {
console.log(`Send confirmation email to ${order.email}`);
});
orderSystem.emit('orderPlaced', { id: 101, email: '[email protected]' });Advanced usage
const EventEmitter = require('events');
class ChatRoom extends EventEmitter {
join(user) {
this.emit('join', user);
}
message(user, text) {
this.emit('message', { user, text, time: new Date() });
}
}
const room = new ChatRoom();
room.once('join', (user) => {
console.log(`${user} joined for the first time`);
});
room.on('message', (data) => {
console.log(`[${data.time.toISOString()}] ${data.user}: ${data.text}`);
});
room.join('Lina');
room.join('Lina');
room.message('Lina', 'Hello everyone');Common Mistakes
- Forgetting to import from
events: Always require or import EventEmitter before using it. - Using different event names: Emitting
'userLogin'while listening for'login'means nothing runs. Keep names consistent. - Ignoring the
'error'event: Add an error listener when your emitter may emit errors. - Adding too many listeners: Remove unused listeners to avoid memory warnings and unexpected repeated execution.
Best Practices
- Use clear event names like
'fileUploaded'or'paymentFailed'. - Keep listeners small and focused on one responsibility.
- Use
once()for one-time setup, initialization, or first-connection logic. - Document what data is passed with each event so other developers can use it safely.
- Handle errors explicitly when events represent risky operations.
Practice Exercises
- Create an emitter with an event named
'welcome'that prints a username when triggered. - Build an emitter that listens for
'taskComplete'and logs the completed task title. - Use
once()to create an event called'start'that only prints a message the first time it is emitted.
Mini Project / Task
Build a simple notification center using EventEmitter. Create events such as 'newUser', 'newMessage', and 'systemAlert', then attach listeners that print different notification messages to the console.
Challenge (Optional)
Create a custom class that extends EventEmitter for a ticket booking system. Emit events for booking created, booking canceled, and booking failed, and make sure each event sends structured data to multiple listeners.
Streams
Streams in Node Js are objects used to read, write, or transform data piece by piece instead of loading everything into memory at once. This exists because many backend tasks deal with large files, network responses, logs, uploads, videos, or database exports that may be too big or inefficient to handle as one giant block. In real applications, streams are used when sending files to browsers, processing CSV files, compressing data, proxying API responses, or reading logs continuously. Node Js provides four main stream types: Readable streams for consuming data, Writable streams for sending data somewhere, Duplex streams that can read and write, and Transform streams that modify data while passing it along. A readable stream emits chunks, and a writable stream accepts chunks. This chunk-based design improves memory usage and often increases performance. Streams also support backpressure, which is the mechanism that prevents fast producers from overwhelming slow consumers. That is one of the biggest reasons streams are important in production systems.
Step-by-Step Explanation
To use streams, you usually import fs or stream. A readable stream can be created with fs.createReadStream(), and a writable stream with fs.createWriteStream(). Readable streams commonly emit data, end, and error events. Writable streams use methods such as write() and end(). A simple flow is: create a readable stream, listen for chunks, and send those chunks to a writable stream. A better approach is often pipe(), which automatically moves data from a readable stream into a writable stream. For advanced work, use the pipeline() utility because it handles cleanup and errors more safely. Transform streams sit between input and output and change the data, such as converting text to uppercase or compressing content.
Comprehensive Code Examples
const fs = require('fs');
const reader = fs.createReadStream('input.txt', { encoding: 'utf8' });
reader.on('data', chunk => {
console.log('Chunk:', chunk);
});
reader.on('end', () => {
console.log('Finished reading file');
});
reader.on('error', err => {
console.error(err.message);
});const fs = require('fs');
const reader = fs.createReadStream('large-video.mp4');
const writer = fs.createWriteStream('copy-video.mp4');
reader.pipe(writer);
writer.on('finish', () => {
console.log('File copied successfully');
});const { Transform, pipeline } = require('stream');
const fs = require('fs');
const upperCaseTransform = new Transform({
transform(chunk, encoding, callback) {
callback(null, chunk.toString().toUpperCase());
}
});
pipeline(
fs.createReadStream('notes.txt'),
upperCaseTransform,
fs.createWriteStream('notes-upper.txt'),
err => {
if (err) console.error('Pipeline failed:', err.message);
else console.log('Pipeline completed');
}
);Common Mistakes
- Reading huge files with
readFile()instead of streams: use streams for large files to avoid high memory usage. - Ignoring error events: always handle
erroror usepipeline()for safer stream chaining. - Manually writing chunks without ending the stream: call
end()when no more data will be written. - Not understanding chunk behavior: a chunk may contain partial data, so do not assume one chunk equals one line or one object.
Best Practices
- Use
pipe()for simple readable-to-writable flows. - Prefer
pipeline()for production code because it manages failures better. - Set encoding only when you need strings; otherwise work with buffers for binary data.
- Use streams for uploads, downloads, logs, compression, and large data processing.
- Keep transformations small and focused when building custom transform streams.
Practice Exercises
- Create a readable stream that prints the contents of a text file chunk by chunk.
- Copy one file into another using
createReadStream()andcreateWriteStream(). - Build a transform stream that converts all incoming text to lowercase and saves it to a new file.
Mini Project / Task
Build a small log processor that reads a server log file as a stream, converts every line to uppercase using a transform stream, and writes the result into a new output file.
Challenge (Optional)
Create a Node Js script that reads a large CSV file as a stream, counts how many rows it contains without loading the full file into memory, and prints the final count.
Buffers
Buffers in Node Js are objects designed to store raw binary data directly in memory. They exist because JavaScript was originally built for text and browser interaction, not low-level byte manipulation. In backend systems, however, applications often work with files, images, video streams, TCP packets, compressed data, and cryptographic values. Buffers solve this gap by giving Node Js a reliable way to handle bytes efficiently. In real life, you use Buffers when reading a file from disk, receiving data from a network socket, converting text into bytes, or processing uploaded images. A Buffer stores fixed-size sequences of bytes, where each byte is a number from 0 to 255. Common operations include creating buffers, reading and writing values, converting between strings and binary data, slicing data, and combining chunks. Node Js provides methods such as Buffer.from(), Buffer.alloc(), and Buffer.concat().
Step-by-Step Explanation
Use Buffer.from() when you want to create a buffer from a string, array, or existing binary source. Use Buffer.alloc(size) to create a safe buffer filled with zeros. This is preferred when you know the required size. Each position in a buffer is called an index, and you can access it like an array using buffer[0]. To convert text to bytes, pass a string and optional encoding like utf8 or hex. To convert bytes back to readable text, call buffer.toString(). You can also join multiple buffers with Buffer.concat(), which is very useful when collecting streamed data chunk by chunk.
Comprehensive Code Examples
const buf = Buffer.from('Hello');
console.log(buf);
console.log(buf.toString());
console.log(buf[0]);const fs = require('fs');
fs.readFile('notes.txt', (err, data) => {
if (err) throw err;
console.log('Raw bytes length:', data.length);
console.log('Text:', data.toString('utf8'));
});const part1 = Buffer.from('Node ');
const part2 = Buffer.from('Js');
const combined = Buffer.concat([part1, part2]);
console.log(combined.toString());
const fixed = Buffer.alloc(4);
fixed[0] = 65;
fixed[1] = 66;
fixed[2] = 67;
fixed[3] = 68;
console.log(fixed.toString());The first example creates a buffer from a string. The second shows a real-world case where file content arrives as a buffer. The third demonstrates combining chunks and manually writing byte values into allocated memory.
Common Mistakes
- Mistake: Treating a buffer like plain text without conversion. Fix: Use
toString()with the correct encoding. - Mistake: Using the wrong encoding such as reading UTF-8 data as hex. Fix: Match the encoding to the data source.
- Mistake: Forgetting that buffer size is fixed. Fix: Allocate the needed size first or create a new buffer when necessary.
- Mistake: Assuming streamed chunks arrive as one complete message. Fix: Collect chunks and merge them with
Buffer.concat().
Best Practices
- Use
Buffer.alloc()for predictable, safe initialization. - Convert buffers to strings only when needed to avoid unnecessary processing.
- Be explicit about encoding when exchanging text data.
- Use buffers for binary work such as file uploads, streams, hashes, and protocols.
- Validate buffer length before reading specific indexes or ranges.
Practice Exercises
- Create a buffer from the text
Backendand print its string value and byte length. - Allocate a buffer of size 5, store byte values for the letters A to E, and display the final text.
- Create two separate buffers containing
NodeandBuffer, combine them, and print the result.
Mini Project / Task
Build a small script that reads a text file as a buffer, prints the total number of bytes, then converts and displays the content as readable text.
Challenge (Optional)
Write a script that receives several text chunks in an array, converts each chunk into a buffer, combines them into one final buffer, and prints both the byte length and final message.
Asynchronous Programming
Asynchronous programming in Node Js is the way Node handles tasks that take time without blocking the entire application. Instead of waiting for one operation to finish before moving to the next, Node can start a task such as reading a file, calling an API, querying a database, or waiting on a timer, and continue running other code in the meantime. This matters because backend applications often deal with slow external operations. In real life, asynchronous programming is used in chat apps, payment systems, file uploads, web APIs, background jobs, and dashboards that gather data from multiple services.
In Node Js, common asynchronous styles include callbacks, Promises, and async/await. A callback is a function passed into another function to run later when the task finishes. Promises provide a cleaner way to represent future success or failure using .then() and .catch(). Async/await is built on Promises and makes asynchronous code look more like normal step-by-step code. You will also see event-driven behavior in Node, where code reacts when something happens, such as a request arriving or a stream receiving data. Understanding these patterns is essential because most Node Js APIs are asynchronous by design.
Step-by-Step Explanation
Start by thinking of asynchronous code as work that is scheduled. When you call setTimeout(), Node does not pause the whole program. It registers the timer and continues. When the time is complete, the callback is placed in a queue and executed when the call stack is free.
A callback example usually looks like function task(callback) { ... callback(result) }. For Promises, you create one with new Promise((resolve, reject) => { ... }). Call resolve(value) for success and reject(error) for failure. To consume it, use promise.then(...).catch(...).
With async/await, mark a function using async. Inside it, use await before a Promise. This pauses only that async function, not the whole Node process. Wrap risky operations in try...catch so errors are handled properly.
Comprehensive Code Examples
console.log('Start');
setTimeout(() => {
console.log('Runs later');
}, 1000);
console.log('End');const fs = require('fs');
fs.readFile('message.txt', 'utf8', (err, data) => {
if (err) {
console.error('Read failed:', err.message);
return;
}
console.log('File content:', data);
});
console.log('Reading file...');function getUser(id) {
return new Promise((resolve, reject) => {
setTimeout(() => {
if (id === 1) resolve({ id: 1, name: 'Asha' });
else reject(new Error('User not found'));
}, 800);
});
}
async function showUser() {
try {
const user = await getUser(1);
console.log(user);
} catch (error) {
console.error(error.message);
}
}
showUser();function fetchOrders() {
return Promise.resolve(['order1', 'order2']);
}
function fetchProfile() {
return Promise.resolve({ name: 'Mina' });
}
async function loadDashboard() {
try {
const [orders, profile] = await Promise.all([fetchOrders(), fetchProfile()]);
console.log('Orders:', orders);
console.log('Profile:', profile);
} catch (error) {
console.error('Dashboard failed:', error.message);
}
}
loadDashboard();Common Mistakes
- Forgetting to handle errors: Always use callback error checks,
.catch(), ortry...catch. - Mixing styles carelessly: Avoid combining callbacks, Promises, and async/await in confusing ways unless necessary.
- Assuming async code runs immediately in order: Code after an async call often runs first, so log carefully and test execution flow.
- Missing
await: Without it, you may log a Promise object instead of the real result.
Best Practices
- Prefer
async/awaitfor readability in modern Node Js projects. - Use
Promise.all()for independent tasks that can run together. - Keep asynchronous functions small and focused.
- Handle failures at every external boundary such as files, APIs, and databases.
- Write meaningful log messages so debugging async flow is easier.
Practice Exercises
- Create a script that prints
Start, waits 2 seconds, then printsDone. - Read a text file asynchronously and print its contents to the console.
- Create a Promise that resolves with a product name after 1 second, then display it using
async/await.
Mini Project / Task
Build a small Node Js script that loads a user's profile and recent messages asynchronously, then prints both results in a clean summary.
Challenge (Optional)
Create a function that requests data from three asynchronous sources and returns the fastest successful result while still handling failures safely.
Callbacks
Callbacks are functions passed as arguments to other functions so they can be executed later. In Node Js, they exist because many operations take time, such as reading files, calling APIs, waiting for timers, or querying databases. Instead of blocking the program until a task finishes, Node Js continues running other code and calls the callback when the task is complete. This makes applications fast and scalable, especially for backend services handling many users at once.
In real life, callbacks are used in file operations with the fs module, HTTP request handling, event-driven code, and database libraries. The most common Node Js callback style is the error-first callback, where the first parameter represents an error and the second represents successful data. A typical pattern looks like function(err, result). If err is not null, something failed. Otherwise, the result can be used.
Callbacks can be synchronous or asynchronous. A synchronous callback runs immediately, such as when using array methods like forEach. An asynchronous callback runs later, such as inside setTimeout or fs.readFile. Understanding this difference is important because the order of execution changes. Beginners often expect asynchronous callbacks to run instantly, which causes confusion when logs appear in an unexpected sequence.
Step-by-Step Explanation
First, define a function that accepts another function as a parameter. Second, perform some work inside the main function. Third, execute the callback at the correct time. In asynchronous cases, the callback is called after the delayed or external task completes.
Syntax idea: create a function like doTask(callback), then call callback() when ready. In Node Js error-first style, use callback(error, data). Always check whether an error exists before using the returned data.
Comprehensive Code Examples
Basic example
function greet(name, callback) {
console.log('Hello, ' + name);
callback();
}
greet('Ava', function () {
console.log('Greeting finished.');
});Real-world example
const fs = require('fs');
fs.readFile('notes.txt', 'utf8', function (err, data) {
if (err) {
console.log('Error reading file:', err.message);
return;
}
console.log('File content:', data);
});
console.log('Reading file...');Advanced usage
function getUser(id, callback) {
setTimeout(function () {
if (!id) {
callback(new Error('User ID is required'));
return;
}
callback(null, { id: id, name: 'Mina' });
}, 500);
}
function getOrders(user, callback) {
setTimeout(function () {
callback(null, ['Book', 'Keyboard']);
}, 500);
}
getUser(101, function (err, user) {
if (err) {
console.log(err.message);
return;
}
getOrders(user, function (err, orders) {
if (err) {
console.log(err.message);
return;
}
console.log(user.name + ' ordered:', orders);
});
});Common Mistakes
- Forgetting error handling: Always check the first callback argument before using data.
- Assuming async code runs in order: Code after an async call executes before the callback finishes, so place dependent logic inside the callback.
- Calling a callback multiple times: Use
returnafter handling an error or finishing a branch. - Deep nesting: Too many nested callbacks become hard to read; split logic into named functions when possible.
Best Practices
- Use clear function names for callbacks to improve readability.
- Follow the Node Js error-first convention consistently.
- Keep callback functions short and focused on one task.
- Return immediately after errors to avoid accidental extra execution.
- Refactor repeated callback logic into reusable helper functions.
Practice Exercises
- Create a function that accepts a number and a callback, prints the number, and then runs the callback.
- Use
setTimeoutwith a callback to display a message after 2 seconds. - Read a text file with
fs.readFileand print either the content or the error message using an error-first callback.
Mini Project / Task
Build a small Node Js script that reads a user profile file, then reads an orders file, and finally prints a summary using nested callbacks with proper error handling.
Challenge (Optional)
Create three asynchronous functions using setTimeout and execute them in sequence with callbacks, ensuring that each step starts only after the previous one finishes.
Promises in Node
Promises in Node.js are a fundamental concept for handling asynchronous operations more elegantly and efficiently than traditional callback functions. At its core, a Promise is an object representing the eventual completion (or failure) of an asynchronous operation and its resulting value. Before Promises, handling multiple asynchronous tasks often led to 'callback hell' ā deeply nested callback functions that made code hard to read, maintain, and debug. Promises provide a structured way to deal with asynchronous code, allowing you to chain operations and handle errors in a more linear, synchronous-looking fashion. They are crucial for modern Node.js development, especially when interacting with databases, making API calls, reading files, or any I/O-bound operation where you don't want to block the main thread. Promises are widely used in Node.js core modules (like the `fs/promises` module) and popular libraries (like Axios for HTTP requests, Mongoose for MongoDB, etc.), making them an indispensable tool for any Node.js developer.
A Promise can be in one of three states:
- Pending: The initial state; neither fulfilled nor rejected. The asynchronous operation has not yet completed.
- Fulfilled (Resolved): The operation completed successfully, and the Promise has a resulting value.
- Rejected: The operation failed, and the Promise has a reason for the failure (an error object).
Once a Promise is fulfilled or rejected, it is considered 'settled' and its state cannot change again. This immutability is a key advantage, as it guarantees that a Promise will resolve or reject only once, preventing race conditions or unexpected behavior from multiple resolutions.
Step-by-Step Explanation
To understand Promises, let's break down their syntax and usage:
1. Creating a Promise:
A Promise is created using the new Promise() constructor, which takes an executor function as an argument. The executor function itself takes two arguments: resolve and reject. These are functions that you call to change the state of the Promise.
const myPromise = new Promise((resolve, reject) => {
// Perform an asynchronous operation here
const success = Math.random() > 0.5;
if (success) {
resolve('Operation successful!'); // Call resolve on success
} else {
reject('Operation failed!'); // Call reject on failure
}
});2. Consuming a Promise:
You consume a Promise using .then(), .catch(), and .finally() methods.
.then(onFulfilled, onRejected): This method is called when the Promise is fulfilled or rejected. It takes up to two callback functions as arguments: the first for success (onFulfilled) and the second for failure (onRejected). TheonFulfilledcallback receives the resolved value, andonRejectedreceives the rejection reason..catch(onRejected): This is a shorthand for.then(null, onRejected). It's used specifically for handling errors/rejections..finally(onFinally): This method is called when the Promise is settled (either fulfilled or rejected). It takes a callback function that will be executed regardless of the Promise's outcome. It's useful for cleanup operations.
myPromise
.then((data) => {
console.log('Success:', data); // Executed if resolve is called
})
.catch((error) => {
console.error('Error:', error); // Executed if reject is called
})
.finally(() => {
console.log('Promise settled (finished)'); // Always executed
});3. Chaining Promises:
One of the most powerful features of Promises is chaining. The .then() and .catch() methods themselves return a new Promise, allowing you to chain multiple asynchronous operations sequentially.
function fetchData(id) {
return new Promise((resolve, reject) => {
setTimeout(() => {
if (id < 5) {
resolve(`Data for ID: ${id}`);
} else {
reject(`No data for ID: ${id}`);
}
}, 1000);
});
}
fetchData(3)
.then(data => {
console.log(data); // 'Data for ID: 3'
return fetchData(4); // Return a new promise for chaining
})
.then(data => {
console.log(data); // 'Data for ID: 4'
return fetchData(6); // This will reject
})
.then(data => {
console.log(data); // This won't be called
})
.catch(error => {
console.error('Caught an error in the chain:', error); // 'No data for ID: 6'
});4. Promise.all(), Promise.race(), Promise.allSettled(), Promise.any():
These static methods are used to manage multiple Promises concurrently.
Promise.all(iterable): Takes an iterable of Promises and returns a single Promise. This returned Promise fulfills when all of the input Promises have fulfilled, or rejects as soon as any of the input Promises rejects.Promise.race(iterable): Returns a Promise that fulfills or rejects as soon as one of the Promises in the iterable fulfills or rejects, with the value or reason from that Promise.Promise.allSettled(iterable): Returns a Promise that fulfills after all of the given Promises have either fulfilled or rejected, with an array of objects describing each Promise's outcome.Promise.any(iterable): Returns a Promise that fulfills as soon as any of the input Promises fulfills. If all of the input Promises reject, then the returned Promise rejects with anAggregateError.
Comprehensive Code Examples
Basic Example: Simulating an Async Operation
function simulateAsyncTask(shouldSucceed) {
return new Promise((resolve, reject) => {
setTimeout(() => {
if (shouldSucceed) {
resolve('Data fetched successfully!');
} else {
reject(new Error('Failed to fetch data.'));
}
}, 1500); // Simulate network delay
});
}
// Call with success
simulateAsyncTask(true)
.then(message => console.log('Success:', message))
.catch(error => console.error('Error:', error.message));
// Call with failure
simulateAsyncTask(false)
.then(message => console.log('Success:', message))
.catch(error => console.error('Error:', error.message));Real-World Example: Fetching User Data from an API
// In a real Node.js app, you'd use a library like 'node-fetch'
// const fetch = require('node-fetch'); // npm install node-fetch@2
// For demonstration, we'll simulate fetch
function mockFetch(url) {
return new Promise((resolve, reject) => {
setTimeout(() => {
if (url === 'https://api.example.com/users/1') {
resolve({
json: () => Promise.resolve({ id: 1, name: 'Alice', email: '[email protected]' })
});
} else if (url === 'https://api.example.com/users/2') {
resolve({
json: () => Promise.resolve({ id: 2, name: 'Bob', email: '[email protected]' })
});
} else {
reject(new Error(`User not found for URL: ${url}`));
}
}, 500);
});
}
function getUserData(userId) {
return mockFetch(`https://api.example.com/users/${userId}`)
.then(response => response.json())
.then(user => {
console.log(`Fetched user: ${user.name}, Email: ${user.email}`);
return user; // Pass the user data to the next .then()
})
.catch(error => {
console.error(`Failed to get user ${userId}:`, error.message);
throw error; // Re-throw to propagate the error down the chain
});
}
getUserData(1)
.then(user => console.log('User 1 processing complete:', user.name))
.catch(err => console.log('Final catch for user 1:', err.message));
getUserData(99)
.then(user => console.log('User 99 processing complete:', user.name))
.catch(err => console.log('Final catch for user 99:', err.message));Advanced Usage: Concurrent Operations with Promise.all
function downloadFile(fileName, delay) {
return new Promise((resolve, reject) => {
setTimeout(() => {
if (fileName.includes('error')) {
reject(new Error(`Failed to download ${fileName}`));
} else {
resolve(`${fileName} downloaded successfully!`);
}
}, delay);
});
}
const filesToDownload = [
downloadFile('report.pdf', 2000),
downloadFile('image.jpg', 1000),
downloadFile('data.csv', 3000)
];
console.log('Starting parallel downloads...');
Promise.all(filesToDownload)
.then(results => {
console.log('All files downloaded:');
results.forEach(result => console.log(`- ${result}`));
})
.catch(error => {
console.error('One or more downloads failed:', error.message);
})
.finally(() => {
console.log('Download operation finished.');
});
// Example with a failing promise
const failingDownloads = [
downloadFile('good-doc.docx', 1000),
downloadFile('error-log.txt', 500), // This will reject
downloadFile('another-file.zip', 2000)
];
console.log('
Starting parallel downloads with potential error...');
Promise.all(failingDownloads)
.then(results => {
console.log('All files downloaded (this should not happen for failingDownloads):', results);
})
.catch(error => {
console.error('Caught error in failingDownloads:', error.message);
});Common Mistakes
- Forgetting to return a Promise in
.then(): If you perform an asynchronous operation inside a.then()block but don't return the new Promise, the next.then()in the chain will receiveundefinedinstead of the result of the async operation.
Fix: Alwaysreturnthe new Promise if you want to chain its result. - Not catching errors: If a Promise rejects and there's no
.catch()handler in its chain, the error will be unhandled, potentially crashing your Node.js process (especially in older Node versions or if not explicitly handled).
Fix: Always include a.catch()at the end of your Promise chains, or use a global unhandled rejection handler. - Mixing callbacks and Promises without proper conversion: Trying to treat a callback-based API as if it returns a Promise directly.
Fix: Useutil.promisifyfor Node.js callback-based functions or manually wrap callback APIs in anew Promise()constructor.
Best Practices
- Always return Promises: When defining functions that perform asynchronous operations, always make them return a Promise. This allows for consistent chaining and error handling.
- Chain
.then()and.catch(): Avoid nesting.then()calls (callback hell with Promises). Instead, chain them for better readability. Place a single.catch()at the end of the chain to handle errors from any preceding Promise. - Handle errors gracefully: Use
.catch()to prevent unhandled promise rejections. For critical applications, consider usingprocess.on('unhandledRejection', ...)for global error logging. - Use
async/awaitfor cleaner code: While not strictly Promises themselves,async/awaitis syntactic sugar built on top of Promises, making asynchronous code look and behave more like synchronous code, greatly improving readability and maintainability. - Avoid Promise constructor anti-pattern: Don't wrap a function that already returns a Promise with
new Promise()(e.g.,new Promise(resolve => resolve(somePromiseFunction()))). Just return the existing Promise directly.
Practice Exercises
1. Simple Delayed Greeting: Create a function delayedGreeting(name, delay) that returns a Promise. This Promise should resolve with the string 'Hello, [name]!' after the specified delay in milliseconds. If the name is empty, it should reject with an error message. Use .then() and .catch() to consume it.
2. Random Number Promise: Write a function getRandomNumberPromise() that returns a Promise. This Promise should resolve with a random number between 1 and 10 after 2 seconds. If the random number is greater than 7, it should resolve. Otherwise, it should reject with the message 'Number too small!'.
3. Sequential File Reading Simulation: Simulate reading two files sequentially. Create two functions: readFile1() and readFile2(), both returning Promises. readFile1() resolves after 1 second with 'Content of File 1'. readFile2() resolves after 1.5 seconds with 'Content of File 2'. Use Promise chaining to first read file 1, then file 2, and log both contents in order.
Mini Project / Task
User Profile Loader:
Build a simple Node.js script that simulates fetching user details and their associated posts from two separate (simulated) API endpoints. Implement two functions, fetchUserDetails(userId) and fetchUserPosts(userId), both returning Promises. fetchUserDetails should resolve with an object { id, name, email } after 1 second. fetchUserPosts should resolve with an array of post titles ['Post A', 'Post B'] after 1.5 seconds. Use Promise.all() to fetch both user details and posts concurrently for a given userId, then combine and log the full user profile (details + posts) once both are available. Handle any potential errors during the fetching process.
Challenge (Optional)
Retry Mechanism for Failed Promises:
Create a function retryPromise(promiseFn, retries, delay). This function should take a function promiseFn (which returns a Promise), a number of retries, and a delay in milliseconds. If promiseFn rejects, retryPromise should attempt to execute promiseFn again up to the specified number of retries, waiting for the delay between each attempt. If all retries fail, the original rejection reason should be propagated. If it succeeds at any point, it should resolve with the successful value. Use setTimeout for the delay and recursion or a loop for retries.
Async and Await
Async and Await are modern JavaScript features that make asynchronous code easier to read and write. In Node Js, many operations take time, such as reading files, calling APIs, querying databases, or waiting for timers. Instead of blocking the entire program, Node Js handles these tasks asynchronously. Earlier, developers often used callbacks, then Promises became common. Async and Await were introduced to simplify Promise-based code so it looks closer to normal step-by-step logic. In real projects, you will use Async and Await in REST APIs, authentication flows, payment handling, background jobs, and file processing tasks. The main idea is simple: an async function always returns a Promise, and the await keyword pauses execution inside that function until a Promise settles. This makes complex workflows easier to understand. Async and Await are not a replacement for Promises; they are built on top of them. You should still understand that awaited values usually come from Promise-returning functions. Common usage patterns include waiting for a single result, running multiple tasks with Promise.all(), and handling failures with try...catch.
Step-by-Step Explanation
To use Async and Await, first mark a function with async. This tells JavaScript that the function will return a Promise. Inside that function, use await before a Promise-returning expression. When JavaScript reaches await, it waits for the Promise to resolve or reject, then continues. If it resolves, the resolved value is returned. If it rejects, an error is thrown, which you should handle with try...catch. Important rule: await can only be used inside an async function, unless your environment supports top-level await in modules. Also remember that awaiting tasks one by one runs them sequentially, which is sometimes correct but sometimes slower than running them together with Promise.all().
Comprehensive Code Examples
function delay(ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}
async function basicExample() {
await delay(1000);
console.log('Done after 1 second');
}
basicExample();async function getUser() {
return { id: 1, name: 'Asha' };
}
async function realWorldExample() {
try {
const user = await getUser();
console.log(`Welcome ${user.name}`);
} catch (error) {
console.error('Failed to load user:', error.message);
}
}
realWorldExample();function fetchPosts() {
return Promise.resolve(['Post 1', 'Post 2']);
}
function fetchComments() {
return Promise.resolve(['Comment A', 'Comment B']);
}
async function advancedExample() {
try {
const [posts, comments] = await Promise.all([fetchPosts(), fetchComments()]);
console.log(posts);
console.log(comments);
} catch (error) {
console.error('Data loading error:', error.message);
}
}
advancedExample();Common Mistakes
- Using await outside an async function: add the
asynckeyword to the function. - Forgetting error handling: wrap awaited code in
try...catchwhen failure is possible. - Running independent tasks one by one: use
Promise.all()for better performance when tasks do not depend on each other. - Assuming async makes code synchronous everywhere: only the async function flow becomes easier to read; the operation is still asynchronous.
Best Practices
- Use
asyncandawaitfor clarity when multiple asynchronous steps depend on each other. - Always handle rejected Promises with
try...catchor a caller-level catch. - Use meaningful function names such as
fetchUserDataorsaveOrder. - Prefer
Promise.all()for parallel operations to improve speed. - Keep async functions small and focused for easier testing and debugging.
Practice Exercises
- Create an async function that waits 2 seconds, then prints a message to the console.
- Write an async function that returns a user object and display the user name using
await. - Create two Promise-returning functions and use
Promise.all()withawaitto print both results together.
Mini Project / Task
Build a small data loader that simulates fetching a user profile, user posts, and notifications. Use Async and Await to display the data in order, and add error handling for one possible failure case.
Challenge (Optional)
Create a function that retries an asynchronous task up to three times before throwing an error. Use Async and Await to control the retry flow cleanly.
Creating a Server
Creating a server in Node Js means building a program that waits for requests from clients such as browsers, mobile apps, or other services, then sends back a response. This exists because modern applications need a backend layer to deliver pages, provide API data, process forms, handle authentication, and connect to databases. In real life, a Node Js server may power an e-commerce site, a chat system, a company dashboard, or a REST API for a mobile app. The most common beginner approach is using the built-in http module, which lets you create a server without installing extra packages. A server usually has a few key ideas: it listens on a port, receives a request object, sends a response object, and stays running so it can handle many requests over time. You may create simple text responses, return JSON for APIs, or respond differently based on the URL and method. Understanding this topic is important because almost every backend application starts with a server that can accept and process network traffic.
There are a few common ways to create servers in Node Js. The basic type is a plain HTTP server using the core module. Another variation is a routed server that checks the request path like /, /about, or /api and sends different outputs. A more advanced style returns structured JSON and proper status codes, which is common in API development. While frameworks like Express simplify this later, learning the built-in approach helps beginners understand what is happening underneath.
Step-by-Step Explanation
First, import the built-in HTTP module using require('http'). Next, call http.createServer() and pass a callback function. That callback receives two objects: req for the incoming request and res for the outgoing response. Then choose what to send back. You can set a response header with res.writeHead() to define the status code and content type. Finally, end the response with res.end(). After defining the server, call server.listen(port) so Node Js starts waiting for requests on that port. Once it is running, visiting http://localhost:3000 in a browser will trigger the callback.
Comprehensive Code Examples
Basic example
const http = require('http');
const server = http.createServer((req, res) => {
res.writeHead(200, { 'Content-Type': 'text/plain' });
res.end('Hello from Node Js server');
});
server.listen(3000, () => {
console.log('Server running at http://localhost:3000');
});Real-world example
const http = require('http');
const server = http.createServer((req, res) => {
if (req.url === '/') {
res.writeHead(200, { 'Content-Type': 'text/plain' });
res.end('Welcome to the homepage');
} else if (req.url === '/about') {
res.writeHead(200, { 'Content-Type': 'text/plain' });
res.end('About this service');
} else {
res.writeHead(404, { 'Content-Type': 'text/plain' });
res.end('Page not found');
}
});
server.listen(3000);Advanced usage
const http = require('http');
const server = http.createServer((req, res) => {
if (req.url === '/api/users' && req.method === 'GET') {
const users = [{ id: 1, name: 'Ava' }, { id: 2, name: 'Noah' }];
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(JSON.stringify(users));
} else {
res.writeHead(404, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ message: 'Route not found' }));
}
});
server.listen(3000, () => console.log('API server started'));Common Mistakes
Forgetting to call
res.end(). Fix: always finish the response so the browser does not hang.Using the wrong content type. Fix: use
text/plainfor plain text andapplication/jsonfor JSON.Trying to use a port already in use. Fix: change the port or stop the existing process.
Not checking
req.urlorreq.method. Fix: handle routes and methods clearly to avoid incorrect responses.
Best Practices
Set proper status codes such as 200, 404, and 500 so clients understand the result.
Keep route logic organized, especially as the server grows.
Return JSON for API endpoints because it is standard and easy to consume.
Log the server URL when listening starts to simplify testing.
Use environment variables for ports in larger applications instead of hardcoding values.
Practice Exercises
Create a server on port 4000 that returns the text
My first Node Js server.Add two routes:
/for a welcome message and/contactfor a contact message.Build a route called
/api/productthat returns a JSON object with a product name and price.
Mini Project / Task
Build a small server for a portfolio website with three routes: home, about, and projects. Each route should return a different response, and any unknown route should return a 404 message.
Challenge (Optional)
Create a server that responds differently based on both URL and request method, such as returning a message for GET /api/tasks and another for POST /api/tasks.
HTTP Module
The Node.js HTTP module is a fundamental component for building network applications, especially web servers. It allows Node.js to transfer data over the HyperText Transfer Protocol (HTTP), which is the foundation of data communication for the World Wide Web. Essentially, it enables your Node.js application to act as both an HTTP client (making requests to other servers) and, more commonly, an HTTP server (listening for and responding to client requests). This module is built into Node.js, meaning you don't need to install any external packages; you can simply require it to start using its functionalities.
In real-life scenarios, the HTTP module is the backbone of almost every web application built with Node.js. When you visit a website, send data through a form, or interact with an API, there's an HTTP server on the other end handling your request. Node.js, through its HTTP module, allows developers to craft these servers to serve web pages, provide data for single-page applications (SPAs), build RESTful APIs, and handle various other network communications. For instance, a simple blog application might use the HTTP module to serve HTML files for blog posts, handle form submissions for comments, and deliver JSON data for an administrative dashboard. Understanding this module is crucial for anyone looking to build robust and scalable backend services with Node.js.
While the core HTTP module provides low-level functionalities, it forms the basis for higher-level frameworks like Express.js, Koa.js, and Hapi.js. These frameworks abstract away many of the complexities of the raw HTTP module, offering more streamlined ways to handle routing, middleware, and request/response cycles. However, even when using these frameworks, the underlying HTTP module is still at work, handling the actual network communication.
Step-by-Step Explanation
The HTTP module provides objects like http.Server, http.ClientRequest, and http.ServerResponse. To create a basic HTTP server, you typically use the http.createServer() method. This method takes a callback function that will be executed every time a request is made to your server. This callback function receives two arguments: req (an instance of http.IncomingMessage, representing the incoming request) and res (an instance of http.ServerResponse, representing the response that will be sent back to the client).
The req object contains information about the client's request, such as the URL, HTTP method (GET, POST, etc.), headers, and body data. The res object is used to construct and send the response back to the client. You can set response headers using res.writeHead(), write response body data using res.write(), and finally, end the response using res.end(). The res.end() method is crucial as it signals that all response headers and body have been sent, and the server should consider this response complete.
After creating the server, you need to tell it to listen for incoming connections on a specific port and optionally a host. This is done using the server.listen() method. For example, server.listen(3000) will make your server listen on port 3000 on all available network interfaces. Once listening, your server will be ready to accept requests from clients.
Comprehensive Code Examples
Basic example
This example demonstrates how to create a simple HTTP server that responds with "Hello, World!" to every request.
const http = require('http');
const server = http.createServer((req, res) => {
res.writeHead(200, { 'Content-Type': 'text/plain' });
res.end('Hello, World!\n');
});
const PORT = 3000;
server.listen(PORT, () => {
console.log(`Server running at http://localhost:${PORT}/`);
});Real-world example
This example creates a server that handles different routes and serves JSON data for an API endpoint.
const http = require('http');
const server = http.createServer((req, res) => {
if (req.url === '/') {
res.writeHead(200, { 'Content-Type': 'text/html' });
res.end('Welcome to the Home Page!
');
} else if (req.url === '/api/users' && req.method === 'GET') {
const users = [
{ id: 1, name: 'Alice' },
{ id: 2, name: 'Bob' }
];
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(JSON.stringify(users));
} else {
res.writeHead(404, { 'Content-Type': 'text/plain' });
res.end('404 Not Found');
}
});
const PORT = 5000;
server.listen(PORT, () => {
console.log(`Server running on port ${PORT}`);
});Advanced usage
This example demonstrates handling POST requests to receive data from a client.
const http = require('http');
const server = http.createServer((req, res) => {
if (req.method === 'POST' && req.url === '/submit') {
let body = '';
req.on('data', (chunk) => {
body += chunk.toString(); // Convert buffer to string
});
req.on('end', () => {
console.log('Received data:', body);
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ message: 'Data received successfully!', data: body }));
});
} else if (req.method === 'GET' && req.url === '/') {
res.writeHead(200, { 'Content-Type': 'text/html' });
res.end(`Submit Data
`);
} else {
res.writeHead(404, { 'Content-Type': 'text/plain' });
res.end('Not Found');
}
});
const PORT = 4000;
server.listen(PORT, () => {
console.log(`Server running on http://localhost:${PORT}`);
});Common Mistakes
- Forgetting
res.end(): If you don't callres.end(), the client's request will hang, and the browser will keep waiting for a response that never fully arrives. Always ensure you callres.end()to terminate the response. - Not setting
Content-Typeheader: Without the correctContent-Typeheader, browsers might misinterpret the response. For example, sending JSON without'Content-Type': 'application/json'might cause the browser to display it as plain text. - Blocking the event loop with synchronous operations: Node.js is asynchronous. Performing CPU-intensive synchronous tasks inside the request handler will block the entire server, preventing it from handling other requests. Always use asynchronous operations for I/O or heavy computations.
Best Practices
- Use a routing mechanism: For anything beyond a very simple server, manually checking
req.urlandreq.methodquickly becomes unmanageable. Consider using a dedicated routing library or a framework like Express.js for cleaner route handling. - Error handling: Implement robust error handling. For server errors (e.g., database connection issues), send appropriate HTTP status codes (e.g., 500 Internal Server Error) and log the error details server-side.
- Asynchronous operations: Embrace Node.js's asynchronous nature. Use callbacks, Promises, or async/await for I/O operations (file system, database, network requests) to keep the server non-blocking.
- Modularity: Separate your server logic into smaller, manageable modules (e.g., routes, controllers, services) to improve maintainability and readability.
- Security headers: Implement security best practices by setting appropriate HTTP security headers (e.g.,
X-Content-Type-Options,X-Frame-Options,Content-Security-Policy) to protect against common web vulnerabilities.
Practice Exercises
- Create an HTTP server that listens on port 8000 and responds with the current date and time in plain text when accessed.
- Modify the server to respond with a personalized greeting, e.g., "Hello, [Name]!" if the URL is
/greet/John, otherwise "Hello, Guest!". - Build a server that serves a simple HTML page containing a list of your favorite fruits when the root URL (
/) is accessed.
Mini Project / Task
Build a simple API server using the HTTP module that manages a list of tasks. Implement two routes:
GET /tasks: Returns a JSON array of all tasks.POST /tasks: Accepts a JSON object with ataskNameand adds it to the list, then returns the updated list.
Challenge (Optional)
Extend your task API server to include a DELETE /tasks/:id route that removes a task by its ID. Additionally, implement a PUT /tasks/:id route to update an existing task's name. Ensure proper error handling for tasks not found.
Express Introduction
Express is a lightweight web framework for Node Js that helps developers build servers and web applications faster than using the built-in HTTP module alone. It exists to reduce repetitive work such as routing requests, reading parameters, sending responses, and organizing backend logic. In real-life projects, Express is used to create REST APIs, backend services for web and mobile apps, authentication systems, dashboards, and full-stack applications. For example, an e-commerce site may use Express to serve product data, process orders, and connect the frontend to a database.
At its core, Express sits on top of Node Js and gives you a cleaner way to handle requests and responses. Common concepts include an application object created with express(), routes such as app.get() and app.post(), middleware for processing requests before the final handler, and response helpers like res.send() and res.json(). You will also commonly see route parameters, query strings, and JSON body parsing. Express supports many backend patterns, from very small apps to large modular API services.
To begin, install Express in a Node Js project with npm init -y and npm install express. Then import it, create an app, define a route, and start a server on a port. When a browser or API client sends a request to that route, Express matches it and runs the related function.
Step-by-Step Explanation
First, import Express using require('express'). Second, create the app with const app = express(). Third, define routes using methods based on HTTP verbs such as app.get() for reading data and app.post() for creating data. Each route usually takes a path and a callback with req and res. The req object contains request details, while res sends data back. Finally, call app.listen(port) to start the server.
Middleware is another key idea. Middleware functions run in order and can log requests, parse JSON, check authentication, or handle errors. A common built-in middleware is express.json(), which allows Express to read JSON request bodies.
Comprehensive Code Examples
const express = require('express');
const app = express();
app.get('/', (req, res) => {
res.send('Welcome to Express');
});
app.listen(3000, () => {
console.log('Server running on port 3000');
});const express = require('express');
const app = express();
app.use(express.json());
app.get('/products', (req, res) => {
res.json([{ id: 1, name: 'Laptop' }, { id: 2, name: 'Mouse' }]);
});
app.post('/products', (req, res) => {
const product = req.body;
res.status(201).json({ message: 'Product created', product });
});
app.listen(3000);const express = require('express');
const app = express();
app.use(express.json());
app.use((req, res, next) => {
console.log(`${req.method} ${req.url}`);
next();
});
app.get('/users/:id', (req, res) => {
const userId = req.params.id;
const active = req.query.active;
res.json({ userId, active });
});
app.listen(3000, () => console.log('Advanced app running'));Common Mistakes
- Forgetting to install Express: run
npm install expressbefore starting the app. - Not starting the server: without
app.listen(), routes will never be reachable. - Using
req.bodywithout JSON middleware: addapp.use(express.json())before POST routes. - Misspelling route paths: make sure the requested URL exactly matches the defined route.
Best Practices
- Use clear route names such as
/usersand/products. - Return JSON for APIs using
res.json()instead of plain text where appropriate. - Keep middleware near the top so it applies consistently.
- Use proper status codes like
200,201,404, and500. - Organize larger apps by moving routes into separate files.
Practice Exercises
- Create an Express app with a home route that sends a welcome message.
- Add a
/aboutroute that returns JSON with your course name and topic. - Create a route
/hello/:namethat responds with a personalized greeting using a route parameter.
Mini Project / Task
Build a small Express server for a bookstore with three routes: / for a welcome message, /books to return a JSON list of books, and /books/:id to return the selected book id.
Challenge (Optional)
Create an Express app that logs every request, accepts JSON data on a POST route, and returns a custom error message when a route does not exist.
Routing
Routing is a fundamental concept in web development, especially crucial in backend frameworks like Node.js with Express. It refers to how an application responds to a client request to a particular endpoint, which is a URI (or path) and a specific HTTP request method (GET, POST, PUT, DELETE, etc.). In essence, routing determines what code gets executed when a user navigates to a certain URL. Without routing, a web server wouldn't know how to differentiate between requests for different resources, like fetching a user profile versus submitting a form. It exists to provide structure and logic to web applications, allowing developers to define specific actions for different URL patterns and HTTP methods, leading to organized, maintainable, and scalable codebases. In real-life applications, routing is everywhere. When you visit
example.com/products, routing directs your request to the code that fetches and displays a list of products. When you fill out a login form and click submit, the data is sent to an endpoint like
example.com/loginvia a POST request, and routing handles the authentication logic.
The core concept of routing in Node.js, particularly with the Express framework, revolves around defining routes. A route consists of a path, an HTTP method, and one or more handler functions. When a request comes in, Express matches the incoming request URL and method against the defined routes. If a match is found, the associated handler functions are executed. There aren't strictly 'sub-types' of routing but rather different ways to organize and implement routes. These include:
- Basic Routing: Direct mapping of a URL path and HTTP method to a single handler function.
- Route Parameters: Capturing dynamic values from the URL path, often used for identifying specific resources (e.g.,
/users/:id
). - Query Parameters: Handling data passed in the URL after a question mark (e.g.,
/search?q=nodejs
), which are typically accessed viareq.query
. - Middleware Functions: Functions that have access to the request object (
req
), the response object (res
), and the next middleware function in the applicationās request-response cycle. They can execute any code, make changes to the request and the response objects, end the request-response cycle, or call the next middleware in the stack. Middleware is often used for authentication, logging, parsing request bodies, etc. - Router-Level Middleware: Using
express.Router()
to create modular, mountable route handlers. This helps in organizing routes for different parts of an application (e.g.,/api/users
,/api/products
).
Step-by-Step Explanation
To implement routing in Node.js with Express, you typically follow these steps:
1. Initialize Express: First, you need to import Express and create an application instance.
const express = require('express');
const app = express();2. Define a Route: Use
app.METHOD(PATH, HANDLER)to define a route.
METHODis an HTTP verb (e.g.,
get,
post),
PATHis the URL path, and
HANDLERis a function that takes
req(request) and
res(response) objects.
app.get('/', (req, res) => {
res.send('Hello World!');
});3. Start the Server: Listen for incoming requests on a specific port.
const PORT = 3000;
app.listen(PORT, () => {
console.log(`Server running on port ${PORT}`);
});4. Handle Route Parameters: To capture dynamic parts of the URL, use colons (
:) in the path.
app.get('/users/:id', (req, res) => {
const userId = req.params.id;
res.send(`User ID: ${userId}`);
});5. Modular Routing with
express.Router(): For larger applications, create separate router modules.
In
users.js:
const express = require('express');
const router = express.Router();
router.get('/', (req, res) => {
res.send('List of all users');
});
router.get('/:id', (req, res) => {
res.send(`User details for ID: ${req.params.id}`);
});
module.exports = router;In
app.js:
const express = require('express');
const app = express();
const usersRouter = require('./users');
app.use('/users', usersRouter); // Mount the router at /users prefix
app.listen(3000, () => console.log('Server running on port 3000'));Comprehensive Code Examples
Basic example
const express = require('express');
const app = express();
const PORT = 3000;
// Route for the home page
app.get('/', (req, res) => {
res.send('Welcome to the Home Page!');
});
// Route for an 'about' page
app.get('/about', (req, res) => {
res.send('This is the About page.');
});
// Start the server
app.listen(PORT, () => {
console.log(`Server is running on http://localhost:${PORT}`);
});Real-world example (Simple API for products)
const express = require('express');
const app = express();
const PORT = 3000;
// Middleware to parse JSON request bodies
app.use(express.json());
let products = [
{ id: 1, name: 'Laptop', price: 1200 },
{ id: 2, name: 'Mouse', price: 25 },
{ id: 3, name: 'Keyboard', price: 75 }
];
// GET all products
app.get('/api/products', (req, res) => {
res.json(products);
});
// GET a single product by ID
app.get('/api/products/:id', (req, res) => {
const productId = parseInt(req.params.id);
const product = products.find(p => p.id === productId);
if (product) {
res.json(product);
} else {
res.status(404).send('Product not found');
}
});
// POST a new product
app.post('/api/products', (req, res) => {
const newProduct = {
id: products.length > 0 ? Math.max(...products.map(p => p.id)) + 1 : 1,
name: req.body.name,
price: req.body.price
};
if (!newProduct.name || !newProduct.price) {
return res.status(400).send('Name and price are required.');
}
products.push(newProduct);
res.status(201).json(newProduct);
});
// Start the server
app.listen(PORT, () => {
console.log(`Product API running on http://localhost:${PORT}`);
});Advanced usage (Modular routing with middleware)
// app.js
const express = require('express');
const app = express();
const PORT = 3000;
// Import routers
const authRouter = require('./routes/auth');
const productsRouter = require('./routes/products');
// Global middleware for logging requests
app.use((req, res, next) => {
console.log(`${new Date().toISOString()} - ${req.method} ${req.url}`);
next();
});
// Middleware to parse JSON bodies
app.use(express.json());
// Mount routers
app.use('/auth', authRouter); // All /auth routes handled by authRouter
app.use('/api/products', productsRouter); // All /api/products routes handled by productsRouter
// Default route
app.get('/', (req, res) => {
res.send('Welcome to the Advanced Node.js App!');
});
// Handle 404 Not Found
app.use((req, res, next) => {
res.status(404).send('Sorry, can't find that!');
});
// Start the server
app.listen(PORT, () => {
console.log(`Server running on http://localhost:${PORT}`);
});
// routes/auth.js
const express = require('express');
const router = express.Router();
// Middleware specific to auth routes
router.use((req, res, next) => {
console.log('Auth middleware triggered.');
next();
});
router.post('/login', (req, res) => {
const { username, password } = req.body;
if (username === 'admin' && password === 'password') {
res.send('Login successful!');
} else {
res.status(401).send('Invalid credentials.');
}
});
router.post('/register', (req, res) => {
res.send('User registered successfully!');
});
module.exports = router;
// routes/products.js
const express = require('express');
const router = express.Router();
let products = [
{ id: 1, name: 'Advanced Laptop', price: 1500 },
{ id: 2, name: 'Wireless Mouse', price: 30 }
];
// Middleware to check if user is authenticated (dummy)
const isAuthenticated = (req, res, next) => {
const authHeader = req.headers['authorization'];
if (authHeader && authHeader === 'Bearer mysecrettoken') {
next();
} else {
res.status(403).send('Access denied. Please authenticate.');
}
};
router.get('/', (req, res) => {
res.json(products);
});
// This route requires authentication
router.post('/', isAuthenticated, (req, res) => {
const newProduct = {
id: products.length > 0 ? Math.max(...products.map(p => p.id)) + 1 : 1,
name: req.body.name,
price: req.body.price
};
if (!newProduct.name || !newProduct.price) {
return res.status(400).send('Name and price are required.');
}
products.push(newProduct);
res.status(201).json(newProduct);
});
module.exports = router;
Common Mistakes
- Incorrect Order of Routes/Middleware: Express processes routes and middleware in the order they are defined. If a generic route like
app.get('/')is defined before a more specific one likeapp.get('/users/:id'), the generic one might catch requests intended for the specific route. Always define more specific routes before more general ones. - Forgetting
next()
in Middleware: Middleware functions must either terminate the request-response cycle (e.g., by callingres.send()
) or pass control to the next middleware function by callingnext()
. Forgettingnext()
will cause the request to hang indefinitely. - Not Handling Different HTTP Methods: A common mistake is only defining
GET
routes and then wondering whyPOST
requests to the same path don't work. Remember to define routes for all HTTP methods your application needs (e.g.,app.post()
,app.put()
,app.delete()
).
Best Practices
- Modularize Routes: For larger applications, organize routes into separate files using
express.Router()
. This keeps yourapp.js
clean and makes routes easier to manage and test. - Use Specificity: Define your routes from most specific to least specific. For example,
/users/create
should come before/users/:id
. - Error Handling: Implement robust error handling middleware at the end of your route definitions to catch unhandled errors and send appropriate responses.
- Meaningful Route Paths: Design intuitive and RESTful API endpoints. For example,
/api/products
for all products,/api/products/:id
for a specific product. - Middleware for Common Tasks: Leverage middleware for tasks like authentication, logging, body parsing, and validation to keep your route handlers focused on business logic.
Practice Exercises
- Exercise 1 (Basic Route): Create a simple Express application with two routes:
/hello
that responds with 'Hello there!' and/goodbye
that responds with 'See you later!'. - Exercise 2 (Route Parameters): Modify the application to include a route
/greet/:name
that takes a name as a parameter and responds with 'Hello, [name]!'. - Exercise 3 (Query Parameters): Add a route
/search
that accepts a query parameterkeyword
. Ifkeyword
is provided, respond with 'Searching for: [keyword]'. Otherwise, respond with 'Please provide a search keyword.'.
Mini Project / Task
Build a simple 'To-Do List API' using Express. It should have the following routes:
GET /todos
: Returns a list of all to-do items.POST /todos
: Adds a new to-do item (acceptsname
in the request body).GET /todos/:id
: Returns a specific to-do item by its ID.
Challenge (Optional)
Extend the 'To-Do List API' mini-project. Implement a
PUT /todos/:idroute to update an existing to-do item (e.g., mark it as complete or change its name). Also, add a
DELETE /todos/:idroute to remove a to-do item. Ensure proper error handling for non-existent IDs for all routes.
Route Parameters
Route parameters are dynamic values embedded directly inside a URL path. In Node Js applications, especially when using Express, they let you capture parts of a route such as an ID, username, slug, or category name. Instead of creating separate routes like /user/1, /user/2, and /user/3, you define one flexible route such as /user/:id. This makes APIs cleaner, easier to maintain, and much more scalable.
In real projects, route parameters are used in product pages, blog posts, profile pages, order tracking, and REST APIs. For example, an e-commerce app may use /products/:productId, while a social platform may use /users/:username. Route parameters are different from query strings. A query string looks like /products?category=books, while a route parameter looks like /products/:category. Both are useful, but route parameters usually identify a specific resource or path segment.
In Express, route parameters are declared with a colon, such as :id. When a request matches the route, Express stores the value inside req.params. This object contains key-value pairs where the key is the parameter name and the value is the part extracted from the URL. You can define one parameter or multiple parameters in the same route, such as /users/:userId/orders/:orderId. This is common in nested resources.
Step-by-Step Explanation
First, create a route path and place a colon before the dynamic part. Example: /books/:id.
Second, when a client visits a matching URL like /books/42, Express captures 42.
Third, access the value using req.params.id inside the route handler.
Fourth, use that value to fetch data, validate input, or build a response.
Fifth, if you have multiple parameters, access each one by its name from req.params.
Always remember that route parameters are strings by default. If you need a number, convert it before performing numeric logic.
Comprehensive Code Examples
Basic example
const express = require('express');
const app = express();
app.get('/users/:id', (req, res) => {
res.send(`Requested user ID: ${req.params.id}`);
});
app.listen(3000);Real-world example
const express = require('express');
const app = express();
const products = [
{ id: 1, name: 'Keyboard' },
{ id: 2, name: 'Mouse' }
];
app.get('/products/:productId', (req, res) => {
const productId = Number(req.params.productId);
const product = products.find(p => p.id === productId);
if (!product) {
return res.status(404).send('Product not found');
}
res.json(product);
});
app.listen(3000);Advanced usage
const express = require('express');
const app = express();
app.get('/users/:userId/orders/:orderId', (req, res) => {
const { userId, orderId } = req.params;
res.json({
message: 'Nested route parameters received',
userId,
orderId
});
});
app.listen(3000);Common Mistakes
- Forgetting the colon: Writing
/users/idinstead of/users/:id. Fix: add:before the parameter name. - Using the wrong object: Trying to read
req.query.idinstead ofreq.params.id. Fix: usereq.paramsfor path values. - Not converting data types: Comparing string IDs to numbers may fail. Fix: use
Number(req.params.id)when needed. - Ignoring validation: Accepting any parameter value can cause bugs. Fix: check format, length, or allowed values before using it.
Best Practices
- Use clear parameter names such as
:userIdand:postSlug. - Validate route parameter values before querying a database.
- Keep route design consistent across the app.
- Use nested parameters only when the relationship is meaningful.
- Return proper HTTP status codes like
404for missing resources and400for invalid input.
Practice Exercises
- Create a route
/books/:titlethat returns the requested book title. - Create a route
/students/:studentIdand respond with a sentence containing the ID. - Create a nested route
/courses/:courseId/lessons/:lessonIdand return both values in JSON format.
Mini Project / Task
Build a small product lookup API with a route /items/:id. Store at least five items in an array, find the matching item by ID, and return it as JSON. If the item does not exist, send a 404 response.
Challenge (Optional)
Create a route like /users/:username/posts/:postId that validates whether postId is a number and returns a custom error message if the URL contains invalid data.
Middleware
Middleware in Node Js is a function that runs between receiving a request and sending a response. It exists to let developers add reusable processing steps such as logging, authentication, parsing JSON, validating input, handling errors, and attaching shared data to the request object. In real applications, middleware is used in APIs, dashboards, e-commerce systems, admin tools, and authentication pipelines. Instead of writing the same logic inside every route, you place it in middleware and apply it globally, to a router, or to a single endpoint.
In Express-based Node Js apps, middleware commonly receives req, res, and next. The req object represents the incoming request, res is used to send the response, and next() passes control to the next middleware in the chain. There are several common categories. Application middleware runs for many routes. Router middleware runs inside grouped routes. Built-in middleware includes helpers like express.json(). Error-handling middleware has four parameters: err, req, res, next. Third-party middleware includes packages such as CORS, Helmet, and Morgan.
Step-by-Step Explanation
First, create an Express app and define a middleware function. A middleware can do one of three things: change the request, change the response, or end the request-response cycle. If it does not end the cycle, it must call next().
Second, register middleware with app.use() for global usage, or attach it to a route like app.get('/path', middlewareFn, handler). Order matters. Middleware executes from top to bottom, so a logger placed before routes runs early, while an error handler should be placed near the end.
Third, use middleware carefully. If you forget next() and also do not send a response, the request will hang. If you send a response and then call next(), later code may try to send another response and cause an error.
Comprehensive Code Examples
const express = require('express');
const app = express();
function logger(req, res, next) {
console.log(`${req.method} ${req.url}`);
next();
}
app.use(logger);
app.get('/', (req, res) => {
res.send('Home page');
});
app.listen(3000);const express = require('express');
const app = express();
app.use(express.json());
function validateProduct(req, res, next) {
const { name, price } = req.body;
if (!name || typeof price !== 'number') {
return res.status(400).json({ error: 'Invalid product data' });
}
next();
}
app.post('/products', validateProduct, (req, res) => {
res.status(201).json({ message: 'Product created', data: req.body });
});
app.listen(3000);const express = require('express');
const app = express();
function auth(req, res, next) {
const token = req.headers.authorization;
if (token !== 'Bearer secret123') {
return res.status(401).json({ error: 'Unauthorized' });
}
req.user = { role: 'admin' };
next();
}
app.get('/admin', auth, (req, res) => {
res.json({ message: 'Welcome', user: req.user });
});
app.use((err, req, res, next) => {
res.status(500).json({ error: 'Server error' });
});
app.listen(3000);Common Mistakes
- Forgetting
next(): If middleware does not end the response, always callnext(). - Wrong order: Putting JSON parsing or auth after routes means routes cannot use them correctly. Register important middleware first.
- Sending multiple responses: Use
return res.status(...)when ending early so execution stops. - Using normal middleware for errors: Error handlers must use four parameters.
Best Practices
- Keep middleware small: One clear responsibility per function.
- Name middleware clearly: Use names like
logger,authenticateUser, andvalidateOrder. - Apply middleware at the correct scope: Global for shared logic, route-level for specific checks.
- Handle errors consistently: Centralize unexpected failures in one error-handling middleware.
- Avoid heavy work inside middleware: Keep request flow fast and predictable.
Practice Exercises
- Create a middleware that prints the current request method and path for every request.
- Build a middleware that checks whether a query parameter named
apiKeyexists before allowing access. - Write a route-specific middleware that blocks access to
/adminunless a header calledroleequalsadmin.
Mini Project / Task
Build a small API with routes for /books. Add one middleware for logging, one for JSON body parsing, and one validation middleware that ensures every new book has a title and author before saving.
Challenge (Optional)
Create a middleware chain for a protected route that logs the request, validates a token, attaches user data to req, and then allows the final handler to return a personalized JSON response.
Error Handling
Error handling in Node Js is the process of detecting, responding to, and recovering from problems in your application. Errors happen in real systems all the time: a file may not exist, a database connection may fail, user input may be invalid, or an external API may return unexpected data. Without proper error handling, your app can crash, leak sensitive details, or behave unpredictably. In backend development, this matters because users expect reliable systems, and developers need clear logs to diagnose failures.
In Node Js, errors appear in several forms. Synchronous errors happen during normal blocking code and can usually be caught with try/catch. Callback-based APIs often return errors as the first argument, commonly called the error-first callback pattern. Promise-based code uses .catch() for failures, and async/await lets you handle those failures with try/catch in a cleaner style. Node Js also allows custom error objects so you can attach codes, messages, and context. Finally, process-level handlers such as uncaughtException and unhandledRejection can catch unhandled failures, though they should be used carefully as safety nets rather than normal control flow.
Step-by-Step Explanation
Start with synchronous code. If a statement may throw an error, place it inside try. If an error occurs, execution jumps to catch. This is useful for parsing JSON or manually throwing validation errors.
For callback APIs, check the first parameter inside the callback. If it contains an error, handle it immediately and return early.
For Promises, attach .catch() to handle rejected values. With async/await, wrap awaited code inside try/catch.
When needed, create custom errors using class MyError extends Error. This helps separate validation issues from database or network failures. Good error handling means logging enough context for developers while returning safe, simple messages to users.
Comprehensive Code Examples
try {
const user = JSON.parse('{"name":"Ava"}');
console.log(user.name);
} catch (error) {
console.error('Invalid JSON:', error.message);
}const fs = require('fs');
fs.readFile('config.json', 'utf8', (error, data) => {
if (error) {
console.error('Could not read file:', error.message);
return;
}
console.log('File loaded:', data);
});async function getUserData(id) {
try {
if (!id) {
throw new Error('User id is required');
}
const response = await fetch(`https://api.example.com/users/${id}`);
if (!response.ok) {
throw new Error(`Request failed with status ${response.status}`);
}
const data = await response.json();
return data;
} catch (error) {
console.error('Failed to get user data:', error.message);
throw error;
}
}class ValidationError extends Error {
constructor(message) {
super(message);
this.name = 'ValidationError';
}
}
function registerUser(email) {
if (!email.includes('@')) {
throw new ValidationError('Email format is invalid');
}
return 'User registered';
}Common Mistakes
- Using try/catch around asynchronous callbacks: it will not catch errors thrown later inside many callback operations. Use callback error parameters, Promise
.catch(), orasync/await. - Ignoring returned errors: beginners often log success without checking
errorfirst in callbacks. Always handle the error path before using the result. - Exposing internal details to users: sending stack traces to clients is unsafe. Log technical details internally and return friendly messages externally.
Best Practices
- Fail early by validating inputs before expensive operations.
- Use custom error classes for clearer error categories.
- Return or rethrow after handling an error to avoid continuing in a broken state.
- Log useful context such as request id, file name, or user action.
- Treat global process handlers as emergency protection, not routine logic.
Practice Exercises
- Create a function that parses a JSON string and prints a friendly message if parsing fails.
- Read a text file with
fs.readFile()and handle the case where the file does not exist. - Write an
asyncfunction that throws an error when an argument is missing and catches it where the function is called.
Mini Project / Task
Build a small Node Js script called safe-config-loader that reads a JSON configuration file, validates that it contains a port value, and prints clear error messages when the file is missing, malformed, or incomplete.
Challenge (Optional)
Create a custom error system with at least two classes, such as ValidationError and DatabaseError, then write one function that throws each type and another function that handles them differently.
Serving Static Files
Serving static files means sending files such as HTML, CSS, JavaScript, images, fonts, PDFs, and videos directly from the server to the browser without generating them dynamically for each request. In Node Js, this is a common requirement because every website needs assets for layout, styling, client-side behavior, and media. For example, when a browser loads a page, it may request /index.html, /styles.css, /app.js, and several image files. A Node Js server must correctly find those files, read them, set the correct content type, and return them safely. In real projects, static file serving is used in company websites, dashboards, admin panels, documentation sites, and frontend builds from frameworks. The main ideas include file paths, MIME types, route mapping, and security. You can serve static files manually with Node's built-in modules such as http, fs, and path, or use a framework like Express with express.static(). Manual serving helps beginners understand what happens under the hood, while Express is the professional choice for most applications because it is shorter, cleaner, and less error-prone.
Step-by-Step Explanation
First, create a public folder such as public/ and place files inside it, for example index.html, style.css, and logo.png. Second, create a server using the http module or Express. Third, map the incoming URL to a file path. If the user requests /, you usually return index.html. Fourth, detect the file extension and send the proper content type such as text/html, text/css, or image/png. Fifth, read the file and send it in the response. If the file does not exist, return a 404 response. Finally, protect your application from path traversal attacks by resolving paths carefully and limiting access to the intended directory only.
Comprehensive Code Examples
Basic example with Node Js core modules
const http = require('http');
const fs = require('fs');
const path = require('path');
const server = http.createServer((req, res) => {
let filePath = req.url === '/' ? '/index.html' : req.url;
filePath = path.join(__dirname, 'public', filePath);
const ext = path.extname(filePath);
const types = { '.html': 'text/html', '.css': 'text/css', '.js': 'application/javascript', '.png': 'image/png', '.jpg': 'image/jpeg' };
const contentType = types[ext] || 'application/octet-stream';
fs.readFile(filePath, (err, data) => {
if (err) {
res.writeHead(404, { 'Content-Type': 'text/plain' });
return res.end('File not found');
}
res.writeHead(200, { 'Content-Type': contentType });
res.end(data);
});
});
server.listen(3000);Real-world example with Express
const express = require('express');
const path = require('path');
const app = express();
app.use('/static', express.static(path.join(__dirname, 'public')));
app.listen(3000, () => {
console.log('Server running on port 3000');
});Advanced usage with cache headers
const express = require('express');
const path = require('path');
const app = express();
app.use(express.static(path.join(__dirname, 'public'), {
maxAge: '1d',
index: 'index.html'
}));
app.listen(3000);Common Mistakes
- Using wrong file paths: Use
path.join(__dirname, 'public')instead of hardcoded strings. - Ignoring content types: Browsers may misread files if
Content-Typeis missing or wrong. - Not handling missing files: Always return a 404 response when the file does not exist.
- Exposing unsafe paths: Never allow user input to access files outside the static directory.
Best Practices
- Store public assets in a dedicated folder like
public. - Use Express static middleware for most projects because it is reliable and simple.
- Set proper cache rules for performance, especially for images and frontend bundles.
- Keep file names predictable and organized by type, such as
css/,js/, andimages/.
Practice Exercises
- Create a Node Js server that serves an
index.htmlfile when the browser opens the root URL. - Add support for serving a CSS file and an image from a
publicfolder. - Build a 404 response for any file that does not exist in the static directory.
Mini Project / Task
Build a small portfolio website server in Node Js that serves index.html, a stylesheet, a JavaScript file, and at least two images from a static assets folder.
Challenge (Optional)
Create your own static file server with Node's core modules that blocks path traversal attempts and automatically serves index.html when a folder route is requested.
Working with JSON
JSON, or JavaScript Object Notation, is a lightweight format for storing and exchanging data. It exists because applications need a simple, readable way to move structured information between systems. In real life, JSON is used in REST APIs, configuration files, database exports, frontend-backend communication, and third-party integrations. In Node Js, JSON is especially important because JavaScript objects and JSON look similar, making it easy to convert data between program memory and text sent over networks or saved to files.
JSON supports key-value pairs, arrays, numbers, strings, booleans, null, and nested objects. However, JSON is not identical to a JavaScript object. For example, JSON keys and string values must use double quotes, and JSON cannot contain functions, comments, or undefined values. In Node Js, the two most common operations are converting an object into JSON text with JSON.stringify() and converting JSON text into a JavaScript object with JSON.parse(). You may also load static JSON files in some workflows, but understanding parsing and serialization is the real foundation.
Step-by-Step Explanation
To turn a JavaScript object into JSON, use JSON.stringify(value). This is useful before sending data in an API response or writing it to a file. To turn JSON text back into a usable JavaScript object, use JSON.parse(text). Beginners should remember that stringify creates a string, while parse reads a string and returns an object or array.
JSON can represent objects and arrays. Objects store named properties, while arrays store ordered lists. In backend development, you often receive JSON from requests, validate it, process it, and return JSON in responses. When working with files, Node Js commonly uses the fs module to read a JSON file as text and then parse it.
Comprehensive Code Examples
const user = { name: "Ava", age: 25, isAdmin: false };
const jsonText = JSON.stringify(user);
console.log(jsonText);
const parsedUser = JSON.parse(jsonText);
console.log(parsedUser.name);const fs = require('fs');
const settings = { theme: "dark", notifications: true, version: 1 };
fs.writeFileSync('settings.json', JSON.stringify(settings, null, 2));
const fileData = fs.readFileSync('settings.json', 'utf8');
const savedSettings = JSON.parse(fileData);
console.log(savedSettings.theme);const http = require('http');
const server = http.createServer((req, res) => {
const products = [
{ id: 1, name: "Laptop", price: 1200 },
{ id: 2, name: "Mouse", price: 25 }
];
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(JSON.stringify(products));
});
server.listen(3000, () => console.log('Server running on port 3000'));The first example shows basic conversion. The second shows file storage, which is common for config data or simple local persistence. The third shows advanced usage in an HTTP server, where JSON is returned to clients such as browsers or mobile apps.
Common Mistakes
- Parsing an object instead of a string:
JSON.parse()only accepts valid JSON text. Fix it by passing a string, not a JavaScript object. - Using single quotes in JSON text: JSON requires double quotes for keys and strings. Fix invalid JSON formatting before parsing.
- Ignoring parse errors: bad JSON will crash your code. Fix it by wrapping
JSON.parse()insidetry...catchwhen input may be unsafe. - Forgetting stringify before saving or sending: writing raw objects to files or responses can fail or behave unexpectedly. Convert them first.
Best Practices
- Use
JSON.stringify(data, null, 2)for readable file output during development. - Validate incoming JSON data before trusting or storing it.
- Use
try...catcharound parsing for request bodies, file content, or external API responses. - Keep JSON data clean and predictable with consistent property names.
- Set the response header to
application/jsonwhen sending JSON from servers.
Practice Exercises
- Create a JavaScript object for a book with title, author, and price, then convert it to JSON and print it.
- Write a script that saves an array of student names to a
students.jsonfile and reads it back. - Create a JSON string representing a product, parse it, and print only the product name and price.
Mini Project / Task
Build a small Node Js script that stores a to-do list in a JSON file. It should create an array of tasks, save it to disk, read it back, parse it, and print all task titles to the console.
Challenge (Optional)
Create a script that reads a JSON file containing users, filters only active users, and writes the filtered result into a new JSON file with pretty formatting.
Handling Forms
Handling forms in Node Js means receiving data submitted by users from a browser, processing it on the server, validating it, and sending back a response. Forms are everywhere in real applications: login pages, registration screens, search boxes, checkout pages, feedback forms, and profile editors. A browser usually sends form data to the server using HTTP methods such as GET and POST. In Node Js, form handling is commonly done with the built-in http module for learning basics, or with frameworks like Express in production. The two main submission patterns are query-string based GET requests and body-based POST requests. With GET, form values appear in the URL and are useful for search forms. With POST, values are sent in the request body and are better for passwords, profile data, and larger payloads. A complete form workflow includes rendering an HTML form, reading incoming data, parsing it, validating required fields, handling errors, and returning a useful message or saving data to a database.
Step-by-Step Explanation
First, create a server that can respond to browser requests. Next, send an HTML page containing a form using inputs and a submit button. Then set the form method and action. When the user submits the form, the browser sends data to the route in action. For a GET form, read values from the URL using the url module. For a POST form, listen to request chunks using req.on('data') and finish with req.on('end'). After collecting the raw body, parse it with URLSearchParams if the form uses standard URL-encoded data. Then validate input values. For example, make sure a name is not empty, an email contains expected characters, or a password meets length rules. Finally, return a clear response and never trust browser input without checking it on the server.
Comprehensive Code Examples
const http = require('http');
http.createServer((req, res) => {
if (req.url === '/' && req.method === 'GET') {
res.writeHead(200, { 'Content-Type': 'text/html' });
res.end(`
`);
} else if (req.url === '/submit' && req.method === 'POST') {
let body = '';
req.on('data', chunk => body += chunk);
req.on('end', () => {
const data = new URLSearchParams(body);
const username = data.get('username');
res.end(`Hello, ${username}`);
});
}
}).listen(3000);const http = require('http');
const url = require('url');
http.createServer((req, res) => {
if (req.url.startsWith('/search') && req.method === 'GET') {
const parsed = url.parse(req.url, true);
const term = parsed.query.term || 'nothing';
res.end(`Searching for: ${term}`);
}
}).listen(3000);const http = require('http');
http.createServer((req, res) => {
if (req.url === '/register' && req.method === 'POST') {
let body = '';
req.on('data', chunk => body += chunk);
req.on('end', () => {
const data = new URLSearchParams(body);
const email = data.get('email');
const password = data.get('password');
if (!email || !password || password.length < 6) {
res.writeHead(400);
return res.end('Invalid input');
}
res.end('Registration successful');
});
}
}).listen(3000);Common Mistakes
Using
GETfor sensitive data like passwords. Fix: usePOSTfor private form submissions.Forgetting to collect the full request body before parsing. Fix: wait for the
endevent.Trusting user input without validation. Fix: check required fields, length, and format on the server.
Best Practices
Use server-side validation even if the browser also validates fields.
Return helpful status codes such as
200,400, and404.Keep form routes clear, such as
/contactand/register.Sanitize input before storing or displaying it to reduce security risks.
Practice Exercises
Create a form that accepts a user name and displays a greeting after submission.
Build a search form using
GETand display the search term from the URL.Create a registration form with email and password, then reject submissions with empty fields.
Mini Project / Task
Build a simple contact form in Node Js that accepts name, email, and message, validates all three fields, and returns either an error message or a success response.
Challenge (Optional)
Extend your form handler so that one route supports both rendering the form with GET and processing submitted data with POST, while showing different responses for valid and invalid input.
REST API Basics
REST API basics introduce the standard way applications exchange data over the web. REST stands for Representational State Transfer, a style for designing web services where clients send HTTP requests and servers return structured responses, often in JSON. In real life, REST APIs power mobile apps fetching user profiles, e-commerce sites loading products, and dashboards updating analytics. In Node Js, REST APIs are commonly built with the built-in http module or frameworks like Express because backend systems need organized endpoints, clear request methods, and predictable responses.
A REST API usually revolves around resources such as users, products, orders, or tasks. Each resource gets a route like /users or /tasks/1. HTTP methods describe the action: GET reads data, POST creates data, PUT replaces data, PATCH partially updates data, and DELETE removes data. Status codes add meaning to responses, such as 200 for success, 201 for created, 400 for bad requests, 404 for not found, and 500 for server errors. Good APIs also use nouns in routes, return JSON consistently, and avoid mixing routing logic with unrelated business logic.
Step-by-Step Explanation
Start by creating a server and defining routes. A route is a URL path paired with an HTTP method. When a request arrives, your server checks the method and path, then sends a response. For example, GET /tasks might return all tasks, while GET /tasks/1 returns one task. When returning JSON, set the Content-Type header to application/json. For requests that send data, such as POST, the server reads the request body, parses JSON, validates input, then stores or processes it. Beginners should think of the flow as: receive request, identify route, read data if needed, perform logic, send JSON response with the correct status code.
Comprehensive Code Examples
Basic example
const http = require('http');
const server = http.createServer((req, res) => {
if (req.method === 'GET' && req.url === '/api/hello') {
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ message: 'Hello from API' }));
} else {
res.writeHead(404, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ error: 'Route not found' }));
}
});
server.listen(3000);Real-world example
const express = require('express');
const app = express();
app.use(express.json());
let tasks = [{ id: 1, title: 'Learn REST' }];
app.get('/api/tasks', (req, res) => res.json(tasks));
app.post('/api/tasks', (req, res) => {
const newTask = { id: tasks.length + 1, title: req.body.title };
tasks.push(newTask);
res.status(201).json(newTask);
});
app.listen(3000);Advanced usage
app.patch('/api/tasks/:id', (req, res) => {
const task = tasks.find(t => t.id === Number(req.params.id));
if (!task) return res.status(404).json({ error: 'Task not found' });
if (req.body.title !== undefined) task.title = req.body.title;
res.json({ message: 'Task updated', task });
});
app.delete('/api/tasks/:id', (req, res) => {
const index = tasks.findIndex(t => t.id === Number(req.params.id));
if (index === -1) return res.status(404).json({ error: 'Task not found' });
tasks.splice(index, 1);
res.json({ message: 'Task deleted' });
});Common Mistakes
- Using wrong HTTP methods: Do not use
GETto create data. UsePOSTfor creation andDELETEfor removal. - Forgetting JSON middleware: In Express, add
app.use(express.json())before readingreq.body. - Ignoring status codes: Always return meaningful codes like
201after creation or404when a resource does not exist.
Best Practices
- Use plural resource names like
/usersand/orders. - Return consistent JSON shapes for success and error responses.
- Validate incoming data before saving or updating resources.
- Keep routes simple and move complex logic into separate functions or services.
Practice Exercises
- Create a
GET /api/booksroute that returns an array of three book objects in JSON. - Build a
POST /api/booksroute that accepts a title and author and returns the created book with status201. - Add a
DELETE /api/books/:idroute that removes a book from an in-memory array and returns a message.
Mini Project / Task
Build a small REST API for a notes app with routes to create, list, update, and delete notes using JSON responses and correct HTTP status codes.
Challenge (Optional)
Extend your notes API by adding query filtering, such as returning only notes that match a keyword in the title or content.
CRUD API
A CRUD API (Create, Read, Update, Delete Application Programming Interface) is a fundamental component of almost any modern web application. It defines a set of operations that allow users or other applications to interact with a persistent data store, typically a database. In the context of Node.js, building a CRUD API involves setting up a server that listens for incoming HTTP requests, processes them, interacts with a database, and sends back appropriate responses. This pattern is ubiquitous because it mirrors the basic operations needed to manage data in any system. For instance, in a social media application, creating a post (Create), viewing your feed (Read), editing a post (Update), and deleting a post (Delete) all rely on CRUD operations. Similarly, an e-commerce site uses CRUD for managing products, users, and orders. Understanding and implementing a CRUD API is a cornerstone of backend development with Node.js, enabling you to build dynamic and data-driven applications.
The primary purpose of a CRUD API is to provide an interface for interacting with data resources. Each letter in CRUD corresponds to a standard HTTP method: 'Create' typically maps to POST, 'Read' to GET, 'Update' to PUT or PATCH, and 'Delete' to DELETE. This RESTful approach makes the API predictable and easy to understand. Node.js, with its asynchronous, event-driven architecture, is particularly well-suited for building fast and scalable CRUD APIs. Frameworks like Express.js simplify the routing and middleware aspects, making development efficient.
Core Concepts & Sub-types
While CRUD itself is a set of operations, its implementation often involves several core concepts:
- RESTful API Design: Adhering to Representational State Transfer (REST) principles is common. This involves using standard HTTP methods, stateless communication, and resource-based URLs (e.g.,
/users,/products/:id). - HTTP Methods: Each CRUD operation maps to an HTTP method. POST for creating new resources, GET for retrieving resources, PUT/PATCH for updating existing ones, and DELETE for removing them.
- Statelessness: Each request from a client to a server must contain all the information needed to understand the request. The server should not store any client context between requests.
- Resources: Data entities are treated as resources, uniquely identified by URLs.
- JSON (JavaScript Object Notation): The most common data format for sending and receiving data in Node.js APIs due to its lightweight nature and native JavaScript compatibility.
- Database Interaction: APIs need to interact with a database (e.g., MongoDB, PostgreSQL, MySQL) to store and retrieve data. This often involves ORMs (Object-Relational Mappers) or ODM (Object Data Mappers) like Mongoose for MongoDB.
- Routing: Defining different endpoints (URLs) and associating them with specific handler functions to process requests.
- Middleware: Functions that have access to the request object (
req), the response object (res), and the next middleware function in the applicationās request-response cycle. Used for tasks like authentication, logging, and parsing request bodies.
Step-by-Step Explanation
Let's break down how to build a basic CRUD API using Node.js and Express.js, with a focus on a simple in-memory data store for clarity, before moving to a database.
1. Initialize Project: Start by creating a new Node.js project and installing Express.
npm init -ynpm install express2. Create an Express App: Set up your main server file (e.g.,
app.js or server.js).const express = require('express');
const app = express();
const PORT = process.env.PORT || 3000;
// Middleware to parse JSON bodies
app.use(express.json());
// Start the server
app.listen(PORT, () => {
console.log(`Server running on port ${PORT}`);
});3. Define Data Storage: For simplicity, we'll use an array of objects as our 'database'. In a real application, this would be a database connection.
let items = [
{ id: '1', name: 'Item A', description: 'Description for Item A' },
{ id: '2', name: 'Item B', description: 'Description for Item B' }
];4. Implement CRUD Endpoints:
- CREATE (POST /items): Add a new item. Generate a unique ID.
- READ (GET /items): Get all items.
- READ (GET /items/:id): Get a single item by ID.
- UPDATE (PUT /items/:id): Update an existing item.
- DELETE (DELETE /items/:id): Remove an item.
Comprehensive Code Examples
Basic Example (In-memory CRUD)
This example demonstrates a full CRUD API for 'items' using an in-memory array.
const express = require('express');
const app = express();
const PORT = process.env.PORT || 3000;
app.use(express.json()); // Middleware to parse JSON request bodies
let items = [
{ id: '1', name: 'Laptop', description: 'High performance laptop' },
{ id: '2', name: 'Mouse', description: 'Wireless optical mouse' }
];
let nextId = 3; // Simple counter for new IDs
// GET all items
app.get('/items', (req, res) => {
res.status(200).json(items);
});
// GET item by ID
app.get('/items/:id', (req, res) => {
const { id } = req.params;
const item = items.find(item => item.id === id);
if (item) {
res.status(200).json(item);
} else {
res.status(404).json({ message: 'Item not found' });
}
});
// POST a new item (CREATE)
app.post('/items', (req, res) => {
const { name, description } = req.body;
if (!name || !description) {
return res.status(400).json({ message: 'Name and description are required' });
}
const newItem = { id: String(nextId++), name, description };
items.push(newItem);
res.status(201).json(newItem); // 201 Created
});
// PUT update an item by ID (UPDATE)
app.put('/items/:id', (req, res) => {
const { id } = req.params;
const { name, description } = req.body;
const itemIndex = items.findIndex(item => item.id === id);
if (itemIndex > -1) {
items[itemIndex] = { ...items[itemIndex], name, description };
res.status(200).json(items[itemIndex]);
} else {
res.status(404).json({ message: 'Item not found' });
}
});
// DELETE an item by ID (DELETE)
app.delete('/items/:id', (req, res) => {
const { id } = req.params;
const initialLength = items.length;
items = items.filter(item => item.id !== id);
if (items.length < initialLength) {
res.status(204).send(); // 204 No Content
} else {
res.status(404).json({ message: 'Item not found' });
}
});
app.listen(PORT, () => {
console.log(`Server running on http://localhost:${PORT}`);
});Real-world Example (MongoDB Integration with Mongoose)
This example shows how to integrate a MongoDB database using Mongoose. First, install Mongoose:
npm install mongoose.const express = require('express');
const mongoose = require('mongoose');
const app = express();
const PORT = process.env.PORT || 3000;
app.use(express.json());
// Connect to MongoDB
mongoose.connect('mongodb://localhost:27017/mycruddb', {
useNewUrlParser: true,
useUnifiedTopology: true
});
const db = mongoose.connection;
db.on('error', console.error.bind(console, 'connection error:'));
db.once('open', () => {
console.log('Connected to MongoDB');
});
// Define a Schema and Model
const itemSchema = new mongoose.Schema({
name: { type: String, required: true },
description: String,
createdAt: { type: Date, default: Date.now }
});
const Item = mongoose.model('Item', itemSchema);
// Routes for CRUD operations
// GET all items
app.get('/items', async (req, res) => {
try {
const items = await Item.find();
res.status(200).json(items);
} catch (err) {
res.status(500).json({ message: err.message });
}
});
// GET single item
app.get('/items/:id', async (req, res) => {
try {
const item = await Item.findById(req.params.id);
if (item == null) {
return res.status(404).json({ message: 'Cannot find item' });
}
res.status(200).json(item);
} catch (err) {
res.status(500).json({ message: err.message });
}
});
// POST create a new item
app.post('/items', async (req, res) => {
const item = new Item({
name: req.body.name,
description: req.body.description
});
try {
const newItem = await item.save();
res.status(201).json(newItem);
} catch (err) {
res.status(400).json({ message: err.message }); // 400 Bad Request
}
});
// PATCH update an item (using PATCH for partial updates, PUT for full replacement)
app.patch('/items/:id', async (req, res) => {
try {
const item = await Item.findById(req.params.id);
if (item == null) {
return res.status(404).json({ message: 'Cannot find item' });
}
if (req.body.name != null) {
item.name = req.body.name;
}
if (req.body.description != null) {
item.description = req.body.description;
}
const updatedItem = await item.save();
res.status(200).json(updatedItem);
} catch (err) {
res.status(400).json({ message: err.message });
}
});
// DELETE an item
app.delete('/items/:id', async (req, res) => {
try {
const item = await Item.findById(req.params.id);
if (item == null) {
return res.status(404).json({ message: 'Cannot find item' });
}
await item.deleteOne(); // Mongoose 6+ uses deleteOne() or deleteMany()
res.status(204).send(); // 204 No Content
} catch (err) {
res.status(500).json({ message: err.message });
}
});
app.listen(PORT, () => {
console.log(`Server running on http://localhost:${PORT}`);
});Advanced Usage (Error Handling and Validation)
This demonstrates more robust error handling and basic input validation, crucial for production APIs.
const express = require('express');
const mongoose = require('mongoose');
const app = express();
const PORT = process.env.PORT || 3000;
app.use(express.json());
mongoose.connect('mongodb://localhost:27017/myadvancedcruddb');
const db = mongoose.connection;
db.on('error', console.error.bind(console, 'connection error:'));
db.once('open', () => {
console.log('Connected to MongoDB for advanced example');
});
const productSchema = new mongoose.Schema({
name: { type: String, required: true, minlength: 3, maxlength: 50 },
price: { type: Number, required: true, min: 0 },
category: { type: String, enum: ['Electronics', 'Books', 'Clothing', 'Food'], default: 'Electronics' },
inStock: { type: Boolean, default: true }
});
const Product = mongoose.model('Product', productSchema);
// Middleware to get product by ID and handle 404
async function getProduct(req, res, next) {
let product;
try {
product = await Product.findById(req.params.id);
if (product == null) {
return res.status(404).json({ message: 'Cannot find product' });
}
} catch (err) {
return res.status(500).json({ message: err.message });
}
res.product = product;
next();
}
// GET all products
app.get('/products', async (req, res) => {
try {
const products = await Product.find();
res.status(200).json(products);
} catch (err) {
res.status(500).json({ message: err.message });
}
});
// GET single product
app.get('/products/:id', getProduct, (req, res) => {
res.status(200).json(res.product);
});
// POST create a new product
app.post('/products', async (req, res) => {
const { name, price, category, inStock } = req.body;
const product = new Product({ name, price, category, inStock });
try {
const newProduct = await product.save();
res.status(201).json(newProduct);
} catch (err) {
// Mongoose validation errors have specific structure
if (err.name === 'ValidationError') {
let errors = {};
for (field in err.errors) {
errors[field] = err.errors[field].message;
}
return res.status(400).json({ message: 'Validation failed', errors });
}
res.status(500).json({ message: err.message });
}
});
// PUT update a product (full replacement)
app.put('/products/:id', getProduct, async (req, res) => {
const { name, price, category, inStock } = req.body;
res.product.name = name || res.product.name;
res.product.price = price || res.product.price;
res.product.category = category || res.product.category;
res.product.inStock = inStock !== undefined ? inStock : res.product.inStock;
try {
const updatedProduct = await res.product.save();
res.status(200).json(updatedProduct);
} catch (err) {
if (err.name === 'ValidationError') {
let errors = {};
for (field in err.errors) {
errors[field] = err.errors[field].message;
}
return res.status(400).json({ message: 'Validation failed', errors });
}
res.status(400).json({ message: err.message });
}
});
// DELETE a product
app.delete('/products/:id', getProduct, async (req, res) => {
try {
await res.product.deleteOne();
res.status(204).send();
} catch (err) {
res.status(500).json({ message: err.message });
}
});
// Generic Error Handling Middleware
app.use((err, req, res, next) => {
console.error(err.stack);
res.status(500).send('Something broke!');
});
app.listen(PORT, () => {
console.log(`Advanced Server running on http://localhost:${PORT}`);
});Common Mistakes
- Missing
app.use(express.json()): Forgetting to include this middleware meansreq.bodywill beundefinedwhen trying to read JSON data from POST/PUT/PATCH requests. Fix: Always includeapp.use(express.json());early in your Express app setup. - Incorrect HTTP Methods: Using
GETfor creating data orPOSTfor retrieving specific resources. This violates REST principles and can lead to unexpected behavior or security issues. Fix: Adhere to standard HTTP methods: POST for Create, GET for Read, PUT/PATCH for Update, DELETE for Delete. - Lack of Error Handling: Not wrapping database operations in
try...catchblocks or not sending appropriate HTTP status codes (e.g., 404 for not found, 400 for bad request, 500 for server errors). Fix: Implement robusttry...catchfor all async operations and useres.status().json()orres.status().send()to communicate outcomes. - Insecure ID Generation: In the basic example, we used a simple counter for IDs. In a real-world scenario, this is not production-ready and can lead to collisions. Fix: Use universally unique identifiers (UUIDs) or rely on database-generated IDs (like MongoDB's
_id).
Best Practices
- Use a Router: For larger applications, organize your routes using
express.Router()to keep your code modular and maintainable. Define separate route files for different resources (e.g.,users.js,products.js). - Input Validation: Always validate incoming request data to ensure it meets your application's requirements and to prevent malicious input. Libraries like Joi or Express-validator are excellent for this. Mongoose schemas also provide built-in validation.
- Error Handling Middleware: Implement a centralized error handling middleware at the end of your Express chain to catch and process all errors gracefully, preventing your server from crashing and providing consistent error responses.
- Asynchronous Operations with
async/await: Useasync/awaitfor all asynchronous operations (especially database interactions) to write cleaner, more readable code than traditional callbacks or.then().catch()chains. - Pagination and Filtering for Read Operations: For endpoints that return multiple resources (e.g.,
GET /items), implement pagination, sorting, and filtering to handle large datasets efficiently and improve client-side performance. - Authentication and Authorization: Secure your API endpoints. Use JWTs (JSON Web Tokens) or session-based authentication to verify user identity and ensure they have the necessary permissions to perform actions.
- Environment Variables: Store sensitive information like database connection strings and API keys in environment variables (e.g., using
dotenv) rather than hardcoding them.
Practice Exercises
1. Build a Simple Task Manager API (In-Memory):
Create a Node.js Express application that manages a list of tasks. Each task should have an
id, title, and completed (boolean) status.Implement the following endpoints:
GET /tasks: Get all tasks.GET /tasks/:id: Get a single task by ID.POST /tasks: Create a new task.PUT /tasks/:id: Update a task (e.g., change title or completed status).DELETE /tasks/:id: Delete a task.
2. Add Basic Validation to Task Manager:
Enhance the Task Manager API from Exercise 1. When creating or updating a task, ensure that the
title is provided and is a non-empty string. Return a 400 status code with an appropriate error message if validation fails.3. Integrate with a Database:
Modify the Task Manager API to use MongoDB with Mongoose instead of an in-memory array. Define a Mongoose schema for the Task model, including
title (required, string) and completed (boolean, default false). Ensure all CRUD operations interact with the MongoDB database.Mini Project / Task
Build a Recipe API:
Create a Node.js Express application that functions as a simple Recipe API. It should allow users to manage recipes, storing them in a MongoDB database using Mongoose.
Each recipe should have at least the following fields:
name(String, required)ingredients(Array of Strings, required)instructions(String, required)cuisine(String, optional)prepTimeMinutes(Number, optional)
Implement the following API endpoints:
POST /recipes: Create a new recipe. Include validation for required fields.GET /recipes: Retrieve all recipes.GET /recipes/:id: Retrieve a single recipe by its ID.PUT /recipes/:id: Update an existing recipe by ID.DELETE /recipes/:id: Delete a recipe by ID.
Challenge (Optional)
Enhance the Recipe API with Search and Filtering:
Extend your Recipe API (Mini Project) to include advanced search and filtering capabilities for the
GET /recipes endpoint. Allow users to:- Search recipes by
name(case-insensitive partial match). - Filter recipes by
cuisine. - Filter recipes by
ingredients(e.g., find recipes that contain 'chicken' AND 'rice').
Example requests:
GET /recipes?name=pastaGET /recipes?cuisine=ItalianGET /recipes?ingredients=chicken,riceGET /recipes?name=soup&cuisine=Asian
Consider how to handle multiple query parameters and combine them effectively in your Mongoose queries.
MongoDB Introduction
MongoDB is a NoSQL, document-oriented database designed to store data in flexible, JSON-like documents. Instead of saving data in rigid tables with fixed rows and columns like a traditional relational database, MongoDB stores records as documents inside collections. This makes it a strong fit for Node Js applications because JavaScript objects used in code closely match the BSON documents MongoDB stores internally. In real-world backend development, MongoDB is commonly used for user accounts, product catalogs, blog systems, chat apps, analytics platforms, and APIs that need to evolve quickly as requirements change.
MongoDB exists to solve problems where flexible data models, rapid development, and horizontal scaling are important. For example, an e-commerce app may store products with different attributes such as size, color, storage, warranty, or brand-specific metadata. In a relational database, this can require many related tables or schema changes. In MongoDB, each product document can contain fields that make sense for that product while still belonging to the same collection.
The main building blocks are databases, collections, documents, fields, and ObjectId values. A database contains collections, a collection contains documents, and each document is made of key-value pairs. MongoDB also supports arrays, nested objects, indexes, and powerful queries. While it is called schema-flexible, that does not mean structure should be random; good design still matters. In Node Js projects, MongoDB is often accessed using the official MongoDB driver or an ODM like Mongoose, though understanding the database itself comes first.
Step-by-Step Explanation
To begin, think of a MongoDB document as a JavaScript object. A collection is similar to a group of related objects, such as users or orders. A simple user document might include a name, email, age, and an array of skills. MongoDB assigns a unique _id field to each document unless you provide one. Queries are written as objects too, which feels natural in Node Js. For example, finding users older than 18 uses a filter object with a comparison operator.
When learning MongoDB, start with four actions: insert documents, find documents, update documents, and delete documents. These are often called CRUD operations: Create, Read, Update, Delete. You should also understand that MongoDB stores data as BSON, which extends JSON by supporting additional types such as dates and ObjectId. This matters because backend systems often need timestamps and unique identifiers.
Comprehensive Code Examples
Basic example
const { MongoClient } = require('mongodb');
async function run() {
const client = new MongoClient('mongodb://127.0.0.1:27017');
await client.connect();
const db = client.db('school');
const students = db.collection('students');
await students.insertOne({ name: 'Ava', age: 20, course: 'Node Js' });
const result = await students.find().toArray();
console.log(result);
await client.close();
}
run();Real-world example
const product = {
name: 'Wireless Mouse',
price: 25.99,
inStock: true,
tags: ['electronics', 'accessory'],
details: { brand: 'LogiTech', color: 'Black' }
};
await db.collection('products').insertOne(product);
const found = await db.collection('products').find({ inStock: true }).toArray();
console.log(found);Advanced usage
await db.collection('users').updateOne(
{ email: '[email protected]' },
{
$set: { lastLogin: new Date() },
$push: { roles: 'member' }
},
{ upsert: true }
);
const adults = await db.collection('users').find({ age: { $gte: 18 } }).toArray();
console.log(adults);Common Mistakes
- Assuming MongoDB has no structure: Flexible does not mean messy. Define consistent field names and document shapes.
- Forgetting to close database connections: Always close the client when your script finishes, or manage connection reuse in apps.
- Using strings instead of proper types: Store numbers as numbers and dates as dates, not plain text.
- Ignoring the
_idfield: Learn how MongoDB uniquely identifies documents for updates and lookups.
Best Practices
- Design documents around how your application reads data most often.
- Use indexes for frequently searched fields such as email or product name.
- Validate incoming data before inserting it into the database.
- Keep naming consistent across collections and documents.
- Use environment variables for connection strings instead of hardcoding them.
Practice Exercises
- Create a collection named
booksand insert three book documents with title, author, and year fields. - Write a query that returns all students whose age is greater than or equal to 18.
- Update one product document by changing its price and adding a new tag field.
Mini Project / Task
Build a small Node Js script for a movie collection. Insert at least five movies, display all movies in one genre, update one movie rating, and delete one old record.
Challenge (Optional)
Design a MongoDB collection for a course platform where each course contains a title, instructor, lessons array, and student count. Then write queries to find courses with more than 100 students and update one lesson title inside a document.
Mongoose Setup
Mongoose is an Object Data Modeling library for Node Js that helps developers work with MongoDB using structured schemas and models instead of writing raw database logic everywhere. It exists to make data handling more organized, readable, and safer by adding validation, defaults, middleware, and model-based queries. In real projects such as e-commerce apps, blogs, dashboards, and booking platforms, Mongoose is often used to manage users, products, orders, comments, and other stored records. The setup process usually includes installing the package, connecting your Node Js application to MongoDB, creating a schema, building a model, and testing read and write operations. There are also common setup styles: direct local MongoDB connection, MongoDB Atlas cloud connection, and environment-variable-based configuration for secure production apps. Understanding this setup is important because nearly every database-driven Node Js backend starts with a reliable connection layer and clear models.
Step-by-Step Explanation
First, install Mongoose with npm using npm install mongoose. Next, import it into your project with require('mongoose') or ES module syntax. Then call mongoose.connect() and pass your MongoDB connection string. This string may point to a local database such as mongodb://127.0.0.1:27017/schoolDB or a cloud database from MongoDB Atlas. After connecting, define a schema using new mongoose.Schema({...}). A schema describes fields like name, email, age, or createdAt, and can include data types, required rules, default values, and validation. After that, create a model with mongoose.model('User', userSchema). The model becomes the main tool for creating, querying, updating, and deleting documents in a collection. Finally, test the connection by saving a document and reading it back. In larger applications, developers usually place connection logic in a dedicated file, keep secrets in environment variables, and start the server only after the database connection succeeds.
Comprehensive Code Examples
const mongoose = require('mongoose');
mongoose.connect('mongodb://127.0.0.1:27017/testdb')
.then(() => console.log('MongoDB connected'))
.catch((err) => console.error('Connection error:', err));const mongoose = require('mongoose');
mongoose.connect('mongodb://127.0.0.1:27017/blogdb');
const postSchema = new mongoose.Schema({
title: { type: String, required: true },
author: { type: String, required: true },
published: { type: Boolean, default: false }
});
const Post = mongoose.model('Post', postSchema);
async function run() {
const post = await Post.create({ title: 'Getting Started', author: 'Ava' });
const posts = await Post.find();
console.log(post);
console.log(posts);
}
run();const mongoose = require('mongoose');
async function connectDB() {
try {
await mongoose.connect(process.env.MONGO_URI, {
dbName: 'shopdb'
});
console.log('Database connected');
} catch (error) {
console.error('Database connection failed:', error.message);
process.exit(1);
}
}
const productSchema = new mongoose.Schema({
name: { type: String, required: true, trim: true },
price: { type: Number, required: true, min: 0 },
inStock: { type: Boolean, default: true }
}, { timestamps: true });
const Product = mongoose.model('Product', productSchema);
module.exports = { connectDB, Product };Common Mistakes
- Using the wrong connection string: Double-check the database name, host, and credentials.
- Starting database queries before connection completes: Use
awaitor.then()before running model operations. - Forgetting required schema fields: Add the needed properties when creating documents or Mongoose validation will fail.
- Hardcoding secrets: Store Atlas credentials in environment variables instead of source code.
Best Practices
- Keep connection logic in a separate file for clean project structure.
- Use schema validation to prevent invalid data from entering the database.
- Prefer async/await for clearer error handling and readable setup code.
- Use environment variables for database URLs and sensitive configuration.
- Add timestamps in schemas when records need created and updated tracking.
Practice Exercises
- Install Mongoose and connect a Node Js file to a local MongoDB database named
studentdb. - Create a schema for a
Studentmodel with name, age, and course fields, then save one document. - Create a model for
Bookwith a required title and default availability value, then fetch all books.
Mini Project / Task
Build a small Node Js database setup for a task manager. Connect Mongoose to MongoDB, create a Task schema with title, completed, and dueDate fields, then insert and list tasks from the database.
Challenge (Optional)
Refactor your setup so the application loads the MongoDB connection string from an environment variable, connects through a reusable function, and prevents the server from starting if the database connection fails.
Schemas and Models
In Node Js applications that use MongoDB, schemas and models are most commonly created with Mongoose to give structure to otherwise flexible JSON-like documents. A schema defines the shape of data: what fields exist, what types they use, whether they are required, and what validation rules apply. A model is the usable class built from that schema, allowing your application to create, read, update, and delete documents in a collection. In real life, this is used in systems like user accounts, product catalogs, blog posts, bookings, and payment records, where data consistency matters. Without schemas, teams often store messy and inconsistent data. With schemas and models, you can enforce business rules, reduce bugs, and keep your code easier to understand.
Important schema features include field types such as String, Number, Boolean, Date, Array, ObjectId, and nested objects. You can also define defaults, enums, custom validators, timestamps, and references between collections. A model is then created from the schema and acts like a constructor plus query interface. For example, a User schema may define name, email, and password fields, while the User model lets you save new users or search existing ones. This pattern separates data design from data operations, which is essential in backend development.
Step-by-Step Explanation
First, install and import Mongoose. Then connect your Node Js app to MongoDB. Next, create a schema using new mongoose.Schema(). Inside it, define each field and its rules. After that, create a model using mongoose.model('ModelName', schema). Finally, use the model to create and query documents.
Syntax breakdown: const schema = new mongoose.Schema({ name: { type: String, required: true } }, { timestamps: true }). Here, name is the field, type sets its data type, and required: true forces input. The second object contains schema options such as automatic createdAt and updatedAt. Then const User = mongoose.model('User', schema) creates the model. By convention, Mongoose maps User to a users collection.
Comprehensive Code Examples
const mongoose = require('mongoose');
const userSchema = new mongoose.Schema({
name: { type: String, required: true },
age: { type: Number, min: 0 },
isActive: { type: Boolean, default: true }
});
const User = mongoose.model('User', userSchema);const productSchema = new mongoose.Schema({
title: { type: String, required: true, trim: true },
price: { type: Number, required: true, min: 0 },
category: { type: String, enum: ['books', 'tech', 'fashion'] },
stock: { type: Number, default: 0 }
}, { timestamps: true });
const Product = mongoose.model('Product', productSchema);
async function createProduct() {
const product = await Product.create({
title: 'Node Js Guide',
price: 29.99,
category: 'books'
});
console.log(product);
}const orderSchema = new mongoose.Schema({
user: { type: mongoose.Schema.Types.ObjectId, ref: 'User', required: true },
items: [{
name: { type: String, required: true },
qty: { type: Number, required: true, min: 1 }
}],
total: {
type: Number,
validate: {
validator: value => value >= 0,
message: 'Total cannot be negative'
}
}
}, { timestamps: true });
const Order = mongoose.model('Order', orderSchema);Common Mistakes
- Using the wrong type: Storing numbers as strings causes sorting and calculation problems. Use the proper schema type.
- Forgetting validation: If required fields or enums are missing, bad data enters the database. Add validation rules early.
- Confusing schema and model: The schema defines structure, but the model performs operations. Always create the model before querying.
- Ignoring defaults: Beginners often manually set values that should be automatic. Use
defaultfor cleaner code.
Best Practices
- Keep schemas focused and readable by grouping related fields logically.
- Use
trim,lowercase, and validation rules for clean user input. - Enable
timestampsin most production schemas. - Use references with
ObjectIdwhen data belongs in another collection. - Name models clearly and keep one model per file in larger projects.
Practice Exercises
- Create a Student schema with name, age, course, and enrolled fields.
- Create a Book model where price cannot be negative and genre must be one of three allowed values.
- Create a Comment schema with message, author, and created date using a default value.
Mini Project / Task
Build a simple Task Manager model with title, description, status, priority, and dueDate fields. Add validation so title is required and status only allows specific values such as pending, in-progress, and completed.
Challenge (Optional)
Design an Employee schema that includes a nested address object, a department enum, a salary validator, and automatic timestamps. Then think about which fields should be required and why.
Database CRUD
Database CRUD stands for Create, Read, Update, and Delete, the four basic operations used to manage data in an application. In Node.js, CRUD is commonly used with databases such as MongoDB, PostgreSQL, and MySQL to store users, products, orders, comments, and many other records. In real life, when a user signs up, data is created; when a profile page loads, data is read; when account details change, data is updated; and when an account is removed, data is deleted. CRUD exists because almost every backend system needs a standard way to handle persistent information safely and efficiently.
In Node.js, CRUD is often implemented inside API routes using frameworks like Express. The most common pattern is mapping HTTP methods to database actions: POST for create, GET for read, PUT or PATCH for update, and DELETE for delete. You may work directly with a database driver or use an ORM/ODM such as Mongoose, Prisma, or Sequelize. The underlying idea stays the same: accept a request, validate input, perform a database action, and return a clear response.
For beginners, the key concepts are records, queries, IDs, validation, and async behavior. A record is one saved item, such as one user. A query is the instruction sent to the database. IDs help locate a specific record. Validation checks whether input is safe and complete. Because database tasks take time, Node.js usually handles them with async/await.
Step-by-Step Explanation
A basic CRUD flow in Node.js starts by connecting your application to a database. Then you define a model or table structure, such as a user with a name and email. Next, create route handlers. A create handler reads request data from req.body and inserts it into the database. A read handler fetches all records or one record by ID. An update handler finds a record by ID and changes selected fields. A delete handler removes the matching record.
The usual syntax pattern is: receive request, extract parameters, run an awaited database method, handle errors, and send JSON. For example, creating a user often looks like: get name and email from the request, call a model method like create(), then respond with status 201 and the saved object. Reading by ID often uses findById() or an equivalent query. Updating commonly uses findByIdAndUpdate() with validation enabled. Deleting commonly uses findByIdAndDelete().
Comprehensive Code Examples
const express = require('express');const app = express();app.use(express.json());let users = [];app.post('/users', (req, res) => { const user = { id: Date.now().toString(), name: req.body.name }; users.push(user); res.status(201).json(user); });app.get('/users', (req, res) => { res.json(users); });const mongoose = require('mongoose');const userSchema = new mongoose.Schema({ name: String, email: String });const User = mongoose.model('User', userSchema);app.get('/users/:id', async (req, res) => { const user = await User.findById(req.params.id); if (!user) return res.status(404).json({ message: 'User not found' }); res.json(user); });app.put('/users/:id', async (req, res) => { const updatedUser = await User.findByIdAndUpdate(req.params.id, req.body, { new: true, runValidators: true }); res.json(updatedUser); });app.delete('/users/:id', async (req, res) => { try { const deletedUser = await User.findByIdAndDelete(req.params.id); if (!deletedUser) return res.status(404).json({ message: 'User not found' }); res.json({ message: 'User deleted successfully' }); } catch (error) { res.status(500).json({ message: error.message }); } });Common Mistakes
Forgetting
awaiton database calls, which can send incomplete or incorrect responses. Fix: useasync/awaitconsistently.Not checking whether a record exists before updating or deleting. Fix: return
404when no matching item is found.Trusting request data without validation. Fix: validate required fields, formats, and allowed values before saving.
Best Practices
Use proper HTTP status codes like
201,200,404, and500.Wrap async database logic in
try/catchblocks to handle failures gracefully.Keep route handlers clean by moving database logic into service or controller files in larger projects.
Return clear JSON messages so frontend developers can use your API easily.
Practice Exercises
Create a
/productsPOST route that saves a product with a name and price.Build a GET route that returns all tasks from a database collection.
Add an update route that changes a student record by ID and returns the updated result.
Mini Project / Task
Build a simple notes API with routes to create a note, list all notes, update one note by ID, and delete one note by ID.
Challenge (Optional)
Extend your CRUD API by adding search and filtering, such as returning only users whose name matches a query string.
Authentication
Authentication is the process of verifying the identity of a user, service, or application. In the context of web development, it's how a server confirms that a client (like a web browser or mobile app) is who it claims to be before granting access to protected resources or functionalities. It exists because not all information or actions should be accessible to everyone; certain operations require specific permissions or knowledge. For instance, you wouldn't want anyone to access your online banking account or modify your personal profile without proving their identity first.
In real-life Node.js applications, authentication is a cornerstone of security. It's used in virtually every application that deals with user data, from e-commerce sites where users log in to make purchases, to social media platforms where users manage their profiles, to enterprise applications where employees access sensitive company data. Without robust authentication, an application is vulnerable to unauthorized access, data breaches, and a loss of user trust.
There are several core concepts and sub-types of authentication commonly used in Node.js applications.
- Session-Based Authentication: This is a traditional method where, after a user logs in, the server creates a session (a small piece of data stored on the server) and sends a session ID (often stored in a cookie) back to the client. For subsequent requests, the client sends this session ID, and the server uses it to retrieve the user's session data and verify their identity.
- Token-Based Authentication (e.g., JWT): In this modern approach, after successful login, the server issues a cryptographically signed token (like a JSON Web Token - JWT) to the client. The client stores this token (e.g., in local storage or a cookie) and sends it with each subsequent request. The server verifies the token's signature to ensure its authenticity and integrity, without needing to store session data on the server. This is stateless and scalable, making it popular for APIs and microservices.
- OAuth (Open Authorization): OAuth is an open standard for access delegation, commonly used for third-party applications to access user information on other services (e.g., "Login with Google" or "Login with Facebook"). It doesn't handle authentication itself but rather authorization, allowing a user to grant limited access to their resources without sharing their credentials.
- Multi-Factor Authentication (MFA): This adds an extra layer of security by requiring users to provide two or more verification factors to gain access. Common factors include something they know (password), something they have (phone, hardware token), or something they are (fingerprint, facial scan).
Step-by-Step Explanation
Let's focus on JWT-based authentication, a very common pattern in Node.js.
1. User Registration/Login: A user provides credentials (username/email and password).
2. Credential Verification: The Node.js server receives these credentials and verifies them against a database (e.g., by hashing the provided password and comparing it to the stored hash).
3. Token Generation: If credentials are valid, the server generates a JWT. This token contains a header (type of token, hashing algorithm), a payload (user data like user ID, roles, expiration time), and a signature (created by encoding the header and payload with a secret key).
4. Token Issuance: The server sends this JWT back to the client.
5. Client Storage: The client stores the JWT (e.g., in local storage or an HTTP-only cookie).
6. Subsequent Requests: For every subsequent request to a protected route, the client includes the JWT, typically in the
Authorization header as a Bearer token (e.g., Authorization: Bearer ).7. Token Verification: The Node.js server intercepts these requests, extracts the JWT, and verifies its signature using the same secret key. It also checks the token's expiration and other claims. If valid, the request is allowed to proceed; otherwise, an unauthorized error is returned.
Comprehensive Code Examples
Basic Example: JWT Generation and Verification
This example uses the
jsonwebtoken library.const jwt = require('jsonwebtoken');
const SECRET_KEY = 'your_super_secret_key'; // Keep this secret and in environment variables!
// 1. User logs in, credentials verified
const user = { id: 1, username: 'testuser', role: 'admin' };
// 2. Generate a JWT
const token = jwt.sign(user, SECRET_KEY, { expiresIn: '1h' });
console.log('Generated Token:', token);
// 3. Verify a JWT (e.g., on a subsequent request)
try {
const decoded = jwt.verify(token, SECRET_KEY);
console.log('Decoded Token:', decoded);
} catch (err) {
console.error('Token verification failed:', err.message);
}
Real-world Example: Express.js Authentication Middleware
Integrating JWT verification into an Express.js application.
const express = require('express');
const jwt = require('jsonwebtoken');
const bodyParser = require('body-parser');
const app = express();
const PORT = 3000;
const SECRET_KEY = 'your_super_secret_key';
app.use(bodyParser.json());
// Middleware to protect routes
function authenticateToken(req, res, next) {
const authHeader = req.headers['authorization'];
const token = authHeader && authHeader.split(' ')[1]; // Bearer TOKEN
if (token == null) return res.sendStatus(401); // No token provided
jwt.verify(token, SECRET_KEY, (err, user) => {
if (err) return res.sendStatus(403); // Token invalid or expired
req.user = user; // Attach user payload to request
next();
});
}
// Login route (generates token)
app.post('/login', (req, res) => {
// In a real app, you'd verify req.body.username and req.body.password against a DB
const user = { id: 1, username: req.body.username || 'demo', role: 'user' };
if (req.body.username === 'user' && req.body.password === 'pass') {
const accessToken = jwt.sign(user, SECRET_KEY, { expiresIn: '1h' });
res.json({ accessToken: accessToken });
} else {
res.status(401).send('Invalid credentials');
}
});
// Protected route
app.get('/profile', authenticateToken, (req, res) => {
res.json({ message: `Welcome ${req.user.username}!`, user: req.user });
});
app.listen(PORT, () => {
console.log(`Server running on port ${PORT}`);
});
Advanced Usage: Refresh Tokens
For better security and user experience, short-lived access tokens are often paired with long-lived refresh tokens. Access tokens expire quickly, reducing the window for compromise if stolen. When an access token expires, the client uses the refresh token to request a new access token without requiring the user to re-authenticate.
const jwt = require('jsonwebtoken');
const ACCESS_TOKEN_SECRET = 'access_secret';
const REFRESH_TOKEN_SECRET = 'refresh_secret';
let refreshTokens = []; // In a real app, store in DB
function generateAccessToken(user) {
return jwt.sign(user, ACCESS_TOKEN_SECRET, { expiresIn: '15m' });
}
function generateRefreshToken(user) {
const refreshToken = jwt.sign(user, REFRESH_TOKEN_SECRET, { expiresIn: '7d' });
refreshTokens.push(refreshToken); // Store the refresh token
return refreshToken;
}
// Simulate login
const user = { id: 1, username: 'john_doe' };
const accessToken = generateAccessToken(user);
const refreshToken = generateRefreshToken(user);
console.log('Access Token:', accessToken);
console.log('Refresh Token:', refreshToken);
// Simulate refreshing an expired access token
// In a real app, this would be an API endpoint /token
const requestRefreshToken = refreshToken; // Client sends this
if (!refreshTokens.includes(requestRefreshToken)) {
console.log('Invalid refresh token');
} else {
jwt.verify(requestRefreshToken, REFRESH_TOKEN_SECRET, (err, user) => {
if (err) {
console.log('Refresh token expired or invalid');
} else {
const newAccessToken = generateAccessToken({ id: user.id, username: user.username });
console.log('New Access Token:', newAccessToken);
}
});
}
Common Mistakes
1. Storing Secret Keys in Code: Hardcoding
SECRET_KEY directly in your source code is a major security vulnerability. Anyone with access to your code can generate or verify tokens. Fix: Always store secret keys in environment variables (e.g., using
dotenv package) and never commit them to version control.2. Not Validating Token Expiration: Failing to check the expiration (and other claims like
nbf - not before) of a JWT can allow stale or revoked tokens to grant access. Fix: The
jsonwebtoken.verify() method automatically handles expiration checks if expiresIn was set during signing. Ensure you catch errors from verify().3. Using HTTP for Sensitive Communication: Sending tokens (especially initial login credentials) over unencrypted HTTP connections exposes them to eavesdropping.
Fix: Always use HTTPS (SSL/TLS) for all communication involving authentication, even in development environments if possible.
Best Practices
- Use Strong, Unique Secret Keys: Generate long, complex, and cryptographically secure secret keys for signing your JWTs. Never reuse keys.
- Set Appropriate Token Expiration: Access tokens should be relatively short-lived (e.g., 15 minutes to a few hours) to minimize the impact of token compromise. Use refresh tokens for longer sessions.
- Store Refresh Tokens Securely: Refresh tokens should be stored securely on the server (e.g., in a database, hashed) and associated with a user. They should also have an expiration.
- Blacklist/Revoke Compromised Tokens: Implement a mechanism to invalidate or blacklist tokens (especially refresh tokens) if a user logs out, changes their password, or if a token is suspected to be compromised.
- Use HTTP-Only Cookies for Tokens: If storing tokens in cookies, use
HttpOnlyflag to prevent client-side JavaScript from accessing them, mitigating XSS attacks. Also, useSecureflag to ensure cookies are only sent over HTTPS. - Hash Passwords: Never store raw passwords. Always hash them using a strong, one-way hashing algorithm like bcrypt before storing them in the database.
Practice Exercises
1. Basic Login Simulation: Write a Node.js script that simulates a user login. If the username is 'admin' and password is 'password123', generate a JWT. Otherwise, print 'Invalid credentials'.
2. Protected Route Check: Create a simple Express.js application with two routes:
/login (which issues a JWT) and /data. Implement an authentication middleware for /data that verifies the JWT from the Authorization header. If valid, send 'Secret data!'; otherwise, send 'Unauthorized'.3. Token Expiration Handling: Modify your JWT generation to have a very short expiration (e.g., 5 seconds). Try verifying it immediately, then wait for 10 seconds and try verifying it again. Observe the error handling.
Mini Project / Task
Build a simple API with Node.js and Express.js that has two endpoints:
/signup and /dashboard. The /signup endpoint should accept a username and password, hash the password (you can use a dummy hash for this task, no real DB needed), and respond with a success message. The /dashboard endpoint should be protected by JWT authentication. Only users who have a valid JWT (which you can manually generate for testing after a 'signup') should be able to access the dashboard and receive a personalized welcome message including their username.Challenge (Optional)
Extend the mini project by implementing a
/login endpoint that takes username/password, verifies them (again, dummy check is fine), and then issues a JWT. Also, add a /logout endpoint that simulates token invalidation (e.g., by adding the token's ID to a blacklist array in memory). Subsequent requests with a blacklisted token should be rejected. JWT Authentication
JWT authentication is a way to verify who a user is after login by issuing a signed token called a JSON Web Token. Instead of storing session data on the server, the server creates a token containing user-related claims such as an id, role, and expiration time. The client sends this token with later requests, usually in the Authorization header as Bearer token. JWT exists because modern applications often need stateless authentication that works well across web apps, mobile apps, APIs, and microservices. In real life, it is used in login systems, protected dashboards, e-commerce admin panels, and single-page applications where the frontend and backend are separate.
A JWT has three parts: header, payload, and signature. The header defines the token type and signing algorithm. The payload contains claims such as user id and expiry. The signature is generated from the header, payload, and a secret key or private key, which helps detect tampering. Common claim types include registered claims like exp, iat, and sub, public claims such as role, and custom claims such as department or permissions. JWTs are commonly used as access tokens. Some systems also issue refresh tokens, which live longer and are used to request new access tokens without forcing the user to log in again.
Step-by-Step Explanation
In Node Js, JWT authentication usually starts after verifying a userās email and password. First, install packages such as express, jsonwebtoken, and often bcrypt for password hashing. Next, define a secret key in environment variables. When the user logs in successfully, call jwt.sign() with a payload object, the secret, and options like expiresIn. Then return the token to the client. For protected routes, create middleware that reads the authorization header, extracts the token, and calls jwt.verify(). If verification succeeds, attach the decoded user data to req.user and continue. If verification fails, return 401 Unauthorized or 403 Forbidden depending on the case.
The typical syntax is simple: generate with jwt.sign(payload, secret, options) and verify with jwt.verify(token, secret). The payload should stay small and should not contain sensitive data like passwords. Tokens can be stored in memory, secure cookies, or local storage, but secure cookies are often preferred for browser apps because they reduce JavaScript access. Always send JWTs over HTTPS in production.
Comprehensive Code Examples
const jwt = require('jsonwebtoken');
const secret = 'mysecret';
const token = jwt.sign({ userId: 101 }, secret, { expiresIn: '1h' });
console.log(token);
const decoded = jwt.verify(token, secret);
console.log(decoded);const express = require('express');
const jwt = require('jsonwebtoken');
const app = express();
app.use(express.json());
const SECRET = process.env.JWT_SECRET || 'supersecret';
app.post('/login', (req, res) => {
const { email, password } = req.body;
if (email === '[email protected]' && password === '123456') {
const token = jwt.sign({ id: 1, role: 'admin' }, SECRET, { expiresIn: '15m' });
return res.json({ token });
}
res.status(401).json({ message: 'Invalid credentials' });
});
function auth(req, res, next) {
const header = req.headers.authorization;
if (!header || !header.startsWith('Bearer ')) {
return res.status(401).json({ message: 'Token missing' });
}
const token = header.split(' ')[1];
try {
req.user = jwt.verify(token, SECRET);
next();
} catch (err) {
res.status(401).json({ message: 'Invalid token' });
}
}
app.get('/dashboard', auth, (req, res) => {
res.json({ message: 'Welcome', user: req.user });
});function authorizeRole(role) {
return (req, res, next) => {
if (req.user.role !== role) {
return res.status(403).json({ message: 'Access denied' });
}
next();
};
}
app.get('/admin', auth, authorizeRole('admin'), (req, res) => {
res.json({ message: 'Admin area' });
});Common Mistakes
Storing passwords inside the token. Fix: store only safe claims like id and role.
Using a hardcoded secret in production. Fix: load secrets from environment variables.
Forgetting token expiration. Fix: always set
expiresInand handle expiry errors.Not checking the
Bearerprefix. Fix: validate header format before verifying.
Best Practices
Keep JWT payloads minimal and never place sensitive information inside them.
Use short-lived access tokens and, when needed, pair them with refresh tokens.
Protect secrets with environment variables and rotate them when necessary.
Use HTTPS and consider secure, httpOnly cookies for browser-based apps.
Separate authentication middleware from authorization middleware for cleaner design.
Practice Exercises
Create a login route that returns a JWT containing a user id and expires in 10 minutes.
Build middleware that reads a bearer token and blocks access when the token is missing or invalid.
Add a protected route called
/profilethat returns the decoded token data.
Mini Project / Task
Build a small Express API with three routes: /login, /profile, and /admin. Issue a JWT on login, protect the profile route for any authenticated user, and allow the admin route only when the token contains the admin role.
Challenge (Optional)
Extend your authentication system by adding refresh tokens and a /refresh-token endpoint that issues a new access token when the old one expires.
Bcrypt Hashing
Bcrypt is a password-hashing algorithm used to store passwords safely in applications. Instead of saving a userās plain-text password such as mySecret123, bcrypt converts it into a one-way hash that is extremely difficult to reverse. This exists because storing raw passwords is dangerous: if a database is leaked, attackers can immediately read every password. In real applications such as banking apps, e-commerce sites, admin dashboards, and social platforms, bcrypt helps protect user accounts even when data exposure happens. In Node Js, bcrypt is commonly used during user registration to hash a password before saving it, and during login to compare the entered password with the stored hash.
Bcrypt includes important ideas beginners should understand. First, it uses a salt, which is random data added before hashing. This prevents identical passwords from producing identical hashes. Second, it uses a cost factor or salt rounds, which controls how slow the hashing process is. Slower hashing makes brute-force attacks harder. In Node Js, you will usually see two main operations: hash() to create a hash and compare() to verify a password. You may also encounter bcrypt and bcryptjs. Both provide similar APIs, but bcrypt is the more common production choice when native bindings are acceptable.
Step-by-Step Explanation
Install the package with npm install bcrypt. Import it using const bcrypt = require('bcrypt'). To hash a password, call await bcrypt.hash(password, saltRounds). The first argument is the plain password, and the second is the number of salt rounds, often 10 to 12 for many apps. To verify a password, use await bcrypt.compare(plainPassword, hashedPassword). This returns true if they match. Beginners should not try to manually compare strings because bcrypt hashes include a salt and will differ each time even for the same password.
Comprehensive Code Examples
const bcrypt = require('bcrypt');
async function basicHash() {
const password = 'mySecret123';
const hashed = await bcrypt.hash(password, 10);
console.log(hashed);
}
basicHash();const bcrypt = require('bcrypt');
async function loginExample() {
const enteredPassword = 'mySecret123';
const storedHash = await bcrypt.hash('mySecret123', 10);
const isMatch = await bcrypt.compare(enteredPassword, storedHash);
console.log(isMatch ? 'Login success' : 'Invalid credentials');
}
loginExample();const bcrypt = require('bcrypt');
async function registerUser(user) {
const rounds = 12;
const hashedPassword = await bcrypt.hash(user.password, rounds);
const savedUser = {
email: user.email,
password: hashedPassword
};
return savedUser;
}
async function verifyLogin(user, enteredPassword) {
return await bcrypt.compare(enteredPassword, user.password);
}
(async () => {
const user = await registerUser({ email: '[email protected]', password: 'Admin@123' });
console.log(await verifyLogin(user, 'Admin@123'));
})();Common Mistakes
- Saving plain passwords: Always hash passwords before storing them in a database.
- Comparing hashes manually: Use
bcrypt.compare()instead of string equality. - Using very low salt rounds: Avoid weak settings like 1 or 2; use a practical value such as 10 or more.
- Hashing on every request unnecessarily: Only hash during registration or password updates, not for unrelated actions.
Best Practices
- Use async functions so hashing does not block application flow unnecessarily.
- Store only the bcrypt hash, never the original password or reversible encryption.
- Combine bcrypt with strong password rules and rate-limited login attempts.
- Choose salt rounds based on security needs and server performance testing.
- Re-hash passwords when users change them, not during every login.
Practice Exercises
- Create a script that hashes a password entered in code and prints the result.
- Write a login check function that compares a plain password with a stored bcrypt hash.
- Build a small registration function that returns a user object with an email and hashed password.
Mini Project / Task
Build a simple Node Js authentication demo with two functions: one for registering a user with a hashed password and one for logging in by verifying the entered password against the stored hash.
Challenge (Optional)
Create a small command-line app that lets a user register once, stores the hash in memory, and then asks for a login password to verify access securely using bcrypt.
Authorization
Authorization is the process of deciding what an authenticated user is allowed to do. In Node Js applications, authentication answers, Who are you?, while authorization answers, What can you access? This exists because not every user should have the same permissions. For example, in an e-commerce app, customers can place orders, support staff can view tickets, and administrators can manage users and products. In APIs, authorization is commonly used to protect routes, limit actions by role, and enforce business rules. The most common authorization styles are role-based access control where permissions depend on roles such as admin or editor, permission-based access control where access depends on specific actions like create:post or delete:user, and resource-based authorization where users can act only on resources they own, such as editing their own profile. In Node Js, authorization is usually implemented with middleware in frameworks like Express. Middleware checks the user information added after authentication, then either allows the request to continue or returns an error such as 403 Forbidden. This separation keeps code organized and secure.
Step-by-Step Explanation
First, authenticate the user and attach their identity to the request, often as req.user. This object may contain fields like id, role, or permissions. Second, create authorization middleware. A basic role middleware receives allowed roles and checks whether req.user.role is included. If yes, call next(). If not, return a 403 response. Third, apply the middleware to protected routes. For example, an admin dashboard route should run authentication first, then authorization. Fourth, when ownership matters, compare the logged-in user id with the resource owner id. If they match, allow access; otherwise deny it. Fifth, keep checks small and reusable so the same authorization rules can be used across many routes. This approach is beginner-friendly because each middleware has a single job: authenticate, authorize, then run controller logic.
Comprehensive Code Examples
const express = require('express');
const app = express();
function mockAuth(req, res, next) {
req.user = { id: 'u1', role: 'admin', permissions: ['read:reports'] };
next();
}
function allowRoles(...roles) {
return (req, res, next) => {
if (!req.user) return res.status(401).json({ message: 'Not authenticated' });
if (!roles.includes(req.user.role)) {
return res.status(403).json({ message: 'Forbidden' });
}
next();
};
}
app.get('/admin', mockAuth, allowRoles('admin'), (req, res) => {
res.json({ message: 'Welcome admin' });
});function allowPermissions(...requiredPermissions) {
return (req, res, next) => {
if (!req.user) return res.status(401).json({ message: 'Not authenticated' });
const hasAll = requiredPermissions.every(p => req.user.permissions.includes(p));
if (!hasAll) return res.status(403).json({ message: 'Missing permission' });
next();
};
}
app.get('/reports', mockAuth, allowPermissions('read:reports'), (req, res) => {
res.json({ report: 'Sales report data' });
});const posts = [
{ id: 'p1', title: 'My Post', ownerId: 'u1' },
{ id: 'p2', title: 'Other Post', ownerId: 'u2' }
];
function allowOwnerOrAdmin(req, res, next) {
const post = posts.find(p => p.id === req.params.id);
if (!post) return res.status(404).json({ message: 'Post not found' });
const isOwner = req.user.id === post.ownerId;
const isAdmin = req.user.role === 'admin';
if (!isOwner && !isAdmin) {
return res.status(403).json({ message: 'You cannot modify this post' });
}
req.post = post;
next();
}
app.put('/posts/:id', mockAuth, allowOwnerOrAdmin, (req, res) => {
res.json({ message: 'Post updated', post: req.post });
});Common Mistakes
- Confusing authentication with authorization
Fix: Verify identity first, then check permissions separately.
- Trusting client-sent roles
Fix: Read roles and permissions from a verified token or database, not from request body values.
- Protecting only the frontend
Fix: Always enforce authorization on backend routes because frontend checks can be bypassed.
- Returning the wrong status code
Fix: Use 401 for unauthenticated requests and 403 for authenticated users without access.
Best Practices
Use reusable middleware for roles, permissions, and ownership checks.
Follow least privilege: give users only the access they need.
Store authorization rules centrally to avoid inconsistent behavior.
Log denied access attempts for auditing and debugging.
Combine authorization with input validation and secure authentication tokens.
Practice Exercises
Create a middleware that allows only users with the role
adminto access/manage-users.Build a permission middleware that checks for
create:productbefore allowing a POST request.Create an ownership check so users can delete only their own comments.
Mini Project / Task
Build a small Express API for a blog with three routes: one public route, one admin-only route, and one route where a user can edit only their own post.
Challenge (Optional)
Design a middleware system that supports both role-based and permission-based authorization, then apply it to different routes without duplicating logic.
Environment Variables
Environment variables are dynamic named values that can affect the way running processes will behave on a computer. They are part of the operating system environment in which a process runs. In Node.js, environment variables are crucial for configuring applications based on their deployment environment (development, testing, production). They allow you to store configuration details like database connection strings, API keys, and port numbers outside of your codebase. This separation is vital for security, maintainability, and portability. For instance, a development database might run on your local machine, while a production database resides on a remote server. Instead of hardcoding these different connection strings into your application, you use environment variables. When the application starts, it reads these variables to determine which database to connect to. This prevents sensitive information from being committed to version control systems like Git, enhancing security significantly. They are widely used in containerized environments (like Docker) and cloud platforms (like AWS, Heroku, Netlify) where configurations often need to change without modifying the application code itself.
The primary reason for their existence is to ensure that your application can be deployed to various environments without code changes. Imagine an application that needs to connect to a database. In development, you might use a local SQLite database. In production, you'd use a robust PostgreSQL or MongoDB instance. Hardcoding these details would necessitate code changes and redeployment for each environment. Environment variables solve this by providing a mechanism to inject configuration at runtime. This practice aligns with the Twelve-Factor App methodology, specifically the 'Config' factor, which advocates for storing configuration in the environment.
In real-life scenarios, environment variables are indispensable. Consider an e-commerce application. It might have different API keys for payment gateways in its staging and production environments. It might also use different logging levels, feature flags, or even different external service endpoints. All these configurations are perfect candidates for environment variables. When the application runs, it checks `process.env.NODE_ENV` to determine if it's in development or production, and then adjusts its behavior accordingly. This flexibility makes applications robust and adaptable to various deployment strategies.
Step-by-Step Explanation
In Node.js, environment variables are accessed via the global `process.env` object. This object is a simple JavaScript object that contains all the environment variables available to the Node.js process at the time it starts. When you launch a Node.js application, the operating system passes its environment variables to the Node.js process, which then populates `process.env`.
To access an environment variable, you simply reference it as a property of `process.env`. For example, if you have an environment variable named `PORT`, you can access it using `process.env.PORT`. It's important to note that all values in `process.env` are strings. If you expect a number (like a port number) or a boolean, you'll need to explicitly convert it.
Setting environment variables can be done in several ways:
- Command Line: You can set variables directly when running your Node.js script. This is common for temporary settings or local development. Example: `PORT=3000 node app.js` (on Linux/macOS) or `set PORT=3000 && node app.js` (on Windows).
- `.env` files: For local development, it's common to use a `.env` file and a package like `dotenv`. This allows you to define variables in a file that is typically excluded from version control.
- Deployment Platforms: Cloud providers like Heroku, AWS, Netlify, Vercel, etc., provide interfaces to manage environment variables for your deployed applications.
- Shell Configuration: You can set variables in your shell's configuration file (e.g., `.bashrc`, `.zshrc`, `config.fish`) to make them persistently available for all processes launched from that shell.
When accessing these variables in your code, always provide a fallback or default value. This ensures your application doesn't crash if an expected environment variable is not set. For instance, `const port = process.env.PORT || 3000;` will use the `PORT` environment variable if it exists, otherwise it will default to `3000`.
Comprehensive Code Examples
Basic example
// app.js
const port = process.env.PORT || 3000;
const environment = process.env.NODE_ENV || 'development';
console.log(`Application running on port: ${port}`);
console.log(`Current environment: ${environment}`);
// To run this:
// On Linux/macOS:
// PORT=8080 NODE_ENV=production node app.js
// Output:
// Application running on port: 8080
// Current environment: production
// Without variables set:
// node app.js
// Output:
// Application running on port: 3000
// Current environment: development
Real-world example (using `dotenv`)
// .env file (DO NOT commit this to Git!)
DB_HOST=localhost
DB_USER=root
DB_PASS=mysecretpassword
API_KEY=your_super_secret_api_key_123
NODE_ENV=development
// server.js
require('dotenv').config(); // Load environment variables from .env file
const express = require('express');
const app = express();
const dbHost = process.env.DB_HOST;
const dbUser = process.env.DB_USER;
const dbPass = process.env.DB_PASS;
const apiKey = process.env.API_KEY;
const nodeEnv = process.env.NODE_ENV;
const port = process.env.PORT || 3000;
console.log(`Database Host: ${dbHost}`);
console.log(`Database User: ${dbUser}`);
console.log(`API Key: ${apiKey ? 'Loaded' : 'Not Found!'}`); // Don't log actual key
console.log(`Node Environment: ${nodeEnv}`);
app.get('/', (req, res) => {
res.send(`Hello from ${nodeEnv} environment!`);
});
app.listen(port, () => {
console.log(`Server running on http://localhost:${port}`);
});
// To run this:
// 1. npm install dotenv express
// 2. Create a .env file as shown above
// 3. node server.js
// Output (example):
// Database Host: localhost
// Database User: root
// API Key: Loaded
// Node Environment: development
// Server running on http://localhost:3000
Advanced usage (conditional configuration)
// config.js
const isProduction = process.env.NODE_ENV === 'production';
const config = {
port: process.env.PORT || 3000,
databaseUrl: isProduction ? process.env.PROD_DB_URL : process.env.DEV_DB_URL || 'mongodb://localhost:27017/devdb',
logLevel: isProduction ? 'info' : 'debug',
enableCaching: isProduction ? true : false
};
module.exports = config;
// app.js
require('dotenv').config();
const config = require('./config');
const express = require('express');
const app = express();
console.log(`App running on port: ${config.port}`);
console.log(`Using database: ${config.databaseUrl}`);
console.log(`Log Level: ${config.logLevel}`);
console.log(`Caching Enabled: ${config.enableCaching}`);
app.get('/status', (req, res) => {
res.json({
environment: process.env.NODE_ENV || 'development',
database: config.databaseUrl,
caching: config.enableCaching
});
});
app.listen(config.port, () => {
console.log(`Server started on port ${config.port}`);
});
// To run in development (with .env having DEV_DB_URL=mongodb://localhost/testdb):
// node app.js
// To run in production (PROD_DB_URL=mongodb://prod-server/prod-db NODE_ENV=production node app.js):
// PROD_DB_URL=mongodb://prod-server/prod-db NODE_ENV=production node app.js
Common Mistakes
- Hardcoding sensitive information: Developers often mistakenly embed API keys, database credentials, or secret keys directly in their code. This is a major security vulnerability, especially when committing to public repositories. Always use environment variables for sensitive data.
Fix: Move all sensitive data to environment variables and access them via `process.env`. Use `dotenv` for local development. - Not providing default values: Assuming an environment variable will always be set can lead to `undefined` errors if the variable is missing in a particular environment. This makes applications brittle.
Fix: Always provide a fallback or default value using the `||` operator or a dedicated configuration module. E.g., `const port = process.env.PORT || 3000;`. - Committing `.env` files to version control: The `.env` file contains environment-specific and often sensitive data. Committing it to Git exposes this information. It should be ignored by version control.
Fix: Add `.env` to your `.gitignore` file. Provide a `.env.example` file with placeholder values to guide other developers.
Best Practices
- Use `dotenv` for local development: The `dotenv` package is a simple and effective way to manage environment variables in a `.env` file during local development. Make sure `.env` is in `.gitignore`.
- Prefix custom environment variables: To avoid conflicts with system environment variables, consider prefixing your application-specific variables (e.g., `APP_PORT`, `MYAPP_API_KEY`).
- Validate environment variables: Before starting your application, validate that all required environment variables are present and correctly formatted. You can write a small script or use a library (e.g., `joi` or `envalid`) for this.
- Separate configuration from code: Adhere to the Twelve-Factor App methodology principle of storing configuration in the environment. This makes your application portable and scalable.
- Convert types explicitly: Remember that `process.env` values are always strings. Convert them to numbers, booleans, or other types as needed (e.g., `parseInt(process.env.PORT, 10)`, `process.env.ENABLE_FEATURE === 'true'`).
- Document required variables: Maintain a `README.md` or a `.env.example` file that clearly lists all expected environment variables and their purpose.
Practice Exercises
- Create a Node.js script that reads an environment variable named `GREETING_MESSAGE`. If it's not set, it should default to "Hello, World!". Print the final message to the console.
- Modify the previous script to also read a `USER_NAME` environment variable. If `USER_NAME` is set, combine it with `GREETING_MESSAGE` (e.g., "Hello, John Doe!"). Otherwise, use the default "Hello, World!".
- Write a Node.js application that uses `dotenv`. Create a `.env` file with `APP_MODE=development` and `DATABASE_URL=mongodb://localhost:27017/mydevdb`. Your application should print these values to the console. Ensure `.env` is ignored by Git in your practice repository.
Mini Project / Task
Build a simple Node.js Express server that serves different content based on the `NODE_ENV` environment variable. If `NODE_ENV` is 'production', the server should listen on port `80` and respond with "Welcome to Production!". If `NODE_ENV` is 'development', it should listen on port `3000` and respond with "Development server running...". Use default values if `NODE_ENV` or `PORT` are not set.
Challenge (Optional)
Extend the mini-project. Implement a configuration module (`config.js`) that exports an object containing environment-specific settings (e.g., database connection strings, API keys, logging levels). Your Express server should import and use these settings. Additionally, add a validation step at the application startup that checks if all critical environment variables (e.g., `NODE_ENV`, `PORT`) are set, and if not, log an error and exit the process gracefully.
File Uploads
File uploads are a fundamental feature in many web applications, allowing users to submit various types of filesāimages, documents, videos, etc.āto the server. In Node.js, handling file uploads involves receiving binary data from a client, processing it, and then often storing it in a persistent location, such as a local file system, a database, or cloud storage like AWS S3. This capability is crucial for applications ranging from social media platforms where users upload profile pictures, to document management systems, and e-commerce sites needing product images. Without robust file upload mechanisms, many interactive and data-rich web experiences would be impossible. The core challenge lies in securely and efficiently handling large binary data streams, preventing malicious uploads, and ensuring data integrity.
When a user uploads a file, their browser typically sends a POST request with the 'Content-Type' header set to 'multipart/form-data'. This special content type is designed to encapsulate multiple parts within a single request body, where each part can represent a form field or a file. Node.js applications need specific middleware or libraries to parse this 'multipart/form-data' payload, as the built-in Node.js HTTP module doesn't handle it directly. Popular choices include 'multer', 'formidable', and 'busboy', each offering different levels of abstraction and features for stream processing and file handling.
Step-by-Step Explanation
To handle file uploads in Node.js, especially with Express, the most common approach involves using the 'multer' middleware. Hereās a breakdown of the steps:
1. Install Multer: First, you need to install 'multer' in your project.
npm install multer2. Import Multer: In your Express application file, require 'multer'.
3. Configure Storage: Define how and where files will be stored. Multer provides `diskStorage` for saving to the local filesystem and `memoryStorage` for keeping files in memory (useful for small files or immediate processing before sending to cloud storage).
-
destination: A function to determine the folder where the uploaded files should be stored. -
filename: A function to determine the name of the file inside the destination folder. It's crucial to give unique names to avoid overwriting.4. Initialize Multer: Pass your storage configuration to `multer()`.
5. Use as Middleware: Apply the Multer instance as middleware to your route handler.
-
upload.single('fieldname'): For a single file upload, where 'fieldname' is the name attribute of the file input in your HTML form. -
upload.array('fieldname', maxCount): For multiple files with the same field name. -
upload.fields([{ name: 'avatar', maxCount: 1 }, { name: 'gallery', maxCount: 8 }]): For multiple files with different field names. -
upload.none(): If you only want to parse text fields of a multipart form.6. Access File Data: After the middleware processes the request, the uploaded file information can be accessed via
req.file (for single uploads) or req.files (for multiple uploads) in your route handler. This object contains properties like `filename`, `path`, `mimetype`, `size`, etc.Comprehensive Code Examples
Basic Example (Single File Upload)
This example demonstrates a simple Express server that accepts a single image file upload and saves it to a 'uploads/' directory.
const express = require('express');
const multer = require('multer');
const path = require('path');
const app = express();
const port = 3000;
// Set up storage for uploaded files
const storage = multer.diskStorage({
destination: function (req, file, cb) {
cb(null, 'uploads/'); // Files will be saved in the 'uploads/' directory
},
filename: function (req, file, cb) {
// Use the original filename with a timestamp to avoid name collisions
cb(null, file.fieldname + '-' + Date.now() + path.extname(file.originalname));
}
});
// Initialize multer with the storage configuration
const upload = multer({ storage: storage });
// Serve a basic HTML form for file upload
app.get('/', (req, res) => {
res.send(`Single File Upload
`);
});
// Route to handle single file upload
app.post('/upload-single', upload.single('myImage'), (req, res) => {
if (!req.file) {
return res.status(400).send('No file uploaded.');
}
res.send(`File uploaded successfully: ${req.file.filename}`);
});
// Start the server
app.listen(port, () => {
console.log(`Server listening at http://localhost:${port}`);
});
/*
To run this:
1. Create a directory named 'uploads' in the same folder as your script.
2. Visit http://localhost:3000 in your browser, select a file, and upload.
*/Real-world Example (Multiple Files with Validation)
This example shows how to upload multiple images, limit file size, and filter by file type.
const express = require('express');
const multer = require('multer');
const path = require('path');
const app = express();
const port = 3000;
// Set up storage for uploaded files
const storage = multer.diskStorage({
destination: function (req, file, cb) {
cb(null, 'uploads/images/'); // Store images in a subfolder
},
filename: function (req, file, cb) {
cb(null, file.fieldname + '-' + Date.now() + path.extname(file.originalname));
}
});
// File filter to accept only images (jpeg, jpg, png, gif)
const fileFilter = (req, file, cb) => {
const allowedTypes = /jpeg|jpg|png|gif/;
const mimetype = allowedTypes.test(file.mimetype);
const extname = allowedTypes.test(path.extname(file.originalname).toLowerCase());
if (mimetype && extname) {
return cb(null, true);
}
cb(new Error('Only image files (JPEG, JPG, PNG, GIF) are allowed!'), false);
};
// Initialize multer with storage, file filter, and file size limits
const upload = multer({
storage: storage,
limits: { fileSize: 1024 * 1024 * 5 }, // 5 MB file size limit
fileFilter: fileFilter
});
// Serve HTML form for multiple file uploads
app.get('/multiple', (req, res) => {
res.send(`Multiple File Upload
`);
});
// Route to handle multiple file uploads
app.post('/upload-multiple', upload.array('myImages', 5), (req, res) => {
if (!req.files || req.files.length === 0) {
return res.status(400).send('No files uploaded.');
}
const uploadedFileNames = req.files.map(file => file.filename).join(', ');
res.send(`Files uploaded successfully: ${uploadedFileNames}`);
});
// Error handling middleware for Multer
app.use((err, req, res, next) => {
if (err instanceof multer.MulterError) {
if (err.code === 'LIMIT_FILE_SIZE') {
return res.status(400).send('File size too large. Max 5MB allowed.');
} else if (err.code === 'LIMIT_UNEXPECTED_FILE') {
return res.status(400).send('Too many files uploaded or wrong field name.');
} else {
return res.status(400).send(err.message);
}
} else if (err) {
return res.status(400).send(err.message); // Custom error from fileFilter
}
next();
});
app.listen(port, () => {
console.log(`Server listening at http://localhost:${port}`);
});
/*
To run this:
1. Create a directory named 'uploads/images' in the same folder as your script.
2. Visit http://localhost:3000/multiple, select multiple image files, and upload.
Try uploading non-image files or files larger than 5MB to see error handling.
*/Advanced Usage (Cloud Storage Integration - S3 Example Concept)
While a full S3 integration requires AWS SDK setup and credentials, this conceptual example shows how you might use Multer's `memoryStorage` to get file buffers before sending them to a cloud service. This avoids saving files locally first.
const express = require('express');
const multer = require('multer');
// const AWS = require('aws-sdk'); // Uncomment and configure for actual S3 use
const app = express();
const port = 3000;
// Configure AWS S3 (conceptual - replace with actual config)
// AWS.config.update({
// accessKeyId: process.env.AWS_ACCESS_KEY_ID,
// secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
// region: process.env.AWS_REGION
// });
// const s3 = new AWS.S3();
// const S3_BUCKET = process.env.S3_BUCKET_NAME;
// Use memoryStorage to get file buffer directly
const uploadToMemory = multer({
storage: multer.memoryStorage(),
limits: { fileSize: 1024 * 1024 * 10 } // 10 MB limit for memory storage
});
app.get('/s3-upload', (req, res) => {
res.send(`Upload to Cloud (S3 Concept)
`);
});
app.post('/upload-s3', uploadToMemory.single('cloudFile'), async (req, res) => {
if (!req.file) {
return res.status(400).send('No file uploaded.');
}
// In a real application, you would send req.file.buffer to S3
// along with req.file.mimetype and a unique key.
console.log('File received in memory:', req.file.originalname);
console.log('File size:', req.file.size, 'bytes');
console.log('File mimetype:', req.file.mimetype);
// Example S3 upload logic (conceptual)
/*
const params = {
Bucket: S3_BUCKET,
Key: `uploads/${Date.now()}-${req.file.originalname}`, // Unique key
Body: req.file.buffer,
ContentType: req.file.mimetype,
ACL: 'public-read' // Or appropriate ACL
};
try {
const s3Data = await s3.upload(params).promise();
res.send(`File uploaded to S3 successfully. URL: ${s3Data.Location}`);
} catch (error) {
console.error('S3 Upload Error:', error);
res.status(500).send('Failed to upload file to S3.');
}
*/
res.send(`File '${req.file.originalname}' processed for cloud upload. (S3 integration not active in this example.)`);
});
// Error handling middleware for Multer (similar to previous example)
app.use((err, req, res, next) => {
if (err instanceof multer.MulterError) {
if (err.code === 'LIMIT_FILE_SIZE') {
return res.status(400).send('File size too large for memory upload. Max 10MB allowed.');
} else {
return res.status(400).send(err.message);
}
} else if (err) {
return res.status(400).send(err.message);
}
next();
});
app.listen(port, () => {
console.log(`Server listening at http://localhost:${port}`);
});Common Mistakes
- Missing
enctype="multipart/form-data"in HTML Form: Without this attribute, the browser will not send the file data correctly, and Multer (or any file upload parser) will not find any files.
Fix: Always include - Incorrect Field Name: The string passed to
upload.single()orupload.array()must exactly match thenameattribute of yourelement in the HTML form.
Fix: Double-check thatcorresponds toupload.single('myFile'). - No Destination Folder or Permissions Issues: If the specified
destinationdirectory doesn't exist or the Node.js process lacks write permissions, Multer will throw an error and fail to save files.
Fix: Manually create the directory (e.g.,mkdir uploads) and ensure your Node.js application has the necessary read/write permissions for that folder. - Inadequate Error Handling: Forgetting to implement middleware to catch Multer-specific errors (like file size limits or type restrictions) can lead to unhandled exceptions and poor user experience.
Fix: Always include a dedicated error-handling middleware after your Multer upload middleware, as shown in the real-world example.
Best Practices
- Validate File Types and Sizes: Always implement server-side validation for file types (using
fileFilter) and file sizes (usinglimits). Never trust client-side validation alone, as it can be easily bypassed. This prevents malicious uploads and server overload. - Generate Unique Filenames: Do not rely on original filenames, as they can cause collisions (overwriting existing files) or path traversal attacks. Use a combination of timestamps, UUIDs, or hashing to create unique filenames.
Date.now() + path.extname(file.originalname)is a common pattern. - Store Files Outside Public Directory: For security, store uploaded files in a directory that is not directly served by your web server (e.g., not inside 'public/'). If you need to serve them, use a dedicated route to fetch and stream them, allowing for access control.
- Sanitize User Input: If you use any part of the original filename or user-provided metadata in your storage path or database entries, ensure it's properly sanitized to prevent injection attacks.
- Consider Cloud Storage: For production applications, especially those requiring scalability, durability, and content delivery networks (CDNs), storing files in cloud services like AWS S3, Google Cloud Storage, or Azure Blob Storage is highly recommended over local disk storage.
- Implement Error Handling: Provide clear and helpful error messages to the user if an upload fails due to size limits, incorrect file types, or server issues.
Practice Exercises
- Beginner-friendly: Create an Express application that allows a user to upload a single text file (.txt). Save the file with a unique name in an 'documents/' folder and send back a success message displaying the saved filename.
- Intermediate: Modify the multi-file upload example to accept up to 3 PDF files only, with a maximum size of 2MB per file. If any file fails validation, return an appropriate error message.
- Advanced: Extend the single image upload example to also receive a text field 'description' along with the image. Store the image and log both the image filename and the description to the console upon successful upload.
Mini Project / Task
Build a simple 'Profile Picture Uploader' application. It should have a single HTML page with a file input and a submit button. When a user uploads an image, the server should save it to a 'profile_pics/' directory, ensuring the filename is unique. After successful upload, redirect the user to a page that displays their newly uploaded profile picture (you'll need to serve static files for this). Implement basic image type (JPG, PNG) and size (max 2MB) validation.
Challenge (Optional)
Enhance the 'Profile Picture Uploader' mini-project. After a successful upload, instead of just saving the original image, use a library like 'sharp' or 'jimp' (you'll need to install one) to resize the uploaded image to a thumbnail (e.g., 150x150 pixels) and save both the original and the thumbnail version. Ensure the server stores both files with unique names and can display both versions on a success page.
Logging and Debugging
Logging and debugging are essential skills for every Node Js developer because backend applications often run without a visual interface. When something goes wrong, logs help you understand what happened, when it happened, and why it happened. Debugging helps you pause execution, inspect variables, trace control flow, and identify bugs more precisely. In real-world systems, logging is used in web servers, authentication systems, payment flows, background jobs, and API integrations. For example, a team may log incoming requests, failed database queries, and user login attempts to monitor application health and troubleshoot issues quickly.
In Node Js, logging can be as simple as console.log(), but production systems usually need structured and meaningful logs such as info, warning, error, and debug messages. Debugging can be done with the built-in Node inspector, browser DevTools, or an editor like VS Code. Common logging types include application logs, error logs, access logs, and audit logs. Common debugging approaches include printing variable values, using breakpoints, stepping through code, inspecting stack traces, and enabling debug flags. Together, logging and debugging reduce downtime, speed up maintenance, and improve code quality.
Step-by-Step Explanation
Start with basic logging using console.log() to print values. Use console.error() for errors and console.warn() for warnings. Keep messages clear and include useful context such as request IDs, user IDs, or function names. Next, use try...catch to capture runtime errors in asynchronous code with await. Then learn the Node debugger by starting an app with node --inspect app.js or node inspect app.js. Add breakpoints in your editor and pause execution where the bug may occur. Inspect variables, step into functions, and follow stack traces carefully. For long-running apps, prefer structured logs so they are easier to search and analyze. Also, avoid logging secrets such as passwords, tokens, and private keys. In development, verbose debug logs are helpful; in production, keep logs useful, safe, and performance-aware.
Comprehensive Code Examples
// Basic example: simple logging
console.log('Server starting...');
const port = 3000;
console.log('Listening on port:', port);
console.warn('This is a warning message');
console.error('This is an error message');// Real-world example: logging inside an API handler
const http = require('http');
const server = http.createServer((req, res) => {
console.log(`[INFO] ${req.method} ${req.url}`);
if (req.url === '/users') {
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ message: 'User list loaded' }));
} else {
console.error(`[ERROR] Route not found: ${req.url}`);
res.writeHead(404);
res.end('Not Found');
}
});
server.listen(3000, () => console.log('[INFO] Server running on port 3000'));// Advanced example: async debugging and error tracing
async function fetchUserData(userId) {
try {
console.log('[DEBUG] Fetching user:', userId);
if (!userId) {
throw new Error('User ID is required');
}
const user = { id: userId, name: 'Ava' };
console.log('[INFO] User fetched successfully', user);
return user;
} catch (error) {
console.error('[ERROR]', error.message);
console.error(error.stack);
}
}
fetchUserData();To debug this file, run node --inspect app.js and open the debugger in Chrome or VS Code. Set a breakpoint inside fetchUserData and inspect the value of userId before the error is thrown.
Common Mistakes
Using only
console.log()for everything: Useconsole.warn(),console.error(), and meaningful labels so logs are easier to read.Logging sensitive data: Never print passwords, tokens, API keys, or personal user data.
Ignoring stack traces: When errors happen, inspect
error.stackto locate the source faster.Too many useless logs: Remove noisy messages and keep logs focused on useful events.
Best Practices
Write short, clear, contextual log messages.
Separate info, warning, debug, and error levels.
Use timestamps and request identifiers in larger applications.
Prefer structured logs in production systems.
Use breakpoints for complex bugs instead of guessing.
Test both success and failure paths while debugging.
Practice Exercises
Create a Node Js script that logs a startup message, a warning, and an error.
Write an async function that throws an error when an input is missing, then catch and log the message and stack trace.
Build a simple HTTP server that logs each request method and URL.
Mini Project / Task
Build a small request logger server that listens on port 3000, logs every incoming route, returns JSON for one valid route, and logs an error for unknown routes.
Challenge (Optional)
Create a reusable logger function that accepts a log level and message, then formats output like [INFO] Application started or [ERROR] Database connection failed.
Testing Basics
Testing in Node Js means writing code that checks whether your application behaves the way you expect. Instead of manually clicking through features every time you make a change, automated tests run quickly and repeatedly to confirm that functions, routes, and services still work. In real projects, testing is used for API endpoints, validation logic, database-related behavior, authentication flows, and utility functions. It exists to reduce bugs, improve confidence during refactoring, and make collaboration safer for teams. The most common testing categories beginners should know are unit tests, integration tests, and end-to-end style checks. A unit test focuses on one small piece, such as a function that calculates tax. An integration test checks how multiple parts work together, such as an Express route calling a service and returning JSON. End-to-end tests simulate full user behavior across the app. In Node Js, popular tools include the built-in test runner, Jest, Mocha, Chai, and Supertest for HTTP testing. A good beginner workflow is simple: arrange the test data, act by calling the code, and assert the expected result. This is often called the Arrange-Act-Assert pattern. You should also understand test cases, assertions, mocks, and test isolation. Assertions verify results. Mocks and stubs replace real dependencies like databases or external APIs so tests stay fast and predictable. Test isolation means one test should not depend on another. If a test only passes when run after a different test, your suite becomes fragile. Good testing habits help teams deploy more often and with fewer surprises.
Step-by-Step Explanation
Start by creating a small file to test, such as a math utility. Then create a matching test file. In Node Js, tests usually use a naming pattern like sum.test.js or are placed inside a test folder. Each test should describe one behavior clearly. First, import the function being tested. Second, provide input values. Third, run the function. Fourth, compare the actual result with the expected result using an assertion. For asynchronous code, return a promise or use async/await so the test runner knows when the test finishes. For API testing, create your Express app and send requests to it using a tool like Supertest. Then assert the status code and response body. Keep tests small, readable, and independent.
Comprehensive Code Examples
Basic example
// sum.js
function sum(a, b) {
return a + b;
}
module.exports = sum;
// sum.test.js with Jest
const sum = require('./sum');
test('adds two numbers', () => {
expect(sum(2, 3)).toBe(5);
});Real-world example
// validator.js
function isValidEmail(email) {
return /^[^\s@]+@[^\s@]+\.[^\s@]+$/.test(email);
}
module.exports = { isValidEmail };
// validator.test.js
const { isValidEmail } = require('./validator');
test('accepts a valid email', () => {
expect(isValidEmail('[email protected]')).toBe(true);
});
test('rejects an invalid email', () => {
expect(isValidEmail('userexample.com')).toBe(false);
});Advanced usage
// app.js
const express = require('express');
const app = express();
app.get('/health', (req, res) => {
res.status(200).json({ status: 'ok' });
});
module.exports = app;
// app.test.js
const request = require('supertest');
const app = require('./app');
test('GET /health returns status ok', async () => {
const response = await request(app).get('/health');
expect(response.status).toBe(200);
expect(response.body.status).toBe('ok');
});Common Mistakes
- Not testing edge cases: beginners test only normal input. Also test empty values, invalid data, and unexpected types.
- Forgetting async handling: if you do not use
awaitor return the promise, tests may pass incorrectly. Always wait for asynchronous work. - Writing dependent tests: one test should not rely on data created by another. Reset state before each test when needed.
Best Practices
- Use clear test names that describe behavior, not implementation details.
- Prefer small focused tests over one large test that checks many things.
- Mock slow or external services to keep tests fast and reliable.
- Test business logic first because it provides the highest value.
- Run tests automatically before pushing code or deploying changes.
Practice Exercises
- Write a unit test for a function named
multiplythat returns the product of two numbers. - Create tests for a function that checks whether a password has at least 8 characters.
- Build a simple Express route
/pingand write a test that confirms it returns status 200 and the textpong.
Mini Project / Task
Create a small Node Js utility module for order totals with functions for subtotal, tax, and final total, then write tests for valid inputs, zero values, and invalid numbers.
Challenge (Optional)
Build a tested Express endpoint that accepts a user email in JSON, validates it, and returns either a success response or a 400 error with a helpful message.
Deployment
Deployment is the process of taking your Node Js application from your local machine and making it available in a live environment so real users, teammates, or client systems can access it. In real life, deployment is used when launching APIs, dashboards, e-commerce backends, chat servers, and automation services. A local app may work perfectly on your laptop, but deployment adds production concerns such as environment variables, process management, ports, logging, security, scalability, and monitoring. In Node Js, deployment commonly happens on virtual servers, container platforms, platform-as-a-service providers, or serverless systems. Common deployment styles include traditional VPS hosting, Docker-based deployment, cloud app platforms such as Render or Railway, and serverless functions for lightweight API endpoints. The goal is not only to make the app run, but to make it stable, restartable, secure, and maintainable.
Before deployment, your app should have a clear start script, dependency list, and configuration strategy. You should know the difference between development and production environments. In development, you may use tools like nodemon and verbose logs. In production, you should use optimized settings, real environment variables, and a process manager or hosting runtime. You must also make sure your app listens on the correct port, usually with process.env.PORT, because cloud providers assign ports dynamically.
Step-by-Step Explanation
First, prepare your application. In package.json, add a production start script such as node server.js. Second, configure your server to read the port from the environment. Third, move secrets like database URLs and API keys into environment variables instead of hardcoding them. Fourth, install only required dependencies and commit your code to a Git repository. Fifth, choose a deployment target. A platform service is easiest for beginners because it handles build and runtime setup. A VPS gives more control but requires server administration. Containers help standardize the environment. Sixth, deploy and test the live URL. Finally, add logging, health checks, and restart behavior.
Typical production syntax includes defining scripts in package.json, loading environment variables, and handling startup cleanly. For Express apps, always use const PORT = process.env.PORT || 3000; and bind using app.listen(PORT).
Comprehensive Code Examples
// Basic example: deployment-ready Express server
const express = require('express');
const app = express();
const PORT = process.env.PORT || 3000;
app.get('/', (req, res) => {
res.send('Node Js app is live');
});
app.listen(PORT, () => {
console.log(`Server running on port ${PORT}`);
});// Real-world example: health route and environment-based config
const express = require('express');
const app = express();
const PORT = process.env.PORT || 3000;
const NODE_ENV = process.env.NODE_ENV || 'development';
app.get('/health', (req, res) => {
res.json({ status: 'ok', environment: NODE_ENV });
});
app.get('/api/message', (req, res) => {
res.json({ message: 'Production-ready endpoint' });
});
app.listen(PORT, () => {
console.log(`App started in ${NODE_ENV} mode on ${PORT}`);
});// Advanced usage: graceful shutdown for production
const express = require('express');
const app = express();
const server = app.listen(process.env.PORT || 3000);
process.on('SIGTERM', () => {
console.log('SIGTERM received');
server.close(() => {
console.log('Server closed gracefully');
process.exit(0);
});
});Common Mistakes
- Hardcoding the port: Beginners often use only
3000. Fix it by usingprocess.env.PORT || 3000. - Committing secrets: API keys and database passwords should not be stored in source files. Fix it by using environment variables and a secure secret manager.
- Using development tools in production: Running nodemon or debug-heavy configs in production wastes resources. Fix it by using a proper start script and production settings.
- Missing error handling: Apps may crash silently. Fix it by logging startup errors and adding monitoring.
Best Practices
- Use
npm startwith a clean production script. - Set
NODE_ENV=productionwhen deploying. - Store secrets in environment variables, never in code.
- Add a health-check endpoint like
/health. - Use a process manager such as PM2 on VPS deployments.
- Enable graceful shutdown for reliable restarts.
- Check logs after every deployment and verify key routes.
Practice Exercises
- Create a simple Express server that uses
process.env.PORTand returns a live message on the home route. - Add a
/healthendpoint that returns JSON showing app status and environment name. - Update a Node Js app so all secrets are read from environment variables instead of hardcoded values.
Mini Project / Task
Deploy a small Express API with two routes, one public route and one health-check route, then verify it works on a live URL using environment-based port configuration.
Challenge (Optional)
Prepare the same Node Js application for two deployment styles: one on a platform service and one using Docker, then compare what configuration changes are required.
Process Management
Process management in Node Js means understanding and controlling the running application process. Every time you start a Node program with node app.js, Node creates a process that uses memory, CPU time, environment variables, and operating system resources. This matters because real applications must start cleanly, respond to shutdown requests, report failures, and release resources like database connections or open servers. In real life, process management is used in APIs, background workers, CLI tools, schedulers, and containerized apps running in Docker or cloud platforms. Important ideas include the global process object, process IDs, current working directory, environment variables, command-line arguments, exit codes, and OS signals such as SIGINT and SIGTERM. You will also see related approaches like graceful shutdown, child processes for running separate commands, and process managers such as PM2 that keep apps alive in production. Good process management improves reliability, observability, and safe deployment behavior.
Step-by-Step Explanation
The built-in process object is available everywhere in Node Js. Use process.pid to get the process ID, process.cwd() for the current folder, process.argv for command-line arguments, and process.env for environment variables. Use process.exit(code) to end the program, where 0 usually means success and non-zero values indicate errors. You can listen for events like exit, beforeExit, uncaughtException, and unhandledRejection. For shutdown control, listen for signals using process.on('SIGINT') and process.on('SIGTERM'). When a signal arrives, stop accepting new work, close servers, save state if needed, and then exit. For advanced scenarios, Node also supports creating other processes with modules like child_process, but the main beginner goal is to manage the current process safely and predictably.
Comprehensive Code Examples
console.log('PID:', process.pid);
console.log('Node version:', process.version);
console.log('Folder:', process.cwd());
console.log('Args:', process.argv.slice(2));
console.log('ENV MODE:', process.env.NODE_ENV || 'development');const http = require('http');
const server = http.createServer((req, res) => {
res.end('Server is running');
});
server.listen(3000, () => {
console.log('Listening on port 3000');
});
function shutdown(signal) {
console.log('Received', signal);
server.close(() => {
console.log('HTTP server closed');
process.exit(0);
});
}
process.on('SIGINT', () => shutdown('SIGINT'));
process.on('SIGTERM', () => shutdown('SIGTERM'));const { exec } = require('child_process');
console.log('Parent PID:', process.pid);
exec('node -v', (error, stdout, stderr) => {
if (error) {
console.error('Command failed:', error.message);
process.exit(1);
}
if (stderr) {
console.error('stderr:', stderr);
}
console.log('Child output:', stdout.trim());
console.log('Memory usage:', process.memoryUsage().heapUsed);
});Common Mistakes
- Calling
process.exit()too early: This can stop logs, requests, or database cleanup. Fix: close resources first, then exit. - Ignoring shutdown signals: Beginners often stop apps with forced termination only. Fix: handle
SIGINTandSIGTERMfor graceful shutdown. - Storing secrets directly in code: Fix: use
process.envand environment configuration. - Depending only on
uncaughtException: Fix: log the error, clean up if possible, and restart the app with a supervisor.
Best Practices
- Use environment variables for ports, secrets, and environment mode.
- Return meaningful exit codes so automation and CI tools can detect failures.
- Log process start time, PID, and shutdown reason for easier debugging.
- Implement graceful shutdown for HTTP servers, queues, and database connections.
- Use PM2, Docker restart policies, or orchestration tools instead of writing your own restart loop.
- Monitor memory usage and uptime to detect leaks or unhealthy behavior.
Practice Exercises
- Create a script that prints the process ID, Node version, current working directory, and two command-line arguments.
- Build a small server that reads the port from
process.env.PORTand falls back to3000. - Write a script that listens for
SIGINT, prints a shutdown message, waits one second, and exits with code0.
Mini Project / Task
Build a small health-check server that exposes one endpoint, logs its PID and uptime on startup, reads configuration from environment variables, and shuts down gracefully when the user presses Ctrl + C.
Challenge (Optional)
Create a CLI tool that accepts a shell command from process.argv, runs it with child_process, prints the output, and exits with a non-zero code when the command fails.
Final Project
The final project is where you combine the major ideas from a Node Js course into one complete backend application. It exists to move you from isolated exercises into real development, where you must design routes, organize files, validate input, handle errors, and deliver a working server that solves a business problem. In real life, Node Js final projects often become REST APIs for task managers, inventory systems, booking platforms, note apps, or admin dashboards. The goal is not only to make something that runs, but to build something maintainable, testable, and understandable.
For this project, think of your work as a small production-ready backend. A good beginner-friendly choice is a Task Manager API with features such as creating tasks, listing tasks, updating status, deleting tasks, and filtering by completion state. You can also extend it with users, authentication, or categories if your course has covered those topics. The project naturally includes several parts: project setup, folder organization, environment variables, routing, controllers, data storage, middleware, validation, and deployment preparation. These are the main sub-areas of a backend project, and understanding how they fit together is more important than memorizing individual commands.
Step-by-Step Explanation
Start by defining the project scope. Decide what the app does, who uses it, and which endpoints are required. Next, initialize the project using npm init -y and install needed packages such as Express and dotenv. Create a clean structure like src/routes, src/controllers, src/middleware, and src/data. Then create an app.js or server.js file to start the server.
After setup, build the Express app. Add middleware for JSON parsing using express.json(). Then create routes for each resource, such as GET /tasks, POST /tasks, PATCH /tasks/:id, and DELETE /tasks/:id. Move route logic into controller functions so files stay readable.
Next, decide how data is stored. For a beginner final project, an in-memory array or JSON file is acceptable. If your course covered databases, use MongoDB or PostgreSQL instead. Add validation so bad input does not crash the server. For example, require a title when creating a task. Then add error handling middleware to send consistent responses.
Finally, test all endpoints using Postman or Thunder Client. Check success cases and failure cases. Clean up naming, improve messages, and document how to run the project. A final project is complete when another developer can clone it, install dependencies, add environment variables, and run it without confusion.
Comprehensive Code Examples
const express = require('express');
const app = express();
app.use(express.json());
app.get('/', (req, res) => {
res.json({ message: 'Task Manager API is running' });
});
app.listen(3000, () => {
console.log('Server running on port 3000');
});const express = require('express');
const app = express();
app.use(express.json());
let tasks = [
{ id: 1, title: 'Learn Node Js', completed: false },
{ id: 2, title: 'Build final project', completed: true }
];
app.get('/tasks', (req, res) => {
res.json(tasks);
});
app.post('/tasks', (req, res) => {
const { title } = req.body;
if (!title) {
return res.status(400).json({ error: 'Title is required' });
}
const newTask = { id: Date.now(), title, completed: false };
tasks.push(newTask);
res.status(201).json(newTask);
});
app.listen(3000);const express = require('express');
const app = express();
app.use(express.json());
let tasks = [];
app.get('/tasks', (req, res) => {
const { completed } = req.query;
if (completed === 'true') return res.json(tasks.filter(t => t.completed));
if (completed === 'false') return res.json(tasks.filter(t => !t.completed));
res.json(tasks);
});
app.patch('/tasks/:id', (req, res) => {
const task = tasks.find(t => t.id === Number(req.params.id));
if (!task) return res.status(404).json({ error: 'Task not found' });
if (typeof req.body.completed === 'boolean') task.completed = req.body.completed;
if (req.body.title) task.title = req.body.title;
res.json(task);
});
app.delete('/tasks/:id', (req, res) => {
const id = Number(req.params.id);
tasks = tasks.filter(t => t.id !== id);
res.json({ message: 'Task deleted' });
});
app.use((err, req, res, next) => {
res.status(500).json({ error: 'Internal server error' });
});Common Mistakes
- Putting all code in one file: Split routes, controllers, and helpers into separate files.
- No validation: Always check request body fields before creating or updating data.
- Ignoring error responses: Return proper status codes like
400,404, and500. - Hardcoding configuration: Use environment variables for ports and secrets.
Best Practices
- Choose a small, clear project scope and finish core features first.
- Use meaningful names for routes, files, variables, and controller functions.
- Keep business logic out of route definitions when possible.
- Test every endpoint manually before calling the project complete.
- Add a README with setup steps, scripts, and endpoint documentation.
Practice Exercises
- Create a basic Express server for a project idea of your choice and return a welcome JSON message.
- Build
GETandPOSTroutes for a resource such as tasks, books, or notes. - Add validation that prevents empty titles and returns a clear error message.
Mini Project / Task
Build a Task Manager API with routes to create, list, update, and delete tasks. Include validation, error handling, and a clear folder structure.
Challenge (Optional)
Extend the final project by adding user accounts and a protected route so each user can only access their own tasks.