Introduction To Docker Containers

Introduction To Docker Containers

Reading time1 min
#Cloud#DevOps#Containers#Docker#Containerization#Microservices

Sure! Since the title, rationale, and suggested hook are blank, I will create a practical how-to blog post focusing on an introduction to Docker containers. The post will explain Docker containers in simple terms, show why they matter, and provide easy-to-follow examples for beginners.


Introduction to Docker Containers: A Practical Guide for Beginners

Rationale:
With the rise of cloud computing and microservices, containerization has become a pivotal technology enabling developers to build, ship, and run applications efficiently. Docker is the leading platform for containerization, offering a simple and consistent environment across development and production. This post aims to demystify Docker containers with practical examples, helping you get started quickly.

Suggested Hook:
"Ever wondered how developers deploy apps that run exactly the same way on your laptop and on servers? Meet Docker containers—the magic behind seamless application consistency."


What Are Docker Containers?

At its core, a Docker container is a lightweight, portable package that includes an application and everything it needs to run: code, runtime, system tools, libraries, and settings. Unlike virtual machines that simulate entire operating systems (which can be heavy), containers share the host OS kernel but run isolated processes. This makes them fast and resource-efficient.

Think of containers like shipping containers: no matter what's inside or where they are shipped, the container acts as a standard wrapper that works everywhere.

Why Use Docker Containers?

  • Consistency: “It works on my machine” becomes true everywhere. Containers eliminate environment differences.
  • Portability: Containers run identically across development machines, testing servers, and production.
  • Lightweight: Share OS resources without the overhead of full virtual machines.
  • Scalability: Easily scale apps horizontally by running multiple container instances.
  • Simplified Deployment: Deploy applications and dependencies as one package.

How to Get Started With Docker Containers

Step 1: Install Docker

First things first—install Docker on your machine.

  • For Windows/Mac, download Docker Desktop.
  • For Linux, follow your distro’s instructions here.

After installation completes successfully, verify by running:

docker --version

You should see something like:

Docker version 24.0.2, build dabd88b

Step 2: Run Your First Container

Let's run a simple container with an Ubuntu image:

docker run -it ubuntu bash

Explanation:

  • docker run tells Docker to start a new container.
  • -it runs it interactively (terminal + input).
  • ubuntu is the image name.
  • bash launches the terminal shell inside the container.

You’ll now be inside a shell in an Ubuntu container!

Try running Linux commands like:

apt update && apt install -y curl
curl https://www.google.com

To exit the container shell type:

exit

Step 3: Understanding Images vs Containers

An image is like a blueprint—a read-only template that defines what goes inside your container (OS files, software dependencies). A container is a running instance of this image.

Try listing downloaded images:

docker images

And list active containers:

docker ps

Or list all containers (running or stopped):

docker ps -a

Step 4: Create Your Own Docker Image

You don’t always have to pull ready-made images—you can build your own!

Create a folder named myapp with this simple Node.js app:

index.js

const http = require('http');
const port = 3000;

const server = http.createServer((req,res) => {
    res.statusCode = 200;
    res.setHeader('Content-Type', 'text/plain');
    res.end('Hello from my first Docker app!\n');
});

server.listen(port, () => {
    console.log(`Server running at http://localhost:${port}/`);
});

Create a package.json file:

{
  "name": "myapp",
  "version": "1.0.0",
  "main": "index.js",
  "dependencies": {}
}

And now create a file called Dockerfile with no extension containing:

# Use Node.js official base image from Docker Hub
FROM node:18-alpine

# Set working directory in container filesystem
WORKDIR /usr/src/app

# Copy files from local folder to working directory inside container
COPY . .

# Define command to run your app when container launches
CMD ["node", "index.js"]

# Expose port for network access outside container (optional)
EXPOSE 3000

Build the image

Run this command inside myapp folder where Dockerfile lives:

docker build -t mynodeapp .

This tells Docker to build an image named mynodeapp using the current folder (.).

Run your app in a container

Launch the app while mapping port 3000 from container to your computer’s port 3000:

docker run -p 3000:3000 mynodeapp

Open http://localhost:3000 in your browser—you should see:

Hello from my first Docker app!

Step 5: Stopping and Cleaning Up Containers

When done testing hit Ctrl+C to stop running containers.

To list all containers again:

docker ps -a

Remove stopped containers via their ID or name:

docker rm CONTAINER_ID_OR_NAME

Remove unused images if you want to free space (dangling images):

docker image prune 

Summary

Congratulations! You’ve now learned what Docker containers are, why they're powerful tools for developers and sysadmins alike, and how to create and run simple ones yourself with practical commands.

If you take away just one thing—containers wrap apps with their environment into lightweight packages that behave identically wherever they go.


What Next?

Here are some ideas for further exploration in Docker world:

  • Learn about volumes for persistent data storage.
  • Explore networks between multiple containers for microservices.
  • Automate builds & publish images with Docker Hub.
  • Integrate Docker into CI/CD pipelines.

Let me know in comments what you’d like me to cover next!

Happy Dockering! 🚢🐳


Would you like me to help generate posts on any related topics such as Kubernetes or advanced Docker workflows?