DevOps

Getting Started with Docker: A Beginner’s Guide to Containerization

Docker is a containerization platform that allows developers to package applications and their dependencies into containers, which can run consistently across different environments. Here's a roadmap for learning Docker, starting from the basics and progressing to more advanced topics:

1. Introduction to Docker

What is Docker?

Docker is a platform used for developing, shipping, and running applications inside containers. Containers package an application and all its dependencies, so it works consistently regardless of the environment.

Why Docker?

Portability: Containers can run on any machine, ensuring that the app will run the same across different environments (development, staging, production).

Efficiency: Containers share the host OS kernel and are lightweight compared to virtual machines.

Core Concepts

  • Containers: Encapsulated environments that run applications and services.
  • Images: Read-only templates used to create containers.
  • Dockerfile: A script that contains instructions on how to build a Docker image.
  • Docker Hub: A cloud-based registry where you can store and share Docker images.
  • Docker Daemon: The background process that manages Docker containers.
  • Docker CLI: Command-line interface used to interact with Docker.

2. Setting Up Docker

Installation:

  • Install Docker on your machine (Windows, macOS, or Linux).
  • Install Docker on Windows/Mac/Linux.
  • Once installed, familiarize yourself with basic commands like:

    docker --version: To check the installed version.

    docker info: To see system information about Docker.

  • Running your First Container:

    Run a simple "Hello World" Docker container:

    docker run hello-world
    

    This will pull the hello-world image from Docker Hub and run it in a container.

3. Basic Docker Commands

Working with Containers:

  • docker run: Create and start a container.
  • docker ps: List running containers.
  • docker ps -a: List all containers (including stopped ones).
  • docker stop <container_id>: Stop a running container.
  • docker rm <container_id>: Remove a container.

Working with Images:

  • docker pull <image_name>: Download an image from Docker Hub.
  • docker build: Build an image from a Dockerfile.
  • docker images: List all available images on your machine.
  • docker rmi <image_id>: Remove an image.

Interacting with Containers:

  • docker exec -it <container_id> bash: Access the container's shell for debugging.
  • docker logs <container_id>: View the logs of a container.

 

4. Dockerfile and Image Creation

Writing a Dockerfile:

A Dockerfile is a simple text file that contains instructions to create an image.

Basic syntax:

FROM <base_image>
RUN <commands>
COPY <src> <dest>
CMD ["executable", "param1", "param2"]

Building an Image:

Create a Dockerfile in your project directory.

Example Dockerfile:

FROM python:3.8-slim
WORKDIR /app
COPY . /app
RUN pip install -r requirements.txt
CMD ["python", "app.py"]

Build the Docker image:

docker build -t my-python-app .

is used to build a Docker image from a Dockerfile in the current directory

PartExplanation
dockerInvokes the Docker CLI (Command Line Interface).
buildTells Docker you want to build an image.
-t my-python-app-t stands for tag. It assigns a name (and optionally a version) to the image. In this case, the image will be named my-python-app.
.The dot (.) specifies the build context — usually the directory containing your Dockerfile and all necessary files. It tells Docker, “Look here for the Dockerfile and application code.”

Tagging and Pushing to Docker Hub:

Tag your image before pushing:

docker tag my-python-app mydockerhubusername/my-python-app:1.0

Push your image to Docker Hub:

docker push mydockerhubusername/my-python-app:1.0

5. Networking in Docker

Docker Networking Basics:

Docker provides several networking modes for containers:

  • bridge (default): Containers on the same host can communicate via virtual bridge networks.
  • host: Containers share the host’s network stack.
  • none: No networking for the container.

Creating a Custom Network:

You can create custom networks for better control and isolation between containers.

docker network create my_network

Linking Containers:

Use Docker’s --link option or Docker networks to link containers.

docker run --name db-container --network my_network mysql
docker run --name app-container --network my_network my-python-app

6. Docker Compose

What is Docker Compose?

Docker Compose is a tool for defining and running multi-container Docker applications. You define your application's services (containers), networks, and volumes in a docker-compose.yml file.

Basic docker-compose.yml Example:

version: '3'
services:
  web:
    image: my-python-app
    ports:
      - "5000:5000"
  db:
    image: mysql
    environment:
      MYSQL_ROOT_PASSWORD: rootpassword

Running Containers with Compose:

To start your application:

docker-compose up

To stop your application:

docker-compose down

 

7. Volumes and Data Persistence

What are Volumes?

Volumes are used for persistent storage outside of containers. Containers are ephemeral, meaning when they stop, their data is lost unless saved in volumes.

Working with Volumes:

Create a volume:

docker volume create my_volume

Mount a volume in a container:

docker run -v my_volume:/data my-container

 

8. Docker Swarm and Orchestration

What is Docker Swarm?

Docker Swarm is Docker's native clustering and orchestration tool. It allows you to manage a cluster of Docker hosts and scale applications across multiple nodes.

Setting up Swarm:

Initialize the Swarm:

docker swarm init

Join other nodes to the swarm using the provided token.

Deploying Services in Swarm:

Deploy a service:

docker service create --name my_service -p 80:80 nginx

Scale services:

docker service scale my_service=3

 

9. Docker Security

Best Practices:

  • Least Privilege: Run containers with the least privileges necessary.
  • Avoid Root User: Avoid running containers as root unless absolutely necessary.
  • Image Scanning: Use image scanning tools like Clair or Trivy to check for vulnerabilities in your Docker images.
  • Use Trusted Base Images: Always use official or trusted base images to reduce risks.

Docker Content Trust (DCT):

  • Docker Content Trust (DCT) ensures that images are signed and verified, preventing the use of tampered images.

 

10. Advanced Docker Topics

Multi-Stage Builds:

Multi-stage builds allow you to minimize the size of your Docker images by using multiple FROM statements in a single Dockerfile. Each stage can be used to perform specific tasks like building the app, testing it, and preparing the production image.

CI/CD Pipelines with Docker:

Learn how to integrate Docker with CI/CD tools like Jenkins, GitLab CI, or GitHub Actions to automate your build and deployment process.

Monitoring Docker Containers:

Use tools like Prometheus, Grafana, and cAdvisor to monitor your containers.

Docker and Kubernetes:

Once you're comfortable with Docker, you can expand your knowledge to Kubernetes, which is a powerful container orchestration tool used to deploy, manage, and scale containerized applications.

 

Resources for Further Learning

  • Official documentation : https://docs.docker.com
  • Books:
    • "Docker Deep Dive" by Nigel Poulton
    • "Docker Up & Running" by Karl Matthias and Sean P. Kane

 

 

Connect  the docker database 

You have 3 options to access the database:

1. Connect using psql from your host machine

If you have psql installed locally, run:

psql -h localhost -p 5454 -U postgres
  • Default user is usually postgres
  • It will ask for a password (check your docker-compose.yml or environment variable POSTGRES_PASSWORD)

2. Connect inside the container

docker exec -it db_container_name psql -U postgres

If you want to connect to a specific database (e.g., test):

docker exec -it db_container_name psql -U postgres -d test

 

3. Use a GUI client like pgAdmin or DBeaver

  • Host: localhost
  • Port: 5454
  • Username: postgres
  • Password: (same as set in env variable or docker-compose file)
  • Database: (default is postgres, but if need to be connect other database , mention database_name example: test)

 

Import the database in docker container

Step 1: Create the database in container (example you have database school)

Step 2: Copy the SQL file into the container (if it’s not already there)

From your host machine:

school database in your Postgres container, so you just need to restore/import test.sql into it.

docker cp test.sql db_container_name:/test.sql

Step 2: Run psql to import into school

docker exec -i db_container_name psql -U user_name -d school < test.sql

Explanation:

  • -U user_name ? uses the  user (your databases are owner user)
  • -d school ? target database
  • < test.sql ? reads your dump file

Step 3: Verify import

After import, check inside container:

docker exec -it db_container_name psql -U user_name -d school(database_name)

Then list tables:

\dt

You should see the tables from your test.sql.

Note:

If your test.sql contains CREATE DATABASE test;, you’ll need to remove that line before restoring into school. Otherwise, it will try to create a new database instead of restoring into the existing one.

 

 

 

For odoo filestore

Background

  • Odoo stores binary files (images, attachments, reports, etc.) not in the DB, but in the filestore directory.
  • By default:
  • /var/lib/odoo/filestore/<dbname>/
  • When you restored only the database (.sql), the filestore did not come with it.

You need to also restore the filestore of the original DB.

1. If you have access to the server 

Locate the filestore folder:

/var/lib/odoo/.local/share/Odoo/filestore/<dbname>

(sometimes /var/lib/odoo/filestore/<dbname> depending on setup)

2.Zip it

tar -czf test_filestore.tar.gz /var/lib/odoo/filestore/<dbname>

3.Copy it to your new server/container:

docker cp test_filestore.tar.gz web_container_name:/var/lib/odoo/filestore/

4.Extract inside container:

docker exec -it web_container_name bash
cd /var/lib/odoo/filestore
tar -xzf test_filestore.tar.gz

 

If you don’t have the filestore

  • Odoo will throw errors whenever it tries to load missing files.
  • Workarounds:
    • Delete broken attachments

      DELETE FROM ir_attachment WHERE store_fname IS NOT NULL;

      (removes all attachments, logos, documents, etc.)

    • Or selectively remove only missing files.

 

 

If you have filestore folder

you already have the filestore folder, you just need to place it in the right location where your Odoo container expects it.

Where Odoo looks for the filestore

Inside your running container (web_container_name), Odoo looks for:

/var/lib/odoo/filestore/<dbname>/

 

Steps to restore filestore into Docker

1.Copy filestore from your host to the container
Assuming your filestore/<db_name> folder is in your local machine:

docker cp /path/to/filestore/db_name web_container_name:/var/lib/odoo/filestore/

This will create:

/var/lib/odoo/filestore/<db_name>

2.Check inside the container

docker exec -it web_container_name ls /var/lib/odoo/filestore/<db_name>

You should see folders like b8/, 4d/, etc. (hashed storage).

3.Restart Odoo Container

docker restart web_container_name

 

 


About author

author image

Amrit panta

Fullstack developer, content creator



Scroll to Top