Add Docker To Jenkins

Add Docker To Jenkins

Reading time1 min
#DevOps#CI/CD#Automation#Jenkins#Docker#Containers

Seamlessly Integrate Docker into Jenkins Pipelines for Immutable Build Environments

Software delivery fails when environments drift. CI builds that pass locally but fail in automation waste time and erode trust. Docker integration with Jenkins neutralizes these variables by enforcing containerized build and test steps, effectively standardizing execution from the developer laptop through to production.


Problem: Build Inconsistency and Environment Drift

Immortal phrase—"works on my machine." Jenkins alone can't guarantee build reproducibility. Subtle host differences (OS packages, network conditions, leftover build cache) introduce hard-to-debug failures. Attempting to align agent environments with custom scripts often leads to unmanageable snowflake servers.

Docker solves this at the source. Run all build, test, package, and even deployment steps inside containers. Jenkins becomes the orchestrator; Docker images define the runtime.


Methods to Run Docker in Jenkins Pipelines

Two production-grade approaches, each with distinct trade-offs:

1. Shelling Out to Docker on Host Agents

Agents must have the Docker Engine (>= 20.10) and the Jenkins user added to the docker group (avoids permission denied errors). Pipelines execute standard Docker CLI commands.

  • Suitable for flexibility (custom images, running Docker-in-Docker if necessary)
  • Security consideration: Jenkins gains host-level Docker privileges

2. Using the Docker Pipeline Plugin

Jenkins Docker Pipeline Plugin (docker-workflow), maintained by the Jenkins project. Enables container orchestration directly in pipeline DSL—no custom shell scripting required. Plugin required on Jenkins controller; does not replace the need for Docker on agent nodes.

  • Abstracts container use in pipelines (docker.image(...).inside)
  • Handles workspace mounting, log propagation, and cleanup automatically
  • Version: plugin >= 572.v950f58993843 recommended for bugfixes

Baseline Pipeline Example (Agent with Docker Installed)

Assume:

  • Jenkins agent: Ubuntu 22.04 LTS
  • Docker Engine: 24.0.2
  • Jenkins user in docker group (id jenkins returns e.g. uid=1001(jenkins) gid=1001(jenkins) groups=1001(jenkins),999(docker))
  • Node labeled docker-agent
pipeline {
    agent { label 'docker-agent' }
    stages {
        stage('Checkout') {
            steps {
                checkout scm
            }
        }
        stage('Docker Build & Test') {
            steps {
                sh '''
                  docker build --pull --cache-from=myapp:buildcache -t myapp:ci .
                  docker run --rm -e CI=true -v "$PWD":/workspace myapp:ci ./ci/run-tests.sh
                '''
            }
        }
        stage('Docker Cleanup') {
            steps {
                sh 'docker image prune -f || true'
            }
        }
        stage('Deploy') {
            steps {
                // Most teams push image here, e.g.:
                // sh 'docker push myapp:ci'
                echo 'Deploy step not shown'
            }
        }
    }
}

Note: Always include image pruning or periodic cleanup; otherwise, disk usage grows unbounded.


Declarative: The Docker Pipeline Plugin Approach

One practical pattern—build and test Java with Maven inside an official container.

pipeline {
    agent any
    environment {
        MAVEN_OPTS = '-Dmaven.repo.local=.m2/repository'
    }
    stages {
        stage('Checkout') {
            steps { checkout scm }
        }
        stage('Build/Test in Docker') {
            steps {
                script {
                    docker.image('maven:3.9.4-eclipse-temurin-17-alpine').inside('-v $HOME/.m2:/root/.m2:ro') {
                        sh 'mvn clean verify'
                    }
                }
            }
        }
    }
}

Trade-off: .inside volumes may not play well with certain host filesystems (Windows hosts, SELinux, NFS).


Known Issues and Gotchas

  • Jenkins on Docker: If Jenkins itself runs in Docker, bind-mount the host Docker socket (/var/run/docker.sock) or use "daemonless" toolkits like Kaniko, though these require further configuration.
  • Docker-in-Docker (DinD): Useful for full isolation, but has security and performance trade-offs; requires privileged mode.
  • Layer Caching: For typical build pipelines which fetch dependencies on every run, structuring Dockerfile steps to maximize reuse is critical. Example: Move COPY package.json . above COPY . . in Node.js builds.
  • Credential Management: Use Jenkins Credentials Binding Plugin for Docker Hub or private registry secrets. Hardcoding tokens is a recurring anti-pattern.

Decision Table: Pipeline Models

PatternFlexibilitySecurityComplexityIdeal For
Shelling out to DockerHighLowMediumCustom pipelines
Docker Pipeline PluginMediumMediumLowStandard workflows

Non-Obvious Tip

When running large test suites in containers, mapping /tmp into the workspace can mitigate "No space left on device" errors in bind mount environments (seen occasionally on Google Cloud Build and self-hosted LXC agents).

docker.image('python:3.11-slim').inside('-v /tmp:/workspace/tmp') {
    sh 'pytest --basetemp=/workspace/tmp'
}

Conclusion

Direct Docker integration with Jenkins ensures that every build, test, and deployment is executed within a controlled, reproducible environment. This minimizes discrepancies, accelerates onboarding, and provides a robust foundation for secure automation. While several valid approaches exist, tailor your integration point based on workflow needs, environment complexity, and risk appetite. Introducing containers into CI rarely solves everything—but it immediately eliminates a large class of build instability.


Note: Most advanced teams eventually transition from static agents toward dynamic, container-native execution platforms (e.g., Kubernetes with the Kubernetes Plugin). For legacy agents or hybrid footprints, techniques above remain effective and battle-proven.