Building End-to-End CI/CD Pipeline with Jenkins, Docker, and Kubernetes – Explained Step by Step

End-to-end automation is at the heart of modern DevOps practices.
This article walks through a real-world Jenkins pipeline that brings together GitHub, Composer, Docker, AWS, and Kubernetes into a streamlined CI/CD workflow. It covers:
- Code checkout and dependency installation
- Docker image build and push to Docker Hub
- Secure AWS authentication and EKS access
- Automated deployments to dev, staging, and production using kubectl
- Manual approval gating for production deployment
Each stage is explained line by line to help you understand how these tools integrate in a practical setup.
Let’s get started
Jenkins Pipeline – Step by Step
pipeline {
Marks the beginning of a Jenkins Declarative Pipeline block.
agent {
label 'worker'
}
Specifies that all steps should run on a Jenkins node (agent) labeled ‘worker’.
options {
buildDiscarder logRotator(artifactDaysToKeepStr: '', artifactNumToKeepStr: '', daysToKeepStr: '7', numToKeepStr: '2')
}
This configures the pipeline to keep only:
- 7 days of builds
- 2 most recent builds
Great for disk cleanup and avoiding clutter in Jenkins.
environment {
dockercred = credentials('docker-hub')
awscred = crdentials()'aws-key'
}
Declares environment variables that fetch credentials from Jenkins’ credential store:
- dockercred → Docker Hub credentials (docker-hub)
- awscred → AWS access credentials (aws-key)
STAGES
Stage: CheckOut
stages {
stage('CheckOut ') {
steps {
checkout scmGit(branches: [[name: '*/master']], extensions: [], userRemoteConfigs: [[url: 'Github Link']])
}
}
Clones the master branch from GitHub into the Jenkins workspace.
Stage: php composer test
stage('php composer test ') {
steps {
sh 'composer install'
sh 'ls'
}
}
Installs PHP dependencies using Composer and lists files to confirm contents.
Stage: Docker build
stage('Docker build'){
steps{
sh 'docker -v'
sh 'docker build -t ...'
sh 'docker login ...'
sh 'docker push ...'
}
}
- Builds a Docker image from your project.
- Logs in to Docker Hub using secure credentials.
- Pushes the image to your Docker Hub repo.
Stage: aws test
stage('aws test ') {
steps {
sh 'aws --version'
sh 'aws sts get-caller-identity'
}
}
Validates that AWS CLI is installed and credentials work correctly.
Stage: kube config setup
stage('kube config setup ') {
steps {
sh 'aws eks update-kubeconfig --region us-east-1 --name devops-working'
}
}
Updates the local kubeconfig file to allow kubectl to connect to your AWS EKS cluster.
Stage: kubectl deployment – dev
stage('kubectl deployment - dev'){
steps{
sh 'kubectl apply -f deploymen-dev.yaml -n dev'
}
}
Deploys your application to the dev namespace on your Kubernetes cluster.
Stage: kubectl deployment – staging
stage('kubectl deployment - staging'){
steps{
sh 'kubectl apply -f deploymen-staging.yaml -n staging'
}
}
Deploys to the staging namespace.
Stage: kubectl deployment – prod
stage('kubectl deployment - prod'){
steps{
script{
def approval = input id: 'Deployment', message: 'Do you want to deploy to production?', submitter: 'admin'
}
sh 'kubectl apply -f deploymen-prod.yaml -n prod'
}
}
}
}
- Manual approval required to proceed (for safety).
- Only an ‘admin’ user can approve deployment to production.
- Once approved, it applies the production manifest to the prod namespace.
Final Thoughts
This pipeline helps automate a full CI/CD workflow from code checkout to Docker image build to multi-environment Kubernetes deployments. What makes it powerful is how secure and flexible it is:
- It uses Jenkins credentials securely.
- Supports manual approvals for production.
- Integrates Docker, AWS CLI, and kubectl.
If you’re learning DevOps, I highly recommend understanding each part of your pipeline like this. It’s a great way to master CI/CD tools and cloud-native deployment strategies.
Feel free to check out the full version here 👉 GitHub Link