Mastering CI/CD Automation for Java Applications with Jenkins, AWS, Docker, Kubernetes, SonarQube, and Nexus: A Step-by-Step Guide

Introduction
In this guide, we’ll explore how to set up a Jenkins pipeline to automate your CI/CD process for a Java application. This pipeline integrates key DevOps tools, including AWS, Docker, Kubernetes (K8s), SonarQube, and Nexus, to streamline the software development lifecycle.
We’ll cover:
- A detailed explanation of the Jenkins pipeline code.
- How to install necessary plugins for managing Docker and AWS credentials in Jenkins.
- How to create and manage secrets for AWS and Docker in Jenkins.
Whether you’re a beginner or an experienced DevOps practitioner, this guide will help you set up a reliable, automated CI/CD pipeline.
In a future post, I’ll also explain how to automate the installation and configuration of Jenkins, SonarQube, Docker, Kubernetes, and Nexus using an Ansible Playbook, making it even easier to establish your DevOps infrastructure.
Let’s dive in!
1. The Jenkins Pipeline
Here’s a sample Jenkins pipeline script that automates the following steps:
- Code checkout from GitHub.
- Static code analysis using SonarQube.
- Building and packaging the application using Maven.
- Uploading artifacts to a Nexus repository.
- Building Docker images and pushing them to AWS Elastic Container Registry (ECR).
- Deploying the application to a Kubernetes cluster.
Pipeline Code
pipeline {
agent any
options {
buildDiscarder logRotator(artifactDaysToKeepStr: '', artifactNumToKeepStr: '', daysToKeepStr: '30', numToKeepStr: '2')
}
tools {
maven 'Maven'
}
environment {
AWS_CREDENTIALS = credentials('aws-key')
}
stages {
stage('Checkout Code') {
steps {
checkout scmGit(branches: [[name: '*/main']], extensions: [], userRemoteConfigs: [[url: 'https://github.com/example-org/example-repo']])
}
}
stage('SonarQube Analysis') {
steps {
script {
def mvn = tool 'Maven'
withSonarQubeEnv(installationName: 'sonarqube-server') {
sh "${mvn}/bin/mvn clean verify sonar:sonar -Dsonar.projectKey=example-project -Dsonar.projectName='Example Project'"
}
}
}
}
stage('Build and Package') {
steps {
sh 'mvn package'
}
}
stage('Publish to Nexus') {
steps {
nexusArtifactUploader(
nexusVersion: 'nexus3',
protocol: 'http',
nexusUrl: 'http://nexus.example.com:8081',
groupId: 'com.example',
version: '1.0-SNAPSHOT',
repository: 'maven-snapshots',
credentialsId: 'nexus-credentials',
artifacts: [
[artifactId: 'ExampleApp',
classifier: '',
file: 'target/example-app-1.0.war',
type: 'war']
]
)
}
}
stage('Build Docker Image') {
steps {
sh 'docker build -t example-app .'
}
}
stage('Push Docker Image to ECR') {
steps {
sh """
aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin 123456789012.dkr.ecr.us-east-1.amazonaws.com
docker tag example-app:latest 123456789012.dkr.ecr.us-east-1.amazonaws.com/example-app:latest
docker push 123456789012.dkr.ecr.us-east-1.amazonaws.com/example-app:latest
"""
}
}
stage('Deploy to Kubernetes') {
steps {
sh """
aws eks update-kubeconfig --region us-east-1 --name example-cluster
kubectl apply -f deployment.yaml
"""
}
}
}
post {
always {
echo "Pipeline execution completed."
}
success {
echo "Pipeline executed successfully!"
}
failure {
echo "Pipeline execution failed."
}
}
}
2. Jenkins Pipeline Code: Line-by-Line Explanation
Pipeline Declaration
pipeline {
- What It Does: Marks the beginning of a Jenkins declarative pipeline.
Agent Declaration
agent any
What It Does: Specifies that the pipeline can run on any available Jenkins agent. If there are multiple agents in your Jenkins setup, Jenkins will assign one dynamically.
Pipeline Options
options {
buildDiscarder logRotator(artifactDaysToKeepStr: '', artifactNumToKeepStr: '', daysToKeepStr: '30', numToKeepStr: '2')
}
- What It Does: Configures options for the pipeline:buildDiscarder:daysToKeepStr: ’30’: Retains logs for 30 days.numToKeepStr: ‘2’: Keeps only the last 2 builds.Purpose: Prevents Jenkins from consuming unnecessary disk space.
Tools Configuration
tools {
maven 'Maven'
}
- What It Does: Specifies the Maven version to be used in the pipeline.
- Where to Configure: Maven must be pre-configured in Manage Jenkins > Global Tool Configuration.
Environment Variables
environment {
AWS_CREDENTIALS = credentials('aws-key')
}
- What It Does: AWS_CREDENTIALS: References AWS credentials stored securely in Jenkins Credentials.
- Purpose: Avoids hardcoding sensitive information in the pipeline script.
Stages Section
stages {
- What It Does: Defines the sequence of steps or tasks that the pipeline will execute. Each task is encapsulated in a stage.
Stage: Checkout Code
stage('Checkout Code') {
steps {
checkout scmGit(branches: [[name: '*/main']], extensions: [], userRemoteConfigs: [[url: 'https://github.com/example-org/example-repo']])
}
}
- What It Does:Pulls the latest code from the main branch of the specified GitHub repository.
- Key Parameters:branches: Specifies the branch to pull (*/main).url: The repository URL.
- Purpose: Ensures the pipeline works with the most recent codebase.
Stage: SonarQube Analysis
stage('SonarQube Analysis') {
steps {
script {
def mvn = tool 'Maven'
withSonarQubeEnv(installationName: 'sonarqube-server') {
sh "${mvn}/bin/mvn clean verify sonar:sonar -Dsonar.projectKey=example-project -Dsonar.projectName='Example Project'"
}
}
}
}
- What It Does:Runs static code analysis with SonarQube to check for bugs, vulnerabilities, and code quality.
- Command Breakdown:
- mvn clean verify:clean: Deletes previous build artifacts.
- verify: Runs all tests and validates dependencies.
- sonar:sonar: Sends analysis results to the SonarQube server.
- -Dsonar.projectKey: A unique identifier for the project in SonarQube.
- -Dsonar.projectName: A human-readable name for the project.
- Purpose: Ensures code quality before proceeding to the next stages.
Stage: Build and Package
stage('Build and Package') {
steps {
sh 'mvn package'
}
}
- What It Does:Compiles the Java code.Packages it into a .war file (web application archive).
- Command:mvn package: Builds the application and generates the deployable artifact.
Stage: Publish to Nexus
stage('Publish to Nexus') {
steps {
nexusArtifactUploader(
nexusVersion: 'nexus3',
protocol: 'http',
nexusUrl: 'http://nexus.example.com:8081',
groupId: 'com.example',
version: '1.0-SNAPSHOT',
repository: 'maven-snapshots',
credentialsId: 'nexus-credentials',
artifacts: [
[artifactId: 'ExampleApp',
classifier: '',
file: 'target/example-app-1.0.war',
type: 'war']
]
)
}
}
- What It Does:Uploads the .war file to a Nexus repository for centralized storage and version control.
- Key Parameters:nexusUrl: URL of the Nexus repository.groupId: Logical grouping for artifacts.version: Artifact version.credentialsId: Jenkins credential ID for authenticating with Nexus.
Stage: Build Docker Image
stage('Build Docker Image') {
steps {
sh 'docker build -t example-app .'
}
}
- What It Does:Builds a Docker image named example-app using the Dockerfile in the project directory.
- Command:docker build: Creates the container image.
Stage: Push Docker Image to ECR
stage('Push Docker Image to ECR') {
steps {
sh """
aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin 123456789012.dkr.ecr.us-east-1.amazonaws.com
docker tag example-app:latest 123456789012.dkr.ecr.us-east-1.amazonaws.com/example-app:latest
docker push 123456789012.dkr.ecr.us-east-1.amazonaws.com/example-app:latest
"""
}
}
- What It Does:Logs into AWS ECR using credentials.Tags and pushes the Docker image to ECR.
- Commands:aws ecr get-login-password: Retrieves an authentication token for ECR.docker login: Authenticates with ECR.docker tag: Tags the image for ECR.docker push: Uploads the image to the specified ECR repository.
Stage: Deploy to Kubernetes
stage('Deploy to Kubernetes') {
steps {
sh """
aws eks update-kubeconfig --region us-east-1 --name example-cluster
kubectl apply -f deployment.yaml
"""
}
}
- What It Does:Configures Kubernetes to use the correct EKS cluster.Deploys the application using Kubernetes manifests (deployment.yaml).
- Commands:aws eks update-kubeconfig: Updates the Kubernetes configuration file to point to the EKS cluster.kubectl apply -f deployment.yaml: Deploys the application defined in deployment.yaml.
Post Actions
post {
always {
echo "Pipeline execution completed."
}
success {
echo "Pipeline executed successfully!"
}
failure {
echo "Pipeline execution failed."
}
}
}
- What It Does:Executes post-build actions:always: Runs regardless of the build’s outcome.success: Executes when the pipeline succeeds.failure: Executes when the pipeline fails.
3. Installing Plugins for AWS and Docker in Jenkins
Step 1: Installing the AWS Plugin
- Go to Manage Jenkins > Plugin Manager.
- Navigate to the Available Plugins tab.
- Search for AWS Plugin.
- Select the plugin and click Install without Restart.
- Once installed, restart Jenkins if required.
Step 2: Installing the Docker Pipeline Plugin
- Go to Manage Jenkins > Plugin Manager.
- Search for Docker Pipeline Plugin under the Available Plugins tab.
- Select the plugin and click Install without Restart
4. Creating Secrets for AWS
For AWS Credentials
- Navigate to Manage Jenkins > Manage Credentials.
- Click Global or the desired scope.
- Add credentials of type AWS Credentials.
- Provide your Access Key ID, Secret Access Key, and a unique ID (e.g., aws-key).
Why Use Jenkins?
Jenkins is a powerful, flexible, and open-source CI/CD tool that serves as the backbone for modern DevOps pipelines. Here’s why Jenkins stands out:
- Automation: Jenkins automates repetitive tasks like building, testing, and deploying code.
- Extensibility: With over 1,800 plugins, Jenkins integrates seamlessly with tools like Docker, AWS, Kubernetes, SonarQube, and Nexus.
- Scalability: Jenkins supports distributed builds using multiple agents, making it suitable for projects of any size.
- Community Support: Jenkins has a robust community that ensures consistent updates, security patches, and plugin availability.
By using Jenkins, teams can ensure faster deployments, improved code quality, and enhanced collaboration.
Conclusion
This guide demonstrates how to:
- Automate your CI/CD workflows using a Jenkins pipeline.
- Manage secrets securely in Jenkins for AWS and Docker.
- Install plugins to enhance Jenkins’ capabilities.
- Push your DevOps skills further by integrating tools like Kubernetes, Nexus, and SonarQube.
Stay tuned to learn how to simplify DevOps infrastructure setup with automation tools!