Jenkins

Using AWS SDK Docker image inside a Jenkins pipeline

Jenkins is a leading automation server widely used for building, testing, and deploying software. It allows developers to streamline their tasks, automate repetitive processes, and improve the quality of their software. One of the main benefits of Jenkins is that it can integrate with numerous tools, including AWS SDK Docker images. This blog post examines how to use AWS SDK Docker images in a Jenkins pipeline.

What is Docker

Before we dive into how to use AWS SDK Docker images in a Jenkins pipeline, let’s understand what Docker is and how it works. Docker is an open-source technology that allows developers to build, ship, and run applications in containers. A container is a lightweight and standalone executable package that includes everything the application needs to run, including code, libraries, dependencies, and configurations. Docker provides an isolated environment for running applications, making them more secure and portable.

If you need to run docker on a Windows environment.

SDK Docker Image

AWS SDK Docker images are Docker images that come pre-installed with the AWS SDK. AWS SDK allows developers to interact with the AWS resources programmatically, making it easier to automate tasks, build applications, and integrate with other tools. AWS SDK Docker images can run AWS CLI commands or execute AWS SDK calls from within Jenkins pipeline steps (or anywhere Docker is installed).

Benefits of using Docker SDK Image

The main advantage of using the docker SDK image is that it is easy to update versions and use an SDK without having to install it on every machine, especially if the Jenkins setup uses on-demand Jenkins slaves.

Below is an example of uploading a file to an s3 bucket. Before you can run this, you will need a Jenkins server, docker installed, an S3 Bucket created, an AWS access key generated, and the AWS access key added as a credential to Jenkins.

pipeline {
    
    agent {
        any // NOTE May need to target an agent with docker installed
    }
    
    stages {
        
		stage("Upload to S3") {
		
			steps {
			
        // Creates a test file to upload.
				sh "echo test >> test.txt"
				
        // TODO Add username/password credentials called "aws-s3"  
				withCredentials([usernamePassword(credentialsId: 'aws-s3', passwordVariable: 'PASSWORD', usernameVariable: 'USERNAME')]) {
				
					sh """ docker run --rm \\
					-v \$(pwd):/aws \\
          -e AWS_ACCESS_KEY_ID=$USERNAME  \\
					-e AWS_SECRET_ACCESS_KEY=$PASSWORD \\
					-t amazon/aws-cli s3 cp test.txt s3://my-s3-url/test.txt
					"""
				
				}
			
			}
			
			
		}
        
    }
    
}

What’s Happening


sh """ docker run --rm \\
    -v \$(pwd):/aws \\ // Mounting the current directory to the aws directory.
    -e AWS_ACCESS_KEY_ID=$USERNAME  \\ // Passes in the token name of the access key
    -e AWS_SECRET_ACCESS_KEY=$PASSWORD \\ // Passes in the access key secret
    -t amazon/aws-cli s3 cp test.txt s3://my-s3-url/test.txt // Pull the image and calls the cp (copy) command to the "my-s3-url" bucket.
"""

By using AWS SDK Docker images in the Jenkins pipeline, developers can simplify their build and deployment process. They no longer need to install and configure AWS SDK on the Jenkins server or the build agents. They can pull the AWS SDK Docker image from Docker Hub and run it inside a container on demand. This makes it easier to test and deploy code changes across different environments.

In conclusion, AWS SDK Docker images can be a valuable addition to Jenkins pipeline. They help to automate AWS tasks, build applications, and integrate with other tools in a secure and portable way. By following the simple steps outlined above, developers can start using AWS SDK Docker image in their Jenkins pipeline and streamline their build and deployment process.

Similar Posts