For several months now, I’ve stopped creating Freestyle Jenkins jobs in favor of using Jenkins Pipeline to manage all of my CI/CD processes, for a number of reasons.
One of the first things that I needed to figure out was how to securely access the credentials that are stored in my Jenkins server into my Pipeline jobs. Specifically, how could I get things such as Ansible Vault passwords, AWS credentials and other secure documents such as private keys injected into my workspace.
Given that it took a bit of tinkering to figure out how to pull it off, I thought I’d share some snippets to get some of my fellow engineers in the community bootstrapped.
First, for those that are new to Jenkins Pipeline, I’d like to start off explaining some of the benefits of using Pipeline jobs, then we’ll dig into how we can leverage our credentials.
Top-level items covered:
Here are some of the key things I appreciate from Pipeline:
So first, you’ll want to click the ‘New Item‘ link in the top-left hand corner of your Jenkins console.
Provide a name for your Pipeline job, select the ‘Pipeline‘ job type and click ‘OK‘.
Once you’ve created your job, you can configure the repository that you’d like Jenkins to pull. This repository should contain your Jenkinsfile file. This is where your Pipeline’s operations will be specified.
If your Jenkinsfile is located in the root of your repository, your configuration should look like the below:
Each of the columns below is a ‘stage‘ in your Pipeline.
Here’s a basic example of a Pipeline Jenkinsfile that builds a Go binary, archives it and cleans up after it’s done.
pipeline { agent any stages { stage('Build') { steps { echo 'Build code here' sh "go build -o application" sh "zip application.zip application" // Archive the build artifact for downloading via Jenkins project screen archiveArtifacts artifacts: 'application.zip', fingerprint: true } } stage('Deploy') { steps { echo 'Handling deploy logic here.' } } } post { // This block always runs always { // You can cleanup key files, etc. here. sh 'rm ./key.pem' } } }
So now that we’ve detailed some of the perks to using Pipeline and how to create a basic project that leverages a Jenkinsfile, let’s dig into credentials.
Prerequisites:
Here’s an example Pipeline stage that demonstrates how one can access AWS credentials stored as Jenkins credentials in our build.
Note: The double-quotes around the shell command are required.
stage('Build') { steps { // Example AWS credentials withCredentials( [[ $class: 'AmazonWebServicesCredentialsBinding', accessKeyVariable: 'AWS_ACCESS_KEY_ID', credentialsId: 'aws-dev-credentials', // ID of credentials in Jenkins secretKeyVariable: 'AWS_SECRET_ACCESS_KEY' ]]) { echo "Listing contents of an S3 bucket." sh "AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID} \ AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY} \ AWS_REGION=us-east-1 \ aws s3 ls clouductivity-demo" } } }
Here’s how you might go about retrieving a secret text such as your Ansible vault password in a Pipeline job. Don’t forget to add the post{ always {} }
to ensure sensitive information is cleaned up afterwards.
Replace AnsibleVault with the credential ID you find in the credentials management screen within Jenkins.
pipeline { agent any stages { stage('Build') { steps { // Example secure string retrieval for Ansible vault password withCredentials([string(credentialsId: 'AnsibleVault', variable: 'vaultPass')]) { // Create virtualenv sh "virtualenv ." // Install Ansible module sh "./bin/pip install ansible" // Create our vaultpass file sh "touch ./.vaultpass" sh "chown jenkins:jenkins ./.vaultpass" sh "chmod 640 ./.vaultpass" sh "echo '${vaultPass}' > ./.vaultpass" sh "bin/ansible-vault decrypt secret-file.pem --vault-password-file ./.vaultpass" } } } } post { always { sh 'rm ./key.pem ./.vaultpass' } } }
I hope you’ve found this article informative, and I’d like to invite you to check out our product, Clouductivity Navigator – a Chrome Extension to improve your productivity in AWS. It helps you get to the documentation and AWS service pages you need, without clicking through the console. Download your free trial today!
--I work at a staffing agency. Day in and day out, I get to hear our recruiters prep candidates as…
Getting into DevOps is like any other career, in the sense that it is a life-altering decision - and, with…
If you've managed PostgreSQL databases in the past, you know how important it is to have proper monitoring in place…
Not too long ago, I was searching for information on the AWS Solutions Architect Professional certification exam. I was trying…
So you've launched a new application in production, and your applications are performing snappily as you might expect. The cloud…
It's Not Easy Doing Everything There's an oft-circulated image in the technology sector - a company looking for a Swift…