Development and Operations departments are no longer exclusively separate roles within the Information Technology space. Today, they’re morphing into one cohesive method and opportunity which is reshaping the way that IT teams operate.
Most would define DevOps as a movement, practice, or culture that aims to tie IT professionals and software developers together so that they can more efficiently streamline infrastructure changes and software delivery. Essentially, it’s rooted in the idea that building, testing and releasing software can run more smoothly and automatically if the appropriate team of professionals are working together.
Today we will be going through the setup and deployment of your Continuous Integration (CI) and delivery environment pipeline, using Docker, GitHub and Jenkins in order to configure, test and deploy a simple LAMP (Linux, Apache, MySQL, PHP) stack.
Three Part Process
The CI workflow described in this article is composed of three steps. The developer first pushes a commit to GitHub, which in turn uses a webhook to notify Jenkins of the update. Jenkins can then pull the GitHub repository, build the Docker container which contains our stack and then run the test. If the test passes, Jenkins will push the code to the master branch.

Now lets explain what each of these programs are, and how they will allow us to achieve a platform we can work from to deliver highly scalable applications on the internet.

What is Docker?
Innovation in today’s organizations comes from software, where all companies are becoming software companies and need to empower their developers to deliver new customer experiences quickly. Innovation can come in many different application formats – from traditional, monolithic applications to cloud-native and 12-factor applications.
These applications must also be able to run across hybrid/multi-cloud and out to the edge. Docker enables organizations to achieve these goals by providing the only end-to-end (desktop to the data center) experience for developing and scaling distributed applications while leveraging the processes, people and tools that they have in place today. In addition to building and running applications, the Docker Platform provides end-to-edge security at scale, without slowing down innovation with automated governance and compliance throughout the application life-cycle.
The Docker platform is built on industry-standard, open source technologies including Docker and Kubernetes. Used by millions of developers and IT professionals worldwide, Docker includes the world’s leading container content library and ecosystem with more than 100,000 container images from major software vendors, open-source projects and the community.

What is Jenkins?
You know how it goes: individuals on the team tend to work independently. Coding solo, engineers regularly create large segments of code outside of version control. Once a developer is “done,” they add their work into the basecode. Then another team manually runs tests to verify the build.
When multiple developers separately commit large changes to version control, they create complex bugs, multiply time-intensive fixes, and increase the time it takes to do more manual testing. Everything slows down.
Jenkins is a self-contained, open source automation server which can be used to automate all sorts of tasks related to building, testing, and delivering or deploying software. Jenkins can be installed through native system packages, Docker, or even run standalone by any machine with a Java Runtime Environment (JRE) installed.

What is GitHub?
To understand GitHub, you must first have an understanding of Git. Git is an open-source version control system that was started by Linus Trovalds – the same person who created Linux. At a high level, GitHub is a website and cloud-based service that helps developers store and manage their code, as well as track and control changes to their code.
When developers create something (an app, for example), they make constant changes to the code, releasing new versions up to and after the first official (non-beta) release. Version control systems keep these revisions straight, storing the modifications in a central repository. This allows developers to easily collaborate, as they can download a new version of the software, make changes, and upload the newest revision. Every developer can see these new changes, download them, and contribute.
We’ve established that Git is a version control system, similar but better than the many alternatives available. So, what makes GitHub so special? Git is a command-line tool, but the center around which all things involving Git revolve is the hub – GitHub – where developers store their projects and network with like minded people.
Ready? …Set …Go!
Now we have some fundamental understanding of the systems that we will be using, let’s crack on and get them installed and configured on your machine. I am using Microsoft Windows 10 for this set up, please note that due to the features required for this configuration you will require a professional license if you are using Microsoft Windows.
Our goal is to ensure our pipeline works well after each code being pushed. The processes we want to auto-manage are as follows;
- Code checkout
- Run tests
- Compile the code
- Run SonarQube analysis on the code
- Push the image to Docker Hub
- Pull and run the image
Spinning up Services
Since one of the goals is to obtain the sonarqube
report of our project, we should be able to access SonarQube from the Jenkins service. docker-compose
is a best choice to run services working together.
Docker uses YAML, which targets many of the same communications applications as Extensible Markup Language (XML), but has a minimal syntax which intentionally differs from SGML . YAML uses both Python-style indentation to indicate nesting, and a more compact format that uses []
for lists and {}
for maps, making it a superset of JSON.
We configure our application services in a YAML file, as follows:
version: '3.2' # docker-compose.yml
services:
sonarqube:
build:
context: sonarqube/
ports:
- 9000:9000
- 9092:9092
container_name: sonarqube
jenkins:
build:
context: jenkins/
privileged: true
user: root
ports:
- 8080:8080
- 50000:50000
container_name: jenkins
volumes:
# NOTE: The 'tmp' directory is designed to be wiped on system reboot.
- /tmp/jenkins:/var/jenkins_home
- /var/run/docker.sock:/var/run/docker.sock
depends_on:
- sonarqube
Paths of docker files of the containers are specified at context attribute in the docker-compose file. Content of these files as follows.
FROM sonarqube:6.7-alpine # sonarqube/Dockerfile
FROM jenkins:2.60.3 # jenkins/Dockerfile
If we run the following command in the same directory as the docker-compose.yml
file, the SonarQube and Jenkins containers will up and run.
docker-compose -f docker-compose.yml up --builddocker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
87105432d655 pipeline_jenkins "/bin/tini -- /usr..." About a minute ago Up About a minute 0.0.0.0:8080->8080/tcp, 0.0.0.0:50000->50000/tcp jenkins
f5bed5ba3266 pipeline_sonarqube "./bin/run.sh" About a minute ago Up About a minute 0.0.0.0:9000->9000/tcp, 0.0.0.0:9092->9092/tcp sonarqube
GitHub configuration
We’ll define a service on Github to call the Jenkins Github webhook
because we want to trigger the pipeline. To do this go to Settings -> Integrations & services. The Jenkins Github plugin
should be shown on the list of available services. We should add a new service by typing the URL of the dockerized Jenkins container along with the /github-webhook/
path.
The next step is that create an SSH key
for a Jenkins user and define it as Deploy keys
on our GitHub repository. If everything goes well, the following connection request should return with a success.
ssh git@github.com
PTY allocation request failed on channel 0
Hi <your github username>/<repository name>! You've successfully authenticated, but GitHub does not provide shell access.
Connection to github.com closed.
Jenkins Configuration
We have configured Jenkins in the docker compose file to run on port 8080 therefore if we visit http://localhost:8080 we will be greeted with a screen like this.
We need the admin password to proceed to installation. It’s stored in the /var/jenkins_home/secrets/initialAdminPassword
directory and also It’s written as output on the console when Jenkins starts;
jenkins | *************************************************************
jenkins |
jenkins | Jenkins initial setup is required. An admin user has been created and a password generated.
jenkins | Please use the following password to proceed to installation:
jenkins |
jenkins | 45638c79cecd4f43962da2933980197e
jenkins |
jenkins | This may also be found at: /var/jenkins_home/secrets/initialAdminPassword
jenkins |
jenkins | *************************************************************
To access the password from the container.
docker exec -it jenkins sh
/ $ cat /var/jenkins_home/secrets/initialAdminPassword
After entering the password, we will download recommended plugins and define anadmin user
.
After clicking Save and Finish and Start using Jenkins buttons, we should be seeing the Jenkins homepage. One of the seven goals listed above is that we must have the ability to build an image in the Jenkins being dockerized. Take a look at the volume definitions of the Jenkins service in the compose file.
- /var/run/docker.sock:/var/run/docker.sock
The purpose is to communicate between the Docker Daemon
and the Docker Client
(we will install it on Jenkins) over the socket. Like the docker client, we also need Maven
to compile the application. For the installation of these tools, we need to perform the Maven
and Docker Client
configurations under Manage Jenkins -> Global Tool Configuration menu.
We have added the Maven and Docker installers and have checked theInstall automatically
checkbox. These tools are installed by Jenkins when our script first runs. We give myMaven
and myDocker
names to the tools. We will access these tools with this names in the script file.
Since we will perform some operations such as checkout
codebase and pushing an image to Docker Hub
, we need to define the Docker Hub Credentials
. Keep in mind that if we are using a private repo, we must define Github credentials
. These definitions are performed under Jenkins Home Page -> Credentials -> Global credentials (unrestricted) -> Add Credentials menu.
We use the value we entered in the ID
field to Docker Login in the script file. Now, we define pipeline under Jenkins Home Page -> New Item menu.
In this step, we select GitHub hook trigger for GITScm pooling
options for automatic run of the pipeline by Github hook
call.
Also in the Pipeline section, we select the Pipeline script from SCM
as Definition, define the GitHub repository and the branch name, and specify the script location.
After that, when a push is done to the remote repository or when you manually trigger the pipeline by Build Now
option, the steps described in Jenkins file will be executed.
Review important points of the Jenkins file
stage('Initialize'){
def dockerHome = tool 'myDocker'
def mavenHome = tool 'myMaven'
env.PATH = "${dockerHome}/bin:${mavenHome}/bin:${env.PATH}"
}
The Maven
and Docker client
tools we have defined in Jenkins under Global Tool Configuration menu are added to the PATH environment variable
for using these tools with sh command
.
stage('Push to Docker Registry'){
withCredentials([usernamePassword(credentialsId: 'dockerHubAccount', usernameVariable: 'USERNAME', passwordVariable: 'PASSWORD')]) {
pushToImage(CONTAINER_NAME, CONTAINER_TAG, USERNAME, PASSWORD)
}
}
withCredentials
provided by Jenkins Credentials Binding Plugin
and bind credentials to variables. We passed dockerHubAccount value with credentialsId
parameter. Remember that, dockerHubAccount value is Docker Hub credentials ID we have defined it under Jenkins Home Page -> Credentials -> Global credentials (unrestricted) -> Add Credentials menu. In this way, we access to the username and password information of the account for login.
SonarQube configuration
For sonarqube
we have made the following definitions in the pom.xml
file of the project;
<sonar.host.url>http://sonarqube:9000</sonar.host.url>
...
<dependencies>
...
<dependency>
<groupId>london.webots.tecs</groupId>
<artifactId>sonar-maven-plugin</artifactId>
<version>2.7.1</version>
<type>maven-plugin</type>
</dependency>
...
</dependencies>
In the docker compose file, we gave the name of the Sonarqube service which is sonarqube
, this is why in the pom.xml file, the sonar URL was defined as http://sonarqube:9000.
Conclusion
In this article, I have tried to share an end-to-end tutorial on using Docker, GitHub and Jenkins together as a pipeline for your scalable application development. I hope this could help your needs.