With a Conveyor library that lets us clone a git repo and run maven builds, it’s time to try it out on a sample application and build it on Jenkins.
<….record scratch…>
That’s right, after all my whining, we’re going to build on Jenkins. The build itself is now handled by Conveyor and can easily be run locally. In practice you still want a build server like Jenkins to provide a centralised place to monitor repositories and trigger builds when commits are pushed.
The major difference now is that the Jenkins server can be mostly barebones; no need to install/update/maintain/configure any plugins other than Git. C’mon, let’s try it out! It’ll only take about ten minutes.
I’m going to use Docker to run the latest Jenkins image.
docker run -p 8080:8080 jenkins/jenkins:lts-jdk11
Keep an eye on the console output for your admin password, you’ll need it in a moment.
It’ll look similar to the below screenshot, but with a different password, obviously.
After a few minutes the container and Jenkins will be running so head to http://localhost:8080 and login with your shiny new admin password. It’ll bring you to a Setup Wizard where you should click “Select plugins to install”.
Next it’ll display a list of plugins with all the recommended ones ticked. We don’t need no stinkin’ recommended plugins so firmly press the “None” option at the top to deselect everything. Then remember you need just one plugin, Git, to actually check your application out so grudgingly give that a tick and click “Install” down the bottom.
Once Git’s installed, Jenkins asks you to do some additional setup of things like an Admin user and the like, but you can confidently ignore all that and click “Skip”, “Not now” or “Yeah fine, whatever” as needed.
Finally you will be presented with a pristine Dashboard waiting for you to create some Jobs, because the government, despite their boasts, are simply unable to. It’s up to you, brave programmer.
Click “Create a job”, fill in a name, select the only option “Freestyle project” and “OK”
On the configure screen, scroll down to “Source Code Management”, select the “Git” option and enter the repo URL to our sample application: https://github.com/Davetron/sample_multi_module.git
Don’t forget to set the Branch Specifier to blank.
Next, head further down to “Build”, click “Add build step”, select “Execute shell” and enter the following line:./pipeline.sh
Save the config, click “Build Now”, then sit back, take a smug sip of your beverage of choice and enjoy your first encapsulated build.
I’ll go into more detail on how it works in the next episode. If you can’t bear to wait until then, take a look at the pipeline script in the sample project and follow your nose, it’s relatively straight-forward.
One thing to consider, now that we can run a build anywhere with a JDK and git installed: do we even need Jenkins? The way I see it, Jenkins does two main things: runs builds and provides a central place for people to check progress and results. The thing is, I’m only interested in MY build and results. Just send me an email with my results when the build is complete or failed.
Maybe it’s throwing the baby out with the bathwater but I think it would be cool to have something serverless like AWS Lambda that triggers when you push to your repo, spins up an instance, runs your build, then disappears. That would take care of actually running your builds and presumably reduce your running costs. You can do something similar with Jenkins using a Kubernetes cluster for slaves but I haven’t played around with it yet.
Given the size of logs and the need to share build results it still makes sense to centralize some of the pipeline but perhaps you can just publish everything to ELK and do whatever you want with the data: dashboards, graphs, explore logs, go wild. That sounds pretty flexible and powerful to me.
For previous entries in this series see:
1 – Breaking Down Barriers
2 – Pipelines as code…not text
3 – Pipelines as code (part 2)…the API
4 – Conveyor (part 3): Encapsulated Builds