Jenkins is not a modern CI. It is a historical milestone, but if you read an article you should see that it was replaced by other tools. Now I don’t recommend considering Jenkins for new projects. It it fast to set up but extremely hard to support and full of old bugs and legacy code. Writing Groovy pipelines is much harder than pipelines in gitlab/github/forgejo/etc. Tens of plugins you have to use even for simple tasks have inconsistent syntax, many of them are buggy, they often become unsupported or deprecated. This all consumes lot of resourses: I maintain an instance that eats ~4G of RAM being idle.
and full of old bugs and legacy code.
The feeling of reading through those crazy JVM stack traces with classes named “hudson” from the Jenikins prototype… I shudder! Well done for pushing through it all!
Jenkins was called Hudson already a long time after being a prototype. We used it productively already.
I have a soft spot for Jenkins because it was the first integration tool I ever used fresh out of college.
But today I want to stab the server because a job started failing randomly with a permission error when trying to copy a file.
I have a soft spot for Jenkins because it was the first integration tool I ever used fresh out of college.
But today I want to stab the server because a job started failing randomly with a permission error when trying to copy a file.
You know what’s a hard pill to swallow for Jenkins haters? It’s likely older than your career, and is going to outlive you too. Like bash, and C, and gnu-utils.
Want to appear godlike in any org? Learn a tiny amount of groovy and read the pipelines pages - https://www.jenkins.io/doc/book/pipeline/
Jenkins is battle tested, Jenkins is likely already in your org, and replacing it for anything else is almost not worth the time from a strategic perspective. But it isn’t perfect, testing it in particular - a pain in the ass
So here’s the best tip: skinny Jenkinsfiles. When you use a sh: have it run a Makefile command, or your build tool command. Keep them short single line things. Don’t rely on massive ENVs. Dockerfiles for most stuff. Dynamic container agents in the cloud are actually good. Learn to use archiveArtifact, integrate with test report plugins. Learn about parallel pipelines.
Why is that a hard pill to swallow? Longevity doesn’t imply goodness, especially in software. Same for Bash, C and gnu utils.
It also doesn’t mean it’s a good idea to use it. I would strongly recommend… basically anything else over Jenkins.
Maybe I missed your point there.
why do you recommend other tools over things which are tested and will last way longer than whatever the current fad is? The best part of Jenkins is it’s ubiquitousness - writing code that will run forever is not to be sniffed at
That’s one of the reasons I like Java. It definitely has problems, but it’s been around so long that there are an insane number of libraries to work with. And you can practically guarantee that your project will run on a given computer with minimal fuss.
Can you give some examples of other tools for the job you’d rather use, which can be self hosted?
Yeah, this is something I stressed at my place. Your Jeninksfile should set up environment variables, authentication related stuff, and call out to some build tool to build the project. The Jenkinsfile should also be configure to use a docker container to run the build within. In projects at my place that’s a Docker file on the project that ultimately sets up and installs all the tools and dependencies required for a valid build environment that’s just checked in along side the Jenkinsfile.
I like Jenkins a lot - but I’d love it if somebody could give it a bit of a facelift. It looks and feels very aged at this point.
I have to say, pretty much every CI/CD tool, build automation or whatever you want to call it, sucks.
Somehow they all manage not to offer boilerplate actions since that would be too restrictive, yet they also stand in your way if you want to have advanced features.
I often end up writing pipeline steps/jobs thinking “how is that not already built-in?”.
And my absolutely biggest pain point: why the fuck is there not a single tool that lets me execute pipelines locally? Why do I have to have 200 commits all saying a variation of “pipeline test”?
I agree so, so much; and I’ve been saying similar things for years.
But I recognize that it’s probably a hard problem. For one thing, auth is almost never going to work the same way in CI as it does locally.
…still, though, I feel like there could be some much nicer tooling here.
It can’t be that hard to create a test harness, so I can plug my own variables into the runner context.
Even just an 80% solution would speed up development tremendously.
Well, for GitHub Actions, there’s this: https://github.com/nektos/act
That actually looks quite promising. You don’t happen to know of a similar project for Gitlab runners?
I do not. I’m sorry.
The sentiment of the first half of your comment is the cause of the problem you describe in the second half. Why /should/ the CI tool have any “steps” built in? Use a task runner, or script in your repo for any task you expect CI to do. Configure CI to run the same command you would run locally.
It’s not the same thing. Sometimes there are rules/conditions in the pipeline you need to test. Sometimes there are child pipelines. There really should be a way to simulate a pipeline run locally.
I totally agree, with some caveats:
- It can make the UI much nicer (though I agree there are probably better ways to do it, e.g. native support for letting the program output determine the UI).
- Sometimes you need to run different things on different machines/OSes, and it’s easier if you have separate steps there.
haven’t tried it myself, but I’ve heard https://earthly.dev/ is supposed to address the execute locally problem
In the last few released they made a lot of changes to the UI to give it a much more modern look.
I’ll have to check that out!
What about cruise control? I used that CI in 2009 maybe earlier. Jenkins was a milestone, fair point but not the first CI