One of the first step I took in the PowerShell journey of my team, and the topic of this post, was to address the need for a Continuous Integration and Delivery (CI/CD) pipeline for our PowerShell modules. Since the modules are built, published and consumed by our internal company infrastructure, there are some nuances that I had to account for when designing our approach for the pipeline. As PowerShell develops into a key tool in our automations and processes, having a well-defined, rock-solid CI/CD system becomes a must-have.
Currently, the PowerShell pipeline has the following goals:
- Install the module dependencies
- Validate PowerShell code syntax
- Run static code checker
- Run tests in order of complexity: unit tests, then component tests, etc.
- Package the module to be published
Before we begin
Some of the concepts and techniques that we are going to cover on this post are not necessarily at a beginner's level; it is intended for those already familiar with PowerShell, instead.
The implementation that we discuss below, builds mainly on existing work and knowledge shared by Kevin Marquette, Warren Frame and Mark Kraus. If you need to cover some ground on PowerShell modules and pipelines, you should checkout first the following posts:
- A PowerShell Module Release Pipeline
- Powershell: Let's build the CI/CD pipeline for a new module
- Write The FAQ ‘n Manual
You're back! Awesome, let's first cover some of the dependencies that we use to build the pipeline with.
Modules in use
To run the pipeline, we use several modules from the community, as well as some developed internally, that facilitate achieving the goals outlined for our system earlier in this post.
As I mentioned them in a previous post about PowerShell, these modules are among the building blocks for any serious endeavors in PowerShell, well at least for me :).
- InvokeBuild: Build automation in PowerShell
- PSDepend: Dependencies management
- PSScriptAnalyzer: Syntax checker
- Pester: Tests for PowerShell
- BuildHelpers: Utility module for CI/CD scenarios
Besides these excellent PowerShell modules, we use an in-house module to generate a .nupkg file for the current module in the pipeline, so it can be stored to Artifactory, which act as our internal PowerShell gallery.
The pipeline is composed of two main components: a PowerShell implementation, that handles everything from validating the code, running tests and packaging the module; and the Jenkins component, that oversees the whole CI/CD starting from Git, passing through the PowerShell pipeline and ending it with the deployment to Artifactory.
This separation was drawn by our infrastructure itself, which only allows to publish to Artifactory repositories from our internal Jenkins servers. Enforcing this separation allows contributors to run the majority of the pipeline on their local boxes, thus improving the feedback cycle, quality and productivity.
As always in software, there is still plenty of room for improvements. On the PowerShell side, the focus is on leveling up the security of the pipeline, by introducing detection mechanisms for security issues such as InjectionHunter, and automating the generation of Markdown documentation, based on the comment-based help for the cmdlets using PlatyPS.