1. Overview

“Pipeline-as-code” is the idea to allow everyone involved in DevOps to create and maintain Jenkins pipelines. In fact, there are two ways of applying this “pipeline-as-code” principle in life: Scripted and Declarative pipelines. “Pipeline-as-code” allows Jenkins to treat pipelines as regular files. People involved in DevOps can store, share, and also can use those files with SCM.

2. Declarative Pipelines and Their Benefits

Declarative pipelines are a more recent approach to the “pipeline-as-code” principle. They are quite easy to write and understand. The structure may seem to be a bit complex, but overall, it contains only a couple of basic sections. The “pipeline” block is the main block that contains the entire declaration of a pipeline. In this example, we’ll consider only “agent”, “stages”, and “steps” sections:

  • pipeline – contains the whole pipeline
    • agent – defines the machine that will handle this pipeline
    • stages – declares the stages of the pipeline
      • steps – small operations inside a particular stage

Let’s check what this structure will look like in our declarative pipeline:

pipeline {
    agent any
    stages {
        stage('Hello World') {
            steps {
                sh 'echo Hello World'
            }
        }
    }
}

Another important part of declarative pipelines is directives. In fact, directives are a convenient way to include additional logic. They can help to include tools into a pipeline and can also help with setting triggers, environment variables, and parameters. Some directives can prompt users to input additional information. There’s an easy-to-use generator that can help with creating these directives.

In the previous example, “steps” is a directive that contains the logic that will be executed in the stage. There’s a dedicated snippet generator that provides a convenient way of creating steps.

The power of declarative pipelines comes mostly from directives. Declarative pipelines can leverage the power of scripted pipelines by using the “script” directive. This directive will execute the lines inside as a scripted pipeline.

3. Scripted Pipelines and Their Benefits

Scripted pipelines were the first version of the “pipeline-as-code” principle. They were designed as a DSL build with Groovy and provide an outstanding level of power and flexibility. However, this also requires some basic knowledge of Groovy, which sometimes isn’t desirable.

These kinds of pipelines have fewer restrictions on the structure. Also, they have only two basic blocks: “node” and “stage”. A “node” block specifies the machine that executes a particular pipeline, whereas the “stage” blocks are used to group steps that, when taken together, represent a separate operation. The lack of additional rules and blocks makes these pipelines quite simple to understand:

node {
    stage('Hello world') {
        sh 'echo Hello World'
    }
}

Think about scripted pipelines as declarative pipelines but only with stages. The “node” block in this case plays the role of both the “pipeline” block and the “agent” directive from declarative pipelines.

As mentioned above, steps for the scripted pipelines can be generated with the same snippet generator. Because this type of pipeline doesn’t contain directives, steps contain all the logic. For very simple pipelines, this can reduce the overall code.

However, it may require additional code for some boilerplate setups, which can be resolved with directives. More complex logic in such pipelines is usually implemented in Groovy.

4. Comparison of Scripted and Declarative Pipelines

Let’s check a three-step pipeline that pulls a project from git, then tests, packages, and deploys it:

pipeline {
    agent any
    tools {
        maven 'maven' 
    }
    stages {
        stage('Test') {
            steps {
                git 'https://github.com/user/project.git'
                sh 'mvn test'
                archiveArtifacts artifacts: 'target/surefire-reports/**'
            }
        }
        stage('Build') {
            steps {
                sh 'mvn clean package -DskipTests' 
                archiveArtifacts artifacts: 'target/*.jar'
            }
        }
        stage('Deploy') {
            steps {
                sh 'echo Deploy'
            }
        }
    }
}

As we can see, all logic resides inside the “steps” sections. Thus, if we would like to translate this declarative pipeline to a scripted pipeline, those parts would not be changed:

node {
    stage('Test') {
        git 'https://github.com/user/project.git'
        sh 'mvn test'
        archiveArtifacts artifacts: 'target/surefire-reports/**'
    }
    stage('Build') {
        sh 'mvn clean package -DskipTests' 
        archiveArtifacts artifacts: 'target/*.jar'
    }
    stage('Deploy') {
        sh 'echo Deploy'
    }
}

A scripted pipeline for the same functionality looks denser than its declarative counterpart. However, we should ensure that all the environment variables are set correctly on the server. At the same time, if there are several Maven versions, we’ll need to change them directly in the pipeline. For this, we can use a concrete path directly or an environment variable.

There’s also a “withEnv” step that can be useful in scripted pipelines. With declarative pipelines, on the other hand, it’s quite easy to change the version of the tools in Jenkins configurations.

The previous example shows that for simple day-to-day tasks, there’s almost no difference in these approaches. If steps can cover all the basic needs for pipelines, these two approaches will be almost identical. Declarative pipelines are still preferred as they can simplify some common logic.

5. Conclusion

Scripted and declarative pipelines follow the same goal and use the same pipeline sub-system under the hood. The major differences between them are flexibility and syntax. They’re just two different tools for solving the same problem, thus, we can and should use them interchangeably.

The succinct syntax of declarative pipelines will ensure a faster and smoother entrance to this field. At the same time, scripted pipelines may provide more power to more experienced users. In order to get the best from both worlds, we can leverage declarative pipelines with script directives.

Comments are open for 30 days after publishing a post. For any issues past this date, use the Contact form on the site.