Giter Site home page Giter Site logo

jenkinsci / pipeline-as-yaml-plugin Goto Github PK

View Code? Open in Web Editor NEW
133.0 9.0 46.0 496 KB

Jenkins Pipeline As Yaml Plugin

Home Page: https://plugins.jenkins.io/pipeline-as-yaml/

License: MIT License

Java 99.82% HTML 0.18%
jenkins jenkins-pipeline jenkins-plugin pipeline yaml pipeline-as-yaml pipeline-as-code multibranch-pipeline

pipeline-as-yaml-plugin's Introduction

Pipeline As Yaml Plugin for Jenkins

Build Coverage LOC Contributors GitHub release GitHub license

This plugin enables defining Jenkins Pipelines in YAML Format for Pipeline and MultiBranch Pipeline Jobs.

Important

Currently this plugin is in the incubation stage. It will evolve further to become more aligned with the Pipeline ecosystem, and some breaking changes are plausible. You are welcome to try out this plugin and to provide your feedback. Contributions are welcome!

Description

Jenkins enables defining pipelines with specific DSL. With this plugin Jenkins pipelines can be defined in Yaml format.

Defined Yaml format is converted to Jenkins Pipeline Declarative syntax in runtime.

Any existing steps in Snippet Generator or Declarative Directive Generator can bu used in step or script block.

Jenkins Declarative Pipeline Syntax rules must be followed.

Please see below for usage examples.

Usage

Pipeline

For using Pipeline As Yaml in your Pipeline Job, select one of the possible options.

Editor

Define Pipeline As Yaml with embedded editor.

pipelineAsScript

SCM

Retrieve Pipeline As Yaml from SCM Definition

pipelineAsScm

MultiBranch Pipeline

For using Pipeline as Yaml in your MultiBranch Pipeline, select by Jenkinsfile As Yaml' in Build Configuration`.

Build Configuration

Pipeline As Yaml Syntax

Pipeline definition must stat with pipeline key.

For detailed usage examples please check here.

pipeline:
  agent: any
    ...
    ...

Agent

Example agent definition is shown below. Agent definitions can be used under stage definitions.

For further supported definitions syntax please check.

pipeline:
  agent:
    node:
      label: 'label'

Environment

Example definition is shown below. Environment definitions can be used under stage definitions.

For further supported definitions syntax please check.

pipeline:
  environment:
    KEY1: "VAL1"

Options

Example definition is shown below. Options definitions can be used under stage definitions.

For further supported definitions syntax please check.

pipeline:
  options:
    - "timeout(time: 1, unit: 'HOURS')"
    # Or any other 'options' directive which is generated by Declarative Directive Generator

Post

Example definition is shown below. Post definitions can be used under stage definitions.

For further supported definitions syntax please check.

pipeline:
  post:
    always:
      - echo Test
    changed:
      - echo Test
    # Or any other 'post' directive which is generated by Declarative Directive Generator 

Tools

Example definition is shown below. Tools definitions can be used under stage definitions.

For further supported definitions syntax please check.

pipeline: 
  tools:
    maven: "maven"
    # Or any other 'tools' directive which is generated by Declarative Directive Generator" 

When

Example definition is shown below. When definitions can be used under stage definitions.

For further supported definitions syntax please check.

pipeline:
  stages:
    - stage: "WhenTest"
      when:
        - "branch 'production'"
      # Or any other 'when' directive which is generated by Declarative Directive Generator" 

Parameters

Example definition is shown below.

For further supported definitions syntax please check.

pipeline:
  parameters:
    - "string(name: 'PERSON', defaultValue: 'Mr Jenkins', description: 'Who should I say hello to?')"
    # Or any other 'parameters' directive which is generated by Declarative Directive Generator" 

Triggers

Example definition is shown below.

For further supported definitions syntax please check.

pipeline:
  triggers:
    - cron('H */4 * * 1-5')
    # Or any other 'triggers' directive which is generated by Declarative Directive Generator" 

Library

Example definition is below.

Before using Library feature please read here

For further supported definitions syntax please check.

pipeline:
  library: "library@master"
  agent:
    any:
  stages:
    - stage: "Stage Library"
      steps:
        script:
          - "myCustomStepInLibrary"

Stages

Example definition is shown below.

For further supported definitions syntax please check.

pipeline:
  agent:
    none:
  stages:
    - stage: "Stage1"
      steps:
        - echo "1"
    - stage: "Stage2"
      steps:
        - echo "2"
pipeline:
  agent:
    none:
  stages:
    - stage: "Stage1"
      stages:
        - stage: "Inner Stage1"
          steps:
            - echo "1" 
pipeline:
  stages:
    - stage: "Stage1"
      steps:
        - echo "1"
    - stage: "Parallel"
      parallel:
        - stage: "Parallel1"
          steps:
            - echo "P1"
        - stage: "Parallel2"
          steps:
            - echo "P1"

Steps

Example definition is shown below.

Any other 'step' which is generated by Snippet Generator can be used in steps definitions.

For further supported definitions syntax please check.

pipeline:
  stages:
    - stage: "Stage"
      steps:
        - echo env.WORKSPACE # Or any other 'step' which is generated by Snippet Generator" 

Any other 'step' which is generated by Snippet Generator or Groovy Script can be used in steps definitions.

pipeline:
  stages:
    - stage: "Stage1"
      steps:
        script:
          - echo "1" # Or any other 'step' which is generated by Snippet Generator, Groovy Script" 

For implementing complex scripts or steps

pipeline:
  stages:
    - stage: "Stage1"
      steps:
        script: |
          echo "1"
          echo "2"
          echo "3"

Special Steps With Code Blocks

Some steps has their own code blocks. For example: 'withAnt, withEnv, withCredentials, dir' or any other custom step definition which has it's own code block.

This kind of steps also can be defined as YAML.

Example definition is shown below.

pipeline:
  stages:
    - stage: "Stage"
      steps:
        script:
          - withAnt:
            script:
              - echo "No values"
          - withEnv: "['KEY=VAL']"
            script:
              - echo $KEY
          - withCredentials: "[usernamePassword(credentialsId: 'eedc7820-a4e0-4d87-a66d-b5b65ee42ad9', passwordVariable: 'PASSWORD', usernameVariable: 'USERNAME')]"
            script:
              - echo $USERNAME
          - withCredentials: "[string(credentialsId: '',variable: 'CRED')]"
            script:
              - echo $CRED

This steps can be used within their blocks as well.

pipeline:
  stages:
    - stage: "WithEnv Intertwined"
      steps:
        script:
          - withEnv: "['KEY1=VAL1']"
            script:
              - echo env.KEY1
              - withEnv: "['KEY2=VAL2']"
                script:
                  - echo env.KEY2

Custom steps can be converted to YAML format as shown below.

myCustomStep([customVariable: '']) {
    echo "some code"
}
pipeline:
  stages:
    - stage: "Stage"
      steps:
        script:
          - myCustomStep: "[customVariable: '']"
            script:
              - echo "some code"

Conversion and Validation

Before running Pipeline As Yaml, you can convert to Declarative Script and validate the pipeline. By this, errors can be prevented before running the pipelines.

For using this functionality click the Pipeline Syntax Page which is shown in the Job Menu

Pipeline Syntax

Click "Pipeline As YAML Converter" link

Pipeline As YAML Conveter

Paste your Pipeline As YAML to first text area and click "Convert To Pipeline Declarative Script" button as shown below

Paste Pipeline

After successful conversion second text area will be filled Pipeline Declarative Script. For validation, click "Validate" button as shown below

Validate

Validation or error messages will be show below the button.

Reporting Issues

Please create issue in this repository.

Create Issue

Thank You!

If you feel your self generous today, you can buy me a coffee : )
Or you can star the project. Thanks.

pipeline-as-yaml-plugin's People

Contributors

aytuncbeken avatar danktec avatar dependabot[bot] avatar jonesbusy avatar kennyg avatar lengyf avatar oleg-nenashev avatar strangelookingnerd avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pipeline-as-yaml-plugin's Issues

Pipeline as YAML Parser classes do not propagate exception causes

Describe the bug
Pipeline as YAML parsers do not retain the cause exceptions. In such case, some diagnostics info may be lost. It may become harder to users to diagnose the failure causes.

Sample code:

        catch (Exception e) {
            throw new PipelineAsYamlRuntimeException(e.getLocalizedMessage());
        }

Sample error from Jenkinsfile Runner:

[2020-07-17T20:27:45.806Z] 2020-07-17 20:27:45.752+0000 [id=112]	WARNING	i.j.j.runner.JenkinsEmbedder#before: Jenkins.theInstance was not cleared by a previous test, doing that now

[2020-07-17T20:27:46.618Z] 2020-07-17 20:27:46.513+0000 [id=133]	WARNING	o.j.p.w.flow.FlowExecutionList#unregister: Owner[job/1:job #1] was not in the list to begin with: []

[2020-07-17T20:27:46.618Z] Started

[2020-07-17T20:27:46.618Z] org.jenkinsci.plugins.workflow.multibranch.yaml.pipeline.exceptions.PipelineAsYamlRuntimeException: java.util.ArrayList cannot be cast to java.lang.String

[2020-07-17T20:27:46.618Z] 	at org.jenkinsci.plugins.workflow.multibranch.yaml.pipeline.parsers.PipelineParser.parse(PipelineParser.java:53)

[2020-07-17T20:27:46.618Z] 	at org.jenkinsci.plugins.workflow.multibranch.yaml.pipeline.cps.PipelineCpsScmFlowDefinition.create(PipelineCpsScmFlowDefinition.java:50)

[2020-07-17T20:27:46.618Z] 	at org.jenkinsci.plugins.workflow.multibranch.yaml.pipeline.PipelineAsYamlScmFlowDefinition.create(PipelineAsYamlScmFlowDefinition.java:74)

[2020-07-17T20:27:46.618Z] 	at org.jenkinsci.plugins.workflow.job.WorkflowRun.run(WorkflowRun.java:309)

[2020-07-17T20:27:46.618Z] 	at hudson.model.ResourceController.execute(ResourceController.java:97)

[2020-07-17T20:27:46.618Z] 	at hudson.model.Executor.run(Executor.java:428)

[2020-07-17T20:27:46.618Z] Finished: FAILURE

To Reproduce

See the test in jenkinsci/jenkinsfile-runner#316

Expected behavior
All exceptions are properly propagated

Pipeline as YAML fails to run if user trigger it with Replay option

I have created a Pipeline in YAML as shown below.

pipeline:
    agent:
      any:
    stages:
     - stage: "Checkout"
       steps:
         script: 
           - git 'https://github.com/username/API.git'
     - stage: "Build Multi stage Docker Image"
       steps:
         script: 
           - sh "docker build -t username/webserver:v$BUILD_NUMBER ."

The pipeline when successful when I build it. But failing when I replay it with the below error.

Replayed #40
java.lang.ClassCastException: java.lang.String cannot be cast to java.util.LinkedHashMap
	at org.jenkinsci.plugins.workflow.multibranch.yaml.pipeline.parsers.PipelineParser.parse(PipelineParser.java:34)
Caused: org.jenkinsci.plugins.workflow.multibranch.yaml.pipeline.exceptions.PipelineAsYamlRuntimeException: java.lang.String cannot be cast to java.util.LinkedHashMap
	at org.jenkinsci.plugins.workflow.multibranch.yaml.pipeline.parsers.PipelineParser.parse(PipelineParser.java:53)
	at org.jenkinsci.plugins.workflow.multibranch.yaml.pipeline.cps.PipelineCpsFlowDefinition.create(PipelineCpsFlowDefinition.java:41)
	at org.jenkinsci.plugins.workflow.multibranch.yaml.pipeline.PipelineAsYamlScriptFlowDefinition.create(PipelineAsYamlScriptFlowDefinition.java:56)
	at org.jenkinsci.plugins.workflow.job.WorkflowRun.run(WorkflowRun.java:309)
	at hudson.model.ResourceController.execute(ResourceController.java:97)
	at hudson.model.Executor.run(Executor.java:428)
Finished: FAILURE

To Reproduce
To reproduce this issue please create a Pipeline using YAML and try to run via replay.

Expected behavior
The pipeline should be successful.

Pipeline Converter UI v1

Is your feature request related to a problem? Please describe.

  • An UI for converting Yaml to Declarative
  • UI must be in the Pipeline Syntax Page for easy access

Require a working example of executing multiline shell script

Describe your use-case which is not covered by existing documentation.

Require a best way to execute multiline shell script.
The script can includes multiple variable declarations and if else statements.

I see that individual line shell command can be executed using
sh [command]
However it should be great if we could have a standard way of executing multiline shell scripts.

If it already exists, I would appreciate if someone can point me to the documentation or provide example for reference here.

Reference any relevant documentation, other materials or issues/pull requests that can be used for inspiration.

No response

Are you interested in contributing to the documentation?

No response

Converting `environment` with `credentials` method is broken?

Describe the bug
The environment section parser bug: a special helper method credentials parser is broken. It's adding single quotes around method. Then it will be errors in pipeline.

To Reproduce
Steps to reproduce the behavior:

  1. Go to 'Pipeline As YAML Converter'
  2. Paste the code from Additional context
  3. Press 'Convert to Pipeline'
  4. See like environment section converted
    4.1 Press 'Valid' & see validation errors

Expected behavior
As expected by docs of environment syntax.
Convert this

environment:
  COVERALLS_SECRET_TOKEN: credentials('COVERALLS_SECRET_TOKEN')

to this (without quotes around credentials method):

environment {
  COVERALLS_SECRET_TOKEN = credentials('COVERALLS_SECRET_TOKEN')
}

Desktop (please complete the following information):

  • Ubuntu 18.04 on server, Linux Mint 20 on desktop
  • Chrome 84 on desktop

Additional context
I have code like this:

pipeline
  agent:
    label: 'master'
  stages:
    - stage: Tests
      stages:
        - stage: Start tests
          environment:
            COVERALLS_SECRET_TOKEN: credentials('COVERALLS_SECRET_TOKEN')
          steps:
            - echo "====++++executing Start tests++++===="
            - echo "COVERALLS_SECRET_TOKEN = ${COVERALLS_SECRET_TOKEN}"
            - sh 'npm install'
            - sh 'npm test'
          post:
            always:
              - echo "====++++always++++===="
            success:
              - echo "====++++Start tests executed successfully++++===="
              - sh 'npm run coverage'
            failure:
              - echo "====++++Start tests execution failed++++===="

and this is parsing result:

pipeline {
  agent {
    node {
      label 'master'
    }
  }
  stages {
    stage('Tests') {
      stages {
        stage('Start tests') {
          environment {
            COVERALLS_SECRET_TOKEN = 'credentials('COVERALLS_SECRET_TOKEN')'
          }
          steps {
            echo "====++++executing Start tests++++===="
            echo "COVERALLS_SECRET_TOKEN = ${COVERALLS_SECRET_TOKEN}"
            sh 'npm install'
            sh 'npm test'
          }

Validation says that:

startup failed:
WorkflowScript: 18: Environment variable values must either be single quoted, double quoted, or function calls. @ line 18, column 50.
LS_SECRET_TOKEN = 'credentials('COVERALLS_
^

WorkflowScript: 17: No variables specified for environment @ line 17, column 11.
environment {
^

2 errors

Pipeline Support

Requirements:

  • Pipeline Job support needs to be added to plugin
  • Options
    • Jenkins file as YAML From SCM
    • Jenkins file as Yaml (Editor)

ERROR: ‘checkout scm’ is only available when using “Multibranch Pipeline” or “Pipeline script from SCM”

Jenkins and plugins versions report

Environment
Jenkins: 2.462.2
OS: Linux - 6.8.8-3-pve
Java: 17.0.12 - Eclipse Adoptium (OpenJDK 64-Bit Server VM)
---
antisamy-markup-formatter:162.v0e6ec0fcfcf6
apache-httpcomponents-client-4-api:4.5.14-208.v438351942757
asm-api:9.7-33.v4d23ef79fcc8
authentication-tokens:1.119.v50285141b_7e1
blueocean:1.27.16
blueocean-bitbucket-pipeline:1.27.16
blueocean-commons:1.27.16
blueocean-config:1.27.16
blueocean-core-js:1.27.16
blueocean-dashboard:1.27.16
blueocean-display-url:2.4.3
blueocean-events:1.27.16
blueocean-git-pipeline:1.27.16
blueocean-github-pipeline:1.27.16
blueocean-i18n:1.27.16
blueocean-jwt:1.27.16
blueocean-personalization:1.27.16
blueocean-pipeline-api-impl:1.27.16
blueocean-pipeline-editor:1.27.16
blueocean-pipeline-scm-api:1.27.16
blueocean-rest:1.27.16
blueocean-rest-impl:1.27.16
blueocean-web:1.27.16
bootstrap5-api:5.3.3-1
bouncycastle-api:2.30.1.78.1-248.ve27176eb_46cb_
branch-api:2.1178.v969d9eb_c728e
caffeine-api:3.1.8-133.v17b_1ff2e0599
checks-api:2.2.1
cloudbees-bitbucket-branch-source:888.v8e6d479a_1730
cloudbees-folder:6.951.v5f91d88d76b_b_
commons-lang3-api:3.17.0-84.vb_b_938040b_078
commons-text-api:1.12.0-129.v99a_50df237f7
configuration-as-code:1850.va_a_8c31d3158b_
credentials:1378.v81ef4269d764
credentials-binding:681.vf91669a_32e45
display-url-api:2.204.vf6fddd8a_8b_e9
docker-commons:443.v921729d5611d
docker-workflow:580.vc0c340686b_54
durable-task:577.v2a_8a_4b_7c0247
echarts-api:5.5.1-1
eddsa-api:0.3.0-4.v84c6f0f4969e
favorite:2.221.v19ca_666b_62f5
font-awesome-api:6.6.0-2
git:5.5.1
git-client:5.0.0
github:1.40.0
github-api:1.321-468.v6a_9f5f2d5a_7e
github-branch-source:1797.v86fdb_4d57d43
gson-api:2.11.0-41.v019fcf6125dc
handy-uri-templates-2-api:2.1.8-30.v7e777411b_148
htmlpublisher:1.36
instance-identity:185.v303dc7c645f9
ionicons-api:74.v93d5eb_813d5f
jackson2-api:2.17.0-379.v02de8ec9f64c
jakarta-activation-api:2.1.3-1
jakarta-mail-api:2.1.3-1
javax-activation-api:1.2.0-7
jaxb:2.3.9-1
jenkins-design-language:1.27.16
jjwt-api:0.11.5-112.ve82dfb_224b_a_d
jobConfigHistory:1268.v75ce751da_911
joda-time-api:2.13.0-85.vb_64d1c2921f1
jquery3-api:3.7.1-2
json-api:20240303-41.v94e11e6de726
json-path-api:2.9.0-58.v62e3e85b_a_655
junit:1300.v03d9d8a_cf1fb_
mailer:472.vf7c289a_4b_420
matrix-auth:3.2.2
matrix-project:832.va_66e270d2946
mina-sshd-api-common:2.13.2-125.v200281b_61d59
mina-sshd-api-core:2.13.2-125.v200281b_61d59
okhttp-api:4.11.0-172.vda_da_1feeb_c6e
pipeline-as-yaml:192.vc72d50cb_c258
pipeline-build-step:540.vb_e8849e1a_b_d8
pipeline-graph-analysis:216.vfd8b_ece330ca_
pipeline-groovy-lib:730.ve57b_34648c63
pipeline-input-step:495.ve9c153f6067b_
pipeline-milestone-step:119.vdfdc43fc3b_9a_
pipeline-model-api:2.2214.vb_b_34b_2ea_9b_83
pipeline-model-definition:2.2214.vb_b_34b_2ea_9b_83
pipeline-model-extensions:2.2214.vb_b_34b_2ea_9b_83
pipeline-stage-step:312.v8cd10304c27a_
pipeline-stage-tags-metadata:2.2214.vb_b_34b_2ea_9b_83
plain-credentials:183.va_de8f1dd5a_2b_
plugin-util-api:5.1.0
prism-api:1.29.0-17
pubsub-light:1.18
role-strategy:743.v142ea_b_d5f1d3
scm-api:696.v778d637b_a_762
script-security:1362.v67dc1f0e1b_b_3
snakeyaml-api:2.3-123.v13484c65210a_
sse-gateway:1.27
ssh-credentials:343.v884f71d78167
sshd:3.330.vc866a_8389b_58
structs:338.v848422169819
swarm:3.47
token-macro:400.v35420b_922dcb_
variant:60.v7290fc0eb_b_cd
workflow-api:1336.vee415d95c521
workflow-basic-steps:1058.vcb_fc1e3a_21a_9
workflow-cps:3964.v0767b_4b_a_0b_fa_
workflow-durable-task-step:1371.vb_7cec8f3b_95e
workflow-job:1436.vfa_244484591f
workflow-multibranch:795.ve0cb_1f45ca_9a_
workflow-scm-step:427.v4ca_6512e7df1
workflow-step-api:678.v3ee58b_469476
workflow-support:926.v9f4f9b_b_98c19

When creating a Jenkinsfile.yaml and running is from SCM I get the following error:
ERROR: ‘checkout scm’ is only available when using “Multibranch Pipeline” or “Pipeline script from SCM”

This is the pipeline:

pipeline:
  agent: any
  stages:
    - stage: "Update1"
      agent:
        docker:
          image: 'python:3'
          label: "update1"
      steps:
        script:
          - checkout scm
          - sh 'pip list'
          - sh 'ls -la'

Running the equivalent Jenkinsfile works..

This is the cause of the exception: SCMVars#L83

What Operating System are you using (both controller, and any agents involved in the problem)?

Docker on debian

Reproduction steps

  1. Create Jenkinsfile.yaml
  2. Create jenkins job with "Pipeline as Yaml from SCM"
  3. Run the job

Expected Results

Code is checked out as expected

Actual Results

Jobs fails

Anything else?

The same job, converted to Declarative works as expected

Are you interested in contributing a fix?

No response

Matrix support

Is your feature request related to a problem? Please describe.
I need jenkins matrices support for some of my projects. I don't see any way to describe it in yaml.

Describe the solution you'd like
I would like to be able to describe matrices in yaml.

Describe alternatives you've considered
N/A

Additional context
I am pretty new to jenkins pipelines so maybe I am missing something about how to describe matrices in a pipeline.

Status of this project?

Hi everyone,

I just wanted to know what the status of this project is.

We (the company) will be moving to Jenkins in the near future and the devs have been looking at how to best migrate our existing pipelines (in Azure DevOps and TeamCity).

I understand that this project is currently in incubation but I haven't seen much activity on the repository (esp. Releases).

Pipeline as YAML should not dependency on Pipeline Aggregator

Describe the bug
Pipeline Aggregator is a meta-plugin which includes A LOT of Pipeline plugins: https://github.com/jenkinsci/workflow-aggregator-plugin . It bloats the dependency scope for the plugin and it prevents Pipeline as YAML from being included into the default distribution due to circular dependencies.

Example of a dependency conflict caused by an old plugin version in the aggregator:

Require upper bound dependencies error for com.google.guava:guava:11.0.1 paths to dependency are:
+-io.jenkins.jenkinsfile-runner:payload-dependencies:1.0-beta-13-SNAPSHOT
  +-org.jenkins-ci.main:jenkins-core:2.235.2
    +-com.google.guava:guava:11.0.1
and
+-io.jenkins.jenkinsfile-runner:payload-dependencies:1.0-beta-13-SNAPSHOT
  +-org.jenkins-ci.main:jenkins-core:2.235.2
    +-org.kohsuke.stapler:stapler-jrebel:1.259
      +-org.kohsuke.stapler:stapler:1.259
        +-com.google.guava:guava:11.0.1
and
+-io.jenkins.jenkinsfile-runner:payload-dependencies:1.0-beta-13-SNAPSHOT
  +-io.jenkins.plugins:pipeline-as-yaml:0.9-rc
    +-org.jenkins-ci.plugins.workflow:workflow-aggregator:2.6
      +-org.jenkinsci.plugins:pipeline-model-definition:1.3.2
        +-org.jenkinsci.plugins:pipeline-model-api:1.3.2
          +-com.github.fge:json-schema-validator:2.0.4
            +-com.github.fge:json-schema-core:1.0.4
              +-com.google.guava:guava:11.0.1 (managed) <-- com.google.guava:guava:13.0.1

To Reproduce
Steps to reproduce the behavior:

  1. See jenkinsci/jenkinsfile-runner#316

Expected behavior
Pipeline as YAML plugin declares dependencies only on Pipeline components it needs. https://github.com/jenkinsci/bom can be ideally to simplify this process and further management.

When replaying a pipeline the code shown is the transformed declarative code rather than the original YAML

Describe the bug
I created a pipeline as YAML. Then ran it, so build n.1 completed. Clicked on the build, then hit the "replay" button, it showed the code of the pipeline, not as YAML anymore, but instead as its corresponding declarative code.

To Reproduce
Steps to reproduce the behaviour:

  1. Create a simple "hello world" pipeline as YAML and save it.
  2. Build the pipeline, wait until build completes.
  3. Click on the build, then, click on "replay"
  4. See the code is not YAML anymore. Instead, it is the declarative version of the previous YAML code.

Expected behaviour
Replaying a build from a pipeline defined as YAML should show the original YAML code.

Example of YAML pipeline

pipeline:
  agent:
    label: 'master'
  stages:
   - stage: Setup
     steps: echo "hello world"

Declarative pipeline
When replaying the code of the pipeline above, this is shown

pipeline {
  agent {
    node {
      label 'master'
    }
  }
  stages {
    stage('Setup') {
      steps {
        echo "hello world"
      }
    }
  }
}

Proposal: Change the package name for the code and tests

Currently the code uses the org.jenkinsci.plugins.workflow.multibranch.yaml.pipeline; package. A few concerns there:

  • org.jenkinsci.plugins.workflow is an old root for Pipeline plugins. Maybe it makes sense to use io.jenkins.plugins.pipeline
  • multibranch does not seem to be needed, the code does not really depend on MultiBranch logic

If we do the change, it needs to happen before 1.0 release. It will be a breaking change, and I doubt it makes sense to spend time on data migration logic to prevent that.

scenarioInput stage missing close brace

Describe the bug
scenarioInput stage missing close brace

Additional context
pipeline {
agent none
stages {
stage('Stage1') {
input {
message "message"
id "id"
ok "ok"
submitter "submitter"
submitterParameter "submitterParameter"
parameters {
string(name: 'PERSON', defaultValue: 'Mr Jenkins', description: 'Who should I say hello to?')
}
steps {
echo "1"
}
}
}
}

Schema Validation

Requirements:

  • Schema of Yaml file should be validated before trying to parse.

Pipeline YAML may get processed as Groovy DSL when CpsFlowFactoryAction2 is present

Describe the bug
I hit this issue in the simple demo for jenkinsci/jenkinsfile-runner#316 . Jenkinsfile Runner uses the SCM source and the virtual FilesystemSCM, and presumptions in the Pipeline as YAML's code may lead to incorrect behavior if CpsFlowFactoryAction2 is present in the created Pipeline.

THIS IS NOT A BUG IN PIPELINE AS YAML, but some code hardening may make sense

Step 0. Jenkinsfile Runner adds the SetJenkinsfileLocation action which implements CpsFlowFactoryAction2. https://github.com/jenkinsci/jenkinsfile-runner/blob/9f41f51b6dc320b9dd5c0fa6d81f179518597d37/payload/src/main/java/io/jenkins/jenkinsfile/runner/SetJenkinsfileLocation.java

Step 1. PipelineCpsScmFlowDefinition converts YAML to Groovy DSL and then calls CpsFlowDefinition constructor

@Override
    public CpsFlowExecution create(FlowExecutionOwner owner, TaskListener listener, List<? extends Action> actions) throws Exception {
        CpsFlowExecution cpsFlowExecution =  super.create(owner, listener, actions);
        String yamlJenkinsFileContent = cpsFlowExecution.getScript();
       
       ....
 
        String jenkinsFileContent = pipelineModel.get().toPrettyGroovy();
        return new CpsFlowDefinition(jenkinsFileContent,cpsFlowExecution.isSandbox()).create(owner,listener, actions);
    }

Step 2. CpsFlowDefinition flow execution creator consults with actions passed as arguments. One of actions is SetJenkinsfileLocation. This action makes the method to calls return ((CpsFlowFactoryAction2) a).create(this, owner, actions); instead of creating the default constructor as coded below. The actually called create() method creates the execution from scratch and ignores the converted DSL. So it calls a standard

image

    @Override
    @SuppressWarnings("deprecation")
    public CpsFlowExecution create(FlowExecutionOwner owner, TaskListener listener, List<? extends Action> actions) throws IOException {
        for (Action a : actions) {
            if (a instanceof CpsFlowFactoryAction) {
                CpsFlowFactoryAction fa = (CpsFlowFactoryAction) a;
                return fa.create(this,owner,actions);
            } else if (a instanceof CpsFlowFactoryAction2) {
                return ((CpsFlowFactoryAction2) a).create(this, owner, actions);
            }
        }
        Queue.Executable exec = owner.getExecutable();
        FlowDurabilityHint hint = (exec instanceof Run) ? DurabilityHintProvider.suggestedFor(((Run)exec).getParent()) : GlobalDefaultFlowDurabilityLevel.getDefaultDurabilityHint();
        return new CpsFlowExecution(sandbox ? script : ScriptApproval.get().using(script, GroovyLanguage.get()), sandbox, owner, hint);
    }

Step 3. A standard Groovy Converter is called. Execution fails with...

org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 5: expecting EOF, found ':' @ line 5, column 10.
     - stage: "Print Hello"
            ^

1 error

        at org.codehaus.groovy.control.ErrorCollector.failIfErrors(ErrorCollector.java:310)
        at org.codehaus.groovy.control.ErrorCollector.addFatalError(ErrorCollector.java:150)
        at org.codehaus.groovy.control.ErrorCollector.addError(ErrorCollector.java:120)
        at org.codehaus.groovy.control.ErrorCollector.addError(ErrorCollector.java:132)
        at org.codehaus.groovy.control.SourceUnit.addError(SourceUnit.java:350)
        at org.codehaus.groovy.antlr.AntlrParserPlugin.transformCSTIntoAST(AntlrParserPlugin.java:144)
        at org.codehaus.groovy.antlr.AntlrParserPlugin.parseCST(AntlrParserPlugin.java:110)
        at org.codehaus.groovy.control.SourceUnit.parse(SourceUnit.java:234)
        at org.codehaus.groovy.control.CompilationUnit$1.call(CompilationUnit.java:168)
        at org.codehaus.groovy.control.CompilationUnit.applyToSourceUnits(CompilationUnit.java:943)
        at org.codehaus.groovy.control.CompilationUnit.doPhaseOperation(CompilationUnit.java:605)
        at org.codehaus.groovy.control.CompilationUnit.processPhaseOperations(CompilationUnit.java:581)
        at org.codehaus.groovy.control.CompilationUnit.compile(CompilationUnit.java:558)
        at groovy.lang.GroovyClassLoader.doParseClass(GroovyClassLoader.java:298)

To Reproduce
Run a demo from jenkinsci/jenkinsfile-runner#316

Expected behavior
Pipeline as YAML code is more robust against custom CpsFlowFactoryAction2 implementations. Additional coverage for Pipeline replay/restart functionality might be needed

Additional context
For me the resolution will be clearly on the Jenkinsfile Runner side. This is rather code hardening, not a bug

Parity with standard Jenkinsfile for Docker

Is your feature request related to a problem? Please describe.

Since the standard Jenkinsfile doesn't require an explicit clone, pipeline as yaml shouldn't either.

Describe the solution you'd like

When it is yaml from SCM it should automatically have the code cloned (as per the configuration) to be consistent with what happens for Jenkinsfile from SCM.

Describe alternatives you've considered

The workaround is to add an explicit checkout

This works..

pipeline {
    agent { 
        docker { 
            image 'maven:3.3.3' 
            reuseNode true
        }
    }
    stages {
        stage('build') {
            steps {
                sh 'mvn clean package'
            }
        }
    }
}

But this doesn't

pipeline:
  agent:
    docker:
      image: maven:3.3.3
      # 2. Reusing the node (double-check that this doesn't happen automatically)
      reuseNode: 'true'
  stages:
    - stage: "build"
      steps:
        - sh "mvn clean package"

Unless you add a clone step before the maven build

        - "checkout([$class: 'GitSCM', branches: [[name: '*/jenkins-poc']], doGenerateSubmoduleConfigurations: false, extensions: [], submoduleCfg: [], userRemoteConfigs: [[credentialsId: 'example-cred', url: 'https://github.com/example-org/example-repo.git']]])"

Can't save custom script path

Describe the bug
When job Pipeline settings is set to 'Pipeline As Yaml from SCM' and a custom path to the Jenkinsfile.yaml is given, the location is not saved even after hitting the 'Save' button.

Steps to reproduce the behavior:

  1. Create a new job
  2. Setup the pipeline section with 'Pipeline As Yaml from SCM' and feed it with SCM/repo/credentials/branch and script path, if Jenkinsfile.yaml is in a subdirectory of a (git) projet.
  3. Click Save
  4. If you check the job configuration again, everything is fine except that the specified path as been replaced by "Jenkinsfile.yaml"

Expected behavior
A script path location taken into account

Versions:

  • Jenkins Version : 2.249.2
  • plugin version : 0.12-rc

stash and unstash

Can you please provide examples for stash and unstash usage examples? I tried below, but looks likes not way to declare.

  agent:
    label: "Master"
  steps:
    - deleteDir()
    - unstash 'ucdPackage'
    - echo '1'

SnakeYAML method not found

SnakeYAML API Plugin 2.2-111.vc6598e30cc65
java.lang.NoSuchMethodError: org.yaml.snakeyaml.representer.Representer: method 'void ()' not found
at io.jenkins.plugins.pipeline.parsers.AbstractParser.(AbstractParser.java:25)
at io.jenkins.plugins.pipeline.parsers.PipelineParser.(PipelineParser.java:26)
at io.jenkins.plugins.pipeline.cps.PipelineCpsScmFlowDefinition.create(PipelineCpsScmFlowDefinition.java:50)
at io.jenkins.plugins.pipeline.PipelineAsYamlScmFlowDefinition.create(PipelineAsYamlScmFlowDefinition.java:74)
at org.jenkinsci.plugins.workflow.job.WorkflowRun.run(WorkflowRun.java:311)
at hudson.model.ResourceController.execute(ResourceController.java:101)
at hudson.model.Executor.run(Executor.java:442)

Add support for Kubernetes plugin

Hi,
It would be great if we could use pipeline-as-yaml-plugin to describe a pipeline which uses the kubernetes agent. At the moment it doesn't seem to be possible.
Something like:

pipeline:
  agent:
    kubernetes:
      cloud: mystack
      yaml: >
        apiVersion: v1
        kind: Pod
        spec:
            imagePullSecrets:
            - name: my-creds
            containers:
            - name: ubuntu
              image: myimage:1.1
              command: ['sleep', 'infinity']
              tty: true
              imagePullPolicy: Always'''
    stages:
      - stage: Test
        steps: echo "Hello world"

I've tried to run the pipeline above through the conversion tool provided by the plugin and it almost worked. The problem is that they "yaml" key's value gets converted in a string with single quotes, which doesn't work with multiline strings such as the pod definition above.
I reckon if the plugin was able to wrap the value of the yaml key in a multiline string (e.g. '''string''') that would work

NPE when running a job with 'Pipeline As Yaml from SCM'

The problem only occurs while running a job with 'Pipeline As Yaml from SCM'. What follows is the Output console content :

java.lang.NullPointerException
	at org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition.<init>(CpsScmFlowDefinition.java:78)
	at org.jenkinsci.plugins.workflow.multibranch.yaml.pipeline.cps.PipelineCpsScmFlowDefinition.<init>(PipelineCpsScmFlowDefinition.java:30)
	at org.jenkinsci.plugins.workflow.multibranch.yaml.pipeline.PipelineAsYamlScmFlowDefinition.create(PipelineAsYamlScmFlowDefinition.java:73)
	at org.jenkinsci.plugins.workflow.job.WorkflowRun.run(WorkflowRun.java:309)
	at hudson.model.ResourceController.execute(ResourceController.java:97)
	at hudson.model.Executor.run(Executor.java:428)
Finished: FAILURE

Steps to reproduce the behavior:

  1. Create a new job
  2. Setup the pipeline section with 'Pipeline As Yaml from SCM' and feed it with SCM/repo/credentials/branch.
  3. Click Save
  4. Running the job lead to a fail wih NPE

Expected behavior
A healthy job execution.

Versions:

  • Jenkins Version : 2.249.2
  • Plugin Version : 0.12-rc

Context:
It may be related to this issue : #34

Pipeline as YAML defines dependency on a higher slf4j-api version than the Jenkins core

Jenkins core includes slf4j-api, and the current versions use 1.7.26. The plugin uses version 1.7.30. It causes problems for components which use Maven Enforcer

Require upper bound dependencies error for org.slf4j:jcl-over-slf4j:1.7.26 paths to dependency are:
+-io.jenkins.jenkinsfile-runner:payload-dependencies:1.0-beta-13-SNAPSHOT
  +-org.jenkins-ci.main:jenkins-core:2.235.2
    +-org.slf4j:jcl-over-slf4j:1.7.26
and
+-io.jenkins.jenkinsfile-runner:payload-dependencies:1.0-beta-13-SNAPSHOT
  +-io.jenkins.plugins:pipeline-as-yaml:0.9-rc
    +-org.slf4j:jcl-over-slf4j:1.7.26 (managed) <-- org.slf4j:jcl-over-slf4j:1.7.30
,
Require upper bound dependencies error for org.slf4j:slf4j-api:1.7.26 paths to dependency are:
+-io.jenkins.jenkinsfile-runner:payload-dependencies:1.0-beta-13-SNAPSHOT
  +-io.jenkins.plugins:pipeline-as-yaml:0.9-rc
    +-org.slf4j:slf4j-api:1.7.26
and
+-io.jenkins.jenkinsfile-runner:payload-dependencies:1.0-beta-13-SNAPSHOT
  +-org.jenkins-ci.main:jenkins-core:2.235.2
    +-org.slf4j:jcl-over-slf4j:1.7.26
      +-org.slf4j:slf4j-api:1.7.26 (managed) <-- org.slf4j:slf4j-api:1.7.30
and
+-io.jenkins.jenkinsfile-runner:payload-dependencies:1.0-beta-13-SNAPSHOT
  +-org.jenkins-ci.main:jenkins-core:2.235.2
    +-org.slf4j:log4j-over-slf4j:1.7.26
      +-org.slf4j:slf4j-api:1.7.26
and
+-io.jenkins.jenkinsfile-runner:payload-dependencies:1.0-beta-13-SNAPSHOT
  +-io.jenkins.plugins:pipeline-as-yaml:0.9-rc
    +-org.slf4j:slf4j-jdk14:1.7.26 (managed) <-- org.slf4j:slf4j-jdk14:1.7.30
      +-org.slf4j:slf4j-api:1.7.26 (managed) <-- org.slf4j:slf4j-api:1.7.30

To Reproduce
jenkinsci/jenkinsfile-runner#316

Expected behavior
Pipeline as YAML uses the same library versions as Jenkins core. The recommendation is to update Plugin POM to 4.x and to use the dependency version provided by the Bill of Materials

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.