Releases: SAP-archive/cloud-s4-sdk-pipeline
Version 45
v45
⚠️ Deprecation of SAP Cloud SDK Pipeline
This is the last planned released of the SAP Cloud SDK Pipeline. In the last months the SAP Cloud SDK Pipeline and its features were migrated into the General Purpose Pipeline of project "Piper" which will replace the SAP Cloud SDK Pipeline and should be used instead. The reasoning as well as further information how to adopt the General Purpose Pipeline are described in the guide.
Note: For SAP internal teams there is an additional guide which will cover some SAP specific aspects. This guide can be found in the SAP internal extension repository.
⚠️ Breaking changes
checkGatling renamed to gatlingExecuteTests
The step checkGatling
has been migrated as gatlingExecuteTests
into Project 'Piper', adopting the naming convention for steps.
This step is executed by the Cloud SDK Pipeline in the stage performanceTests
, but only if it is enabled via step configuration.
This step configuration has to be adapted in your .pipeline/config.yml
as shown in the diff below:
steps:
- checkGatling:
+ gatlingExecuteTests:
enabled: true
Version 44
v44
⚠️ Breaking changes
Backend & Frontend Integration Tests
Due to alignments with project "Piper" the stages backendIntegrationTests
and frontendIntegrationTests
have been merged into the project "Piper" stage integration
.
Please move any existing configuration for the stages backendIntegrationTests
and frontendIntegrationTests
to the configuration of the stage integration
.
For example:
stages:
- backendIntegrationTests:
+ integration:
retry: 2
credentials:
- alias: 'ERP'
credentialId: 'erp-credentials'
- alias: 'SF'
credentialId: 'successfactors-credentials'
Improvements
Conditional Execution of Stages
It is possible to consistently enable or disable all conditional stages with the config key runInAllBranches
.
The stages productionDeployment
, artifactDeployment
, compliance
, and security
are by default disabled for non-productive branches.
They can be enabled also for non-productive branches by configuring the respective stage with runInAllBranches
.
Example .pipeline/config.yml
file snippet to enable security
also in non-productive branches:
stages:
security:
runInAllBranches: true
Similarily, if there are stages which you do not want to run by default also in your non-productive branches, you can disable them like this:
stages:
endToEndTests:
runInAllBranches: false
This would then deviate from the default behavior and run the End to End Tests stage only for the productive branch.
Disable Usage of Deprecated Jenkins Plugins
The checksPublishResults
step uses some Jenkins plugins which have been deprecated in favor of warnings-ng
.
When replacing the SAP Cloud SDK Pipeline specific build stage with the more generic build stage of project "Piper", those plugins became a requirement of SAP Cloud SDK Pipeline, which was not intended.
Due to backwards compatibility concerns in project "Piper" general purpose pipeline, the old plugins are still available, but they have been disabled by default in SAP Cloud SDK Pipeline so that having those plugins installed is not required anymore.
If you need any of the old plugins, see the docs of the checksPublishResults
step to enable them in your pipeline config file.
Version 43
⚠️ Breaking changes
Quality Checks stage
Due to alignments with project "Piper" the s4SdkQualityChecks
stage has been removed.
Please remove any configuration for the stage s4SdkQualityChecks
from your pipeline configuration or custom default configuration.
Additional tools stage
The Third-party Checks
stage has been aligned with project "Piper" and the additionalTools
stage was not migrated.
In case you have a custom extension for the additionalTools
stage, please migrate it to be an extension of the stage security
instead.
The security
stage is documented in project "Piper".
In addition, this stage was used for running code analysis tools, e.g., Vulas, for internal projects, which has been removed as well.
Checkmarx Scan stage
Similarly, the checkmarxScan
stage has been merged into the project "Piper" stage security
.
The configuration of Checkmarx needs to be moved to the step configuration of the checkmarxExecuteScan step.
For example:
- stages:
- checkmarxScan:
+ steps:
+ checkmarxExecuteScan:
groupId: <Checkmarx GroupID>
vulnerabilityThresholdMedium: 5
checkMarxProjectName: 'My_Application'
vulnerabilityThresholdLow: 999999
filterPattern: '!**/*.log, !**/*.lock, !**/*.json, !**/*.html, !**/Cx*, **/*.js, **/*.java, **/*.ts'
fullScansScheduled: false
generatePdfReport: true
incremental: true
preset: '36'
checkmarxCredentialsId: CHECKMARX-SCAN
checkmarxServerUrl: http://localhost:8089
NPM dependency audit stage
Due to alignments with project "Piper" the npmAudit
stage has been removed.
Please remove any configuration for the stage npmAudit
from your pipeline configuration or custom default configuration.
Lint stage
The lint
stage has been aligned with project "Piper" and the checkUi5BestPractices
step was not migrated, since the used tool is deprecated.
In addition, the linting will now be executed as part of the build
stage instead of in the dedicated lint
stage.
Thus, the configuration for the lint
stage has to be removed, as it will not have an effect anymore.
Instead, please configure the step npmExecuteLint
in the steps section of your project configuration, as described in the documentation.
For example:
stages:
- lint:
- ui5BestPractices:
- esLanguageLevel: es2020
- failThreshold:
- error: 3
- warning: 5
- info: 7
steps:
+ npmExecuteLint:
+ failOnError: true
Static code checks stage
The staticCodeChecks
stage has been aligned with project "Piper".
The static code checks will now be executed as part of the build
stage instead of in the dedicated staticCodeChecks
stage.
Frontend unit tests stage
The frontendUnitTests
stage has been aligned with project "Piper" in version v43.
The stage has been renamed to additionalUnitTests
.
The additionalUnitTests
stage is documented in project "Piper".
Please move any existing stage configuration for the stage frontendUnitTests
to the stage additionalUnitTests
.
For Example:
stages:
- frontendUnitTests:
+ additionalUnitTests:
dockerImage: 'myDockerImage'
Renaming of sonarQubeScan stage
Continuing with the alignment efforts, the execution of the step sonarExecuteScan
has been integrated into the project "Piper" stage Compliance
, and the Cloud SDK Pipeline executes that stage instead.
To activate this stage, the step sonarExecuteScan
needs to be configured in your .pipeline/config.yml
as described in the documentation.
By default, the pipeline will run the stage only for the productive branch, as before, but you can run it in all branches by configuring the option runInAllBranches: true
for the stage compliance
.
Also note that the parameter sonarProperties
has been renamed to options
.
compliance.groovy
.
The following diff shows the necessary migration of the configuration:
steps:
+ sonarExecuteScan:
+ projectKey: "my-project"
+ instance: "MySonar"
+ dockerImage: "myDockerImage"
+ options:
+ - "sonar.sources=./application"
stages:
- sonarQubeScan:
+ compliance: # The stage config is only necessary,
runInAllBranches: true # if you need to activate 'runInAllBranches'.
- projectKey: "my-project"
- instance: "MySonar"
- dockerImage: "myDockerImage"
- sonarProperties:
- - "sonar.jacoco.reportPaths=s4hana_pipeline/reports/coverage-reports/unit-tests.exec,s4hana_pipeline/reports/coverage-reports/integration-tests.exec"
- - "sonar.sources=./application"
Specifying sonar.jacoco.reportPaths
as previously documented is no longer necessary.
Recent versions of the SonarQube plugin (8.x) no longer support coverage reports in .exec binary format.
They only support .xml reports generated from the JaCoCo maven plugin.
As of now, it is a known issue that importing code coverage into the SonarQube service does not work in the Cloud SDK Pipeline out of the box.
If you need this, please open an issue on GitHub.
Migration to whitesourceExecuteScan step
The stage whitesourceScan
has been replaced with the project "Piper" stage security
.
Now the step whitesourceExecuteScan
will be executed and the stage is activated if the step whitesourceExecuteScan
is configured in your .pipeline/config.yml
file.
The existing configuration for the stage whitesourceScan
has to be moved to the step whitesourceExecuteScan
with some modifications:
steps:
+ whitesourceExecuteScan:
+ productName: 'THE PRODUCT NAME AS IN WHITESOURCE'
+ orgAdminUserTokenCredentialsId: 'Jenkins-credentials-id-org-token'
+ userTokenCredentialsId: 'Jenkins-credentials-id-user-token'
+ productVersion: 'current' # replaces staticVersion
+ cvssSeverityLimit: '5' # optional
stages:
- whitesourceScan:
- product: 'THE PRODUCT NAME AS IN WHITESOURCE'
- credentialsId: 'Jenkins-credentials-id-org-token'
- whitesourceUserTokenCredentialsId: 'Jenkins-credentials-id-user-token'
- staticVersion: true
Note that the step will now fail the pipeline if the scan finds security vulnerabilities in any module that exceed the defined severity limit.
This can be controlled with the new parameter cvssSeverityLimit
.
For more information about the step whitesourceExecuteScan
, please refer its project "Piper" documentation.
With using the new step implementation, there is also a potential change in naming the WhiteSource projects with regard to the version.
The naming scheme for each WhiteSource project that is part of a scan is <module name> - <version>
.
The version part is now guaranteed to be consistent across a single scan.
If the parameter productVersion
is configured (formerly staticVersion
), it is taken from there.
Otherwise it is taken from the main build descriptor file (i.e. mta.yaml) in accordance to the step parameter versioningModel
, which defaults to major
.
The version in the build descriptor files for each module is ignored.
Send notification post action
Due to alignments with project "Piper" the sendNotification
post action has been removed.
Please remove any configuration for the post action sendNotification
from your pipeline configuration or custom default configuration. Please use the project "Piper" step mailSendNotification
instead.
Post Pipeline Hook Extension
We moved some of the post actions into the default "Post Pipeline Hook" stage implementation.
If you implemented an extension for the "Post Pipeline Hook" with the name postPipelineHook.groovy
please make sure to always run parameters.originalStage.call()
.
Otherwise, some default post actions might be skipped.
Version 42
⚠️ Breaking changes
createHdiContainer step removal
The backendIntegrationTests
stage has been aligned with Project 'Piper' in version v42 and activation of the createHdiContainer
step was not migrated.
If you still need the functionalities provided by the createHdiContainer
step, please open an issue at our pipeline repository.
In the meantime it is also possible to implement an extension for the backendIntegrationTests
stage using the extensibility concept explained in the documentation.
The step createHdiContainer
is still available for use in extensions, configuration options must be passed via parameters only.
Backend integration tests stage (only for JS integration tests)
The name ci-integration-test
for the npm script which is executed as part of the backend integration tests stage is deprecated.
From v42 onwards it is required to change the name of the script in your package.json
files to the new name ci-it-backend
, since the script ci-integration-test
will not be executed as part of the backend integration tests anymore.
Frontend unit tests stage
The name ci-test
for the npm script which is executed as part of the frontend unit tests stage is deprecated.
From v42 onwards it is required to change the name of the script in your package.json
files to the new name ci-frontend-unit-test
, since the script ci-test
will not be executed as part of the frontend unit tests anymore.
Renaming of keys in runStage configuration map
Due to further alignment efforts with project "Piper" the keys (identifiers for the different stages) used in the runStage
map have changed.
This is a breaking change for users that use a custom pipeline or overwrite the runStage
map or its entries as part of an extension.
In particular, the keys have been changed from the upper case notation used before, to their respective stage names in camel case.
For example:
- script.commonPipelineEnvironment.configuration.runStage.BACKEND_INTEGRATION_TESTS = false
+ script.commonPipelineEnvironment.configuration.runStage.backendIntegrationTests = false
New Features
Fixes
- In versions v40 and v41 of the Cloud SDK Pipeline, the Lint stage could fail while trying to record issues, if the version of the Jenkins plugin
warnings-ng
was older than 8.4.0. - In versions v39, v40 and v41 of the Cloud SDK Pipeline, the
productionDeployment
stage did not enable zero downtime deployments by default.
For the affected versions it is possible to fix this problem by addingenableZeroDowntimeDeployment: true
to theproductionDeployment
stage configuration or to update to v42 of the Cloud SDK Pipeline.
Improvements
- The results of tests when viewed via the Jenkins Blue Ocean interface are now separated by the stage where the tests have been performed.
In previous versions of the SAP Cloud SDK Pipeline, the same test results could be listed under multiple stages.
Version 41
v41
⚠️ Breaking changes
Configuration option for SAP NPM Registry
The configuration option sapNpmRegistry
was removed, due to the migration of all packages from the SAP NPM registry to the default public registry at npmjs.org.
Thus, no separate configuration of the SAP NPM registry is required anymore.
Any configuration for the parameter sapNpmRegistry
will be ignored by the pipeline.
If your project requires a custom registry configuration, use the defaultNpmRegistry
parameter instead. For example:
npmExecuteScripts:
defaultNpmRegistry: 'https://registry.npmjs.org/'
- sapNpmRegistry: 'https://...'
Fixes
The frontend integration tests where not run in versions v39 and v40 of the pipeline, because the technical name of the stage was not correctly passed to the new go-implemented steps introduced in v39. The stage Frontend Integration Tests was nevertheless shown as successful.
Version 40
v40
New Features
Synopsys Detect Scan (formerly BlackDuck) (Beta)
A new 3rd party stage was introduced which allows to execute Detect scans using the project "Piper" step detectExecuteScan.
Please note that the step is currently only available in an early version.
It might not support all variants of projects out-of-the-box.
Furthermore, you might have to configure a Docker image which provides the build tools, e.g. maven, that you want to use during the scan.
The scan can be activated by configuring the step detectExecuteScan
, for example:
steps:
detectExecuteScan:
detectTokenCredentialsId: 'detect-token'
projectName: 'My Example'
projectVersion: '1'
serverUrl: 'https://xyz.blackducksoftware.com'
dockerImage: 'docker-image'
Fixes
Some stages such as backendIntegrationTests
can be configured to run with an optional sidecar image.
This was broken for a number of releases, if the Download Cache was enabled and any steps within the stage made use of it (such as mavenExecute
).
For the time being, docker containers can be connected to one network only, which means the Download Cache has to be disabled for stages with a sidecar image defined.
In scp-cf-spring
archetype maven-based MTA projects, the pipeline installed the wrong application
-module artifact which broke the ability to run integration tests.
This was fixed in the library.
Improvements
-
The reports from a Checkmarx scan in PDF and XML format are archived for a pipeline run.
-
For a step which anticipates to run with an optional sidecar image, the image may now also be defined in that step's configuration only.
This improves performance versus configuring the sidecar image in the stage, since it avoids running other steps within the stage also with that sidecar.⚠️ If the Download Cache is enabled, sidecar images have to be configured in the stage as before.
The Download Cache is enabled by default on CX-Server-based Jenkins instances, unless when running Jenkins in a Kubernetes environment.
Version 39
v39
⚠️ Breaking changes
In the process of aligning the pipeline configuration with the concepts of other project "Piper" pipelines to delivery a common user experience, the configuration of the SAP Cloud Platform Transport Management upload was moved from the stage productionDeployment
to the tmsUpload
step:
steps:
+ tmsUpload:
+ nodeName: 'TEST'
+ credentialsId: 'TMS-UPLOAD'
+ customDescription: 'A custom description for the node upload'
stages:
- productionDeployment:
- tmsUpload:
- nodeName: 'TEST'
- credentialsId: 'TMS-UPLOAD'
- customDescription: 'A custom description for the node upload'
Stages endToEndTests and productionDeployment
- The
appUrls
property of the stagesendToEndTests
andproductionDeployment
must not be a string anymore.
Instead,appUrls
is required to be a list of maps, where for eachappUrl
the mandatory propertyurl
is defined.
Example:
endToEndTests:
- appUrls: 'https://my-app-url.com'
+ appUrls:
+ - url: 'https://my-app-url.com'
- In addition, the optional property
parameters
which can be defined for eachappUrl
, must not be a string anymore.
Instead, it is required to be a list of strings, where each string corresponds to one element of the parameters. For Example:
endToEndTests:
appUrls:
- url: 'https://my-app-url.com'
- parameters: '--tag scenario1'
+ parameters: ['--tag', 'scenario1']
Version 38
v38
New Features
The pipeline now sets the branchName
property in non productive, non PR branches in sonar.
This allows usage of sonar in multi-branch setups, as sonar is made aware of which branch a check refers to.
Please note that this is not available in all versions of sonar.
Fixes
In previous versions, a faulty pipeline configuration could lead to a crash of the project "Piper" go cli.
In cases where, e.g., a boolean value is expected as a parameter but also a map could be provided, the program attempted to cast the boolean to a map.
This specific case is handled now, and the program does not crash anymore.
Version 37
v37
⚠️ Breaking changes
- Support for SourceClear scans was removed from the pipeline.
Please use other 3rd party security scanning tools, such as Fortify, Checkmarx, WhiteSource or SonarQube instead. - We reimplemented the mechanism how custom defaults are retrieved while implementing some improvements as explained below.
Please note that the old parameterextensionRepository
cannot be used anymore.
Instead, please use the optionglobalExtensionsRepository
.
The optionglobalExtensionsRepository
does not allow to specify a version with the option-b
anymore.
Instead, please use the parameterglobalExtensionsVersion
to configure a version.
Please note that you can also configure these values as part of your custom defaults / shared configuration.
The precedence has not changed.
Example:
general:
- extensionRepository: 'https://my.git.example/extensions.git -b v35'
+ globalExtensionsRepository: 'https://my.git.example/extensions.git'
+ globalExtensionsVersion: 'v35'
Automatic versioning
Automatic versioning is solely configured via the new step artifactPrepareVersion
as documented here
For internal use-cases, the pipeline is configuring this step to also push a tag for each generated version.
This makes it necessary to provide the two parameters gitHttpsCredentialsId
and gitUserName
.
For external use-cases, the default is not to push tags.
Three types of versioning are supported via the parameter versioningType
: cloud
, cloud_noTag
, and library
.
To disable automatic versioning, set the value to library
.
The pipeline will then pick up the artifact version configured in the build descriptor file, but not generate a new version.
If you previously turned off automatic versioning via the parameter automaticVersioning
, this diffs shows the necessary migration of the config file:
general:
- automaticVersioning: false
steps:
+ artifactPrepareVersion:
+ versioningType: 'library'
If you previously configured pushing tags for each new version, this is how the configuration can be migrated:
steps:
- artifactSetVersion:
- commitVersion: true
- gitCredentialsId: 'Jenkins secret'
- gitSshUrl: 'repo-URL'
- gitUserEMail: 'email'
- gitUserName: 'username'
+ artifactPrepareVersion:
+ versioningType: 'cloud'
+ gitHttpsCredentialsId: 'Jenkins secret'
The repository URL for the project in Jenkins needs to be configured with https://
scheme.
It will be used also for pushing the tag.
For Maven projects, the step mavenExecute
is not used anymore to set the version.
Instead, it is directly done by artifactPrepareVersion
, which avoids starting new Docker containers and will improve the performance.
If you have defined a project settings file for mavenExecute
before, you must move this configuration into the general sections as follows:
general:
+ maven:
+ projectSettingsFile: 'settings.xml'
steps:
- mavenExecute:
- projectSettingsFile: 'settings.xml'
The same applies to other options defined for mavenExecute
.
Improvements
Authenticated Access for Custom Defaults and Global Pipeline Extensions
We updated the download mechanism for custom defaults and global pipeline extensions so that both can also be stored on
locations which are secured by basic authentication.
If you specify the option for customDefaultsCredentialsId
in the section general of the configuration, the username and password will be used for accessing all custom defaults urls defined in the section customDefaults.
Please find further details here: https://sap.github.io/jenkins-library/steps/setupCommonPipelineEnvironment/
If you specify the option for globalExtensionsRepositoryCredentialsId
in the section general of the configuration, the username and password will be used for cloning the extension repository.
Specify Global Pipeline Extensions in Shared Config
Now it is possible to configure all configuration options regarding the global extensions as part of your custom defaults / shared configuration.
Lint Stage
The pipeline can be configured to fail based on linting findings using the failOnError
configuration option.
By default, the pipeline does not fail based on lint findings.
This option is available when providing a custom linting script or when relying on the default linting of the pipeline.
It is not available when using the SAPUI5 best practices linter, which is using thresholds instead.
steps:
+ npmExecuteLint:
+ failOnError: true
Jenkinsfile
We updated our bootstrapping Jenkinsfile so that it loads the pipeline directly from the library and not from the cloud-s4-sdk-pipeline
repository anymore.
This change improves efficiency of the pipeline because it decreases the number of repositories checked out and the number of executors during the pipeline run.
The new version can be found here: https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/archetype-resources/Jenkinsfile
To benefit from the improved efficiency, please update your Jenkinsfile like this:
-node {
- deleteDir()
- sh "git clone --depth 1 https://github.com/SAP/cloud-s4-sdk-pipeline.git -b ${pipelineVersion} pipelines"
- load './pipelines/s4sdk-pipeline.groovy'
-}
+library "s4sdk-pipeline-library@${pipelineVersion}"
+cloudSdkPipeline(script: this)
Merged "build" stage with Project "Piper" general purpose pipeline
In an effort to reduce differences between project "Piper" general purpose pipeline and SAP Cloud SDK pipeline, both use the same stage for "Build and Unit Test" now.
This is a change under the hood and it should not require any changes in your project in most cases.
A notable exception is the JavaScript pipeline.
First, the new pipeline does not only run ci scripts in the top-level package.json
file, but also in sub-directories.
This only applies if the package.json
file implements the respective scripts.
Older pipeline versions required you to orchestrate the build inside nested package.json
files.
If your project has a build setup where the top level package.json
file takes care of building sub-modules, please take care that this should not be done anymore.
This is the list of scripts that are automatically executed by the pipeline, if they are implemented:
ci-build
ci-backend-unit-test
ci-package
This might in particular be an issue if any of the mentioned scripts uses lerna.
If you still want lerna to orchestrate the execution, make sure to use the mentioned names in the package.json
files in the root of your project, and to use different names in sub-modules.
You can try this out locally by running piper npmExecuteScripts --install --runScripts=ci-build,ci-backend-unit-test,ci-package
in the project "Piper" cli.
Second, the new pipeline does not run ci-package
in an isolated file system anymore.
Therefore, make sure that ci-package
only changes the deployment directory.
If your JavaScript/TypeScript project was generated from an older template project, you might have to adjust the ci-package
command like so:
"scripts": {
- "ci-package": "mkdir -p deployment/ && npm prune --production && cp -r node_modules dist package.json package-lock.json frontend index.html deployment/",
+ "ci-package": "sap-cloud-sdk package --include=dist/**/*,package.json,package-lock.json,frontend/**/*,index.html",
},
"devDependencies": {
+ "@sap-cloud-sdk/cli": "^0.1.9",
}
If you notice any regressions, please let us know by opening an issue.
Version 36
Release v36
This is a maintenance release without exciting new features.
Changes
- Revert to curl for downloading the project "Piper" binary
- We made use of the http request Jenkins plugin before which caused issues in some setups, thus we've reverted to using curl as we already did in older versions
- Configure nodeLabel in init stage
- This might be useful in a distributed setup where you use Jenkins agents to run builds.