Automate the development, testing and execution of SAS (CAS) code in SAS Viya 3.5 with Git and Jenkins. Learn how to create a Jenkins pipeline for a simple end-to-end scenario: load files in CAS, create a star schema as a CAS view, test the star schema and finally, clean-up.
In a previous post, DevOps Applied to SAS Viya 3.5: Run a SAS Program with a Jenkins Pipeline, we covered the basics of Jenkins, Git and SAS Viya. GitLab will be used in the post as Git management software.
Select any image to see a larger version.
Mobile users: To view the images, select the "Full" version at the bottom of the page.
The CAS programs we want in the Jenkins pipeline:
The process to push CAS files in Git (GitLab) was covered in DevOps Applied to SAS Viya 3.5: Top Git Commands with Examples.
The Jenkins file is stored in Git (GitLab). Its location is stored in the Jenkins pipeline configuration.
Jenkins builds the pipeline according to the Jenkins file.
Jenkins builds on the SAS Viya machine, defined in the agent label, therefore you need to define an agent first in Jenkins. Please read DevOps Applied to SAS Viya 3.5: Run a SAS Program with a Jenkins Pipeline for more details.
The Jenkins file has several stages:
pipeline {
agent { label 'intviya01.race.sas.com'}
stages {
stage('Clone GIT on SAS Viya') {
steps {
sh 'echo "Hello " `logname`'
}
}
stage('Copy source files') {
steps {
sh 'cp -n /opt/sas/devops/workspace/{userid}-PSGEL250-devops-applied-to-sas-viya-3.5/Data-Management/source_data/* /gelcontent/demo/DM/data/'
}
}
stage('Load in CAS') {
steps {
sh '/opt/sas/spre/home/SASFoundation/sas -autoexec "/opt/sas/viya/config/etc/workspaceserver/default/autoexec_deployment.sas" /opt/sas/devops/workspace/{userid}-PSGEL250-devops-applied-to-sas-viya-3.5/Data-Management/scripts/080_load_CAS.sas -log /tmp/080_load_CAS.log'
}
}
stage('Create Star Schema in CAS') {
steps {
sh '/opt/sas/spre/home/SASFoundation/sas -autoexec "/opt/sas/viya/config/etc/workspaceserver/default/autoexec_deployment.sas" /opt/sas/devops/workspace/{userid}-PSGEL250-devops-applied-to-sas-viya-3.5/Data-Management/scripts/100_create_star_schema.sas -log /tmp/100_create_star_schema.log'
}
}
stage('Functional Test') {
steps {
sh '/opt/sas/spre/home/SASFoundation/sas -autoexec "/opt/sas/viya/config/etc/workspaceserver/default/autoexec_deployment.sas" /opt/sas/devops/workspace/{userid}-PSGEL250-devops-applied-to-sas-viya-3.5/Data-Management/scripts/200_functional_test.sas -log /tmp/200_functional_test.log'
}
}
stage('Technical Test') {
steps {
sh '/opt/sas/spre/home/SASFoundation/sas -autoexec "/opt/sas/viya/config/etc/workspaceserver/default/autoexec_deployment.sas" /opt/sas/devops/workspace/{userid}-PSGEL250-devops-applied-to-sas-viya-3.5/Data-Management/scripts/300_technical_test.sas -log /tmp/300_technical_test.log'
}
}
}
post {
always {
echo 'Job Done!'
}
}
}
The Jenkins Pipeline from the previous post will be reused as such. The pipeline doesn’t change. What changes is the Jenkins file, the Jenkins pipeline definition. You might choose to store the Jenkins file in GitLab (or any version control system). The advantage is that you can reuse the same pipeline, over and over.
I am using Blue Ocean, a Jenkins plug-in, to run and visualize the pipelines.
See it in action:
If all was set up correctly, you would see the result, in the latest run.
The Jenkins file has been converted to a pipeline.
You can now consult the individual stages, look at their status, consult the logs, etc.
With a well-designed automated test, you do not need to perform a visual check. However, let us assume that you need to validate the test.
cas casauto;
caslib _all_ assign;
* target caslib;
proc casutil incaslib="Public";
list files; list tables;
quit;
After test validation, you could add a stage to remove the created objects. Insert in the Jenkins file above, just before the post step:
stage('Cleanup CAS tables') {
steps {
sh '/opt/sas/spre/home/SASFoundation/sas -autoexec "/opt/sas/viya/config/etc/workspaceserver/default/autoexec_deployment.sas" /opt/sas/devops/workspace/{userid}-PSGEL250-devops-applied-to-sas-viya-3.5/Data-Management/scripts/900_cleanup.sas -log /tmp/900_cleanup.log'
}
}
stage('Cleanup files') {
steps {
sh '''
rm -f /gelcontent/demo/DM/data/mailorder.csv
rm -f /gelcontent/demo/DM/data/customers.csv
rm -f /gelcontent/demo/DM/data/products.csv
rm -f /gelcontent/demo/DM/data/catcode.csv
rm -f /tmp/080_load_CAS.log
rm -f /tmp/100_create_star_schema.log
rm -f /tmp/200_functional_test.log
rm -f /tmp/300_technical_test.log
'''
}
}
We automated a simple end-to-end scenario: load files in CAS, create a star schema as a CAS view, test the star schema then clean-up with SAS Viya 3.5, Jenkins and Git (GitLab). A Jenkins pipeline was defined, with a SAS Viya machine as the agent. The Jenkinsfile containing the Jenkins syntax was stored in Git (GitLab). The CAS programs were also stored in GitLab. Finally, we ran the Jenkins pipeline, analyzed the results and confirmed visually the results in CAS.
More will follow: how to work with parallel stages in Jenkins, how to surface detailed logs, how to import SAS content, such as SAS Data Studio Plans or SAS Visual Analytics reports. Stay tuned.
Mark Thomas, Rob Collum, Stephen Foerster.
Thank you for your time reading this post. Please comment and share your experience with Jenkins, Git and SAS Viya.
Join us for SAS Innovate 2025, our biggest and most exciting global event of the year, in Orlando, FL, from May 6-9. Sign up by March 14 for just $795.
Data Literacy is for all, even absolute beginners. Jump on board with this free e-learning and boost your career prospects.