If you ever wondered how to run a SAS Studio Flow in batch, this article provides you the approach and the scripts to:
Gerry Nelson provides more context in his post Keeping the SAS Administration Command-Line interfaces up-to-date
His post talks about the sas-admin
executable, which has been replaced by sas-viya
for SAS Viya release 2020.1 and later. Replace sas-admin with sas-viya in his instructions and you should be fine.
To perform the batch commands, you must have the SAS Viya CLI installed on the machine where you will execute the commands.
The installation instructions can be found here: SAS® Viya®: Using the Command-Line Interface.
The top-level command sas-viya
is used to initialize, authenticate the user and execute the job
plug-in.
For a complete plug-in list, see Command-Line Interface: Plug-Ins.
Back to the SAS Studio flows:
What is a flow? A flow is a sequence of operations on data. Flows are available in SAS Studio as of SAS Viya release 2020.1.
If you want to know more, please see SAS® Studio with SAS® and SAS® Viya® Programming Documentation / SAS® Studio Flows.
My colleagues will post more content about flows and how to use them in the coming weeks and months.
In the following example, the flow queries a data set, creates a result table and loads it in a CASLIB.
Select any image to see a larger version.
Mobile users: To view the images, select the "Full" version at the bottom of the page.
You can schedule the flow from the interface. Pick a time in the future and choose just one occurrence. At this stage, the schedule transforms the flow in a job request.
After, you could check in SAS Environment Manager > Jobs and Flows > Scheduling the job request and the id. This id is key to running the flow in batch.
While you can run and schedule the flow from the interface, I will focus on how to run a SAS Studio Flow using scripts and using the SAS Viya CLI mentioned earlier.
The job request contains the SAS Studio flow code. Log-in the terminal as a Kubernetes administrator and:
# get namespace and host, adapt the value below to your environment current_namespace=gelenv INGRESS_SUFFIX=$(hostname -f) INGRESS_URL=https://${current_namespace}.${INGRESS_SUFFIX} echo ${INGRESS_URL} # get certificates (TLS support)
## run just once
mkdir -p ~/.certs
kubectl cp $(kubectl get pod -l app=sas-logon-app -o=jsonpath='{.items[0].metadata.name}'):security/trustedcerts.pem ~/.certs/${current_namespace}_trustedcerts.pem
## run as many times as you authenticate export SSL_CERT_FILE=~/.certs/${current_namespace}_trustedcerts.pem export REQUESTS_CA_BUNDLE=${SSL_CERT_FILE} # go to sas-viya CLI folder clidir=/opt/sas/viya/home/bin cd $clidir # create a profile export SAS_CLI_PROFILE=${current_namespace} ./sas-viya --profile ${SAS_CLI_PROFILE} profile set-endpoint "${INGRESS_URL}" ./sas-viya --profile ${SAS_CLI_PROFILE} profile toggle-color off ./sas-viya --profile ${SAS_CLI_PROFILE} profile set-output fulljson # Login and get a token, change the password ./sas-viya --profile ${SAS_CLI_PROFILE} auth login -user sasadm -password *****
Output example:
There is a more elegant and secure approach to login and get a token, I have described it at the end.
The approach is to filter by name the jobs, then retrieve the id in a variable, jid.
# List the existing job requests
/opt/sas/viya/home/bin/sas-viya -output text job requests list -limit 25
# Read the id of the created job, filter by name
jid=$( /opt/sas/viya/home/bin/sas-viya --output json job requests list --filter 'eq(name,"CarMakeFlow.flw")' | jq -r '.items[]["id"]')
echo $jid
/opt/sas/viya/home/bin/sas-viya job requests execute --id $jid
BINGO, the SAS Studio flow is executed. You can check the last execution update time in SAS Environment Manager > Jobs and Flows > Scheduling.
What happens if you change something in the SAS Studio flow and save it? Will the job request, update itself with the new flow code? The answer is no. You will have to re-schedule the flow as a new job request. Now you will have two jobs with the same name, but different ids.
You have to delete the old job, supposing you no longer need that version and execute the new one.
# Delete previous job using old jid
/opt/sas/viya/home/bin/sas-viya -output text job requests delete --id $jid
## Relist the jobs
/opt/sas/viya/home/bin/sas-viya -output text job requests list -limit 25
# Read the id of the new job, filter by name
jid=$( /opt/sas/viya/home/bin/sas-viya --output json job requests list --filter 'eq(name,"CarMakeFlow.flw")' | jq -r '.items[]["id"]')
echo $jid
# Execute the job request based on the id of the job
/opt/sas/viya/home/bin/sas-viya job requests execute --id $jid
A more secure way is to avoid encoding credentials. You can use an .authinfo file in conjunction with the pyviya tools. See Gerry's post: Introducing the GEL pyviyatools.
# Instead of
# ./sas-viya --profile ${SAS_CLI_PROFILE} auth login -user sasadm -password *****
# Create .authinfo - replace ***** with real credentials
tee ~/.authinfo > /dev/null << EOF
default user sasadm password *****
EOF
chmod 600 ~/.authinfo
# Browse to PyViya Tools folder and use the loginviauthinfo command for authentification
cd ~/admin/pyviyatools
./loginviauthinfo.py -f ~/.authinfo
The pyviatools installation instructions are found here: INSTALL.md .
To execute SAS Studio Flows in SAS Viya 2020.1 or later, consider using the sas-viya batch CLI.
Many thanks to: Gerry Nelson and Mary Kathryn Queen.
For more information, see the following information:
Thank you for your time reading this article. If you liked the article, give it a thumbs up. Please comment and tell us what you think about running flows in batch.
Find more articles from SAS Global Enablement and Learning here.
Thanks. This seems like a lot, just to batch submit a flow. You mention “While you can run and schedule the flow from the interface, I will focus on how to run a SAS Studio Flow using scripts and using the SAS Viya CLI mentioned earlier.” What are the benefits to using the CLI to execute a flow rather than the Studio interface? On a BI server, between SAS Management Console and LSF Flow Manager, I could do everything I wanted for scheduling and executing flows through reasonably friendly visual interfaces. Is the Studio interface more limited, thus requiring use of CLI? I’m still on EG, which never had a batch submit feature. But I heard studio has better batch submit buttons built in??
Hi Quentin,
The architecture in SAS Viya 2020.1 and later is very different from SAS 9.4. The authentication, the security, etc.
If you need to execute SAS Studio flows in batch, in a CI/CD process, schedule the execution from outside SAS then SAS Viya CLI is facilitating your interaction with the Kubernetes cluster where SAS Viya is running.
You could also use REST APIs, but there's more code to write. The SAS Viya CLI is a wrapper on top of the REST APIs.
It may look like much code, but it can be scripted. If if can be scripted, it can be automated. There is nothing more annoying than a button that you need to keep on pushing.
If you ever need to integrate flows from SAS Viya in a broader enterprise flow, then the SAS Viya CLI is the way to do it.
Gotcha, thanks. Hopefully I'll be playing in Viya before too long. I do hope there is a batch submit button in Studio, as this was one of the top-voted EG enhancement ballot requests for years. As I'm not worried about incorporating my flows in other non-SAS flows, as a SAS developer (not admin) I'm mostly focused on SAS giving me an easy way to batch submit (so that I can make sure my job runs in a clean session, and I get a log file), and a reasonable way to schedule jobs. On non-BI linux server, batch submitting was of course easy, and scheduling was just crontab. On BI server, both of these became a bit more of a headache, to schedule a SAS program to run I had to write the program, then put it in a DI job, then deploy the job for scheduling, blah blah. I'm just hoping the Viya interfaces for batch submit / scheduling won't be much more complex than EBI. From the "Jobs and Flows" screenshot in your article, looks similar.
What is the best way to diagnose/catch errors and review logs from the batch job and/or programs run as part of the flow? Would the information be in SAS Environment Manager > Jobs and Flows? What roles would be necessary to access? I'm assuming the logs wouldn't be accessible on disk?
@kp21 from the linked article, https://communities.sas.com/t5/SAS-Communities-Library/How-to-Run-SAS-Programs-in-Batch-in-the-New-S... , it looks like you can retrieve the logs from a batch submit. If I'm reading correctly. (If you couldn't get the logs, could you really call it a batch submit? : )
Registration is now open for SAS Innovate 2025 , our biggest and most exciting global event of the year! Join us in Orlando, FL, May 6-9.
Sign up by Dec. 31 to get the 2024 rate of just $495.
Register now!
Data Literacy is for all, even absolute beginners. Jump on board with this free e-learning and boost your career prospects.