Migrating content in SAS Viya can be straightforward when resources are stored in folders—you simply use SAS Environment Manager or the transfer plugin of the command-line interface (CLI). However, not all resources live in folders. Job flows, along with many of their dependent resources, are managed in the Jobs and Flows area of SAS Environment Manager and are not stored in folders. This makes their migration more complex, requiring the use of the sas-viya CLI (or sas-admin CLI for Viya 3 environments) and requiring the user to understand and navigate the dependent resources that comprise a flow.
In this post, we revisit the topic of job flow migration—originally covered for SAS Viya 3—and extend it to both Viya 3 and Viya 4. More importantly, we introduce a new Python-based tool (exportjobflow.py that automates the manual steps, making job flow migration faster, more reliable, and less error-prone.
A SAS Viya job flow, created and managed in the Jobs and Flows page of SAS Environment Manager, is a structured sequence of jobs that execute in a defined order, often with dependencies and conditional logic. It organizes multiple individual jobs into a workflow, ensuring tasks run in the correct sequence and handling dependencies or branching logic based on the success or failure of each task. Job flows are managed through SAS Viya’s Job Execution Service. They can be triggered manually or scheduled to run at specific times, making them ideal for automating recurring processes.
Select any image to see a larger version. Mobile users: To view the images, select the "Full" version at the bottom of the page.
A job flow consists of many dependent resources. When migrating a job flow, you must migrate the flow and all its dependent resources.
These resources include:
job flows (/jobFlowScheduling/flows): a flow can contain other job flows
job actions (/jobFlowScheduling/jobs): a job action is a node in a job flow. They are created when you add a job request to a flow.
job requests ( /jobExecution/jobRequests): job requests add runtime information to a job definition. The job request either includes the job definition to be executed or includes a link (via its URI) to the job definition
job definitions (/jobDefinitions/definitions): contain metadata about a job, including the code, the type of job, and any parameters
Of these resources, only job definitions can be stored in folders. Although there are cases where job definitions are created but are not surfaced in folders.
A key step in exporting is to find the URI of the resources we wish to export. In SAS Viya, a Uniform Resource Identifier (URI) is used to identify unique resources. There are different types of URIs. For migration, we need to be aware of Content URIs. The format of the URI is service name, followed by endpoint and then a unique ID. Here, we see an example of three URIs: a folder, a report, and a job flow.
URI and IDS (the last part of the URI) are important for content migration because the export command of the transfer plugin uses the URI to identify what to export. The URI can be passed on the command line, or you can pass a list of URIs in a JSON request file. To export a Job flow, we need to create a JSON request file that contains a list of all the URIs for the resources that comprise the flow (other flows, actions, definitions, requests, etc.). The out-of-the-box process for doing this is manual, where we use the CLI to determine the URI of all related sources and manually build a request JSON file. In the section "Export a flow manually" below, I cover this process. In the next section, I introduce a new tool, exportjobflow.py, that simplifies and automates the process.
Exportjobflow.py
The pyviyatools is a collection of Python-based command-line utilities developed by SAS to help administrators and developers interact with SAS Viya environments. These tools utilize the SAS Viya REST APIs to automate common administrative tasks, facilitating easier management and migration of Viya resources. The tools are available from the SAS GitHub repository. A new tool exportjobflow.py automates the export of one or many SAS Viya Job flows. The tool:
Navigates the dependencies of the Job flows.
builds a JSON request file for each flow.
Exports each flow to a SAS Viya package.
The tool accepts either a unique flow name or a JSON file with a list of flows. It then loops through the flows and exports each one to a SAS Viya package. The JSON file can be easily created using the jobs plugin of the sas-viya CLI. In the first example, we will export a single flow, the same flow used in the manual process documented below. The flow can be exported with one command.
python exportjobflow.py -fn HRAnalysysProject_Job_Flow_001 -d /tmp/viyapackages -t --debug
The tool:
builds a JSON request file for the flow and its dependent resources
creates a SAS Viya transfer package based on the content of the JSON request file
The output from the command displays the content of the JSON request file that is built. These are the resources that make up the SAS Job flow. The Viya package is created with the same name as the Job flow in the directory specified.
{
"version": 1,
"name": "HRAnalysysProject_Job_Flow_001",
"description": "Created from pyviyatools flow name is:HRAnalysysProject_Job_Flow_001",
"items": [
"/jobFlowScheduling/flows/672ecd0c-d967-4650-83c4-d975d66fe1ad",
"/jobFlowScheduling/jobs/836b3fd7-820b-4761-b15d-1b4dc7ffa7b4",
"/jobExecution/jobRequests/b8160013-e27f-4295-8fc3-68ad5a379ef1",
"/jobDefinitions/definitions/9edb4f3b-09e6-4242-9354-31716d27c89c",
"/jobFlowScheduling/jobs/ec467035-4e23-4fef-8ea1-3fc5f592782b",
"/jobExecution/jobRequests/5238efec-8288-4ce6-a2c9-c90da11cfcab",
"/jobDefinitions/definitions/e999e40a-1a44-453b-ab67-4ce38325d0ce",
"/jobFlowScheduling/jobs/311caef0-ecb4-4572-8a85-5e91984ce0f2",
"/jobExecution/jobRequests/7af85970-bea3-4348-9fac-4110271e02c9",
"/jobDefinitions/definitions/8f31b9df-959a-4937-ba37-8c4f1a8fe77a",
"/jobFlowScheduling/jobs/54efb035-5baa-4f8c-9e3d-11821bc7ef41",
"/jobExecution/jobRequests/08d82c49-98d3-4b70-83ab-7c38c9d466f7",
"/jobDefinitions/definitions/396cfff1-a97c-4c57-b91c-4eb720679181"
]
}
NOTE: Viya Job Flow and dependent objects HRAnalysysProject_Job_Flow_001 exported to json file /tmp/viyapackages/HRAnalysysProject_Job_Flow_001.json
To export a series of flows, use the sas-viya jobs plugin to create a list of Job flows in JSON format.
sas-viya --output json job flows list --filter 'startsWith(name,"HR")' > /tmp/HRjoblist.json
The list of jobs is output to a JSON file.
{
"items": [
{
"description": "HR Analysis Job FLow",
"id": "672ecd0c-d967-4650-83c4-d975d66fe1ad",
"name": "HRAnalysysProject_Job_Flow_001",
"version": 1
},
{
"description": "Analyze employee survey data",
"id": "6b7fb10d-360b-4ddf-9d82-1ab03e0b798d",
"name": "HREmployeeSatisfactionFlow",
"version": 1
}
]
}
NOTE: When passing a file with a list of flows, you must use the --output JSON option on the CLI command.
Pass the JSON file containing the list of flows to exportjobflow.py to export each flow to a package file.
python exportjobflow.py -ff /tmp/HRjoblist.json -d /tmp/viyapackages -t --debug
The output will show the number of flows processed.
.......
"/jobExecution/jobRequests/36228cbc-9194-40ae-a990-0ec4a375c11d",
"/jobDefinitions/definitions/f93627e5-0fcf-4177-9f27-feeb7e33d40f"
]
}
NOTE: Viya Job Flow HREmployeeSatisfactionFlow and dependent objects exported to json file /tmp/viyapackages/HREmployeeSatisfactionFlow.json
NOTE : total processed flows = 2
The tool will work in Viya and Viya 4. One caveat: if your Job flow contains a sub-flow. The sub-flow must be imported prior to importing the parent flow.
Read on to see the manual process of exporting a job flow.
Export a flow manually
To export the flow and all its dependent resources, you need to create a request file that contains all the resources the flow requires. The request file includes a list of the Uniform Resource Identifiers (URI) of the resources to export. For Job flows this involves using the job plugin of the sas-viya CLI to find the URIs and then add them to the request file.
Get the flow URI
Firstly, let's get the URI of the flow itself. Use the flow command of the job plugin of the CLI to list flows. Pass a filter that makes sense for your environment based on the flow name.
sas-viya -output text job flows list -filter 'startsWith(name,"HRAn")'
Id Name Version Description
672ecd0c-d967-4650-83c4-d975d66fe1ad HRAnalysysProject_Job_Flow_001 1 HR Analysis Job FLow
You can also get the ID from the SAS Environment Manager Job and Flows page and by editing the flow.
Using the ID, we can retrieve the flow’s URI along with detailed information about the flow, including its associated resources. The flow URI starts with /jobFlowScheduling/flows.
sas-viya -output json job flows show --id 672ecd0c-d967-4650-83c4-d975d66fe1ad
The output shows all the properties of the flow. In the "self" entry in the links section of the output, the full URI of the flow is listed. It is /jobFlowScheduling/flows/672ecd0c-d967-4650-83c4-d975d66fe1ad
"links": [
{
"href": "/jobFlowScheduling/flows/672ecd0c-d967-4650-83c4-d975d66fe1ad",
"method": "GET",
"rel": "self",
"type": "",
"uri": "/jobFlowScheduling/flows/672ecd0c-d967-4650-83c4-d975d66fe1ad"
},
Create a JSON request file and add this URI. The rest of the process will add other required URIs to the requests file.
{
"version": 1,
"name": "HRAnalysysProject_Job_Flow_001",
"description": "Building a request file for transfer",
"items": [
"/jobFlowScheduling/flows/672ecd0c-d967-4650-83c4-d975d66fe1ad"
}
Get Job Actions
To continue building the request JSON file, navigate through the dependent resources of the flow. A flow contains a set of job resources in the jobs list. The list can contain sub-flows, as well as job actions. The command below shows the details of the flows.
sas-viya -output json job flows show --id 672ecd0c-d967-4650-83c4-d975d66fe1ad
The output shows all the properties of the flow. The links output is removed for clarity.
{
"createdBy": "geladm",
"creationTimestamp": "2025-12-12T17:44:00.816Z",
"defaultJobProperties": {},
"dependencies": [
{
"event": {
"condition": "all",
"events": [
{
"expression": "success('3_LoadFormatsInSAS')",
"type": "jobevent"
},
{
"expression": "success('2_CreateDataInSAS')",
"type": "jobevent"
}
],
"type": "gate"
},
"target": "4_LoadDataInCAS"
},
{
"event": {
"expression": "success('1_CreateFormatsInSAS')",
"type": "jobevent"
},
"target": "2_CreateDataInSAS"
},
{
"event": {
"expression": "success('1_CreateFormatsInSAS')",
"type": "jobevent"
},
"target": "3_LoadFormatsInSAS"
}
],
"description": "HR Analysis Job FLow",
"flowProperties": {
"completionLevel": "default",
"endBehavior": "stop",
"exitcodeStrategy": "last"
},
"id": "672ecd0c-d967-4650-83c4-d975d66fe1ad",
"jobs": [
"/jobFlowScheduling/jobs/836b3fd7-820b-4761-b15d-1b4dc7ffa7b4",
"/jobFlowScheduling/jobs/ec467035-4e23-4fef-8ea1-3fc5f592782b",
"/jobFlowScheduling/jobs/311caef0-ecb4-4572-8a85-5e91984ce0f2",
"/jobFlowScheduling/jobs/54efb035-5baa-4f8c-9e3d-11821bc7ef41"
],
"modifiedBy": "geladm",
"modifiedTimestamp": "2025-12-12T17:44:00.830Z",
"name": "HRAnalysysProject_Job_Flow_001",
"schedulerId": "166ed8bc-4ea9-407a-b7e6-b22dfb516720",
"triggerCondition": "any",
"triggerType": "event",
"triggers": [
{
"active": true,
"event": {
"duration": 100,
"hours": "08",
"maxOccurrence": 3,
"minutes": "15",
"name": "Runs every day effective 11/26/25 at 08:15(America/New_York)",
"recurrence": {
"dayOfMonth": 0,
"skipCount": 1,
"startDate": "2025-11-26",
"type": "daily"
},
"timeZone": "America/New_York"
},
"name": "Runs every day effective 11/26/25 at 08:15(America/New_York)",
"type": "timeevent"
}
],
"version": 1
}
The command below passes the output to a JQ query to return just the list of job action resources.
sas-admin -output json job flows show --id 672ecd0c-d967-4650-83c4-d975d66fe1ad | jq '.jobs'
Expected output:
[
"/jobFlowScheduling/jobs/836b3fd7-820b-4761-b15d-1b4dc7ffa7b4",
"/jobFlowScheduling/jobs/ec467035-4e23-4fef-8ea1-3fc5f592782b",
"/jobFlowScheduling/jobs/311caef0-ecb4-4572-8a85-5e91984ce0f2",
"/jobFlowScheduling/jobs/54efb035-5baa-4f8c-9e3d-11821bc7ef41"
]
NOTE: The jobs list can contain flows that are also members of this flow. If it does, you will need to follow this process to migrate that flow as well.
The four items in the jobs list should be added to the requests file. From the output of the jobs list, the URI starts with /jobFlowScheduling/jobs/. This indicates that the flow includes 4 job actions.
Get the Job Request and Job Definition
The job action can be linked to a Job request and/or Job definition. If these exist, we must retrieve them and add them to the requests JSON file. This process must be repeated for each job action.
Firstly, use the actions command of the job plugin to get the URI of the jobRequest for the action.
sas-viya -output json job actions show --id 836b3fd7-820b-4761-b15d-1b4dc7ffa7b4
Expected output(partial):
{
"createdBy": "geladm",
"creationTimestamp": "2025-12-12T17:44:00.490Z",
"id": "836b3fd7-820b-4761-b15d-1b4dc7ffa7b4",
"jobRequestUri": "/jobExecution/jobRequests/b8160013-e27f-4295-8fc3-68ad5a379ef1",
"links": [ ... ]
}
],
"modifiedBy": "geladm",
"modifiedTimestamp": "2025-12-12T17:44:00.490Z",
"name": "1_CreateFormatsInSAS",
"priority": "none",
"version": 1
}
You can use a jq query to just return the job request URI.
sas-admin -output json job actions show --id 836b3fd7-820b-4761-b15d-1b4dc7ffa7b4 | jq '.jobRequestUri'
For each job request there will be a job Definition. Use the definitions command of the job plugin to return the job definition URI and add it to the JSON request file.
sas-viya -output json job requests show --id b8160013-e27f-4295-8fc3-68ad5a379ef1
Expected output(partial):
{
"arguments": {
"_ODSOPTIONS": "OPTIONS(BITMAP_MODE='INLINE' SVG_MODE='INLINE' CSS_PREFIX='.ods_d2b292b3-ec7a-45ad-bea2-81d01c0e71b4' BODY_ID='div_d2b292b3-ec7a-45ad-bea2-81d01c0e71b4')",
"_URL": "",
"_addJesBeginEndMacros": "true",
"_contextName": "SAS Studio compute context"
},
"createdBy": "geladm",
"creationTimestamp": "2025-12-12T17:44:00.163Z",
"expiresAfter": "PT96H",
"id": "b8160013-e27f-4295-8fc3-68ad5a379ef1",
"jobDefinitionUri": "/jobDefinitions/definitions/9edb4f3b-09e6-4242-9354-31716d27c89c",
"modifiedBy": "geladm",
"modifiedTimestamp": "2025-12-12T17:44:00.164Z",
"name": "1_CreateFormatsInSAS",
"version": 3
}
You can use a jq query to just return the job request URI.
sas-admin -output json job requests show --id b8160013-e27f-4295-8fc3-68ad5a379ef1 | jq '.jobDefinitionUri'
"/jobDefinitions/definitions/9edb4f3b-09e6-4242-9354-31716d27c89c"
Repeat the process of returning the Job request URI and Job definition URI for each job action.
Export Job Flow and its Components
Here is the request file that would be built by this process. Now we will use the requests file to export the Job flow.
tee /tmp/request_jobflow.json > /dev/null <<EOF
{
"version": 1,
"name": "HRAnalysysProject_Job_Flow_001",
"description": "Request to export flow name :HRAnalysysProject_Job_Flow_001",
"items": [
"/jobFlowScheduling/flows/672ecd0c-d967-4650-83c4-d975d66fe1ad",
"/jobFlowScheduling/jobs/836b3fd7-820b-4761-b15d-1b4dc7ffa7b4",
"/jobExecution/jobRequests/b8160013-e27f-4295-8fc3-68ad5a379ef1",
"/jobDefinitions/definitions/9edb4f3b-09e6-4242-9354-31716d27c89c",
"/jobFlowScheduling/jobs/ec467035-4e23-4fef-8ea1-3fc5f592782b",
"/jobExecution/jobRequests/5238efec-8288-4ce6-a2c9-c90da11cfcab",
"/jobDefinitions/definitions/e999e40a-1a44-453b-ab67-4ce38325d0ce",
"/jobFlowScheduling/jobs/311caef0-ecb4-4572-8a85-5e91984ce0f2",
"/jobExecution/jobRequests/7af85970-bea3-4348-9fac-4110271e02c9",
"/jobDefinitions/definitions/8f31b9df-959a-4937-ba37-8c4f1a8fe77a",
"/jobFlowScheduling/jobs/54efb035-5baa-4f8c-9e3d-11821bc7ef41",
"/jobExecution/jobRequests/08d82c49-98d3-4b70-83ab-7c38c9d466f7",
"/jobDefinitions/definitions/396cfff1-a97c-4c57-b91c-4eb720679181"
]
}
EOF
Now you can use the sas-viya transfer export command with the --request parameter to create an export package containing the objects defined in `/tmp/request_jobflow.json`.
sas-viya --output text transfer export --request @/tmp/request_jobflow.json
Store the package ID in a variable and download the package file.
packageid=$(sas-viya --output json transfer list --name HRAnalysysProject_Job_Flow_001 | jq | jq -r '.items[]["id"]')
sas-viya transfer download --file /tmp/viyapackages/jobflow.json --id $packageid
Importing
In your target environment, importing the packages can be done with SAS Environment Manager or the sas-viya CLI. Pyviyatools also has a tool importpackages.py that loops through a set of packages in a directory and imports them.
Wrap Up
Migrating SAS Viya job flows requires careful handling of multiple dependent resources, including flows, job actions, job requests, and job definitions. While the manual process involves identifying URIs for each component and building a JSON request file, the new exportjobflow.py tool from the PyViyaTools collection significantly streamlines this task. Automating dependency discovery and package creation reduces complexity and saves time, whether you are migrating a single flow or multiple flows. This approach ensures a more efficient and reliable migration process across SAS Viya 3 and Viya 4 environments.
Supplemental Resources
Here are some additional resources related to jobs and flows, as well as migration from Viya 3 to Viya 4.
Jobs and Flows official Documentation
SAS Viya Platform: Jobs and Flows
Jobs and Flows: Concepts
Schedule Job Flows
Jobs and Flows Tutorials & Blog Posts
Getting Started with Job Scheduling in SAS Viya
Tips for Scheduling Flows with SAS Viya Environment Manager
Jobs and Flows Training
Working with SAS® Viya® Jobs
Working with SAS Viya Jobs: Advanced
Scheduling and Orchestrating SAS® Programs and SAS® Studio Flows with SAS® Environment Manager Jobs and Flows
Managing and Querying Data Using Flows in SAS Studio
Using SAS® Studio Flows and Custom Steps in SAS® Viya® Fast Track
Migration
System Migration and Content Migration – SAS Help Center
Migration on the SAS Viya Platform: Reference
SAS Viya 3 Content Assessment
SAS Viya Job object model
Selective Backup and Restore of Viya Content
pyviyatools
You CAN take it with you! Saving SAS Viya Content
Find more articles from SAS Global Enablement and Learning here.
... View more