SAS Data Integration Studio, DataFlux Data Management Studio, SAS/ACCESS, SAS Data Loader for Hadoop and others

Migration of SAS Flows

Reply
Occasional Contributor
Posts: 5

Migration of SAS Flows

Hi All,

We need to migrate our SAS Flows from Development to System Testing environment and right now all our SAS Flows are created under SAS Management Console, please help us to migrate these flows to different environment.

We are using SAS Batch Server and log path for SAS flow is defined by its own and it can be change using Advanced options, is there any options to parameterize those log path?

Respected Advisor
Posts: 4,173

Re: Migration of SAS Flows

OS? SAS version, SMC version?

Occasional Contributor
Posts: 5

Re: Migration of SAS Flows

Hi Patrick

SAS Server Version 9.2 (TS2M3)

OS Version : Linux 2.6.18-194.3.1.el5 (LIN X64) platform

SMC 9.2

SAS DI 4.2


Respected Advisor
Posts: 4,173

Re: Migration of SAS Flows

I'm not that happy how the term "scheduling" is used within SAS DIS/SMC.

If I understand this right then you're simply talking about creating .sas files out of SAS metadata. This is unfortunately called "scheduling". You need a SAS server (the SAS Batch Server) to create these .sas files.

When moving to another environement (code migration) then what you normally do is to export your metadata from the source environment (creating .spk files), moving these files to your target environment, import the .spk files into the target environment (I'm using DIS for this) and then re-deploy your jobs (=scheduling in the target environment creating new .sas file in the target environment).

What "scheduling" also does is creating these additional scheduling objects. If you're using LSF for scheduling (and here comes the terminology mess) then you use these objects to build your scheduling flows in SMC (not job flows now but scheduling flows where you define how the jobs are executed). So that's where the question comes: What scheduler are you using.

I don't see a reason to change the default path for the batch server logs. That's only the logs about "scheduling" your jobs. I can see a lot of reasons why you might want to define the log location for the jobs which get regularly executed. Whatever scheduler (LSF or something else) you're using, you will batch submit the .sas files with a command line like: sas.sh -prog <program name> ....

There are a lot of additional parameters possible. sas.sh sets a lot of the values if not passed explicitly. I believe the location for logs is one of it. So if you want to define the location of the log then you do something like sas.sh -prog <path/program name> -log <path/logname> ....

You could use a UNIX environment variable as part of the pathname for the log. If you want to do this then I would set this environment variable in the .profile of the user under which the process runs.

The one thing I've done already more than once is to define a UNIX environment variable $LEV as this is normally the only changing part between environments. And then I use this environment variable both for batch submitting and also within SAS code using SYMGET()

Super User
Posts: 5,432

Re: Migration of SAS Flows

If it's "normal" promotion, use the export/import wizards for both jobs and flows.

In the target environment you should redeploy the moved jobs.

Why do you need to parameterize the log path?

What scheduler do you use? In 9.3 you can set environment variables on the flow, but i believe that doens't work for operating system services (cron, Windows Task Scheduler).

Data never sleeps
Ask a Question
Discussion stats
  • 4 replies
  • 447 views
  • 0 likes
  • 3 in conversation