Architecting, installing and maintaining your SAS environment

Deploying and scheduling jobs programatically (9.2)

Reply
Occasional Contributor
Posts: 17

Deploying and scheduling jobs programatically (9.2)

Hi,

Is it possible to deploy SAS DI Jobs programatically and then add them to a predefined job flow?  I want to try and set up an LSF Job Flow for adhoc jobs when our normal etl batches have finished and have them added to it without user intervention.

Cheers in advance

Super Contributor
Posts: 356

Deploying and scheduling jobs programatically (9.2)

Hi.

May not be the best option, however you could

Schedule a flow that runs a job that dynamically builds "includes".  The job could look at a specific folder where all the adhocs are saved and run all of them, of course you'll need to move the adhoc after being run other wise it will be run the next day also....

However doing this you lose all the goods things about metadata ie impact analysis etc.

Regards

Barry

Occasional Contributor
Posts: 17

Deploying and scheduling jobs programatically (9.2)

I was hoping to avoid that and have it visible via the Job Schedule plugin in SMC...

A colleague and I did a similar thing for doing scheduling with base sas files and EG Projects a while back. It was basically a DI job that picked up all the EGuide and .sas files from a folder available in the folders tab of EG and created a number of .sas files with includes to each of them in 4 pre defined job flows.  It had some smarts to make sure we didn't start a cottage industry of regularly scheduled EGuide projects performing ETL tasks.  Great because it works but it gives very little visibility of what is running etc. via standard monitoring.  All the monitoring had to be done via manipulation of log files or pre-defined error and status handling the user had to add to code.  Most EG users don't both with such rigor.

I guess this raises another question;

Can the deployment location for DI jobs be controlled via authorisations/security?  E.G.  User in group A can deploy to batch folder x while Users in group b can deploy to batch folder y?  Hmm may have to try that out...

Cam

Super User
Posts: 5,260

Deploying and scheduling jobs programatically (9.2)

Since most of this is metadata, I think this is possible to program by using the metadata API.

But after adding jobs to a flow, the flow has to be rescheduled so that LSF will be aware of the new jobs. If that's possible programmatically, I don't know.

/Linus

Data never sleeps
Occasional Contributor
Posts: 17

Deploying and scheduling jobs programatically (9.2)

Issuing a command to BSUB should fix the issue with resubmitting it.  I doubt the metadata definition for a submitted job would actually permeate though from issuing  a bsub command.  I think it would need to be a two stage process;

  • Issue BSUB command
  • Issue Add/Update Metadata command via proc metadata etc. to modify the object dealing with job schedules.  (Can't remember exactly but I think it's jfjob and another object as well).

Cam

SAS Employee
Posts: 36

Deploying and scheduling jobs programatically (9.2)

In 9.3, the answer to this question is yes, we have added a command line interface for deploying di jobs programmatically.   You can reference the online help or di studio users guide to see the specifics, and there is a wrapper script that handles the details that you can call from a command line. 

Respected Advisor
Posts: 3,900

Re: Deploying and scheduling jobs programatically (9.2)

The batch deployment which nar@sas mentions would solve job deployment. I believe it wouldn't handle scheduling (not 100% sure - may be nar@sas will show us that I'm wrong).

Else: Building on what Barry proposes you could create a flow which calls wrapper jobs (calls in parallel). The wrapper jobs (a macro call) would be like: first node calls first job in list of .sas files in deployment directory for adhoc jobs, second node call second job... and so on - and if there is no n-th job then the wrapper code does nothing.

May be something like: First node creates SAS table with adhoc jobs, node 2 to node n are wrapper nodes calling an adhoc job and moving the job to an archive if execution is successful.... Something along this line.

Ask a Question
Discussion stats
  • 6 replies
  • 1194 views
  • 3 likes
  • 5 in conversation