01-27-2015 06:41 AM
Is it possible to create a shell script to deploy/redeploy a SAS job. General syntax or example are welcome. Also, is it possible to creating the job's respective SPK via the script itself ? Thanks
01-27-2015 06:52 AM
You can schedule both export and import of spk - it's documented fairly extensive.
Not sure that you by standard tool can schedule/script job deployment. Question is, why do you want to do that?
In my mind, you need to do some test/checks that the imported jobs are ok, and all necessary post-import task is performed before you put the job to production (deploy).
01-27-2015 07:34 AM
Instead of manually deploying/re-deploying a job, a shell script can deploy the job. In case the job is already deployed, then the script will rename the existing .SAS file i.e job_name.sas -> job_name_old_count.sas and redeploy the job to job_name.sas. Similarly for the .SPK file.
Basically i want to automate the process.
01-27-2015 07:53 AM
Still find it a bit "dangerous". How can you be sure that the imported met data is actually working in the new environment?
And quite often a new version is accompanied by change scripts and other manual tasks (parameter files etc). How do you automate that?
And as I mentioned, I'm not aware of any ready made function for this.
01-29-2015 03:17 AM
Kind of new to SAS, so i don't understand what the problem could be. When i usually run a job, i run via another script, which invokes the .SAS in the specified location. This is done via using 'SYSIN' function in the script.
Generally when we deploy/redeploy the job in DIS manually and then import it in a new environment, it works fine. Can you please explain why doing this process manually will not work ? Just wanted to know if it is possible to deploy via a script.
01-29-2015 03:36 AM
Again, I don't know any such functionality. and since no one else has answered, chances are that there is none. To confirm, contact your SAS representative.
Reason (which really doesn't matter if you can't automate this): during maintenance of a DW, some changes means that you need to change target table structures (adding/renaming/removing columns, indexes, keys) - manual scripts to change often needs to be developed and executed in the production environment before the updated ETL flow can be scheduled.
Also, there could be new input data, new logins etc. So there's lot of dependences that needs to checked before scheduling new/uodated jobs.
01-30-2015 01:31 PM
the deploy redeploy job is a DI feature and SAS(R) Data Integration Studio 4.9: User's Guide.
See: "Note: Under change management, only administrators can deploy jobs"
Change management is a process to assure you are only doing validated changes.
SAS(R) Data Integration Studio 4.9: User's Guide (Using a Command Line to Deploy Jobs) is there for doing that process automated mostly under control of a release management tool.
Yes you could implement all kind of those thing on a trial-error approach. No you should not do that for a professional approach. A higher maturity level is requiring being in control.
As there could be unintended changes in the SAS metadata your deployment will possible fail. But why would you deploy when the code is ok.
More likely you are on some SIEM approach, than go for a SIEM design and solution.