Hi @FK1
A DI Studio Job is a Metadata object with associations to any number of transformation objects with associations to each other, so a certain execution order is defined, and each transformation consists of a prototype with any number of sub-objects described by attributes and associations to other sub-objects.
When a job is deployed in DI Studio, two things happens:
All the the metadata objects which define the job are assembled and transformed into a source file that is written to a disk file for later execution as a stand-alone SAS program (or the Job can be redeployed, in which case the disk file is overwritten).
A new metadata object (a JFJob) is created with attributes like name and path to the source file and an association to the Job object (or the Job can be redeployed, in which case the JfJob object is updated)
When a Job is deployed, the result is one physical disk file, A DI Studio Job can include other jobs, but these jobs are treated as transformations in the deployment process, which means that even if they look like jobs on the canvas, it is just a container, a "shorthand" notation for the transformations in the included job. So the JfJob has always one Job as source, and the deployment of a Job results in one source file.
But the same job can be deployed under different names, so one Job can be source to more than one JFJOB. The source file is generated using the actual metadata registrations for the Job at the time of execution. So if a job is deployed and later changed and deployed under another name, two JFJobs will exist with the same Job object as source, but pointing at program files with different code. The only way to maintain order in this chaos is to check if all JfJobs have MetadataUpdated timestamps that are at >= the MetadataUpdated timestamps of the corresponding Jobs.
The source files written by deployment exist in their own right as physical files, so they are not deleted when the parent JFJobs are deleted. The only way to check for orphaned source files (batchjobs) is to see if all files in the batchjob directory have a metadata registration as a file object associated with a JFJob object.
We have automated these checks along with many others, so we receive a daily rapport over obsolete files, more than one JFJob for each Job, Jobs changed after latest deployment etc. The corrections are done manually to avoid catastrophes in case of errors, but we maintain almost complete order in our SAS environment and enforce business rules for naming objects etc.without much work.
I wish you a happy new year with the hope that the SAS 9 engine will be with us for many years to come as the core product behind all the bells and whistles.
Erik
... View more