Hi Everyone,
I have programmed code which creates a dataset that lists all available deployed jobs from metadata and their respective meta data ids and jobnames.
For that I made use of the "metadata_resolve" function:
jobobj="omsobj:JFJob?@PublicType='DeployedJob'";
jobcount=metadata_resolve(jobobj,type,id);
Similarily, I programmed the same kind of code, but not for deployed jobs, but for available jobs, i.e.
jobobj="omsobj:Job?@PublicType='Job'";
jobcount=metadata_resolve(jobobj,type,id);
What I want, is to relate the "origin" of a deployed job. For example, let's assume the data set with the infos about deployed jobs has this observation:
Is there a way to derive its corresponding job (including metadata id of the job) from metadata?
rather than rely on the job name and merge it against the list of available jobs?
Cheers,
FK1
Hi @FK1
Sorry, I didn't se your post until today. To get the corresponding job, you need to follow the association from Deployed Job to Job. The association chain can be viewed in the SAS Metadata Browser, that can be found under Solutions in the SAS Display Manager:
The following code writes a data set with all deployed jobs and the corresponding jobs. In this case it it very simple because there is always one and only one job associated with a given JfJob.
data test (drop=jfjobcnt r1-r4 i);
length JfJobUri $256 JfJobName $60 JobUri $256 JobName $60;
call missing(of JfJobUri, JfJobName, JobUri, JobName);
jfjobcnt = metadata_getnobj("omsobj:JFJob?@PublicType='DeployedJob'", 1, JfJobUri);
do i = 1 to jfjobcnt;
r1=metadata_getnobj("omsobj:JFJob?@PublicType='DeployedJob'", i, JfJobUri);
r2=metadata_getattr(JfJobUri, "Name", JfJobName);
r3=metadata_getnasn(JfJobUri, "AssociatedJob", 1, JobUri);
r4=metadata_getattr(JobUri, "Name", JobName);
output;
end;
run;
Using SAS Data Step Metadata Functions is not fast. It took about 30 seconds to make a list of 5270 jobs, and it could be done in less than a second with Proc Metadata + an XML specification file, but that is much more complicated to set up and get to work.
Hi @FK1
Sorry, I didn't se your post until today. To get the corresponding job, you need to follow the association from Deployed Job to Job. The association chain can be viewed in the SAS Metadata Browser, that can be found under Solutions in the SAS Display Manager:
The following code writes a data set with all deployed jobs and the corresponding jobs. In this case it it very simple because there is always one and only one job associated with a given JfJob.
data test (drop=jfjobcnt r1-r4 i);
length JfJobUri $256 JfJobName $60 JobUri $256 JobName $60;
call missing(of JfJobUri, JfJobName, JobUri, JobName);
jfjobcnt = metadata_getnobj("omsobj:JFJob?@PublicType='DeployedJob'", 1, JfJobUri);
do i = 1 to jfjobcnt;
r1=metadata_getnobj("omsobj:JFJob?@PublicType='DeployedJob'", i, JfJobUri);
r2=metadata_getattr(JfJobUri, "Name", JfJobName);
r3=metadata_getnasn(JfJobUri, "AssociatedJob", 1, JobUri);
r4=metadata_getattr(JobUri, "Name", JobName);
output;
end;
run;
Using SAS Data Step Metadata Functions is not fast. It took about 30 seconds to make a list of 5270 jobs, and it could be done in less than a second with Proc Metadata + an XML specification file, but that is much more complicated to set up and get to work.
Hello @ErikLund_Jensen ,
you wrote:
In this case it it very simple because there is always one and only one job associated with a given JfJob.
one quick follow up question: is it technically possible to have a many to one relationship between a job/jobs and its associated JfJob?
I was under the impression, that there exists always one and only one job, which is associated with a given JfJob.
Or, to ask differently: can I have multiple Jobs, which are associated with one, given deployed Job?
Hi @FK1
A DI Studio Job is a Metadata object with associations to any number of transformation objects with associations to each other, so a certain execution order is defined, and each transformation consists of a prototype with any number of sub-objects described by attributes and associations to other sub-objects.
When a job is deployed in DI Studio, two things happens:
When a Job is deployed, the result is one physical disk file, A DI Studio Job can include other jobs, but these jobs are treated as transformations in the deployment process, which means that even if they look like jobs on the canvas, it is just a container, a "shorthand" notation for the transformations in the included job. So the JfJob has always one Job as source, and the deployment of a Job results in one source file.
But the same job can be deployed under different names, so one Job can be source to more than one JFJOB. The source file is generated using the actual metadata registrations for the Job at the time of execution. So if a job is deployed and later changed and deployed under another name, two JFJobs will exist with the same Job object as source, but pointing at program files with different code. The only way to maintain order in this chaos is to check if all JfJobs have MetadataUpdated timestamps that are at >= the MetadataUpdated timestamps of the corresponding Jobs.
The source files written by deployment exist in their own right as physical files, so they are not deleted when the parent JFJobs are deleted. The only way to check for orphaned source files (batchjobs) is to see if all files in the batchjob directory have a metadata registration as a file object associated with a JFJob object.
We have automated these checks along with many others, so we receive a daily rapport over obsolete files, more than one JFJob for each Job, Jobs changed after latest deployment etc. The corrections are done manually to avoid catastrophes in case of errors, but we maintain almost complete order in our SAS environment and enforce business rules for naming objects etc.without much work.
I wish you a happy new year with the hope that the SAS 9 engine will be with us for many years to come as the core product behind all the bells and whistles.
Erik
Hello @ErikLund_Jensen ,
thank you so much for this very detailed and precise, yet profound descritpion of the "mechanics" behind deploying a Job (via DI).
So, to summarize, it can be said, that:
You mentioned that:
We have automated these checks along with many others, so we receive a daily rapport over obsolete files, more than one JFJob for each Job, Jobs changed after latest deployment etc.
Are these checks done by means of SAS utilities, like self-written macros, STPs, etc.? If possible, I'd be interested in getting the underlying code 🙂
Cheers,
FK1
It's finally time to hack! Remember to visit the SAS Hacker's Hub regularly for news and updates.
Learn how use the CAT functions in SAS to join values from multiple variables into a single value.
Find more tutorials on the SAS Users YouTube channel.
Ready to level-up your skills? Choose your own adventure.