Hi,
Is there an easy way for a DI job to figure out the metadata path/folder it is in?
When a stored process runs, the metadata folder is available in &_metafolder.
For a DI job &ETLS_JOBNAME has the name of the job. But I can't see infomation on metadata folder the job is in anywhere.
Thanks,
--Q.
One way to do it is to use the jobID macro varaible and use the following code to get the folder location:
data _null_; | |
length name _uri location _location $200; | |
_rc=1; | |
_uri="&jobID"; |
_rc=metadata_getnasn(_uri,"Trees",1,_uri);
_rc=metadata_getattr(_uri,"Name",location);
_tree=1;
do while (_tree>0);
_tree=metadata_getnasn(_uri,"ParentTree",1,_uri);
if _tree > 0 then do;
_rc=metadata_getattr(_uri,"Name",_location);
location=catx('/',_location,location);
end;
end;
location = '/'||location;
put location=; |
run;
Sorry, but I think you have to build a routine that queries the metadata, perhaps using the mentioned ETLS_JOBNAME.
Speculating a bit; the difference between a Stp and a DIS Job, is that a Stp is called via metadata. A deployed DI Studio has a meta data registration, but when executed, it's the the standalone sas program that is being called, and this usually via some scripts (depending on hist/scheduling server).
Thanks Linus,
Makes sense. Still learning my way around DIS. But now that you mention it, it fit's with what I'm seeing. Speculating even more, looks like when I deploy a job, it's basically writing a .sas file somewhere on whatever server. And then when I use SMC to create a flow and then schedule a flow (using LSF lite or whatever), seems like it's basically analogous to just using cron to batch submit that .sas file. So when I run the deployed job, to SAS it's just a batch submit. I guess if I looked at the .sas file sitting on the server, I might see a %LET statement assigning ETLS_JOBNAME as part of the wrapper code added by DIS when the job was deployed. So I guess if they wanted to make it easier to get at the metadata path, they would need to add code to generate a similar global macro var when the job is deployed.
Will think about whether it's worth querying the metadata to get the path. But for now, will probably just end up hard coding a %LET statement as "pre code" to the job.
--Q.
I suppose a question would be why do you need to know the metadata folder the job metadata is in?
Barry
Playing around with faking a /Dev /Test /Prod environment using metadata folders. For a stored process, when I promote a stored process from /Dev to /Test, I don't have to change anything in the stored process code itself. That is, because the stored process knows it is in a /Dev metadata directory, it knows it is running in Development mode, and runs with the appropriate options etc.
--Q.
An interesting way of achiving this,
We have three seperate Metadata Environments etc ( all on the same box), so when you promote to the other env you need to change nothing, and there is no need to have any smarts like what you describe.
Barry
Thanks Barry,
Yes, I've been thinking of my /Dev /Test /Prod folders as a "poor man's" replacement for having separate server environments.
You say you've got different metadata environments all on one box? What does "Metadata Environment" mean? If I was using SPWA, where the URL is basically server:port/metadatapath , are different metadata environments distinguished by different server names (even though running on one box?) In terms of SAS licensing fees, are you paying for multiple server licenses, or does one server license still allow multiple metadata environments?
Without getting into licensing issues... As a SAS Developer, when I first asked "How can we set up a development server", the answer was "We'd have to pay for a second (or third) SAS license." But maybe I can ask the admins to consider setting up a separate metadata environment???
Thanks
--Q.
For the URL, we actually have the web-tier on seperate boxes as there is no SAS licencing issue there (no actual SAS installed on the boxes).
I was being a bit simple when I said one box, we actually have a metadata server, edi, ebi server however on each box they are logically split by dev test prod. all this could be on one box but would put load on it.
When I say metadata environments what I mean is that we have three metadata servers/repositories using ports 8561, 8562, and 8563 all configured on the one box, no extra cost involved, however risks are dev will impact prod etc but you have that issue also with your setup. So to answer your question ''does one server license still allow multiple metadata environments" - YES
If you wanted a seperate development server then there would definately additional cost - in my opinion there shouldn't be but that is a different discusion.
Sorry for the disjointed response
Thanks again Barry
When I say metadata environments what I mean is that we have three metadata servers/repositories using ports 8561, 8562, and 8563 all configured on the one box, no extra cost involved, however risks are dev will impact prod etc but you have that issue also with your setup. So to answer your question ''does one server license still allow multiple metadata environments" - YES
That's very helpful. 3 metadata servers, no extra cost sounds appealing! I will have to explore it further. Can see how with one server, you're still at risk of some process running on dev impacting prod, but as you say we have that risk now. Separate metadata servers/repositories sounds more useful than my current approach of separate metadata *folders*....
--Q.
I also believe that you can have your metadata server(s) on another machine at no extra cost - however you would need to discuss with your SAS rep.
This would remove some of the load from you machine that has the main SAS processing on it.
One way to do it is to use the jobID macro varaible and use the following code to get the folder location:
data _null_; | |
length name _uri location _location $200; | |
_rc=1; | |
_uri="&jobID"; |
_rc=metadata_getnasn(_uri,"Trees",1,_uri);
_rc=metadata_getattr(_uri,"Name",location);
_tree=1;
do while (_tree>0);
_tree=metadata_getnasn(_uri,"ParentTree",1,_uri);
if _tree > 0 then do;
_rc=metadata_getattr(_uri,"Name",_location);
location=catx('/',_location,location);
end;
end;
location = '/'||location;
put location=; |
run;
Thanks much LuLu, looks nifty!
--Q.
SAS Innovate 2025 is scheduled for May 6-9 in Orlando, FL. Sign up to be first to learn about the agenda and registration!
Need to connect to databases in SAS Viya? SAS’ David Ghan shows you two methods – via SAS/ACCESS LIBNAME and SAS Data Connector SASLIBS – in this video.
Find more tutorials on the SAS Users YouTube channel.