Hi,
My department is currently preparing for a SAS server migration which involves a SAS version upgrade as well.
In the past when we have done migrations of SAS metadata, we noticed that while importing the spk file into the new environment some jobs fail/break at certain transformations which results in additional effort in recreating the job to fix the failure.
Has anybody faced similar issues and do you know what are steps that can be taken to prevent such re-work?
Thanks in advance!
Hi,
if you have created your own transformations or modified the default ones, before importing any job you need to import those transformations, then you can import the actual job.
When importing spk, you need to take care of the dependencies, and import first what has less dependencies and import the latest what has more dependencies.
An example:
Users - Roles - Groups.
Another:
Servers - libraries - tables
Another:
tables - transformations - jobs - deployed jobs (you still might need to re-deploy your jobs)
Hi,
if you have created your own transformations or modified the default ones, before importing any job you need to import those transformations, then you can import the actual job.
When importing spk, you need to take care of the dependencies, and import first what has less dependencies and import the latest what has more dependencies.
An example:
Users - Roles - Groups.
Another:
Servers - libraries - tables
Another:
tables - transformations - jobs - deployed jobs (you still might need to re-deploy your jobs)
After migrating 2 projects folowing this principle, my team were able to minimise the reworks on the DI jobs. There were couple of issues however:
1. Appeared to be linked to a new functionality/bug in the latest version of DI studio which was overwriting parameter values in a DI job. At the end we had to delete the parameter and add it to "Precode".
2. SAS project name conflict causing an issue for a DI job [can be ignored!]
So, to summarize I would vote for this suggestion as the best solution; eventhough, it would have been wonderful if an automated deployment tool was available for this purpose.
Hi,
what error message(s) are you getting? Am wondering because only some transformations/jobs fail.
Thanks
Anja
Hi Anja,
you are right, only few transformation/jobs get broken. If I recall correctly, import process doesnt show errors. Only when we look at the imported job we would realise that some joins/mapping got broken.
Is there any reason why such failures happen? I am not sure if it was due to some missing dependant object.
If there are some automated ways to do such migrations, please let me know.
Thanks!
Hi,
are you importing the data into foundation or any custom repositories?
Sometimes, when custom repositories were used, and exported from there, and then imported into
the Foundation repository, the GUID that is specific to each transformation causes the import / the transformation to fail
as it detects the same key as already exitent.
Is the export being done from within SASMC Folders, or within DI?
Did you check the Metadata Server log and Object Spawner log to see if there are any errors or warnings?
Thanks
Anja
Hi Anja,
Thanks for these hints. I always usually export and import from custom repository tab itself, but I hadnt given enough care on these in the past. Next time I will be cautious.
I use Di studio for export and import. Does it make any difference?
I dont have access to Metadata server and object spawner logs. Next time, I will ask the admin team to provide me these logs.
Thanks a lot for the suggestions!
Hi, thanks for the advice. However, in my case we are migrating to a new server which has a different SAS version[SAS9.4].
Do you think it is still a good idea to use the SAS migration utility tool?
Personally, I have always used a migration like that (new server, new SAS release) to completely rework my metadata setup and correct all logical flaws that the current version had.
Most of the time, there's also a lot of cruft found that can be weeded out.
Yes, same here.. but, this time I would like to minimise this re-work involved in rebuilding the DI jobs. I am consolidating all best practices from experts and hoping for the best while executing.
Nothing wrong with that. If your metadata structure has already matured to the point where you have no potholes that repeatedly cause you to utter indecent words when dealing with them, then a simple automatic migration makes sense. If one can use the word "simple" when it comes to SAS installs, anyway 😉
The SAS Users Group for Administrators (SUGA) is open to all SAS administrators and architects who install, update, manage or maintain a SAS deployment.
Learn how to install the SAS Viya CLI and a few commands you may find useful in this video by SAS’ Darrell Barton.
Find more tutorials on the SAS Users YouTube channel.