Hello,
Is there a way to change the SASLIBREF of multiple metadata registered sasdata sets at once?
or Change the data server and DBMS of an existing Library in sas metadata.
Existing library is impala and intended target library to which all existing tables be allocated is snowflake
Thanks
ir0nxl
Solution I used:
Delete the exiting impala Library (say with the libref 'implib') in the target environment (say Test) under which all the tables are registered.
create a new SNOWFLAKE library with the same libref 'implib' in the target environment (TEST)
Export all the table from source environment (Dev) to Test. SAS metadata engine is smart enough to the pick up the target library(newly created Snowflake library) with the same name 'implib'.
Then programmatically uppercase ( which is required for SNOWFLAKE) the PhysicalName/Name of all tables using metadata_setattr() function.
Finally Update metadata of tables selecting them all and updating manually (or programmatically if preferred).
Using SMC or DIS is best practise.
But you need to find a way to migrate 800+ tables in a single library from one DBMS to new DBMS.
updating them individually is not an option. So the best option was to update the library.
I managed to change the server and other relevant properties of the library using SAS metadata functions.
But the solution I opted involves no programmatic metadata manipulation of the library but uppercasing the physicalName/Name of all table which is an unfortunate requirement of SNOWFLAKE
Solution we used:
Delete the exiting impala Library (say with the libref 'implib') in the target environment (say Test) under which all the tables are registered.
create a new SNOWFLAKE library with the same libref 'implib' in the target environment (TEST)
Export all the table from source environment (Dev) to Test. SAS metadata engine is smart enough to pick up the target library(newly created Snowflake library) with the same name 'implib'.
Then programmatically uppercase (which is required for SNOWFLAKE) the PhysicalName/Name of all tables using metadata_setattr() function.
Finally update metadata of tables selecting them all and updating manually (or programmatically if preferred).
The easiest way to do this would be to create new libraries in SAS Management Console Library Manager mirroring the existing SAS LIBREFs, then use the Register Tables option on the data library's properties. Then you can delete the old libraries which will remove all of the registered datasets for those libraries in one go.
That would create new objects? We want to retain the same object_id so that we do not have to change 800+ DIS jobs.
Are the LIBREFs pointing to an external database server? If so you can change the server name on the data library without changing the library as long as the data library engine isn't changing. For example staying as ODBC, but pointing to a new database server containing the same databases and schemas.
Changing from Impala to Snowflake. Changing the server or connection of the registered Library is not possible
@ird0nxl - That's unfortunate. We are lucky to be using ODBC only so switching servers is quick and easy.
I don't use DIS, but having tables identified by object_id seems like a siginificant limitation.
Hello @ird0nxl
"Changing from Impala to Snowflake. Changing the server or connection of the registered Library is not possible"
Accessing Impala typically requires SAS access interface to Impala, Similarly for Snowflake one would typically use SAS access interface Snowflake.
Just curious how do you plan to manage without making this change ?
You can change the server by using meta function
rc = metadata_setattr(liburi, "Engine", 'SASIOSNF');
Then you can can change the other relevant options in the library via SMC or DIS.
But I have found a simpler way than manipulating metadata (see the original question)
Thanks for the response. My understanding is that using SAS Management Console is simpler and is the recommended best practice (https://go.documentation.sas.com/doc/en/pgmsascdc/9.4_3.5/lrmeta/p1e5qsh142wjgwn1kz6gpsq1xeyi.htm ).
Writing a program , testing on dev/qa , passing through the change review before implementing on production is a laborious and time consuming process.
I realize that there is a lot to learn in SAS, it would help many of us, if someone can enlighten us with the advantage of doing it using a code and not using SAS Management Console.
Using SMC or DIS is best practise.
But you need to find a way to migrate 800+ tables in a single library from one DBMS to new DBMS.
updating them individually is not an option. So the best option was to update the library.
I managed to change the server and other relevant properties of the library using SAS metadata functions.
But the solution I opted involves no programmatic metadata manipulation of the library but uppercasing the physicalName/Name of all table which is an unfortunate requirement of SNOWFLAKE
Solution we used:
Delete the exiting impala Library (say with the libref 'implib') in the target environment (say Test) under which all the tables are registered.
create a new SNOWFLAKE library with the same libref 'implib' in the target environment (TEST)
Export all the table from source environment (Dev) to Test. SAS metadata engine is smart enough to pick up the target library(newly created Snowflake library) with the same name 'implib'.
Then programmatically uppercase (which is required for SNOWFLAKE) the PhysicalName/Name of all tables using metadata_setattr() function.
Finally update metadata of tables selecting them all and updating manually (or programmatically if preferred).
The SAS Users Group for Administrators (SUGA) is open to all SAS administrators and architects who install, update, manage or maintain a SAS deployment.
Learn how to install the SAS Viya CLI and a few commands you may find useful in this video by SAS’ Darrell Barton.
Find more tutorials on the SAS Users YouTube channel.