Hello all, I am having big problems with Data Integration Studio, we have a large number of tables and jobs and our metadata backups are 4GB in size. Recently we have noticed the following problems: -DI Studio is often very slow, especially when opening properties of tables or generating code. -We sometimes receive error messages such as "Error writing metadata" when trying to save jobs. We then have to close the job without saving changes and redo the work. -Some of our jobs won't open at all anymore. We previously thought this only happened with jobs which were particularly long or complicated, but it has also happened with simpler jobs. -Sometimes we have problems editing metadata: e.g. when trying to change the library allocation of a table, it reverts back to the library which it was before once the properties have been closed. We have tried to reduce the size of our metadata by deleting tables and jobs which are no longer needed, then performing a "purge" operation through SAS Management Console, before adding the RUNANALYSIS AND REORG statement in the OMABACKUP script. Following this we were temporarily able to open jobs which we were previously not able to. However, this was only short-lived. We have also noticed that the size of our metadata backups has not gone down at all and remains at 4GB. Does anyone have any experience of dealing with large amounts of metadata and how this affects performance? Also what steps do we need to take to get metadata permanently removed and so reducing the size of our backups. We are using windows server 2005 32 bit. Our clients us windows xp 32 bit. Thanks in advanced
Windows 2005 server? Didn't realise there was a 2005.
Because you are using 32 bit windows machine, you are going to run into limitations/constraints. SAS Metadata is a in memory process, and loads "modules" when someone accesses that part of metadata and you'll hit issues when the process memory is around 1.9GB so cleaning up you metatdata will only do so much. If you are continually having issues you may need to reconsider your OS.
As suggested, large metadata repositories might cause a problem with a 32-bit OS.
If you continue to have problems with large metadata repositories, consider contacting Tech Support. They might be able to say whether the methods described in this SAS Note would be appropriate in your situation:
Our Technical Support division helpfully reminded me of the METACLEANSE procedure and a plug-in for DI Studio. Both of these can be used to clean up unused metadata, thereby reducing the size of the repository. Please see the following SAS Note for detailed instructions: http://support.sas.com/kb/33/577.html.
As others commented, another option is to determine if you have any unused Custom or Project repositories. All repositories (not just Foundation) contribute to the total amount of memory being managed by the server, so if there are unused repositories that you can remove (or perhaps just "Unregister"), that will help to eliminate the memory issues.
If you go through these procedures and still find yourself close to the "memory ceiling" for 32-bit Windows, please post again.
There are a couple of other settings to try to see if they help. Be sure to shut down the application before setting these settings, then restart after they are set. Also be very careful of the syntax, there are dashes and spacing requirements. It is best to just copy/paste the settings in as is so that you don't miss any of the syntax.
1. You can locate your distudio.ini or etlstudio.ini file which will be in your install directory, and add a setting like this to that file:
where N is a number higher than the other numbers already in the file; here is an example:
This setting allows the class size buffer to grow to accomodate more content. As more classes are loaded, which happens at startup of the application and as you continue to work with it, this buffer can fill up and won't expand to accomodate all of the code.
2. You can also add this parameter if it is not already in that same ini file:
I tried the metacleanse options and it has reduced our repository size at all. The other idea was creating a new repository and then migrating what we need over. If we keep the metadata in another custom repository will we still get memory issues? The reason we are keen to keep the other stuff is just incase we need it. Thanks
Did you mean to say in your last post that you tried METACLEANSE and that it did NOT reduce the size of your repository? You wrote that it DID reduce the size, but I thought perhaps you meant the other.
In any case, all repositories managed by the same Metadata Server contribute to the total memory under management by the Metadata Server process, which is what is constrained on 32-bit windows. So, simply moving metadata to a custom repository under the same Metadata Server won't help the situation.
If you're still experiencing issues, I would recommend that you open a case with Technical Support.
If you want to remove metadata but have it around in backup form just in case you need it again, the best thing to do is create an export package with unneeded items and then delete those items from the repository. After deleting the items, you'd want to follow the same approach you did before (backup with REORG=YES, MetaCleanse). Export only covers a subset of the metadata object types, however, so this isn't a general solution.
Attempting to get below the 3 GB mark and stay there is a bit like trying to keep you Inbox clean - it's only going to help for a little while. If you're nudging up against this boundary, we'd strongly recommend an upgrade to a 64-bit OS and to 9.2. After doing this, you'll never face this problem again.