I am having big problems with Data Integration Studio, we have a large number of tables and jobs and our metadata backups are 4GB in size.
Recently we have noticed the following problems:
- DI Studio is often very slow, especially when opening properties of tables or generating code.
- We sometimes receive error messages such as "Error writing metadata" when trying to save jobs. We then have to close the job without saving changes and redo the work.
- Some of our jobs won't open at all anymore. We previously thought this only happened with jobs which were particularly long or complicated, but it has also happened with simpler jobs.
- Sometimes we have problems editing metadata: e.g. when trying to change the library allocation of a table, it reverts back to the library which it was before once the properties have been closed.
We have tried to reduce the size of our metadata by deleting tables and jobs which are no longer needed, then performing a "purge" operation through SAS Management Console, before adding the RUNANALYSIS AND REORG statement in the OMABACKUP script.
Following this we were temporarily able to open jobs which we were previously not able to.
However, this was only short-lived. We have also noticed that the size of our metadata backups has not gone down at all and remains at 4GB.
Does anyone have any experience of dealing with large amounts of metadata and how this affects performance? Also what steps do we need to take to get metadata permanently removed and so reducing the size of our backups.
Hi we use 9.1.3 sas and i think DI studio is version 4.3. Metadata data server is on windows servers 2005 32 bit. Each client has xp 32 bit. We have around 10 developers editing the metadata server and another 7 reading from it.