SAS Data Integration Studio, DataFlux Data Management Studio, SAS/ACCESS, SAS Data Loader for Hadoop and others

Data integration studio

Reply
Contributor
Posts: 33

Data integration studio

Hello all, I am having big problems with Data Integration Studio, we have a large number of tables and jobs and our metadata backups are 4GB in size. Recently we have noticed the following problems: -DI Studio is often very slow, especially when opening properties of tables or generating code. -We sometimes receive error messages such as "Error writing metadata" when trying to save jobs. We then have to close the job without saving changes and redo the work. -Some of our jobs won't open at all anymore. We previously thought this only happened with jobs which were particularly long or complicated, but it has also happened with simpler jobs. -Sometimes we have problems editing metadata: e.g. when trying to change the library allocation of a table, it reverts back to the library which it was before once the properties have been closed. We have tried to reduce the size of our metadata by deleting tables and jobs which are no longer needed, then performing a "purge" operation through SAS Management Console, before adding the RUNANALYSIS AND REORG statement in the OMABACKUP script. Following this we were temporarily able to open jobs which we were previously not able to. However, this was only short-lived. We have also noticed that the size of our metadata backups has not gone down at all and remains at 4GB. Does anyone have any experience of dealing with large amounts of metadata and how this affects performance? Also what steps do we need to take to get metadata permanently removed and so reducing the size of our backups. We are using windows server 2005 32 bit. Our clients us windows xp 32 bit. Thanks in advanced
Super Contributor
Posts: 356

Re: Data integration studio

Hi

Windows 2005 server? Didn't realise there was a 2005.

Because you are using 32 bit windows machine, you are going to run into limitations/constraints. SAS Metadata is a in memory process, and loads "modules" when someone accesses that part of metadata and you'll hit issues when the process memory is around 1.9GB so cleaning up you metatdata will only do so much. If you are continually having issues you may need to reconsider your OS.

What SAS Version and DIS version are u using?

Barry
SAS Employee
Posts: 75

Re: Data integration studio

As suggested, large metadata repositories might cause a problem with a 32-bit OS.

If you continue to have problems with large metadata repositories, consider contacting Tech Support. They might be able to say whether the methods described in this SAS Note would be appropriate in your situation:

http://support.sas.com/kb/33/577.html

I suggest that you check with Tech Support before trying this.
SAS Employee
Posts: 51

Re: Data integration studio

What version of DI Studio are you running? The options for addressing these types of problems differ between 9.1.3 and 9.2.

Thanks,

Tim Stearn
Contributor
Posts: 33

Re: Data integration studio

We are using DI STUDIO 3.4
Contributor
Posts: 33

Re: Data integration studio

and SAS VERSION 9.1.3
Frequent Contributor
Posts: 80

Re: Data integration studio

We have struck this, tried all the options and have come to the limit of the 32 bit operating system and the way the SAS Metadata server has be (un?)architected to handle it.

We are now planning to move to SAS 9.2 / 64 Bit to solve the issues.

Try scheduling to restart the metadata service each night s this sometimes helps.

Also try and get rid of any dependent repositories if you have any.

One site I know of started with a fresh metadata repository and recreated all their content, that worked but a little time intensive ;-)

Or upgrade to 64 bit windows which probably means SAS 9.2 at the same time.
SAS Employee
Posts: 51

Re: Data integration studio

Thanks for the version information. I'll get back to you with possible follow ups shortly.
SAS Employee
Posts: 51

Re: Data integration studio

Our Technical Support division helpfully reminded me of the METACLEANSE procedure and a plug-in for DI Studio. Both of these can be used to clean up unused metadata, thereby reducing the size of the repository. Please see the following SAS Note for detailed instructions: http://support.sas.com/kb/33/577.html.

As others commented, another option is to determine if you have any unused Custom or Project repositories. All repositories (not just Foundation) contribute to the total amount of memory being managed by the server, so if there are unused repositories that you can remove (or perhaps just "Unregister"), that will help to eliminate the memory issues.

If you go through these procedures and still find yourself close to the "memory ceiling" for 32-bit Windows, please post again.

Thanks,

Tim Stearn
SAS Employee
Posts: 36

Re: Data integration studio

There are a couple of other settings to try to see if they help. Be sure to shut down the application before setting these settings, then restart after they are set. Also be very careful of the syntax, there are dashes and spacing requirements. It is best to just copy/paste the settings in as is so that you don't miss any of the syntax.

1. You can locate your distudio.ini or etlstudio.ini file which will be in your install directory, and add a setting like this to that file:

9.1 syntax:
CommandLineArgs=-XX:MaxPermSize=128m -DentityExpansionLimit=1000000

9.2 syntax
JavaArgs_N=-XX:MaxPermSize=128m

where N is a number higher than the other numbers already in the file; here is an example:

JavaArgs_12=-XX:MaxPermSize=128m

This setting allows the class size buffer to grow to accomodate more content. As more classes are loaded, which happens at startup of the application and as you continue to work with it, this buffer can fill up and won't expand to accomodate all of the code.

2. You can also add this parameter if it is not already in that same ini file:

9.1 syntax:
CommandLineArgs=-XX:MaxPermSize=128m -DentityExpansionLimit=1000000

9.2 syntax:
JavaArgs_N=-DentityExpansionLimit=1000000

here is some info on this second setting: http://java.sun.com/j2se/1.5.0/docs/guide/xml/jaxp/JAXP-Compatibility_150.html

This is a Sun java issue, and we put that workaround for it into the ini file in a hotfix but if you are not on the current hotfix settings, this might not be in your file.
Contributor
Posts: 33

Re: Data integration studio

Hello

I tried the metacleanse options and it has reduced our repository size at all. The other idea was creating a new repository and then migrating what we need over. If we keep the metadata in another custom repository will we still get memory issues? The reason we are keen to keep the other stuff is just incase we need it. Thanks
SAS Employee
Posts: 51

Re: Data integration studio

Hi,

Did you mean to say in your last post that you tried METACLEANSE and that it did NOT reduce the size of your repository? You wrote that it DID reduce the size, but I thought perhaps you meant the other.

In any case, all repositories managed by the same Metadata Server contribute to the total memory under management by the Metadata Server process, which is what is constrained on 32-bit windows. So, simply moving metadata to a custom repository under the same Metadata Server won't help the situation.

If you're still experiencing issues, I would recommend that you open a case with Technical Support.

Thanks,

Tim Stearn
Contributor
Posts: 33

Re: Data integration studio

Hi

Sorry i ment it had not reduced the size. Where would be the best place to delete the metadata and export the unwanted stuff out.
SAS Employee
Posts: 51

Re: Data integration studio

If you want to remove metadata but have it around in backup form just in case you need it again, the best thing to do is create an export package with unneeded items and then delete those items from the repository. After deleting the items, you'd want to follow the same approach you did before (backup with REORG=YES, MetaCleanse). Export only covers a subset of the metadata object types, however, so this isn't a general solution.

Attempting to get below the 3 GB mark and stay there is a bit like trying to keep you Inbox clean - it's only going to help for a little while. If you're nudging up against this boundary, we'd strongly recommend an upgrade to a 64-bit OS and to 9.2. After doing this, you'll never face this problem again.

Thanks,

Tim Stearn
Ask a Question
Discussion stats
  • 13 replies
  • 541 views
  • 0 likes
  • 6 in conversation