08-18-2016 08:29 AM
I need help regarding SAS memory management, currently I have 96 GB LASR memory but my production data base size is huge so currently we put only 90 days data (only important table) in SAS LASR, now LASR memory is almost full and due to this SAS server is goes down so many time, due to this all data is truncated from memory, so can you suggest me how manage SAS memory system or save data in disk other than LASR or opimization of data in LASR, any best way optimize data size of table
proc sql ;
connect to odbc (dsn=inventory user=ashu password='');
create table libname.test_table as select
c1 format=$30. length=30,
from connection to odbc
disconnect from odbc;
08-18-2016 09:14 AM
Save data on disk removes the possibility to (immediately) analyze it in VA/VS. So it depends hon your whole analytical environment/process if that's feasible.
There a few techniques (not limited to) that can save memory in LASR, what to do depends on your data, requirements and query patterns:
08-20-2016 07:45 PM - edited 08-20-2016 07:46 PM
I don't have practical experience with it but it appears the 2.4 version got a squeeze option which allows for compressed in-memory tables.
08-21-2016 01:37 AM
Compression is available in the more recent versions of SAS VA, but the down side is it can have a very significant impact on performance and needs to be used with extreme caution. For large tables like the one being discussed it could be a total CPU killer. Have a look at the other options suggested first.
Also if you don't need row-level detail in VA consider pre-summarising your data first to reduce the size.