I am processing a big data and during the processing a lot of data sets will be created that are not required for the results. They are required for the calculations. Because of I/O speed limitation (and storage limitation) I have broke the data into smaller chunks. And to improve speed I have defined a library on the memory using memlib. I use a data step and call execute to run a macro on each chunk of data. The datasets required for calculations require roughly 100MB, and will be replaced in each iteration. The problem is SAS will crashes when I run the data step. I am sure it is not because of the datasets storage. I tested by limiting the observations in data step (i.e. iterations ), when I put obs=25 on average SAS uses 140MB of memory. when obs=200 it uses the 1.2GB and if it is above 200 it will crash. I run the code for 25 steps in each data step up to 300 and it just uses 140 MB. So it is not the datasets on the memory. But when I put obs=300 memory usage goes up above 1.2GB at the early stage just a seconds and SAS crashes and closes. I suspect SAS do not know how much the macro uses the memory, so it buffer the data base on the data set and when starts the macro it will crash. I couldn't solve the problem by BUFSIZE, or BUFFNO. data _null_ ; set ric_date(obs=200);
call symput('taq_day', ric_date);
***;
AXJO_date = catt( 'Mkt_' ,substr(ric_date, index( ric_date, '_') +1));
call symput('Mkt_day', AXJO_date);
****;
if substr(ric_date, index( ric_date, '_') +1) = substr(AXJO_date, index( ric_date, '_') +1) then
call execute('%HFEQ_ALL');
/*%HFEQ_ALL*/
run;
... View more