DATA Step, Macro, Functions and more

How to handle large data

Accepted Solution Solved
Reply
Contributor
Posts: 53
Accepted Solution

How to handle large data

Hello I am working with a very large data set somewhere in the 50 million + records.  The code that i have attached works ok when i subset out a month but will not work on a much larger chunk of the data.

What is the best way for me to be able to do this on the entire dataset consisting of 5 years? 

Is sorting and then using data step my best option?

Thank you so much for all your expertise.

proc sql;

    create table ambient_temp_dc as

    select RNH, RN, datepart('occur date'n) as date, hour('occur date'n) as hour;

        avg(basic_AT) as avg_ambient

        from rmdeoap.gets_dw_eoa_flt_dccca_v

    where RNH = 'CSX' and 'occur date'n ge '01Nov2014'd

    group by RNH, RN, date, hour

    ;

quit;


Accepted Solutions
Solution
‎11-27-2014 02:32 AM
Super User
Posts: 17,828

Re: How to handle large data

50 million records isn't that big so there are various ways to handle this. I'd start off by trying a standard proc means even without the sort, though you may need it.

You can use formats, datetime12. to get the data by hour rather than calculate the hour/date and then separate them out afterwards if really desired.

If none of the above solutions work, you can always process the data a month at a time.

proc means data=rmdeoap.gets_dw_eoa_flt_dccca_v

  (where = (RNH='CSX') keep=RNH RN 'occur date'n basic_AT) noprint;

  class RNH RN 'occur_date'n;

  ways RNH*RN*'occur_date'n;

  format 'occur_date'n datetime12.;

  output out=ambient_temp_dc mean(basic_at)=avg_ambient;

run;



View solution in original post


All Replies
Super User
Posts: 6,936

Re: How to handle large data

Try to sort your existing dataset first.

proc sort

  data=rmdeoap.gets_dw_eoa_flt_dccca_v (

    where=(RNH = 'CSX' and 'occur date'n ge '01Nov2014'd)

  )

  out=tempdata

;

by RNH, RN, date, hour;

run;

If that succeeds, the data step (or proc means) to build the average is a breeze.

Note that the where condition on the input data set reduces the size of the utility file built while sorting, so you may escape your disk full problem. proc sql has the nasty habit of throwing all the data into one big utility file and doing all its stuff there, causing a tankerload of competing disk accesses that tend to slow down systems to a crawl and often cause unnecessary disk space problems.

proc sort can be made faster and more stable by using UTILLOC= to assign a separate physical disk for the utility files.

---------------------------------------------------------------------------------------------
Maxims of Maximally Efficient SAS Programmers
Solution
‎11-27-2014 02:32 AM
Super User
Posts: 17,828

Re: How to handle large data

50 million records isn't that big so there are various ways to handle this. I'd start off by trying a standard proc means even without the sort, though you may need it.

You can use formats, datetime12. to get the data by hour rather than calculate the hour/date and then separate them out afterwards if really desired.

If none of the above solutions work, you can always process the data a month at a time.

proc means data=rmdeoap.gets_dw_eoa_flt_dccca_v

  (where = (RNH='CSX') keep=RNH RN 'occur date'n basic_AT) noprint;

  class RNH RN 'occur_date'n;

  ways RNH*RN*'occur_date'n;

  format 'occur_date'n datetime12.;

  output out=ambient_temp_dc mean(basic_at)=avg_ambient;

run;



Super User
Posts: 9,681

Re: How to handle large data

I prefer to Data step + Array .

☑ This topic is SOLVED.

Need further help from the community? Please ask a new question.

Discussion stats
  • 3 replies
  • 276 views
  • 4 likes
  • 4 in conversation