BookmarkSubscribeRSS Feed
nickbarrett
Calcite | Level 5

OK I know how to export a table to csv and I have found BAT way to split the csv into 5000 rows, but it does not keep the header for each file. Just the first file. SO I am looking for a do loop that will export 

 

proc export DATA=clean_table
OUTFILE="C:\Table_2019_04_02.csv"
DBMS=DLM REPLACE;
DELIMITER=',';
run;

 

Depending on the day, the source file could contain between 3000 and 200000 rows of data. I would love to get the files to be exported like

 

clean_table 14654 rows

 

Table_2019_04_02-01.csv (5000 rows)

Table_2019_04_02-02.csv (5000 rows)

Table_2019_04_02-03.csv (4654 rows)

 

Any help would be appreciated! Thanks! 

3 REPLIES 3
ballardw
Super User

Not a generic solution but should work for the request.

 

proc export DATA=clean_table (firstobs=1 obs=5000)
   OUTFILE="C:\Table_2019_04_02_01.csv"
   DBMS=DLM REPLACE;
   DELIMITER=',';
run;
proc export DATA=clean_table (firstobs=5001 obs=10000)
   OUTFILE="C:\Table_2019_04_02_02.csv"
   DBMS=DLM REPLACE;
   DELIMITER=',';
run;
proc export DATA=clean_table (firstobs=10001 )
   OUTFILE="C:\Table_2019_04_02_03.csv"
   DBMS=DLM REPLACE;
   DELIMITER=',';
run;

Firstobs specifies the number of the first observation to process and OBS specifies the number of the last observation to process.

When OBS is not specified then the default should be to get the remaining.

 

 

Tom
Super User Tom
Super User

Why do you want to split them?

For such a simple split you can just do it with a single data step.

Here is one that will put in the header row for you.  Make sure your real file does not already include the variables used (_EOF, FNAME, _FILE, _P, _NOBS).

%let path=%sysfunc(pathname(work));
%let n=5;
%let dsn=sashelp.class;

proc transpose data=&dsn(obs=0) out=names;
var _all_;
run;

data _null_;
  do _n_=1 to &n ;
    if _n_=1 then link header;
    set &dsn end=_eof;
    put (_all_) (+0);
  end;
  if _eof then stop;
return;
header:
length fname $200 ;
_file+1;
fname=cats("&path/file",_file,'.csv');
putlog fname=;
file out filevar=fname dsd ;
do _p=1 to _nobs;
  set names point=_p nobs=_nobs;
  put _name_ @;
end;
put;
return;
run;

 

Reeza
Super User

@Tom solution is much more efficient than mine.

 

This macro splits the files easily for you and then you can export them. You could modify the macro to have it exported as the last step as well. 

 

https://gist.github.com/statgeek/abc3c6ce1dbeedb84fe7f11da0603cda

 

If you're new to macro programming, I recommend the UCLA tutorials and there's a tutorial on my Github main page which you can find from the Gist page - just walks through the steps to convert a working program to a macro. It's not great but it's there 🙂

 

Also, you can do this directly within a data step, using the FILEVAR statement which you change the output data file on the fly. This is ultimately Tom's solution.

 


@nickbarrett wrote:

OK I know how to export a table to csv and I have found BAT way to split the csv into 5000 rows, but it does not keep the header for each file. Just the first file. SO I am looking for a do loop that will export 

 

proc export DATA=clean_table
OUTFILE="C:\Table_2019_04_02.csv"
DBMS=DLM REPLACE;
DELIMITER=',';
run;

 

Depending on the day, the source file could contain between 3000 and 200000 rows of data. I would love to get the files to be exported like

 

clean_table 14654 rows

 

Table_2019_04_02-01.csv (5000 rows)

Table_2019_04_02-02.csv (5000 rows)

Table_2019_04_02-03.csv (4654 rows)

 

Any help would be appreciated! Thanks! 


 

SAS Innovate 2025: Save the Date

 SAS Innovate 2025 is scheduled for May 6-9 in Orlando, FL. Sign up to be first to learn about the agenda and registration!

Save the date!

How to Concatenate Values

Learn how use the CAT functions in SAS to join values from multiple variables into a single value.

Find more tutorials on the SAS Users YouTube channel.

SAS Training: Just a Click Away

 Ready to level-up your skills? Choose your own adventure.

Browse our catalog!

Discussion stats
  • 3 replies
  • 1979 views
  • 1 like
  • 4 in conversation