02-14-2014 04:29 PM
There are people with SAS datasets in the terabyte range. Output disc space and run time will likely be the issue.
I would recommend using an OBS option to try importing a few hundred records to make sure the formats and variables look right before loading the whole thing. Nothing like waiting a few hours for something to finish importing only to discover a variable was imported with 6 charcters instead of 12...
02-14-2014 05:45 PM
You can read it as long as you have enough disk space. If it is just a text file then you are better off writing the code to read it yourself. Or take a small set of sample records and run it through PROC IMPORT and pull back the generated code and clean it up and point it to the large file.
02-14-2014 06:08 PM
Thank you, Tom. It is a .dat file. I have the data library which tells the name, position, length and type of each variable.
I used data step, infile statement with obs=50, but it still took unbearably long time to get result.
02-14-2014 06:27 PM
OBS=50 on INFILE statement should cause it to stop pretty quickly.
You could try coding a STOP statement instead.
data small ;
infile 'big.dat' lrecl=100000 obs=50;
data small ;
if _n_ > 50 then stop;
infile 'big.dat' lrecl=100000 ;