huge data file import

Reply
Contributor
Posts: 36

huge data file import

Hi,

I have a huge .dat file of size 63G. Is it even possible to read into PC-SAS? My computer has 12G memory.

Thanks.

Super User
Posts: 11,343

Re: huge data file import

There are people with SAS datasets in the terabyte range. Output disc space and run time will likely be the issue.

I would recommend using an OBS option to try importing a few hundred records to make sure the formats and variables look right before loading the whole thing. Nothing like waiting a few hours for something to finish importing only to discover a variable was imported with 6 charcters instead of 12...

Super User
Super User
Posts: 7,067

Re: huge data file import

You can read it as long as you have enough disk space.  If it is just a text file then you are better off writing the code to read it yourself. Or take a small set of sample records and run it through PROC IMPORT and pull back the generated code and clean it up and point it to the large file.

Contributor
Posts: 36

Re: huge data file import

Thank you, Tom. It is a .dat file. I have the data library which tells the name, position, length and type of each variable.

I used data step, infile statement with obs=50, but it still took unbearably long time to get result.

Super User
Super User
Posts: 7,067

Re: huge data file import

OBS=50 on INFILE statement should cause it to stop pretty quickly.

You could try coding a STOP statement instead.

data small ;

  infile 'big.dat' lrecl=100000 obs=50;

....

run;

or

data small ;

  if _n_ > 50 then stop;

  infile 'big.dat' lrecl=100000 ;

....

run;

Ask a Question
Discussion stats
  • 4 replies
  • 839 views
  • 0 likes
  • 3 in conversation