SAS Data Integration Studio, DataFlux Data Management Studio, SAS/ACCESS, SAS Data Loader for Hadoop and others

Import large csv problem

Reply
Occasional Contributor
Posts: 7

Import large csv problem

Hi all,
I'm trying to import CSV file of 4 million rows and ~150 columns that order of columns I don't know.
It is impossible to do guessingrows for such a file.
And, I get cut character columns.

Any ideas what can I do?
Super User
Posts: 7,762

Re: Import large csv problem

Very simple. Write a data step according to the file description.

To get help in this, you best post that description here.

---------------------------------------------------------------------------------------------
Maxims of Maximally Efficient SAS Programmers
Occasional Contributor
Posts: 7

Re: Import large csv problem

As I wrote. I didn't know the order and some times discription of columns... becouse of this I can't write data step....
Super User
Posts: 7,762

Re: Import large csv problem

Then DEMAND the description from whoever gave you that file.

Your only other option is to run proc import with a large enough guessingrows value, and hope for the best.

Unless you want to inspect all rows with the good old eyeballs Mk 1.

---------------------------------------------------------------------------------------------
Maxims of Maximally Efficient SAS Programmers
Super User
Posts: 19,770

Re: Import large csv problem

Posted in reply to KurtBremser

There isn't much of a choice then. I suggest the standard process of using proc import to generate code. Copy the code from the log and customize according to your errors until they're gone. 

 

I would recommend setting GUESSINGROWS to 1 million for initial proc import. 

You can limit total amount read using OBS option. 

 

Option obs=1500000;

 

Theb reset after:

 

option obs=Max;

Occasional Contributor
Posts: 7

Re: Import large csv problem

I tried GUESSINGROWS 4 million ...it doesn't end.😃
Super User
Posts: 19,770

Re: Import large csv problem

If 1 million doesn't get it that would be very surprising. 

 

I suggested 1.5,million. 

Super User
Posts: 7,762

Re: Import large csv problem


evgenys wrote:
I tried GUESSINGROWS 4 million ...it doesn't end.😃

Oh yeah, 4 million rows * 150 columns takes a lot of time to inspect. It is what happens when you try something stupid.

Get back to the data source and demand the description, and tell them that the data is useless without it.

---------------------------------------------------------------------------------------------
Maxims of Maximally Efficient SAS Programmers
Occasional Contributor
Posts: 7

Re: Import large csv problem

😃
I did it first...but before it done I have to cope with it...
PROC Star
Posts: 1,167

Re: Import large csv problem

This isn't an easy problem!

 

Do any of your fields have internal commas? If not, then you could use (untested):

 

data _null_;
retain MaxCols 0;
infile x end=LastRec;
input;
ColCount = count(_infile_, ",") + 1;
if ColCount > MaxCols then MaxCols = ColCount;
if LastRec then call symput("ColCount", put(MaxCols, best8.));
run;

 

This should give you the highest number of columns in your file.

 

Then maybe something along the lines of:

 

%macro GetLen;
data _null_;
retain MaxLen 0;
infile x end=LastRec;
input;
%do &i = 1 to &ColCount;
ColLen = length(scan(_infile_, &i))
if ColLen > MaxLen then MaxLen = ColLen;
%end;
if LastRec then call symput("ColLen", put(MaxLen, best8.));
run;
%mend;
%GetLen;


And finally:

 

%macro GetData;
data Want;
length Col1-Col&ColCount. $&ColLen.;
infile x;
input;
%do &i = 1 to &ColCount;
Col&i. scan(_infile_, &i);
%end;
run;
%mend;
%GetData;

 

Good luck!

Tom

Occasional Contributor
Posts: 7

Re: Import large csv problem

Thanks Tom. I'll try it tomorrow.
Super User
Super User
Posts: 7,039

Re: Import large csv problem

[ Edited ]

You don't need to know what is in a CSV file to read it in as character strings. If you don't even know how many columns there are just use a larger number (150 in the example below) than you expect.  If the last column is not empty then increase the number and read the file again.

 

data temp (compress=yes);
   infile 'myfile.csv' dsd truncover firstobs=2 ;
   length x1-x150 $200 ;
   input x1-x150;
run;

You can then analyze the character strings yourself and make your own decision on what is in it.  

  • Find the maximum length (if you find any with maximum length close to 200 then you might want to use more then $200 in the step above). 
  • Check if they can be converted to a number by using INPUT function with COMMA32. informat.
  • Check if they can by converted to a date, time, or datetime by using ANYDTDTE, ANYDTTME, and ANYDTDTM informats, respectively.

If the first line has variable names then you could read that line in separately and use it to rename the variables.

Respected Advisor
Posts: 4,919

Re: Import large csv problem

Here is a quick way to take a 1% random sample of the records using a line pointer control in the input statement

 

data sample;
length str $200;
infile "&sasforum\datasets\frame.csv" truncover line=lineNo;
linePt = ceil(2 * 100 * rand("uniform"));
input #linePt str&;
line + lineNo;
keep line str;
run;
PG
Super User
Posts: 11,343

Re: Import large csv problem

My rough stab would be to run proc import with guessingows in the 32000 range. The save and inspect the generated datastep  code.

I would likely increase the lengths of character variables in case the generated 73 or such doesn't quite work for all of them, likely around 10 percent added.

 

But if you don't have a document that says what any of the columns are then what are you going to do with this data file? Unless you have column headers long enough to be explanatory it may be hard to tell what anything else is.

Occasional Contributor
Posts: 7

Re: Import large csv problem

Let see... you have a software that it's output CSV file that cat change. You do know what you get.. and you can't change your import code every time.That's you must have a sutable function.
Ask a Question
Discussion stats
  • 19 replies
  • 1231 views
  • 3 likes
  • 7 in conversation