08-17-2013 08:01 AM
to start a list:
- making a directory listing either by SAS-functions or by external commands
Using this as input either as macros or as input files.
- Using wildcards in the input statement
- Using the filename option in the input statement
- Using sas-macros to build generated source loops
- Using a data-step to generate SAS source that can be included later
- Proc import
- XML input processing
- using XMLmapper to define XML processing
- PCfiles server usage to import Windows files
- EGuide generating some code...
All kind of combinations and variations on those. Some options dedicated to the type of files being improted.
08-17-2013 08:10 AM
08-17-2013 10:21 AM
The libname to excel is supporting sheetnames as tables
SAS/ACCESS(R) 9.4 Interface to PC Files: R (SAS LIBNAME Statement Syntax for PCFILES Engine)
Example 2: Assigning a Libref to a Microsoft Excel Workbook)
Proc import is having a sheet option.
SAS/ACCESS(R) 9.4 Interface to PC Files: Reference (Microsoft Excel Workbook Files)
The xml approach is more work to investigate for me ,as no experience with that at excel part.
Having it as libname makes the sheets visible as dataset tables.
03-01-2014 09:00 AM
Needed to recap the approach.... So her it is.
SAS functions dir-listing Having a list of files it is possible to execute genrated SAS-code.
There is nu need to do that as you have the filename option and wildcards in filenames. Reading with wildcards assumes:
Explanation wildcards filename usage:
An example of wildcards and reading the small files can be like:
filename tjaka "E:/sastest" ;
data work.filelst (keep=filenm) ;
length filenmc filenm $255 ;
infile tjaka(*.txt) filename=filenmc end=flend eov=frstrcr lrecl=32767 recfm=F truncover;
input @1 var1 $1 ;
if (frstrcr) then do; /* first record/data non empty file */
Instead of the call execute (9.3 doc) there could be a input statement processing the files;
With 9.4 you could put in a dosubl call.
The approach with dopen/doptnum/dclose is an alternative way precessing a complete directory. Testing on filenames is necessary when needed.
03-01-2014 04:04 PM
If your worksheet names are SAS compliant here's a way, original from Nat Wooding on SAS-L
Libname in 'C:\Park\bhupindertest.xlsx';
Proc contents data = in._all_ out = a noprint;;
Data _null_ ;
set a (keep = memname) end = eof;;
name = compress(memname, '$');
name = translate(trim(name) , '_', ' ');
str =compbl( "Data " || name || "; set in.'" || memname ||"'n;" );
Call Execute (str);
if eof then do;
str = 'run;';
call execute (str);
03-02-2014 12:15 PM
Hi! A somewhat different solution: Some years ago I had to import data from many different Excel-sheets (with numbers as both numbers and as text) into SAS tables. The standard SAS tools did not work very well at that installation.
1) export each Excel-sheet to a .CSV file. When I looked at this file using Notepad I could see
all the data, in different ways of writing them (E-format, decimal-komma, etc.)
2) Then I wrote a SAS Data-step program that did, in a loop:
Read one line into a LONG character variable. Search in this variable after the first semicolon. Extract all characters
before that to a variable (and take away unnecessary extra characters - Use the E-format, etc - and read the result into a
SAS numeric variable.)
Then take away everything in the long character variable up to and including the first semicolon.
Bottom line: It took 2-3 days to write the code. VERY fast and secure way to convert the Excel-sheets into a SAS table. Fast to execute - a short coffee brake only, for each run.
This may sound as a very rough way. However it worked very nicely - when nothing else worked!
I hope that You have some use of this.
/ Br Anders