12-26-2017 12:49 PM
12-26-2017 02:36 PM
Take a look at the EXIST and FEXIST functions. They tell you whether a SAS data set exists (EXIST) or a file exists (FEXIST).
Inside your %DO loop, you can add an interior %DO group that basically says, "If the file I'm looking for exists, then %DO."
12-26-2017 05:24 PM
You need to provide more information if you want help. Why would a loop generating data step code be any different than a loop generating proc import code?
Why are you looping from a to b if every number in between a and b does not exist? Why not set it up to loop over a list of valid values?
%let valid_list=Florida_101 Texas_469 ; %do i=1 %to %sysfunc(coutw(&valid_list,%str )); %let next=%scan(&valid_list,&i,%str( )); .... %end;
12-27-2017 02:48 AM
12-27-2017 08:54 AM
Still not sure if I understand. If you imported them into WORK library and you want all of the FLORIDA datasets then why not just use wild cards.
data all_florida ; set florida_: ; run;
12-27-2017 09:30 AM
12-27-2017 10:04 AM - edited 12-27-2017 10:29 AM
@Tom Thank you Tom. The key point of my problem is that, for some states and even the same state,the same variables in different years may be of different type because in some years variable X1 may be numeric and in another year X1 becomes a character variable when imported to sas due to lack of data in that year. So when I use wild cards, it does not work.
So finally you have stated an actual problem. This is caused by using PROC IMPORT to guess about what is in your source files. If you can convert the source files to text files (like CSV files for example) then you can read them in consistently and you will no longer have this problem.
If your problem is just with empty variables that got created as numeric instead of character then the solution is to just drop them. (if they are empty you will lose nothing).
Automating this might be more work than its worth. Personally I would just get the list of files. For example you could run a simple step like this
data _null_; set sashelp.vtable ; where libname = 'WORK' and memname like 'FLORIDA_%' ; put memname '(drop=)'; run;
and then copy the list from the log into a new data step and add in any variables that need to be dropped.
data all_florida ; set FLORIDA_101 (drop=) FLORIDA_102 (drop=X1) ; run;
If you really need to automate then you should create a dataset to use as a template and query the metadata for your new files and compare the the expected result. Then you can use the result of that comparison to generate appropriate code for each dataset that will drop , rename , transform etc any variables that got defined in the wrong way.
As to the question in your title the example data step above is probably a good place to start for the "looping". So instead of looping in macro code make the normal data step iteration your loop. For example if you created a macro named %FIX() that could fix up the variables for one of the files then you could use CALL EXECUTE in a data step to call it for all of the datasets.
data _null_; set sashelp.vtable ; where libname='WORK' and memname like 'FLORIDA_%' ; call execute(cats('%nrstr(%fix)(',memname,')')); run;
12-27-2017 03:49 AM
Store your valid value comnibations in a dataset, and use call execute off that dataset. Much better and scalable than a list in a macro variable.
12-27-2017 09:21 AM
12-27-2017 02:46 PM
@KurtBremser Thank you KurtBremser! But I have no idea about the call execute routine,could you kindly give me the sample code about my problem so that I can start with? Thanks a lot!
I see that @Tom has already given you a nice example in the meantime.