11-01-2013 04:53 PM
Is there an easy way to change this loop to only modify character variables? I have over 2000 variables so running the data is taking a very long time. Thanks!
/* -- Get the names of the variables --*/
proc contents data = whatever out = names noprint; run;
/*--- Make macro vars needed ---*/
proc sql noprint;
count(distinct name) into :name_count from names;
distinct name into :name1 - :name9999 from names;
/*--- Strip spaces from name_count ---*/
%let name_count = &name_count.;
%put There are &name_count. variables in the data set;
/*--- Run the test for all variables ---*/
%do i = 1 %to &name_count.;
&&name&i = compress(&&name&i,',');
11-01-2013 05:04 PM
No reason to use a macro:
array charvars _character_;
do _t = 1 to dim(charvars);
charvars[_t] = compress(charvars[_t],',');
11-04-2013 03:43 PM
This seemed to work well for the most part but a couple of variables are still delimiting incorrectly. Is there a way to check what is causing the problem?
11-04-2013 03:48 PM
a couple of variables are still delimiting incorrectly. Is there a way to check what is causing the problem?
Can you explain more about what's happening and what you're trying to do?
11-04-2013 04:02 PM
I thought this would be a simple task. All we want to do is export a large dataset to excel or as a CSV file. Several posters have given suggestions but none seem to work properly. Basically, the data has comment fields and within them interviewers have put commas so exporting as a CSV file, the variables do not separate correctly. What is interesting is that most of the data seems to export correctly, even when there appears to be a comma in a field. For some reason for several variables the file does not separate correctly so data is not showing up in the right columns, it is moved over to the wrong columns and creating blank rows of data. Is there a way to figure out why it is being separated incorrectly? One of my first thoughts was to just remove all commas, replace them in excel and export the data but when I run the code that was suggested to remove commas it does not seem to work properly and this error still occurs. I thought it would be a simple task to export to excel but it is turning out to be much more challenging. Any suggestions as to how to fix these errors would be greatly appreciated.
11-04-2013 04:13 PM
Have you tried exporting as a tab delimited table? SAS will import tab delimited data and this should preserve embedded commas. If this still does not work you may need to show us examples of data that does not import correctly. Disguise the contents if you wish, but preserve all punctuation and the field lengths.
11-04-2013 04:44 PM
Vague memories of this question
How big is the dataset? Are you on Excel 2010?
Have you considered the tagset/XML version?
ods tagsets.excelxp file="C:\temp\my_file.xml" style=meadow;
proc print data=have label noobs;
format _numeric_ _character_;
ods tagsets.excelxp close;
11-04-2013 05:33 PM
When I use proc export to create a CSV any field with a comma is exported as quote qualified text. Which Excel reads correctly. What application are you having problems with that you think the commas are causing problems?
11-05-2013 09:56 AM
Commas are probably NOT your problem. Your text probably has embedded end-of-line characters that are causing Excel to not understand the generated CSV file. Try using the suggested data step with the array eliminate the end of lines (or them to blanks or some other printable character).
array charvars _character_;
do _t = 1 to dim(charvars);
charvars[_t] = compress(charvars[_t],'0D0A'x);
The other possibility is that your records are longer than the LRECL setting that you are using in SAS to write the CSV file so that SAS is inserting extra end of lines. What method are you using to write the CSV file? SAS can easily generate a CSV file with a data step, it is writing the row of variable labels that is complicated.
file 'myfile.csv' dsd dlm=',' lrecl=1000000;
put (_all_) ( ;
11-01-2013 05:10 PM
As a side note, if you DID do this in a macro, no reason to make the macro modify the whole darned dataset every iteration... not as nice as the array method (way more work) but shouldn't be much different time wise.
To make this only character variables, use where type=2. Also, better practice is to make the macro do just one, and then use the sql call to generate the list of calls - it's more flexible than macro loops and makes the code more reusable.
proc contents data = sashelp.class out = names noprint; run;
distinct cats('%compressvar(',name,')') into :compresslist separated by ' ' from names where type=2;
&name = compress(&name,',');
%put Compressing &name;
&compresslist; *this is all of the calls to the compressvar macro;
11-04-2013 06:25 PM
Hizzah, a question I can answer! All joking aside, I hope this is correct.
I would bet my salary that you are experiencing issues because some of the fields that you're trying to export have CRLF (carriage return life feed) values within the comments.
Sas doesn't know how to handle these in comment fields, so if the field also has a comma sas splits it out into new values, and then creates a blank row, exactly what you are seeing.
If you can, try to remove the crlf values from your variables before you export it out, and everything should work out fine.
something like this will work.
You can put this into your loop pretty easily, and then try exporting the datafile.
If this doesn't work, please ignore the "bet my salary" portion of my response.. If it does, then glad to be of assistance!