BookmarkSubscribeRSS Feed
🔒 This topic is solved and locked. Need further help from the community? Please sign in and ask a new question.
Elle
Quartz | Level 8

I have a dataset with a character variable taking around 300 unique values. Its length is 100 and some of the values are really long and have symbols inside too.

I need to use this variable as an id in a proc transpose, but since the values are not following sas column naming conventions I was thinking I could recode them in bulk and keep track of what code I gave to what value.

Only problem is I have no idea how to approach this.

The below code simulates my issue (test dataset - it is exaggerated though, my dataset has intelligible values followed by some codes, but you get the picture)

 

data test1;
	length byvar $2.;
	do i=1 to 3;
		do j=1 to 2;
			substr(byvar,j)=byte(int(65+26*ranuni(0)));
		end;
		output;
	end;
	keep byvar;
run;
data test2;
	length sensor $100.;
	do i=1 to 10;
		do j=1 to 35;
			substr(sensor,j)=upcase(byte(int(113+12*ranuni(0))));
		end;
		output;
	end;
	keep sensor;
run;
proc sql noprint;
	create table test as
	select * from test1, test2;
quit;
data test;
	set test;
	length sensor_value $5.;
	do k=1 to 3;
		dt=datetime()-rannorm(0);
		l=k+floor(rannorm(0));
		t=intnx('SECOND',dt,l,'B');
		if substr(sensor,1,1)='D' then sensor_value=byte(int(65+26*ranuni(0)))||byte(int(65+26*ranuni(0)))||byte(int(65+26*ranuni(0)))||byte(int(65+26*ranuni(0)));
		else sensor_value=put(int((676+261*ranuni(0))),$5.);
		output;
	end;
	format t datetime16.;
	keep byvar sensor t sensor_value;
run;

 

The troublesome variable is sensor.

Is there a way to replace in bulk all its unique values with say VAL1-VAL300?

Any other suggestions / approaches are welcome. 

1 ACCEPTED SOLUTION

Accepted Solutions
ballardw
Super User

Here is an example of one approach. You should have the data set SASHELP.CLASS available.

This selects unique values of a variable, creates a recode value for each one and then merges the recoded value to the original data set into a new set.

 

proc sql;
   create table list as
   select distinct name
   from sashelp.class
   order by name
   ;
quit;

data temp;
   set list;
   length recode $ 10;
   recode = catt('var',_n_);
run;

proc sql;
   create table recoded as
   select a.*, b.recode
   from sashelp.class as a
        left join
        temp as b
        on a.name=b.name
   ;
quit;

 

View solution in original post

4 REPLIES 4
ballardw
Super User

Here is an example of one approach. You should have the data set SASHELP.CLASS available.

This selects unique values of a variable, creates a recode value for each one and then merges the recoded value to the original data set into a new set.

 

proc sql;
   create table list as
   select distinct name
   from sashelp.class
   order by name
   ;
quit;

data temp;
   set list;
   length recode $ 10;
   recode = catt('var',_n_);
run;

proc sql;
   create table recoded as
   select a.*, b.recode
   from sashelp.class as a
        left join
        temp as b
        on a.name=b.name
   ;
quit;

 

Elle
Quartz | Level 8

Thank you for the code. It is working.  Why I didn't think of this is beyond me.

Astounding
PROC Star
It's certainly possible to add a variable. Would that do the trick? Just before the output statement:

V + 1;
fake_key = cats("val", put(v, z3.));

Elle
Quartz | Level 8

I am not sure I follow.

The code I provided will generate a data similar to what I have. In my system this data already exists (the test data). The question is how can I take the variable sensor from the test dataset to recode it in bulk?

sas-innovate-2024.png

Available on demand!

Missed SAS Innovate Las Vegas? Watch all the action for free! View the keynotes, general sessions and 22 breakouts on demand.

 

Register now!

Mastering the WHERE Clause in PROC SQL

SAS' Charu Shankar shares her PROC SQL expertise by showing you how to master the WHERE clause using real winter weather data.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 4 replies
  • 804 views
  • 0 likes
  • 3 in conversation