BookmarkSubscribeRSS Feed
🔒 This topic is solved and locked. Need further help from the community? Please sign in and ask a new question.
freda
Fluorite | Level 6

Hello,

I modified the code as follows:

proc sql;
create table table1 as
select unique la1.*, count(?) as la1_no_hand_la1
from la1
left join
la1 past1
on (la1.listing_1=past1.listing_1 and la1.date_sign< past1.date_sign < la1.date_exp and la1.status="$EX" )
or (la1.listing_1=past1.listing_1 and la1.date_sign< past1.date_sign < la1.date_sold and la1.status="$VE")
or (la1.listing_1=past1.listing_1 and la1.date_sign< past1.date_exp < la1.date_exp and past1.status="$EX" and la1.status="$EX")
or (la1.listing_1=past1.listing_1 and la1.date_sign< past1.date_exp < la1.date_sold and past1.status="$EX" and la1.status="$VE")
or (la1.listing_1=past1.listing_1 and la1.date_sign< past1.date_sold < la1.date_exp and past1.status="$VE" and la1.status="$EX")
or (la1.listing_1=past1.listing_1 and la1.date_sign< past1.date_sold < la1.date_sold and past1.status="$VE" and la1.status="$VE")
group by 1
order by 1 ;

However, I don't know what I should put instead of the ? in the count(?) because I don't understand what does past.1 stands for. I tried the original code as count(past1.date_sign), but it does not work this time.
The result I want to have is the listing at hand during the contract period. For example, for id=1, the listing_1 A has its own contract period between date_sign and date_exp or date_sold if the listing has been sold. I want to count all the listings the first listing agent A involved.

 

Please see the attachments for the data. 

Thank you

Freda

1 ACCEPTED SOLUTION

Accepted Solutions
Kurt_Bremser
Super User

First of all, your join will leave no results, as you are testing for statuses beginning with a $ sign, but no such statuses are present.

Second, you best start to check your conditions manually against the data. Even with the literals stripped of the dollar signs, I got only 2 matches for the whole dataset, for id=2.

eg id=1 is an observation with status='EX', so it would need a match

- where the other (past1) date_sign is between date_sign and date_exp (not present)

- where the other status is an 'EX' and the other date_ep falls between data_sign and date_exp (also not present)

- where the other status is a 'VE' and the other date_sold falls between date_sign and date_exp (also not present)

 

View solution in original post

6 REPLIES 6
freda
Fluorite | Level 6

The result supposed to be:

id          result

1              1

2              5

3              0

4              1

5             0

6             2

7             0

8             0

9             0

10           1

11           1

12            0

13             2

14             0

15             0

16             0

17             0

18             1

19             0

20             1

21             0

22             0

23             0

24             1

25             1

26             0

27             0

28             1

29             1

30             0

freda
Fluorite | Level 6
Hi,
I did not import this data use data step. I used proc import:
proc import datafile = 'C:\Users\Samuel\Desktop\testing.csv'
out = la1
dbms = csv
replace;
run;

Thank you
Kurt_Bremser
Super User

@freda wrote:
Hi,
I did not import this data use data step. I used proc import:
proc import datafile = 'C:\Users\Samuel\Desktop\testing.csv'
out = la1
dbms = csv
replace;
run;

Thank you

You do not know it yet, but you used a data step. You find the code in the log of the proc import.

freda
Fluorite | Level 6
Here is the code in the log:


598 /**********************************************************************
599 * PRODUCT: SAS
600 * VERSION: 9.4
601 * CREATOR: External File Interface
602 * DATE: 09AUG18
603 * DESC: Generated SAS Datastep Code
604 * TEMPLATE SOURCE: (None Specified.)
605 ***********************************************************************/
606 data WORK.LA1 ;
607 %let _EFIERR_ = 0; /* set the ERROR detection macro variable */
608 infile 'C:\Users\Samuel\Desktop\testing.csv' delimiter = ',' MISSOVER DSD lrecl=32767 firstobs=2 ;
609 informat id best32. ;
610 informat listing_1 $1. ;
611 informat listing_2 $1. ;
612 informat listing_3 $1. ;
613 informat listing_4 $1. ;
614 informat selling_1 $1. ;
615 informat selling_2 $1. ;
616 informat selling_3 $1. ;
617 informat selling_4 $1. ;
618 informat date_sign mmddyy10. ;
619 informat date_list mmddyy10. ;
620 informat date_exp mmddyy10. ;
621 informat date_sold mmddyy10. ;
622 informat status $2. ;
623 informat la1 best32. ;
624 informat ra1 best32. ;
625 informat date_sign_com best32. ;
626 informat listing_1_com $1. ;
627 informat nola1 best32. ;
628 informat nola2 best32. ;
629 informat temp best32. ;
630 informat temp2 best32. ;
631 informat temp3 best32. ;
632 informat temp4 best32. ;
633 informat la1_number_la1_5 best32. ;
634 informat la2_number_la1_5 best32. ;
635 format id best12. ;
636 format listing_1 $1. ;
637 format listing_2 $1. ;
638 format listing_3 $1. ;
639 format listing_4 $1. ;
640 format selling_1 $1. ;
641 format selling_2 $1. ;
642 format selling_3 $1. ;
643 format selling_4 $1. ;
644 format date_sign mmddyy10. ;
645 format date_list mmddyy10. ;
646 format date_exp mmddyy10. ;
647 format date_sold mmddyy10. ;
648 format status $2. ;
649 format la1 best12. ;
650 format ra1 best12. ;
651 format date_sign_com best12. ;
652 format listing_1_com $1. ;
653 format nola1 best12. ;
654 format nola2 best12. ;
655 format temp best12. ;
656 format temp2 best12. ;
657 format temp3 best12. ;
658 format temp4 best12. ;
659 format la1_number_la1_5 best12. ;
660 format la2_number_la1_5 best12. ;
661 input
662 id
663 listing_1 $
664 listing_2 $
665 listing_3 $
666 listing_4 $
667 selling_1 $
668 selling_2 $
669 selling_3 $
670 selling_4 $
671 date_sign
672 date_list
673 date_exp
674 date_sold
675 status $
676 la1
677 ra1
678 date_sign_com
679 listing_1_com $
680 nola1
681 nola2
682 temp
683 temp2
684 temp3
685 temp4
686 la1_number_la1_5
687 la2_number_la1_5
688 ;
689 if _ERROR_ then call symputx('_EFIERR_',1); /* set ERROR detection macro variable */
690 run;

NOTE: The infile 'C:\Users\Samuel\Desktop\testing.csv' is:
Filename=C:\Users\Samuel\Desktop\testing.csv,
RECFM=V,LRECL=32767,File Size (bytes)=2867,
Last Modified=August 09, 2018 17:06:08 o'clock,
Create Time=August 09, 2018 17:06:04 o'clock

NOTE: 30 records were read from the infile 'C:\Users\Samuel\Desktop\testing.csv'.
The minimum record length was 77.
The maximum record length was 92.
NOTE: The data set WORK.LA1 has 30 observations and 26 variables.
NOTE: DATA statement used (Total process time):
real time 0.06 seconds
cpu time 0.04 seconds
Kurt_Bremser
Super User

First of all, your join will leave no results, as you are testing for statuses beginning with a $ sign, but no such statuses are present.

Second, you best start to check your conditions manually against the data. Even with the literals stripped of the dollar signs, I got only 2 matches for the whole dataset, for id=2.

eg id=1 is an observation with status='EX', so it would need a match

- where the other (past1) date_sign is between date_sign and date_exp (not present)

- where the other status is an 'EX' and the other date_ep falls between data_sign and date_exp (also not present)

- where the other status is a 'VE' and the other date_sold falls between date_sign and date_exp (also not present)

 

sas-innovate-2024.png

Join us for SAS Innovate April 16-19 at the Aria in Las Vegas. Bring the team and save big with our group pricing for a limited time only.

Pre-conference courses and tutorials are filling up fast and are always a sellout. Register today to reserve your seat.

 

Register now!

How to Concatenate Values

Learn how use the CAT functions in SAS to join values from multiple variables into a single value.

Find more tutorials on the SAS Users YouTube channel.

Click image to register for webinarClick image to register for webinar

Classroom Training Available!

Select SAS Training centers are offering in-person courses. View upcoming courses for:

View all other training opportunities.

Discussion stats
  • 6 replies
  • 1074 views
  • 2 likes
  • 2 in conversation