DATA Step, Macro, Functions and more

Using lag function to get cumulative position over time

Reply
Super Contributor
Posts: 259

Using lag function to get cumulative position over time

 
 

Hi 

 

I've been working on this for a while now (using Lag function) but i'm not getting anywhere.

 

Bascially, the dataset is sorted by scrape time. They are all bets and assume that after the event the stake is returned to my account. So my capital to start is 10000. In the first row my capital is 10000-stake. In the second row, I need to look back to the first row. Using lag If the first event is over then my stake is returned. 

 

Here is my code:

 

data WORK.DATASET;
infile datalines dsd truncover;
input stake:32. scrapetime: DATETIME14. eventtime: DATETIME14.;
datalines4;
52,15SEP16:16:25,15SEP16:16:40
18,15SEP16:16:51,15SEP16:17:20
15,15SEP16:16:52,15SEP16:16:57
13,15SEP16:16:57,15SEP16:17:45
117,15SEP16:17:04,16SEP16:15:30
1188,15SEP16:17:04,15SEP16:17:10
562,15SEP16:17:34,15SEP16:17:40
28,15SEP16:17:34,15SEP16:17:50
422,15SEP16:17:47,15SEP16:18:20
806,15SEP16:17:53,15SEP16:19:50
129,15SEP16:17:59,15SEP16:18:10
37,15SEP16:18:25,15SEP16:18:30
32,15SEP16:18:46,15SEP16:19:01
26,15SEP16:18:48,15SEP16:19:01
74,15SEP16:19:06,15SEP16:19:11
90,15SEP16:19:29,15SEP16:19:45
13,15SEP16:19:41,15SEP16:19:47
;;;;

 

data _NULL_;
if 0 then set dataset nobs=n;
call symputx('nrows',n);
stop;
run;

 

%let initialcapital=10000;

 

%macro test;

data test;
set dataset;

if _n_ = 1 then do; running_capital = &initialcapital - stake; end;

else do;

 

%do look_back=1 %to &nrows;

if lag&look_back.(eventtime) < scrapetime then running_capital = running_capital+lag&look_back.(stake);

%end;

running_capital = running_capital - stake;

%end;
run;

%mend;
%test;

Super User
Posts: 11,343

Re: Using lag function to get cumulative position over time

It will help us if you also show the expected result given you example data.

Super Contributor
Posts: 259

Re: Using lag function to get cumulative position over time

Hi , I don't know what the expected result is as I've simplified the code to highlight where I'm having the trouble. 

Super User
Posts: 11,343

Re: Using lag function to get cumulative position over time

I submit that if you cannot determine by hand (manually as in pencil and paper) what your desired result is for a small number of records then you really can't code a solution.

Super Contributor
Posts: 259

Re: Using lag function to get cumulative position over time

No, I can work out the desired result. I will do that and send an update. The data set is just a series of chronological bets. So I start with capital of 10000. As I move down the rows, I'm betting using my capital for the stake amount. For simplicity (but not intuitive) I have assumed that when the event is over you get your stake returned. The result I want to get to is the running capital at the last row.

The difficulty I was having was that some events haven't started when I place another bet and I need to adjust my capital at each row to reflect what events have taken place and which ones are still outstanding
Super Contributor
Posts: 259

Re: Using lag function to get cumulative position over time

The running_capital at the last row should be 9887
Super User
Posts: 5,516

Re: Using lag function to get cumulative position over time

Here's a reasonable way to approach this.

 

proc sort data=have (drop=scrapetime) out=returns;

by eventtime;

run;

data returns2;

set returns;

rename eventtime = scrapetime;

run;

 

data want;

retain running_capital 10000;

set have (in=in1) returns2 (in=in2);

by scrapetime;

if in1 then running_capital = running_capital - stake;

else do;

   running_capital = running_capital + stake;

   delete;

end;

run;

 

It's untested, but definitely represents an easier way to approach the problem.  Also, you don't necessarily have to delete the returns records if you think that information would be valuable (or if it helps illustrate how the program is working).

 

Ask a Question
Discussion stats
  • 6 replies
  • 354 views
  • 0 likes
  • 3 in conversation