05-04-2016 09:26 PM - edited 05-04-2016 09:31 PM
I'm stripping down a problem I'm having to a simplified version using a data step merge. I'm hoping it is a quick misunderstanding of that data step that someone can enlighten me on.
I have two datasets to merge. I've dummied some data below and changed the format as my data is proprietary.
data onshelf; input product $ storeid $ mancode $ expdate packages; informat expdate mmddyy10.; format expdate mmddyy10.; datalines; 12345 4985 882 06/19/2016 17 12345 4985 912 07/03/2016 6 12345 4985 125 07/17/2016 19 ;;; run; data outlook; input product $ storeid $ 'fcst_05/29/2016'n 'fcst_06/05/2016'n 'fcst_06/12/2016'n 'fcst_06/19/2016'n 'fcst_06/26/2016'n; datalines; 12345 4985 12 4 5 2 2 ;;; run;
Thinking about bringing these datasets together
data together; merge onshelf outlook; by storeid product; run;
The first record would be (generically)
product storeid mancode expdat packages forecasts 12345 4985 882 06/19/2016 17 12 4 5 2 2
What I want to do, is consume the inventory on shelf with the forecasted volume.
So, consume 12 of the packages with the first forecast, leaving 5 on the shelf. Then consume 4 more inventory units with the second forecast, leaving 1 on shelf. Then consume that last unit of inventory with the third forecast. This leaves inventory packages at 0 and 4 units in the third forecast. This record would then be output. e.g.
product storeid mancode expdat packages forecasts 12345 4985 882 06/19/2016 0 0 0 4 2 2
I can get to this pretty easily with an array and a little conditional logic to remove volume from packages and from the array'ed forecasts.
The problem is on the next record.
product storeid mancode expdate packages 12345 4985 912 07/03/2016 6
Notice this is the same product and storeid, but a different expiration date (different manufacturer line date for instance).
I want to use the forecast values left over from record 1 (0 0 4 2 2). However, I can't seem to get SAS to retain those values in the array. It is being reset to (12 4 5 2 2).
Is this type of activity possible? This is where I got my title from as it seems like I want to retain the values of the forecast as long as I'm in the same product, storeid combination.
Thanks in advance.
05-04-2016 10:02 PM
Can you post some more sample data and expected output? At least two cases so it's easier to test and output that matches your rules.
I think changing your data structure may make this easier, if you go long instead of wide.
05-04-2016 10:42 PM
You're right. It's not possible. Are you positive that your second data set doesn't contain the forecast variabnles?
Are you able to post the code that you used? Simplify it for the data that you illustrated, but call the forecast variables f1 through f5.
05-05-2016 08:36 AM
Maybe I've missed something, but I don't see your problem.
Variables that are read with a MERGE statement are automatically retained (see section "Redundancy" in the documentation of the RETAIN statement).
So, if you modify the forecast values based on the PACKAGES value of the first observation of ONSHELF, these modified forecast values will be retained and hence will be available for further modifications in the second observation and so on until the end of the BY group. They would not be "reset to (12 4 5 2 2)."
Simplified example: Reduce forecast values and PACKAGES by 1 in each observation (and avoid negative values). Of course, your algorithm would not be that trivial.
data outlook; input product $ storeid $ f1-f5; datalines; 12345 4985 12 4 5 2 2 ; data test; merge onshelf outlook; by storeid product; array f f:; do i=1 to dim(f); f[i]=max(0,f[i]-1); end; packages=max(0,packages-1); drop i; run;
product storeid mancode expdate packages f1 f2 f3 f4 f5 12345 4985 882 06/19/2016 16 11 3 4 1 1 12345 4985 912 07/03/2016 5 10 2 3 0 0 12345 4985 125 07/17/2016 18 9 1 2 0 0
05-05-2016 11:28 AM
Thanks all for input. Turns out I had a couple extraneous output statements as I was debugging that led me down the "non-retained" path. It looked like the forecast measures were being re-initialized but I was actually just looking at duplicated output records. I was able to get to what I needed with my original logic.