I believe the hashexp doesn't significantly impact on memory size but rather on how efficiently the key lookup performs.
I'm not aware of any "hard" algorithm to determine the optimal hashexp but based on what I've read in the docu a value of 9 or 10 appeared appropriate to me for 8M rows with a single key.
I guess the most important thing is for the OP to check that the variables for the hash are reasonably defined and don't have some default length of 200 chars or so as in memory these chars get fully expanded even if the compression option is turned on.
Paul Dorfman has a nice paper on this exact technique: Hash + Point = Key. But of course you would then have to read a record from the 8M file about 64M times (16M records in the big file with 4 reads per record) in order to identify the oldest record.
But for this problem, I don't see enough benefit, because (I presume) the OP isn't retrieving lots of vars from the lookup file. Storing the row number of the lookup table will take the same space as storing the consdate. If @Steelers_In_DC doesn't also need ORIGACCT to make it work, I don't see any memory benefit at all..
Are you ready for the spotlight? We're accepting content ideas for SAS Innovate 2025 to be held May 6-9 in Orlando, FL. The call is open until September 25. Read more here about why you should contribute and what is in it for you!
Learn how use the CAT functions in SAS to join values from multiple variables into a single value.
Find more tutorials on the SAS Users YouTube channel.