I believe the hashexp doesn't significantly impact on memory size but rather on how efficiently the key lookup performs.
I'm not aware of any "hard" algorithm to determine the optimal hashexp but based on what I've read in the docu a value of 9 or 10 appeared appropriate to me for 8M rows with a single key.
I guess the most important thing is for the OP to check that the variables for the hash are reasonably defined and don't have some default length of 200 chars or so as in memory these chars get fully expanded even if the compression option is turned on.
Paul Dorfman has a nice paper on this exact technique: Hash + Point = Key. But of course you would then have to read a record from the 8M file about 64M times (16M records in the big file with 4 reads per record) in order to identify the oldest record.
But for this problem, I don't see enough benefit, because (I presume) the OP isn't retrieving lots of vars from the lookup file. Storing the row number of the lookup table will take the same space as storing the consdate. If @Steelers_In_DC doesn't also need ORIGACCT to make it work, I don't see any memory benefit at all..
Don't miss out on SAS Innovate - Register now for the FREE Livestream!
Can't make it to Vegas? No problem! Watch our general sessions LIVE or on-demand starting April 17th. Hear from SAS execs, best-selling author Adam Grant, Hot Ones host Sean Evans, top tech journalist Kara Swisher, AI expert Cassie Kozyrkov, and the mind-blowing dance crew iLuminate! Plus, get access to over 20 breakout sessions.
Learn how use the CAT functions in SAS to join values from multiple variables into a single value.
Find more tutorials on the SAS Users YouTube channel.