We are hashing 4 columns and using proc groovy to do so. When we run our script in small time increments, versus an entire year., we have ZERO issues. Our smaller datasets can be done in quarters but our larger ones would have to be done in much smaller time frames, taking weeks to piece together,as the max amount of rows we can write at once is 140K. The crazy part is we always hit in the 144K range and it errors out on us. Any ideas as to what could be causing the issue? ***The line and column numbers always change and the amount of observations is always in 144K range*** Error: Object instantiation failed at line 298 column 62. ERROR: DATA STEP Component Object failure. Aborted during the Execution phase. NOTE: There were 144595 observations read from the data set FY16. filename cp temp; proc groovy classpath=cp; submit parseonly; import java.security.MessageDigest class Sha1 { public String encode(String message) { return new BigInteger(1, MessageDigest.getInstance("SHA1").digest(message.getBytes())).toString(16); } } endsubmit; quit;
... View more