I've been using SSH (WRDS cloud) for iterating my heavy SAS procedure, where each batch command is asked to do around 10 million iteration. After 1000 iteration, I use proc datasets to delete all the temporary files, so that work library does not explode. However, I've been seeing my batch is terminated due to excessive memory issue after around 100k iterations. It seems that this is not driven by a 'single' iteration, but as the iteration number increases, the amount of memory used piles up. Can you think of any specific reason for this?