If you have average/complex data loading jobs, with transformations and cleansing, it's best practice to have data "flow" into the warehouse in steps, such as ODS, staging, transformed daily bulk, target detail DW, star-schema DM, information marts etc. Each step should be in a permanent data storage. It's also handy to keep each job just to load one level at the time, allowing you to organize scheduling in a simplified way (remember not to allow jobs to refer to data created in subsequent job in the data flow).
In 9.2, there is a restart feature which allow you to restart a job at specified point, which could be a temporary table (SAS will make the temporary table temporarily permanent between runs).
If you still feel that you need to save some intermediate data between sessions, you can have them defined as permanent during development, and then change to temporary before going to test/prod.
/Linus
Data never sleeps