The only difference with a temporary dataset is that your job cannot be restarted, depending on how the temporary dataset is used. Also, typically a temporary dataset's disk space usage is allocated to a different DASD storage pool than would be for a permanent dataset -- but that is not a guaranteed consideration unless you were to verify with your Data Storage Management staff. So, depending on how critical the restart/recovery of your batch process, I would say that the "overhead" is a non-issue and you should be focused on whether it is or is not important to have restartability for your SAS batch processing.
Regarding a technical reference or website, you would be best served by discussing the topic/point specifically with your Data Storage Management personnel.
If you have average/complex data loading jobs, with transformations and cleansing, it's best practice to have data "flow" into the warehouse in steps, such as ODS, staging, transformed daily bulk, target detail DW, star-schema DM, information marts etc. Each step should be in a permanent data storage. It's also handy to keep each job just to load one level at the time, allowing you to organize scheduling in a simplified way (remember not to allow jobs to refer to data created in subsequent job in the data flow).
In 9.2, there is a restart feature which allow you to restart a job at specified point, which could be a temporary table (SAS will make the temporary table temporarily permanent between runs).
If you still feel that you need to save some intermediate data between sessions, you can have them defined as permanent during development, and then change to temporary before going to test/prod.