Deployments of SAS IT Resource Management 3.x implemented with duplicate data checking use special duplicate data control files as part of each staging job. In the event that there is a need to rerun the staging job, these special datasets will need to be “reset.”. This article further explains the issue and offers a means to address this situation.
The reasons for rerunning the staging job vary, but when the situation occurs, it can be immensely helpful to have a simple process ready to execute to “reset” the staging process. With duplicate data checking enabled, rerunning the staging job (with the same input file) will result in filtering out all of the data. The paper shows how the user can “reset” the duplicate data control files to the previous update.
Join us for SAS Innovate 2025, our biggest and most exciting global event of the year, in Orlando, FL, from May 6-9. Sign up by March 14 for just $795.
Data Literacy is for all, even absolute beginners. Jump on board with this free e-learning and boost your career prospects.