Deployments of SAS IT Resource Management 3.x implemented with duplicate data checking use special duplicate data control files as part of each staging job. In the event that there is a need to rerun the staging job, these special datasets will need to be “reset.”. This article further explains the issue and offers a means to address this situation.
The reasons for rerunning the staging job vary, but when the situation occurs, it can be immensely helpful to have a simple process ready to execute to “reset” the staging process. With duplicate data checking enabled, rerunning the staging job (with the same input file) will result in filtering out all of the data. The paper shows how the user can “reset” the duplicate data control files to the previous update.