BookmarkSubscribeRSS Feed


Started ‎11-30-2011 by
Modified ‎10-05-2015 by
Views 1,937

Deployments of SAS IT Resource Management 3.x implemented with duplicate data checking use special duplicate data control files as part of each staging job. In the event that there is a need to rerun the staging job, these special datasets will need to be “reset.”. This article further explains the issue and offers a means to address this situation.


The reasons for rerunning the staging job vary, but when the situation occurs, it can be immensely helpful to have a simple process ready to execute to “reset” the staging process.  With duplicate data checking enabled, rerunning the staging job (with the same input file) will result in filtering out all of the data.  The paper shows how the user can “reset” the duplicate data control files to the previous update.

Version history
Last update:
‎10-05-2015 02:58 PM
Updated by:


Don't miss out on SAS Innovate - Register now for the FREE Livestream!

Can't make it to Vegas? No problem! Watch our general sessions LIVE or on-demand starting April 17th. Hear from SAS execs, best-selling author Adam Grant, Hot Ones host Sean Evans, top tech journalist Kara Swisher, AI expert Cassie Kozyrkov, and the mind-blowing dance crew iLuminate! Plus, get access to over 20 breakout sessions.


Register now!

Free course: Data Literacy Essentials

Data Literacy is for all, even absolute beginners. Jump on board with this free e-learning  and boost your career prospects.

Get Started

Article Tags