SAS Data Integration Studio, DataFlux Data Management Studio, SAS/ACCESS, SAS Data Loader for Hadoop and others

How to create a job which gives failures at record level and not rule level?

Reply
N/A
Posts: 1

How to create a job which gives failures at record level and not rule level?

I recently started using SAS DM studio. I was successful in creating new data jobs. When i click on monitoring reports under tools, i see the results for the jobs that were run.But, Here is the issue i am facing. 

Job1 - The job has been created with data from multiple tables, and the total number of records processed is about 10000. FID_X_UNIQIS is a unique key which is unique across entire system. In the results i see the row with the same FID_X_UNIQIS multiple times, this is because there are 3 different rules written against the row. The results i see now are rule based and not record based. Is there a way i can get the results at record level ? ( Even though a record has failed multiple times for multiple rows, i still want to see it as one single failure. )

I tried to achieve this by exporting the data to excel sheet. DM studio becomes too slow when i increase export page size to more than 100. It took me hours just to export 5000 rows of data. If there is any other way in achieving this please let me know. 

Ask a Question
Discussion stats
  • 0 replies
  • 264 views
  • 0 likes
  • 1 in conversation