02-01-2018 03:48 AM
I have a program that reads all metadata of all SAS datasets that reside in the network and the SAS server.
The program automatically assigns libraries for all paths in which datasets have been stored.
At the same time it reads the dataset properties in a step where dictionary.columns is used.
This happens successfully, however I bumped into a corrupt dataset that cannot be opened and then below happens.
Program stops with this error just for 1 dataset.
How can I circumvent that ?
ERROR: An exception has been encountered.
Please contact technical support and provide them with the following traceback information:
The SAS task name is [SQL (2)]
ERROR: Write Access Violation SQL (2)
Exception occurred at (00D2968F)
Address Frame (DBGHELP API Version 4.0 rev 5)
0000000000D2968F 000000000E20C960 tkmk:tkBoot+0x1794F
0000000000D278C0 000000000E20C9C0 tkmk:tkBoot+0x15B80
00000000022F5947 000000000E20C9C8 sashost:Main+0x1E4B7
000000000995BDF4 000000000E20CA58 sasyoio:tkvercn1+0x1ADB4
02-01-2018 04:08 AM
First of all, I'd follow the advice and open a track with SAS TS, as SAS should give a message for a damaged dataset and not crash.
Then I'd rename the dataset, so that your program won't pick it up, but it will still be available for the support people.
02-01-2018 10:46 AM
Have you been able to identify which dataset is the problem? Or at least which library it is in?
And the ever popular: is this repeatable with the same data set causing the same error?
02-01-2018 11:00 AM
I have. For the moment I've skipped it. Luckily enough there was only 1 file :-)
However, I think I need to build in something to prevent this specific file and similar files from being scanned for metadata.
I should try to open it. If it can't be opened, it should be skipped. First thing that comes to mind is :
Don't know if this works for datasets... but i 'll check.
02-02-2018 01:50 AM
Did you have a look at the file before you deleted/renamed it?
You could modify the initial find command I gave you to exclude files below a certain size, for instance.
OTOH, if this happens quote rarely, just see the resulting ERROR as a means to find "dead" dataset files you can weed out.