Hi, We have a large, 6,884,020 dbf file containing mapping data that we want to read into a SAS. When we run the following code in Enterprise Guide 8.3, we only get a little over 4,000,000 rows read in. proc import datafile = '\\server\mapping\Export_OutputWo3setA.dbf' out= WORK.MapSetA REPLACE dbms = dbf; run; Thinking there may have been some limitation we hit, we divided the one file into two smaller dbf files, one with 3,610,374 and the other with 3,273,646. After running the same code, we recieved 3,017085 and 2,848721 rows respectively. We are not getting any errors when reading in the data. We pulled out a very small subset of 157,804 and all records were read into SAS. Our next step is to subset the data into these smaller dbf files and read them that way but thought we should check with the experts before going through the effort. There seems to be some data issues but we don't know what or how to check. Has anyone seen this behavior with dbf files?
... View more