09-11-2014 11:42 AM
I am using a libname statement to access an excel file. In the past, we have found that when writing out to an excel file using this methodology the data is not always replaced. To avoid that problem, we use a proc datasets and then delete the dataset before trying to replace it. In this particular program, I have five different datasets that I am writing out to the excel file. Three of them are cleared with the delete statement, two are not. The two that are not are large (about 96,000 and 19,000 records). The following is the code I am running:
LIBNAME WrkBk EXCEL 'network location\Output Excel File.xlsm';
PROC DATASETS LIB=WrkBk;
LIBNAME WrkBk CLEAR;
We are using Excel 2010. SAS 9.2 TS Level 2M3. After the delete statement executes, the "dataset" BigData1 no longer is in the list of datasets. However, if the excel file is accessed before the data step that populates the excel file is executed, the data is still there. The three smaller tables all clear themselves out as I expect.
Any assistance will be greatly appreciated. Currently we are manually deleting the data before we run the code.
09-11-2014 01:15 PM
When you open that spreadsheet dataset using excel than excel copies all data into memory
Opening the same excel thereafter will read the data out of the spreadsheetdataset not form the memory in excel.
Do you want to protect that spreadsheet you will need to implement locking/enqueing on the Windows system.
Prevent any Excel user reading that file before you want to update that. Blocks the update of updating when anyone has it opened for reading.
This enqueing is not present in Windows by default.
It all works as known is behaving (works as designed). What is your problem?
09-11-2014 02:12 PM
Maybe I didn't explain the problem fully enough.
I am getting two differnet behaviors from the proc datasets; delete statement. With the three smaller datasets, the data is deleted from the excel file when the statement is submitted. That is as I expect. However, when the same statements are executed on the larger set of data, the data is not deleted from the excel file. That is not as I expect.
09-11-2014 02:30 PM
This might be a tech support question.
I typically use a proc sql drop table instead to clear out my tables and haven't had any issues.
09-11-2014 02:59 PM
What is see:
- Storage location on network drive (Windows) this is why I am suspicious on sharing issues.
- You had problems before as sometimes the tables was not replaced (normally delete-write after rename)
- You still have problems for deleting bigger tables (excel sheetnames).
Try to eliminated possible causes.
Check whether the same failure also is occurring when that excel file is on a local drive not on a network drive.
Can you replay this all on a local machine than you can also send that as a case to TS.
Cross your fingers not getting the answer "migrate to the latest version 9.4"
09-11-2014 03:55 PM
Similar but not quite the same. We had the problem on both a table over and under 65K records. I just reran to see if the tables were still not clearing after we did the manual delete of the lines yesterday. It turns out that now the tables are being deletid with the proc datasets statement. My hypothesis is that there is a formula somewhere inside all of those rows of data and that was causing the issue. I'm still trying to verify.
In regards to Jaap, I doubt it is a sharing issue. Only one person really accesses the file and three of the five tables cleared as expected. The other problems we've had in the past have been primarily sorting issues. When accessing an excel file utilizing a libname statement and the excel engine, any existing table appears to be updated rather than replaced. This keeps the original order of the data but will replace the changing data. I suspect that the write out to the excel file is actually utilizing an update sql statement behind the scenes.
09-11-2014 03:47 PM
SAS/ACCESS(R) 9.3 Interface to PC Files: Reference Is also noted at that level -cautions-.
09-11-2014 04:16 PM
The sharing caused by persons we could eliminated indeed.
The sharing by a network IO could be an other. A networked drive is classified as "local intranet zone" slightly behaving different as a local drive.
Seen this as UNC names coming back an used (intranet network) or drive letters (local).
With a java run-time I have experienced this made some difference on the jvm behavior.
An other weird error could be by "Opportunistic Locking and Read Caching on Microsoft Windows Networks"
At this moment a would blame the ACE driver. May be a formula instead of data in excel is the trigger.
The ACE driver is the SQL drive like interface for data-connections with Microsoft You can find it with almost every interface.