Hello and thanks for your help. First of all, I am an administrator of SAS, among other things.
Our SAS environment is hosted on Windows 2012 and consists of 4 servers:
Base/Foundation, Visual Analytics, MetaData and Middle-Tier.
Our SAS version is 9.4M3.
My most pressing need is to understand the best way to load data into memory and maintain the data in memory from Enterprise Guide to SAS Visual Analytics
Enterprise Guide uses the Base/Foundation Server for its processing and places generated datasets in a folder location on the Base/Foundation Server
There is an option in Enterprise Guide to load the datasets into Visual Analytics; however, the next time the LASR server is restarted, there is a possibility the tables may not reload in Visual Analytics<< this needs to be avoided and so far I've been relying on "ReloadOnStart" but SAS Support says that's a bad idea, and I agree
I've read on other threads and on user blogs of ways people have maintained datasets in memory from BI>VA. One of them mentioned writing a Windows script that moves datasets from Foundation/Base server to the SAS Autoload dropzone.
Another mentioned writing datasets to a "co-location," but did not specifically mention how to create, maintain a co-location. The SAS Base/Foundation server's dataset library is a shared network location.
Why can't I reassign my dropzone location to the shared folder on Base SAS server and if possible, what's the best way to do this? Or, I suggested to SAS support the windows script option...but I was worried... if I set up a windows task to copy datasets from Base SAS>VA drop zone, say every 30 minutes, would that be enough time to ensure the datasets would be finished running from Base SAS? So, I suggested using "datalock" but that is also not recommended, by SAS support. Right now we are thinking a Windows script to move datasets from SASLib to the append folder.
Does anyone out there have a solution to this in use at the moment?