Hi There, In our environment there are hundreds of tables living in LASR, and whenever there is any maintenance scheduled on servers or if servers goes down due to some reason, we always trouble getting the tables back to LASR server. The main reason we trouble with loading tables back to server is because some of the tables are loaded via Data Integration Jobs, some of them are manually loaded by developers as external flat files, some are auto loaded via VA auto load libraries etc. So we don't know for which tables we have to run a DI job or whatever to get them back to LASR.
It would be really nice to have if we could get some report out of SAS that would tell us how those tables are loaded in LASR i.e. via DI job, auto loaded by VA, imported from external files etc.
If you are on a Distributed Linux, with several nodes/machines, it would be interesting for you to know there is a way to "save" all LASR data from a LASR server (so, as many times as LASR servers you have got) into HDFS.
Thanks for you prompt response. Yes, we do have distributed Linux environment and we do have those Administrative reports configured already. No doubt, these reports are very helpful and informative. However, I just took another look at those reports they give a bunch of information like who is using LASR tables, what actions are being performed, in what capacities they are being used etc. but I'm afraid I din't find anything that tells me how those tables are loaded into LASR. Is there's a way I can get this information extracted from logs or something into Excel sheet, may be via some SAS code or something else?
all the audit information is stored in the WIP database (postgreSQL generally speaking), in SharedServices, the audit table. In this table you will find a lot of extended information and it can be extended even more, depending on your enabled Logging options
However, even if you find additional information, I seriously doubt SAS a way to know where the load is coming from in all the scenarios.
Instead, you have 2 options:
You set AutoLoad to load data with an specific system account (say, sa_saslasr_autoload1), you set your ETL to run and load data with another system account (say, sa_sasetl_batch), and your users will load data witht their own user account. In this case, you will need to report the Data Loads on the Administration report per system account (e.g sa_saslasr_autoload1 and sa_sasetl_batch) and then another one for your users. You could create a customized copy of the Admin reports to report this information as you need.
You can create an alternative, e.g. a macro on your ETLs, on your Autoloads that will add an observation to an specific table. I have no idea on how to apply this for the import from the users, but perhaps SAS Technical Support might pint point you to the internal SAS code that is used by SAS to import the data, then you can create a customized copy that includes that kind of maco.
I would go for option 1: more standard, and no modification if SAS internal code. And much easier to maintain!
PS. If your Workspace Server is set up with Host Authentication, the user id will be used during user imports. But if you have SAS Token Authentication, the centralized system account used as Token, will ne used.