Hello @Alok_Pal,
all the audit information is stored in the WIP database (postgreSQL generally speaking), in SharedServices, the audit table. In this table you will find a lot of extended information and it can be extended even more, depending on your enabled Logging options
However, even if you find additional information, I seriously doubt SAS a way to know where the load is coming from in all the scenarios.
Instead, you have 2 options:
- You set AutoLoad to load data with an specific system account (say, sa_saslasr_autoload1), you set your ETL to run and load data with another system account (say, sa_sasetl_batch), and your users will load data witht their own user account. In this case, you will need to report the Data Loads on the Administration report per system account (e.g sa_saslasr_autoload1 and sa_sasetl_batch) and then another one for your users. You could create a customized copy of the Admin reports to report this information as you need.
- You can create an alternative, e.g. a macro on your ETLs, on your Autoloads that will add an observation to an specific table. I have no idea on how to apply this for the import from the users, but perhaps SAS Technical Support might pint point you to the internal SAS code that is used by SAS to import the data, then you can create a customized copy that includes that kind of maco.
I would go for option 1: more standard, and no modification if SAS internal code. And much easier to maintain!
PS. If your Workspace Server is set up with Host Authentication, the user id will be used during user imports. But if you have SAS Token Authentication, the centralized system account used as Token, will ne used.
Hope it helps!
Kind regards,
Juan