Hi Does anyone know where the best place to get documentation around this is? I'm a bit lost as to where to look! For explanation, our configuration: SAS 9.4 ML2 on Windows 2012 Server 64 bit SAS/ACCESS for Hadoop Cloudera CDH 5.8 (Hive and HDFS data sources) on Linux (Centos 7) CDH is (Active Directory) Kerberized (2008R2 AD), and accessed via a kinit-ed user. SAS Server is also part of the same AD domain (and hence the same Kerberos realm). No cross realm configuration is required. We've successfully managed to get SAS talking to Cloudera Hadoop, via the SAS/ACCESS connector for Hadoop, by logging in to the SAS base server, manually kinit-ing against an AD user with access to Cloudera Hadoop, then starting SAS client, and running the relevant LIBNAME and PROC SQL statements to issue queries against Hive. Results are returned into SAS once the Hive job has completed on the cluster - so operating exactly how we'd expect. However, we really need to set this up so a client (e.g., SAS Enterprise Guide), also authenticated against the same AD, can use SAS Enterprise Guide to talk to the SAS server, and then onward to Cloudera using the same kerberos credentials (credentials cache / TGT) available on the SAS Enterprise Guide client. I understand this is done by issuing a kinit on the SAS Server using the SAS ObjectSpawner process, which looks to run a batch script - WorkspaceServer.bat. That makes sense. Although I can't really find any workable examples of this! So, my question; is it possible (or is there any documentation anyone knows about that explains this!) to issue the kinit via ObjectSpawner, using the user credentials passed from the client running Enterprise Guide? The article at http://blogs.sas.com/content/sgf/2014/10/15/sas-high-performance-analytics-connecting-to-secure-hadoop/ in the 'Making connections in a standard SAS session' section seems to suggest this is all possible, but it's missing a lot of detail in how we actually do it! So, the flow through the transaction would typically be User S logs into Windows, on a client with Enterprise Guide installed. User S starts Enterprise Guide, and logs into the SAS Server. The SAS Server (via ObjectSpawner?) runs a kinit for the user 'S' that is running Enterprise Guide. The the SAS Server can successfully use Hive / HDFS as the kinit'd user 'S'. Data / requests / etc are passed from Enterprise Guide through the SAS server, and onto Hive / HDFS. Results flow back the other way. Many thanks Simon
... View more