Well actually what you have done is made a connection to the Hive server and not to HDFS. I think the problem lies in the fact that SAS cannot connect to HDFS and hence the the original error. When you run the original code libname hdplib hadoop server="XXXXXX" user="XXX" password="$XXXX" port=10001; data hdplib.class; set sashelp.class(obs=10); run; I think you will see that the temp file is being written to the local file system and not to HDFS. When HIVE attempts to move the temp file from HDFS to Hive it fails since the file is not available on the HDFS file system. I see you are not pointing to a configuration file. This file tells SAS where to look for the HDFS and MapRed components. Perhaps this is the issue. You will probably face the same issue if you use the filename to Hadoop, which emphasizes the fact that it is an HDFS connectivity issue and not Hive - filename out hadoop '/tmp/' user='sasdemo' pass='Orion123' recfm=v lrecl=32167 dir ; data _null_; file out(shoes) ; put 'write data to shoes file'; run; 438 filename out hadoop '/tmp' cfg='/tmp/richard.cfg' 439 user='sasinst' pass=XXXXXXXXXX recfm=v lrecl=32167 dir debug; 440 data _null_; 441 file out(shoes4) ; 442 put 'write data to shoes file'; 443 run; ERROR: java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.hdfs.DistributedFileSystem Sorry I cant help further, I am having the exact same issue and have decided to reinstall Hadoop to check if this is the issue.
... View more