Hi, I'm trying to connect Windows SAS 9.4 to Cloudera hadoop 5.0.1-1-vmware(installed to Oracle VM VirtualBox). As given in SAS forums, I have copied the list of configuration files and jar files to the local Windows directory, and assigned their directory path to SAS_HADOOP_CONFIG_PATH and SAS_HADOOP_JAR_PATH respectively. Please find attached screen captures for the list of Config files and Jar files I got from Cloudera system. Now when I run Proc Hadoop and libname statements, I'm getting errors related to JAVA. Can anyone help me in fixing the issue. Please find below log statements with code and errors. /**Connecting with Proc Hadoop**/ 25 options set=SAS_HADOOP_CONFIG_PATH="D:\hadoop\conf"; 26 options set=SAS_HADOOP_JAR_PATH="D:\hadoop\jarlib"; 27 /* create authdomain in SAS Metadata named "HADOOP" * 28 * copy file from my local file system to HDFS * 29 * HDFS location is /user/sas * 30 *****************************************************/ 31 proc hadoop username='cloudera' password=XXXXXXXXXX verbose; 32 hdfs mkdir='/home/cloudera/Desktop/newdir'; 33 hdfs copytolocal='/home/cloudera/Desktop/samplenew.txt' 34 out='D:\hadoop\sampletest.txt' ; ERROR: java.io.FileNotFoundException: File /home/cloudera/Desktop/samplenew.txt does not exist ERROR: at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java: 511) ERROR: at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.jav a:724) ERROR: at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:501) ERROR: at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:397) ERROR: at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:337) ERROR: at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:289) ERROR: at com.dataflux.hadoop.DFHDFS$10.run(DFHDFS.java:322) ERROR: at java.security.AccessController.doPrivileged(Native Method) ERROR: at javax.security.auth.Subject.doAs(Subject.java:415) ERROR: at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554) ERROR: at com.dataflux.hadoop.DFHDFS.copyToLocal(DFHDFS.java:313) 35 run; NOTE: The SAS System stopped processing this step because of errors. NOTE: PROCEDURE HADOOP used (Total process time): real time 0.07 seconds cpu time 0.01 seconds /**Connecting with lib statement**/ 36 libname hivelib hadoop server="10.237.22.44" port=22 user="cloudera" password=XXXXXXXXXX 36 ! subprotocol=hive 37 cfg="D:\hadoop\conf\core-site.xml"; ERROR: java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java heap space ERROR: Unable to connect to the Hive server. ERROR: Error trying to establish connection. ERROR: Error in the LIBNAME statement.
... View more