BookmarkSubscribeRSS Feed
SJ-sasadmin
Calcite | Level 5

I'm validating the FILENAME Statement with below program from EG and getting error. The permission on "tmp" is 777. I'm able to login from any sas servers to any Cloudera 5.4.5 hadoop servers. We are running RHEL 6.7 on both SAS and Hadoop servers.

  filename out hadoop "/tmp/sas-test-file" 

  user="sasabc" pass="xxxx";
  data _null_;
  file out;
  put "here is a line in myfile";
  run;

 

Error: 

ERROR: java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
ERROR: at com.dataflux.hadoop.DFConfiguration.<init>(DFConfiguration.java:88)
ERROR: at com.dataflux.hadoop.DFConfiguration.<init>(DFConfiguration.java:73)
ERROR: Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
ERROR: at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
ERROR: at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
ERROR: at java.security.AccessController.doPrivileged(Native Method)
ERROR: at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
ERROR: at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
ERROR: at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
ERROR: ... 2 more
ERROR: Validate the contents of the Hadoop configuration file and ensure user permissions are correct.
ERROR: Unable to create Hadoop configuration.
ERROR: Failure in Hadoop create instance action. Use the debug option for more information.

 

Here are the jar files:

commons-configuration-1.9.jar
jackson-core-asl-1.8.11.jar
hive-exec-1.1.0-cdh5.4.5.jar
guava-15.0.jar
hadoop-auth-2.6.0-cdh5.4.5.jar
hadoop-common-2.6.0-cdh5.4.5.jar
hadoop-core-2.6.0-mr1-cdh5.4.5.jar
hadoop-hdfs-2.6.0-cdh5.4.5.jar
hive-jdbc-1.1.0-cdh5.4.5.jar
hive-metastore-1.1.0-cdh5.4.5.jar
hive-service-1.1.0-cdh5.4.5.jar
httpclient-4.3.jar
httpcore-4.3.jar
libfb303-0.9.2.jar
pig-0.12.0-cdh5.4.5-withouthadoop.jar
protobuf-java-2.5.0.jar
commons-beanutils-1.8.3.jar
commons-cli-1.2.jar
commons-collections-3.2.1.jar
commons-logging-1.2.jar
commons-lang3-3.1.jar
jackson-jaxrs-1.9.2.jar
jackson-mapper-asl-1.9.12.jar
jackson-xc-1.9.2.jar
slf4j-log4j12-1.7.5.jar
slf4j-api-1.7.5.jar
hadoop-client-2.3.0-mr1-cdh5.0.0.jar

 

PATH in "savv9.cfg" file:

-SET SAS_HADOOP_CONFIG_PATH "C:\hdp\config"
-SET SAS_HADOOP_JAR_PATH "C:\hdp\lib"

 

I'm attaching the logout put of this program and "proc javainfo picklist" program also.

5 REPLIES 5
SASKiwi
PROC Star

The documentation lists certain requirements that must be met for this to work:

 

http://support.sas.com/documentation/cdl/en/lestmtsref/68024/HTML/default/viewer.htm#p0we15v9bcy9qon...

 

I would check these out in the first instance. What SAS version are you using - 9.4 M3?

Penguin
Calcite | Level 5

 

Based on the error is looks as if something at least failed because of the hadoop configuration files that were put in the SAS_HADOOP_CONFIG_PATH. These environment variable, jars, and config files need to be on the SAS Application Server, not the Windows PC.

 

#1 - Why is the path mentioned below a Windows path? 

-SET SAS_HADOOP_CONFIG_PATH "C:\hdp\config"
-SET SAS_HADOOP_JAR_PATH "C:\hdp\lib"

 

These environment variables need to be on the SAS Application Server (the Linux server), not on the Windows PC. Unless running SAS on Windows obviously.

 

 

#2 - Which sasv9.cfg did you update? There are a lot of sasv9.cfg files, you probably need to update them on SASApp servers config/Lev1/SASApp and/or config/Lev1/SASApp/WorkspaceServer as these are applied in a specific order of precedence. To check, run config/Lev1/SASApp/WorkspaceServer.sh -nodms -verbose on SASApp to see the options, then type endsas; to quit.

 Set options in sasv9_usermods.cfg and/or sasv9_local.cfg not sasv9.cfg. The sasv9.cfg file(s) can get overwritten during install/config/upgrade tasks.

 

#4 - Your JAR list looks short, usually there are 100+ jars. Specifically I do not see any for Yarn or mr1, you probably one or the other but not both, this will cause errors.

 

#5 - This error looks related to the XMLS, you will usually need hdfs-site.xml, hive-site.xml, core-site.xml, yarn-site.xml, oozie-site.xml, and maybe the manually created merged XML file explained in the SAS Access to Hadoop documentation. 

 

In SAS 9.4 TS1M3 you can use the SAS Deployment Manager to collect the Hadoop Configuration Files, Jar files, and set the SAS_HADOOP_CONFIG_PATH and SAS_HADOOP_JAR_PATH environment variables on the Linux server for you.

 

SJ-sasadmin
Calcite | Level 5

Thank you for the suggestion.

Actually I've took the another approcah and manually fetched jars/xml. Now there are over 100+ jars and for XML I've created a merged xml file.

Looks like I was updating incorrect sasv9 file. I will try step2 later on and keep you posted.

 

Again, thanks for looking into this.

Matt
Quartz | Level 8

Can someone sent me list of JAR files required for SAS acess to hadoop configuration? I have gathered few jar files from hadoop admin and running into same issues.

 

Also, we have made 2 SAS computer serves part of Hadoop cluster as hadoop nodes. Does this change S/A to hadoop configuration?

SJ-sasadmin
Calcite | Level 5

I'm getting below error now.

EG error:

ERROR: org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]
NOTE: Validate the contents of the Hadoop configuration file and ensure user permissions are correct.
ERROR: Unable to create stream service from /test/sas-test.
NOTE: The SAS System stopped processing this step because of errors.

 

EG program:
filename out hadoop "/test/sas-test/"
cfg="/path/config/HadoopKerberosConfig.xml"
user='xxx' pass='xxx';
data _null_;
file out;
put 'writing from sas';
run;

-------------------

 

I believe when SAS writes to hdfs it connects to hive server, is it true?

I've veiried that the hdfs,root,hive,yarn and user writing the sas program has valid kerberos ticket on hive server.

It looks like it's complaining about the auth method. I'm attaching "HadoopKerberosConfig.xml" file also.

 

root@nj1sasmd2 /path/to/config
> ls -ltr
total 72
-rwxr-xr-x 1 root root 3964 Nov 10 21:14 yarn-site.xml
-rwxr-xr-x 1 root root 1510 Nov 10 21:14 topology.py
-rwxr-xr-x 1 root root 920 Nov 10 21:14 topology.map
-rwxr-xr-x 1 root root 315 Nov 10 21:14 ssl-client.xml
-rwxr-xr-x 1 root root 4638 Nov 10 21:14 mapred-site.xml
-rwxr-xr-x 1 root root 314 Nov 10 21:14 log4j.properties
-rwxr-xr-x 1 root root 2247 Nov 10 21:14 hdfs-site.xml
-rwxr-xr-x 1 root root 407 Nov 10 21:14 hadoop-env.sh
-rwxr-xr-x 1 root root 3868 Nov 10 21:14 core-site.xml
-rwxr-xr-x 1 root root 14384 Nov 12 06:31 HadoopKerberosConfig.xml

---------------------

Both "/path/lib" & "/path/config" have 777 permission.

The /test dir has 777 permission.

The merged "HadoopKerberosConfig.xml" file now has core-site.xml, hdfs-site.xml, mapred-site.xml, and yarn-site.xml configuration files. I've tried with either yarn or mapred also.

 

 

------

 

> grep -A2 auth HadoopKerberosConfig.xml
<name>hadoop.security.authentication</name>
<value>kerberos</value>
</property>
--
<name>hadoop.security.authorization</name>
<value>true</value>
</property>
--
<value>authentication</value>
</property>
<property>
<name>hadoop.security.auth_to_local</name>
<value>DEFAULT</value>
</property>

------------------------------------------------------

 

I've 145 jars.

[id@server lib]$ ls | wc -l
145

---------------------------------

 

Options set internally at initialization:

CONFIG /path/SASFoundation/9.4/sasv9.cfg
/path/SASFoundation/9.4/nls/en/sasv9.cfg
/path/SASFoundation/9.4/sasv9_local.cfg
/path/config/Lev1/SASApp/sasv9.cfg
/path/Lev1/SASApp/sasv9_usermods.cfg
/path/Lev1/SASApp/WorkspaceServer/sasv9.cfg
/path/Lev1/SASApp/WorkspaceServer/sasv9_usermods.cfg

--------------------------

??? I don't see "sasv9_local.cfg" in WorkspaceServer dir, but it's in /sashome/SASFoundation/9.4/. Should I update it?

 

> grep HADOOP /path/config/Lev1/SASApp/WorkspaceServer/sasv9.cfg

-SET SAS_HADOOP_CONFIG_PATH "/path/config"
-SET SAS_HADOOP_JAR_PATH "/path/lib"
-----------------------------
root@nj1sascn1 /path/config/Lev1/SASApp/WorkspaceServer
> grep HAD /path/config/Lev1/SASApp/WorkspaceServer/sasv9_usermods.cfg
-SET SAS_HADOOP_CONFIG_PATH "/path/config"
-SET SAS_HADOOP_JAR_PATH "/path/lib"

-----------------------------

suga badge.PNGThe SAS Users Group for Administrators (SUGA) is open to all SAS administrators and architects who install, update, manage or maintain a SAS deployment. 

Join SUGA 

CLI in SAS Viya

Learn how to install the SAS Viya CLI and a few commands you may find useful in this video by SAS’ Darrell Barton.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 5 replies
  • 4173 views
  • 1 like
  • 4 in conversation