BookmarkSubscribeRSS Feed
🔒 This topic is solved and locked. Need further help from the community? Please sign in and ask a new question.
mdavidson
Quartz | Level 8

Hello All --

I've created a new data library to Hadoop, the problem is that I'm not able to run any queries against this library.  For example, here is the libname statement:

libname test hadoop subprotocol=hive2 port=10000 server="xxx.xx.com"  schema=xxx;

When running the following query, I'm getting an error message

proc sql;

create table test as

select *

from

  test.customer

where

  state='RI';

quit;

ERROR: Unable to create stream service from /tmp/sasdata_16_01_48_861_00004. Use the debug option for more information is neither

       an HDFS file nor an HDFS directory. A valid HDFS file or directory is required.

Doing some research it looks like most issues are due to JAR files not being present.  I checked the JAR files listed in the SAS_HADOOP_JAR_PATH and here's what is listed -- everything seems to check out, any ideas on what might be causing the problem?

commons-cli-1.2.jar

core-site.xml

guava-11.0.2.jar

hadoop-auth-2.0.0-cdh4.5.0.jar

hadoop-common-2.0.0-cdh4.5.0.jar

hadoop-core-2.0.0-mr1-cdh4.5.0.jar

hadoop-hdfs-2.0.0-cdh4.5.0.jar

hive-exec-0.10.0-cdh4.5.0.jar

hive-jdbc-0.10.0-cdh4.5.0.jar

hive-metastore-0.10.0-cdh4.5.0.jar

hive-service-0.10.0-cdh4.5.0.jar

hive-site.xml

libfb303-0.9.0.jar

pig-0.11.0-cdh4.5.0.jar

protobuf-java-2.4.0a.jar

1 ACCEPTED SOLUTION

Accepted Solutions
JBailey
Barite | Level 11

Hi Keds,

I don't have your exact setup, but I have something close. Keep in mind, that you cannot access HDP and CDH at the same time (from a single SAS process). This is because of the SAS_HADOOP_JAR_PATH= environment variable and the way JARs work.

Here is what I have for HDP 2.0:

   commons-cli-1.2.jar

   guava-12.0.1.jar

   hadoop-auth-2.2.0.2.0.6.0-101.jar

   hadoop-common-2.2.0.2.0.6.0-101.jar

   hadoop-hdfs-2.2.0.2.0.6.0-101.jar

   hadoop-mapreduce-client-common-2.2.0.2.0.6.0-101.jar

   hadoop-mapreduce-client-core-2.2.0.2.0.6.0-101.jar

   hadoop-mapreduce-client-jobclient-2.2.0.2.0.6.0-101.jar

   hadoop-yarn-api-2.2.0.2.0.6.0-101.jar

   hadoop-yarn-client-2.2.0.2.0.6.0-101.jar

   hadoop-yarn-common-2.2.0.2.0.6.0-101.jar

   hive-exec-0.12.0.2.0.6.1-101.jar

   hive-jdbc-0.12.0.2.0.6.1-101.jar

   hive-metastore-0.12.0.2.0.6.1-101.jar

   hive-service-0.12.0.2.0.6.1-101.jar

   httpclient-4.1.3.jar

   httpcore-4.1.4.jar

   libfb303-0.9.0.jar

   pig-0.12.0.2.0.6.1-101.jar

   protobuf-java-2.5.0.jar

Here is what I have for CDH 4.5:

   avro-1.7.4.jar

   commons-cli-1.2.jar

   commons-collections-3.2.1.jar

   commons-configuration-1.6.jar

   commons-httpclient-3.1.jar

   commons-logging-1.1.1.jar

   guava-11.0.2.jar

   hadoop-auth-2.0.0-cdh4.5.0.jar

   hadoop-common-2.0.0-cdh4.5.0.jar

   hadoop-core-2.0.0-mr1-cdh4.5.0.jar

   hadoop-hdfs-2.0.0-cdh4.5.0.jar

   hive-exec-0.10.0-cdh4.5.0.jar

   hive-jdbc-0.10.0-cdh4.5.0.jar

   hive-metastore-0.10.0-cdh4.5.0.jar

   hive-service-0.10.0-cdh4.5.0.jar

   libfb303-0.9.0.jar

   log4j-1.2.17.jar

   pig-0.11.0-cdh4.5.0-withouthadoop.jar

   pig-0.11.0-cdh4.5.0.jar

   protobuf-java-2.4.0a.jar

   slf4j-api-1.6.1.jar

   slf4j-log4j12-1.6.1.jar

View solution in original post

17 REPLIES 17
mdavidson
Quartz | Level 8

Also here are the results of proc javainfo picklist 'hadoop/hdoopsasjars.txt'; run;

Picklist URLs:

file:/mnt/sas/sashome/SASVersionedJarRepository/eclipse/plugins/sas.hadoop.hivehelper_904000.0.0.20130522190000_v940/sas.hadoop.hivehelper.jar

file:/mnt/sas/sashome/SASVersionedJarRepository/eclipse/plugins/Log4J_1.2.15.0_SAS_20121211183158/log4j.jar

file:/mnt/sas/sashome/SASVersionedJarRepository/eclipse/plugins/commons_beanutils_1.8.2.0_SAS_20121211183319/commons-beanutils.jar

file:/mnt/sas/sashome/SASVersionedJarRepository/eclipse/plugins/commons_collections_3.2.1.0_SAS_20121211183225/commons-collections.jar

file:/mnt/sas/sashome/SASVersionedJarRepository/eclipse/plugins/commons_logging_1.1.1.0_SAS_20121211183202/commons-logging.jar

file:/mnt/sas/sashome/SASVersionedJarRepository/eclipse/plugins/jackson_1.9.7.0_SAS_20121211183158/jackson.jar

file:/mnt/sas/sashome/SASVersionedJarRepository/eclipse/plugins/slf4j_1.5.10.0_SAS_20121211183229/slf4j-api.jar

file:/mnt/sas/sashome/SASVersionedJarRepository/eclipse/plugins/slf4j_1.5.10.0_SAS_20121211183229/slf4j-log4j12.jar

Total URLs: 8

JBailey
Barite | Level 11

Hi,

Here is what I have for CDH 4.5.

avro-1.7.4.jar

commons-cli-1.2.jar

commons-collections-3.2.1.jar

commons-configuration-1.6.jar

commons-httpclient-3.1.jar

commons-logging-1.1.1.jar

guava-11.0.2.jar

hadoop-auth-2.0.0-cdh4.5.0.jar

hadoop-common-2.0.0-cdh4.5.0.jar

hadoop-core-2.0.0-mr1-cdh4.5.0.jar

hadoop-hdfs-2.0.0-cdh4.5.0.jar

hive-exec-0.10.0-cdh4.5.0.jar

hive-jdbc-0.10.0-cdh4.5.0.jar

hive-metastore-0.10.0-cdh4.5.0.jar

hive-service-0.10.0-cdh4.5.0.jar

libfb303-0.9.0.jar

log4j-1.2.17.jar

pig-0.11.0-cdh4.5.0-withouthadoop.jar

pig-0.11.0-cdh4.5.0.jar

protobuf-java-2.4.0a.jar

slf4j-api-1.6.1.jar

slf4j-log4j12-1.6.1.jar

keds
Calcite | Level 5

Hi JBailey ,

we are trying to connect HortonWorks 2.2.0.2.0.6.0-101 and Cloudera CDH4.6 using SAS Access to Hadoop.


how to get the list of jars required ? does SAS Provides any documentation for this ?

Getting Below Error when trying to access.any thoughts about this ?

4   LIBNAME hdpcdh HADOOP  PORT=10000 SERVER="XXXXXXXXXX"  SCHEMA=default

15   USER=XXXXXXXXXXX  PASSWORD=XXXXXXXXXX subprotocol=hive2 ;

ERROR: java.sql.SQLException: Could not establish connection to jdbc:hive2://XXXXXXXXXXXX:10000/default: java.net.ConnectException:

       Connection timed out: connect

ERROR: Unable to connect to the Hive server.

ERROR: Error trying to establish connection.

ERROR: Error in the LIBNAME statement.

Thanks

keds

DaveR_SAS
SAS Employee

The Configuration Guide for  SAS 9.4 Foundation includes a topic, "Configuring Hadoop JAR Files." Here is the PDF that includes that topic: http://support.sas.com/documentation/installcenter/en/ikfdtnunxcg/66380/PDF/default/config.pdf

DaveR_SAS
SAS Employee

Sorry: The doc link I supplied above is appropriate for SAS 9.4M1. I've just been told the following: "HW 2.2.0 and CDH4.6 came out after SAS 9.4M1 (which is the latest release in the field). I suspect their SAS consultant will have to help them find the correct files."

JBailey
Barite | Level 11

Hi Keds,

I don't have your exact setup, but I have something close. Keep in mind, that you cannot access HDP and CDH at the same time (from a single SAS process). This is because of the SAS_HADOOP_JAR_PATH= environment variable and the way JARs work.

Here is what I have for HDP 2.0:

   commons-cli-1.2.jar

   guava-12.0.1.jar

   hadoop-auth-2.2.0.2.0.6.0-101.jar

   hadoop-common-2.2.0.2.0.6.0-101.jar

   hadoop-hdfs-2.2.0.2.0.6.0-101.jar

   hadoop-mapreduce-client-common-2.2.0.2.0.6.0-101.jar

   hadoop-mapreduce-client-core-2.2.0.2.0.6.0-101.jar

   hadoop-mapreduce-client-jobclient-2.2.0.2.0.6.0-101.jar

   hadoop-yarn-api-2.2.0.2.0.6.0-101.jar

   hadoop-yarn-client-2.2.0.2.0.6.0-101.jar

   hadoop-yarn-common-2.2.0.2.0.6.0-101.jar

   hive-exec-0.12.0.2.0.6.1-101.jar

   hive-jdbc-0.12.0.2.0.6.1-101.jar

   hive-metastore-0.12.0.2.0.6.1-101.jar

   hive-service-0.12.0.2.0.6.1-101.jar

   httpclient-4.1.3.jar

   httpcore-4.1.4.jar

   libfb303-0.9.0.jar

   pig-0.12.0.2.0.6.1-101.jar

   protobuf-java-2.5.0.jar

Here is what I have for CDH 4.5:

   avro-1.7.4.jar

   commons-cli-1.2.jar

   commons-collections-3.2.1.jar

   commons-configuration-1.6.jar

   commons-httpclient-3.1.jar

   commons-logging-1.1.1.jar

   guava-11.0.2.jar

   hadoop-auth-2.0.0-cdh4.5.0.jar

   hadoop-common-2.0.0-cdh4.5.0.jar

   hadoop-core-2.0.0-mr1-cdh4.5.0.jar

   hadoop-hdfs-2.0.0-cdh4.5.0.jar

   hive-exec-0.10.0-cdh4.5.0.jar

   hive-jdbc-0.10.0-cdh4.5.0.jar

   hive-metastore-0.10.0-cdh4.5.0.jar

   hive-service-0.10.0-cdh4.5.0.jar

   libfb303-0.9.0.jar

   log4j-1.2.17.jar

   pig-0.11.0-cdh4.5.0-withouthadoop.jar

   pig-0.11.0-cdh4.5.0.jar

   protobuf-java-2.4.0a.jar

   slf4j-api-1.6.1.jar

   slf4j-log4j12-1.6.1.jar

Nikolay
Calcite | Level 5

Hi Jeff,

Thank you for your list of JAR files for Hortonworks.

JBailey
Barite | Level 11

My pleasure. I hope you found it helpful. If you need anything, holler.

pmohanty
Fluorite | Level 6

HI,

We have Hortonworks 1.2 configured on our machine

The list of Hadoop Jar file we have set at SAS_HADOOP_JAR_PATH are:

guava-11.0.2.jar  

hadoop-core-2.0.0-mr1-cdh4.3.1.jar

hive-exec-0.10.0-cdh4.3.1.jar

hive-jdbc-0.10.0-cdh4.3.1.jar

hive-metastore-0.10.0-cdh4.3.1.jar

hive-service-0.10.0-cdh4.3.1.jar

libfb303-0.9.0.jar

pig-0.11.0-cdh4.3.1.jar

protobuf-java-2.4.0a.jar

but other three jar files are not there with us:

hadoop-auth-2.0.0-cdh4.3.1.jar   

hadoop-common-2.0.0-cdh4.3.1.jar

hadoop-hdfs-2.0.0-cdh4.3.1.jar

The list of Hadoop cluster configuration files that are set at:SAS_HADOOP_CONFIG_PATH are:

hdfs-site.xml

mapred-site.xml

core-site.xml

So my question is whether we need the other 3 jar files as mentioned above to Interface with Hadoop, connect to a specific Hadoop cluster and to To store data in HDFS using the SPD Engine.

As you have mentioned earlier that we dont need SAS/ACCESS for To Interface with hadoop cluster using SPD Engine.

But in my case when we just ran the below libname statement:

LIBNAME hdplib SPDE '/SAS/sangram/data' HDFSHOST=DEFAULT;

we are getting some issue regarding Can't get Kerberos configuration.


So,Please help me regarding this and also do we need the other 3 jars files??? and if required can you please provide them.It will really be grateful.

Thank You and Waiting for a Reply.

Sangramjit

JBailey
Barite | Level 11

Hi Sangramjit,

You may want to open a different discussion for this topic. The SPDE interface is distinct from SAS/ACCESS. I notice that you say you are using Hortonworks, but you are listing Cloudera JAR files. This will lead to problems. You want to make certain that your JAR files match your distribution.

Troubleshooting Kerberos issues can be fraught with peril. In order to do that someone needs to go through your XML files. If your problem is Kerberos, it is best to open a track with SAS Technical Support.

Best wishes,

Jeff

pmohanty
Fluorite | Level 6

Hi Jeff,

I have a list of Hadoop Jar files for Hortonworks 1.2 and placed on my machine:

protobuf-java-2.4.1.jar

libfb303-0.9.0.jar

hive-service-0.11.0.1.3.3.4-2.jar

hive-metastore-0.11.0.1.3.3.4-2.jar

hive-jdbc-0.11.0.1.3.3.4-2.jar

hive-exec-0.11.0.1.3.3.4-2.jar

hadoop-core-1.2.0.1.3.3.4-2.jar

guava-11.0.2.jar

pig-0.11.1.1.3.3.4-2.jar

But the other jar files for Hortonworks 1.2 such as:

hadoop-auth-jar   

hadoop-common-.jar

hadoop-hdfs-.jar


are not there with us.Rest all configuration files are present and they are also set by the respective Environment variables.

While executing the sas script we are getting the following issue:

ERROR: Could not connect to HDFS.

ERROR: Libref HDPLIB is not assigned.

ERROR: Error in the LIBNAME statement.

.

.

ERROR: Call to method org.apache.hadoop.fs.FileSystem::get(URI, Configuration) failed.

ERROR: java.lang.ExceptionInInitializerError

.

.

java.lang.IllegalArgumentException: Can't get Kerberos configuration


So,can you please tell me whether we are getting this type of error for missing of above 3 jar files or some other issue is there.


Thanks,

Sangramjit



pmohanty
Fluorite | Level 6

Jeff,can you go through the Disscussion: SPDEngine:Storing Data in the Hadoop Distributed File System opened by Sangramjit Panda

Thanks,

Sangramjit

fatcat
Calcite | Level 5

Hi, i am trying to connect to Hortonworks with SAS/Access for Hadoop. Able to get the klist, and able to ssh from the SAS Client machine to the one of the nodes. But getting an error when trying to connect via SAS EG. Any idea?

 

 

libname zhdplib  hadoop subprotocol='hive2' server='myserver'   schema=myschema; user='user123' pwd='Test123';

 

 

ERROR: Unable to connect to the Hive server.
ERROR: Error trying to establish connection.
ERROR: Error in the LIBNAME statement.

 

Alex

SAS Innovate 2025: Save the Date

 SAS Innovate 2025 is scheduled for May 6-9 in Orlando, FL. Sign up to be first to learn about the agenda and registration!

Save the date!

How to connect to databases in SAS Viya

Need to connect to databases in SAS Viya? SAS’ David Ghan shows you two methods – via SAS/ACCESS LIBNAME and SAS Data Connector SASLIBS – in this video.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 17 replies
  • 9860 views
  • 3 likes
  • 8 in conversation