05-27-2014 07:05 AM
I have a task to make a connection from SAS to Hortonworks.
There is an instance of SAS 9.4 with SAS/Access to Hadoop on Windows Server 9.4.
Also there is a cluster of Hortonworks (with 10 nodes, OS: RedHat).
Unfortunately I couldn't find any information about the Jar files that I needed.
I just found "Chapter 4: Configuring Hadoop JAR Files" in configuration guide.
If someone has experience in this task please let me know how you did it.
05-28-2014 08:57 AM
05-27-2014 07:53 AM
Did you follow this:
I successfully connected to IBM BigInsights following this.
The distribution may vary on where to find the appropriate jar-files specified - there are no distribution specific complete "pick list". You probably need to search for these manually in Hortonworks. If you fins multiple, pick the ones with latest version no.
05-27-2014 08:47 AM
LinusH, you right.
I follow exactly mentioned config documentation.
But unfortunately in this documentation it is only the Jar files for Cloudera CDH4.0.1 (It's realy old version ).
How did you find which Jar files needed for IBM BigInsights?
And could you please write the list of this files.
05-28-2014 08:57 AM
05-29-2014 07:15 AM
I have HDP 2.0 version.
I found of the 20 JAR files that you have in your link above.
Also I set SAS_HADOOP_JAR_PATH.
But when I started the program, it was faced with another problem. The program hangs in "processing submitted statements".
I was forced to quit the program by "Terminated the SAS System".
Could you tell why this program could hang?
I've done the connection first and second ways according your video " Getting Started with SAS® and Hadoop Part1".
Jeff, thank you your video.
05-30-2014 06:20 AM
I solved the problem with connection.
The problem was in incorrect libname. My Hive server was HiveSever2 and it was critical to add option SUBPROTOCOL=HIVE2.
When I wrote libname hdp hadoop server="192.168.14.26" user="hadoop1" SUBPROTOCOL=HIVE2; everything was working.
06-10-2014 10:30 AM
It is very easy to overlook the SUBPROTOCOL= option. In the near future we will likely make HIVE2 the default. The use of Hive1 (for want of a better way to describe it) is rapidly disappearing. Eventually, the option may go away. But in the meantime, it sure is easy to overlook it.
Need further help from the community? Please ask a new question.