SAS Data Integration Studio, DataFlux Data Management Studio, SAS/ACCESS, SAS Data Loader for Hadoop and others

Why I have to keep SAS_HADOOP_RESTFUL=1 even tough it is optional parameter as per SAS document?

Accepted Solution Solved
Reply
Occasional Contributor
Posts: 17
Accepted Solution

Why I have to keep SAS_HADOOP_RESTFUL=1 even tough it is optional parameter as per SAS document?

Hi,

I have configured SAS Hadoop Env. by using the following steps:

1) using SAS9.4 in local laptop (without SAS/Access to Hadoop)

2) Install and run CDH 5.8 VM using VM player

 

Manuall set up SAS Hadoop Env.

3) Modified host file to point to CDH 5.8 quickstart.cloudera by adding IP address and hostname

4) Define SAS_HADOOP_RESTFUL=1   

5) Define the SAS_HADOOP_CONFIG_PATH to point to following files:

     core-site.xml,hdfs-site.xml,map-reduce.xml and yarn-site.xml

 

%put %sysget(SAS_HADOOP_JAR_PATH);

%put %sysget(SAS_HADOOP_CONFIG_PATH);

%put %sysget(SAS_HADOOP_RESTFUL);

/* write a file to HDFS directory */

proc hadoop username='cloudera' password='cloudera' verbose;

hdfs mkdir='/user/cloudera/newdirectory3';

run;

The above steps works fine without defining the SAS_HADOOP_JAR_PATH and .

 

However, If I define SAS_HADOOP_RESTFUL=0, and

set SAS_HADOOP_CONFIG_PATH to point config files

set SAS_HADOOP_JAR_PATH to point JARS 

 

I am getting the following error related to dataflux studio, even though I donot have SAS Dataflux installed on my laptop.

 

 

 

saserror.png  

 

 


Accepted Solutions
Solution
‎03-07-2017 05:15 AM
SAS Employee
Posts: 215

Re: Why I have to keep SAS_HADOOP_RESTFUL=1 even tough it is optional parameter as per SAS document?

Just so it is documented in this thread... information regarding the tool can be found here:

 

https://support.sas.com/resources/thirdpartysupport/v94/hadoop/hadoop-configuration-guide-base-acces...

 

This page is also useful: 

https://support.sas.com/resources/thirdpartysupport/v94/hadoop/

 

View solution in original post


All Replies
SAS Employee
Posts: 215

Re: Why I have to keep SAS_HADOOP_RESTFUL=1 even tough it is optional parameter as per SAS document?

Hi @ajain59

 

There is likely a problem with the JAR files. I suggest running the hadooptracer.py tool to pull the JAR and XML files from the cluster.

Occasional Contributor
Posts: 17

Re: Why I have to keep SAS_HADOOP_RESTFUL=1 even tough it is optional parameter as per SAS document?

Where can i find this tool  and how to use it?

 

 

 

Occasional Contributor
Posts: 17

Re: Why I have to keep SAS_HADOOP_RESTFUL=1 even tough it is optional parameter as per SAS document?

[ Edited ]

I got the link here for python script

https://support.sas.com/resources/thirdpartysupport/v94/hadoop/hadoop-configuration-guide-base-acces...

 

and I am now going to use it to extract all jar files and config files

Thanks for your help. I will share if it resolves the issue.

 

Thanks,

Ashish

Solution
‎03-07-2017 05:15 AM
SAS Employee
Posts: 215

Re: Why I have to keep SAS_HADOOP_RESTFUL=1 even tough it is optional parameter as per SAS document?

Just so it is documented in this thread... information regarding the tool can be found here:

 

https://support.sas.com/resources/thirdpartysupport/v94/hadoop/hadoop-configuration-guide-base-acces...

 

This page is also useful: 

https://support.sas.com/resources/thirdpartysupport/v94/hadoop/

 

Occasional Contributor
Posts: 17

Re: Why I have to keep SAS_HADOOP_RESTFUL=1 even tough it is optional parameter as per SAS document?

Thanks for help....now I am able to copy the correct jars and config files. it is working.. hip Hip ..Hurray...

 

 

☑ This topic is solved.

Need further help from the community? Please ask a new question.

Discussion stats
  • 5 replies
  • 318 views
  • 2 likes
  • 2 in conversation