BookmarkSubscribeRSS Feed
🔒 This topic is solved and locked. Need further help from the community? Please sign in and ask a new question.
pauger2
Calcite | Level 5

Hi All!

 

I am having some issues using SAS /ACCESS to Hadoop.

I would like to resolve this issue ASAP and I hope someone had similar experience and can share how they resolved it.

 

I can connect to the hadoop environment via a libname statement.

I can also read from a table in the hadoop environment and write it out as a SAS dataset but I cannot write to a database in the Hadoop environment.

 

When I submit a sample data step to write, my session goes away and I don't even get a SAS log that would tell me what's happening.

If anyone has any suggestiong they would like share I would very much appreciate it.

I am using SAS 9.4 TS Level 1M3 in window and my Hadoop distribution is Hortonworks.  Thank you so much and I look forward to your suggestions.

 

 

1 ACCEPTED SOLUTION

Accepted Solutions
JBailey
Barite | Level 11

Hi @pauger2,

 

Have you installed the hotfix for Knox? 

 

http://support.sas.com/kb/56/644.html

 

View solution in original post

10 REPLIES 10
ballardw
Super User

It may help someone diagnose your issue if you provide an example of the code that makes the "session go away".

pauger2
Calcite | Level 5

Hello Ballardw:

Thanks for your response.  Below is the code I am using.

When I submit this code, my sas session terminates by itself and no sas log is generated.

Thie code creates a directory entry, puts a filename in metadata but the actual data doesn’t make it and I end up with an empty hive table. 

OPTIONS SET=SAS_HADOOP_RESTFUL='1';

OPTIONS SET=KNOX_GATEWAY_URL='';

LIBNAME HDP HADOOP URI=''

SERVER='' USER=xxxxxx PW=xxxxxxxx DATABASE=xxxxxxxxxxxx;

DATA HDP.SAMPLE(DBCREATE_TABLE_OPTS='STORED AS SEQUENCEFILE');

SET LOCAL.SAMPLE;

RUN;

JBailey
Barite | Level 11
Hi,

When you say "my session goes away" what do you mean? Is SAS abending or is the connection to Hadoop lost?

Many customers experience issues because of missing JARs or XML config files with incorrect information in them.
pauger2
Calcite | Level 5

Hello JBailey.

Thanks for your response.

My SAS session goes away means It's gone..bye bye.. by just running the code below and no SAS log is generated.

With the code below, we can connect to Hadoop.  We can also read data from Hadoop but we cannot write to hadoop.

The code simply creates a directory entry, puts a filename in metadata but the actual data doesn’t make it and we end up with an empty hive table.  

Do you happen to have a comple set of JARs and XMLs config files from a client who have successfully connect, read and write?

We're using Hortonworks for our hadoop distribution.  We're connecting using PC SAS in window and we have SAS 9.4 - TS Level 1M3.

Anything you can contribute to help solve this issue would be greatly appreciated.  Thanks a lot.

OPTIONS SET=SAS_HADOOP_RESTFUL='1';

OPTIONS SET=KNOX_GATEWAY_URL='';

LIBNAME HDP HADOOP URI=''

SERVER='' USER=xxxxxx PW=xxxxxxxx DATABASE=xxxxxxxxxxxx;

DATA HDP.SAMPLE(DBCREATE_TABLE_OPTS='STORED AS SEQUENCEFILE');

SET LOCAL.SAMPLE;

RUN;

JBailey
Barite | Level 11

Hi @pauger2,

 

Have you installed the hotfix for Knox? 

 

http://support.sas.com/kb/56/644.html

 

pauger2
Calcite | Level 5

I don't believe we did.

Below is the hotfix we recently installed.

 

http://support.sas.com/kb/57/099.html

pauger2
Calcite | Level 5

Great news JBailey!

 

My SAS session is no longer disappearing and was able to write 1000 rows in HDFS...very exciting...

 

The only is now when we attempt to write 2000 rows or more we get the error below.  It seems to be about a permission issue that we hope to be able to work out with our admin hadoop team.

 

NOTE: The data set HDP.WRITES_SAMPLE2 has 3000 observations and 11 variables.

ERROR: Execute error on statement: LOAD DATA INPATH

       '/tmp/sasdata-2016-09-28-19-47-22-419-e-00001.dlv' OVERWRITE INTO TABLE

       sastmp_09_28_19_48_23_228_00002. Could not load

       /tmp/sasdata-2016-09-28-19-47-22-419-e-00001.dlv into table

       sastmp_09_28_19_48_23_228_00002 in schema xxxxxxxxxxxx_TBLS. A common cause of this

       issue is conflicting HDFS permissions between the data file and the Hive warehouse

       directory for the table. Another possible cause is the "sticky" bit set on HDFS directory

       /tmp.

 

Thanks for helping out.

 

 

sasprofile
Quartz | Level 8

Hello Pauger2,

 

I see you have raised a Question about SAS to Hadoop connectivity issue, hope your issue is resolved.

 

I need your help in establishing a connection from SAS 9.4 to Hadoop.

 

I never worked before with SAS connecting Hadoop

 

I would appreciate if any one can assist me with some high-level step by step Instructions for connecting from SAS to Hadoop.

 

 

Thank you in Advance

SASKiwi
PROC Star

@sasprofile - You've already asked the same question in your own separate post. You are more likely to get your answers there, rather than by repeating your questions on old posts.

JBailey
Barite | Level 11

Hi, 

 

I replied to the other thread. Hope it helps.

 

Best wishes,

Jeff

sas-innovate-2024.png

Join us for SAS Innovate April 16-19 at the Aria in Las Vegas. Bring the team and save big with our group pricing for a limited time only.

Pre-conference courses and tutorials are filling up fast and are always a sellout. Register today to reserve your seat.

 

Register now!

How to connect to databases in SAS Viya

Need to connect to databases in SAS Viya? SAS’ David Ghan shows you two methods – via SAS/ACCESS LIBNAME and SAS Data Connector SASLIBS – in this video.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 10 replies
  • 3592 views
  • 0 likes
  • 5 in conversation