BookmarkSubscribeRSS Feed
Lcopello
Fluorite | Level 6

Hi All,

When trying to write a sas dataset out to Hadoop I get the error below.   I will let everyone know that I have tried many of the fixes that have been posted but nothing has worked. In our data lake it looks like it is creating the table but it creates is the headers.  We were able to write to the data lake 2 weeks ago but we have switched to a new cluster and we can no longer write from SAS EG, SAS EM or SAS Studio.  Any help would be appreciated. The code that I use is also below.

ERROR: Execute error on statement: LOAD DATA INPATH '/tmp/sasdata-2017-07-14-14-02-27-980-e-00008.dlv' OVERWRITE INTO TABLE
        sastmp_07_14_14_02_28_143_00009. Could not load /tmp/sasdata-2017-07-14-14-02-27-980-e-00008.dlv into table
        sastmp_07_14_14_02_28_143_00009 in schema UTILITY. A common cause of this issue is conflicting HDFS permissions between the
        data file and the Hive warehouse directory for the table.  Another possible cause is the "sticky" bit set on HDFS directory
        /tmp.
______________________________________________________________________________________
Code used that gets the error
-------------------------------------------------------------------------------------- 

options mprint mlogic symbolgen mlogicnest compress=yes;
options set=SAS_HADOOP_CONFIG_PATH='D:\\hadoopcfg';
options set=SAS_HADOOP_JAR_PATH='D:\\hdp22';
OPTIONS SET=SAS_HADOOP_RESTFUL='1';

libname liz hadoop server="xxxxx" PORT=10000 database=user_xxxxx SUBPROTOCOL=hive2
        user="xxxxx" password="xxxxxxxx" HDFS_TEMPDIR="/tmp"  ;

 

proc sql ;
 drop table utility.HolidayCalendar2;

 create table utility.HolidayCalendar2  (DBCREATE_TABLE_OPTS='STORED AS ORC') as select * from  utility.referenceCalendar;
 
 quit;

4 REPLIES 4
Patrick
Opal | Level 21

@Lcopello

Well, if the environment changed then I'd take the information in the SAS Log very seriously:

 A common cause of this issue is conflicting HDFS permissions between the
        data file and the Hive warehouse directory for the table.  Another possible cause is the "sticky" bit set on HDFS directory
 
Have you already excluded the above two possible causes?
 
If you can't get to the bottom of this then I suggest you contact SAS TechSupport as you will likely have to provide site specific information which you can't post into a public forum.
Lcopello
Fluorite | Level 6

Patrick,

 

We have tried the things that were mentioned in the error plus some other things that we had found online while researching hte issue.

 

Thanks for your help.

Lcopello
Fluorite | Level 6

We found a solution to the problem.  We ended up having to change the SAS config file for HADOOP. 

 

Thanks,

 

Patrick
Opal | Level 21

@Lcopello

Can I suggest you describe the issue you had and the resolution to the issue as detailed as possible and then mark this as the solution.

This could help others with similar challenges to use your resolution as guidance.

sas-innovate-2024.png

Join us for SAS Innovate April 16-19 at the Aria in Las Vegas. Bring the team and save big with our group pricing for a limited time only.

Pre-conference courses and tutorials are filling up fast and are always a sellout. Register today to reserve your seat.

 

Register now!

How to connect to databases in SAS Viya

Need to connect to databases in SAS Viya? SAS’ David Ghan shows you two methods – via SAS/ACCESS LIBNAME and SAS Data Connector SASLIBS – in this video.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 4 replies
  • 1856 views
  • 2 likes
  • 2 in conversation