Hi All, When trying to write a sas dataset out to Hadoop I get the error below. I will let everyone know that I have tried many of the fixes that have been posted but nothing has worked. In our data lake it looks like it is creating the table but it creates is the headers. We were able to write to the data lake 2 weeks ago but we have switched to a new cluster and we can no longer write from SAS EG, SAS EM or SAS Studio. Any help would be appreciated. The code that I use is also below. ERROR: Execute error on statement: LOAD DATA INPATH '/tmp/sasdata-2017-07-14-14-02-27-980-e-00008.dlv' OVERWRITE INTO TABLE sastmp_07_14_14_02_28_143_00009. Could not load /tmp/sasdata-2017-07-14-14-02-27-980-e-00008.dlv into table sastmp_07_14_14_02_28_143_00009 in schema UTILITY. A common cause of this issue is conflicting HDFS permissions between the data file and the Hive warehouse directory for the table. Another possible cause is the "sticky" bit set on HDFS directory /tmp. ______________________________________________________________________________________ Code used that gets the error -------------------------------------------------------------------------------------- options mprint mlogic symbolgen mlogicnest compress=yes; options set=SAS_HADOOP_CONFIG_PATH='D:\\hadoopcfg'; options set=SAS_HADOOP_JAR_PATH='D:\\hdp22'; OPTIONS SET=SAS_HADOOP_RESTFUL='1'; libname liz hadoop server="xxxxx" PORT=10000 database=user_xxxxx SUBPROTOCOL=hive2 user="xxxxx" password="xxxxxxxx" HDFS_TEMPDIR="/tmp" ; proc sql ; drop table utility.HolidayCalendar2; create table utility.HolidayCalendar2 (DBCREATE_TABLE_OPTS='STORED AS ORC') as select * from utility.referenceCalendar; quit;
... View more