BookmarkSubscribeRSS Feed
a_SAS
Obsidian | Level 7

 

We are trying to write data from SAS to HDFS ,SAS Enterprise Guide application able to write data into HDFS and logs shows writing data into HDFS ,but some reason we are not able to see data in below hadoop cluster and its shows 0 file size  ,please provide me suggestion on this issue.

 

---code -------------------------------------------

libname hdfslib hadoop server='abcd.com'
HDFS_TEMPDIR='/user/rhive/test'
HDFS_DATADIR='/user/rhive/lib'
HDFS_METADIR='/user/rhive/testabc';

/* Copy SASHELP.CARS over to HDFS */
data hdfslib.carsdata7;
set sashelp.cars;
run;

libname hdfslib clear;

-------------------------------------------------

logfile deatils below : 

 

NOTE: Writing HTML(EGHTML) Body file: EGHTML
21
22 GOPTIONS ACCESSIBLE;
23 libname hdfslib hadoop server='abcd.com'
24 HDFS_TEMPDIR='/user/rhive/test'
25 HDFS_DATADIR='/user/rhive/lib'
26 HDFS_METADIR='/user/rhive/testabc';
NOTE: Libref HDFSLIB was successfully assigned as follows:
Engine: HADOOP
Physical Name: /user/rhive/testabc
27
28 /* Copy SASHELP.CARS over to HDFS */
29 data hdfslib.carsdata7;
30 set sashelp.cars;
31 run;

NOTE: SAS variable labels, formats, and lengths are not written to DBMS tables.
NOTE: There were 428 observations read from the data set SASHELP.CARS.
NOTE: The data set HDFSLIB.CARSDATA7 has 428 observations and 15 variables.
NOTE: DATA statement used (Total process time):
real time 2.68 seconds
user cpu time 0.03 seconds
system cpu time 0.02 seconds
memory 1916.43k
OS Memory 18404.00k
Timestamp 08/04/2017 04:47:31 PM
Step Count 2 Switch Count 33
Page Faults 0
Page Reclaims 1856
Page Swaps 0
Voluntary Context Switches 3732
Involuntary Context Switches 0
Block Input Operations 0
Block Output Operations 0

32
33 libname hdfslib clear;
2 The SAS System 16:46 Friday, August 4, 2017

NOTE: Libref HDFSLIB has been deassigned.
34
35
36
37
38 GOPTIONS NOACCESSIBLE;
39 %LET _CLIENTTASKLABEL=;
40 %LET _CLIENTPROJECTPATH=;
41 %LET _CLIENTPROJECTNAME=;
42 %LET _SASPROGRAMFILE=;
43
44 ;*';*";*/;quit;run;
45 ODS _ALL_ CLOSE;
46
47
48 QUIT; RUN;
49

-------------------

hadoop cluster data ,its showing 0 file size


-rw-r--r-- 3 b_sas hdmi-technology 0 2017-08-04 16:10 /user/rhive/testabc/carsdata7.sashdmd

4 REPLIES 4
JBailey
Barite | Level 11

Hi @a_SAS

 

This is most likely a configuration problem with the XML files. Current versions of SAS can be configured using the hadooptracer.py Python tool. I have seen it quite often with the Cloudera Quickstart and Hortonworks Sandbox, too. 

 

The configuration guides for SAS/ACCESS to Hadoop can be found here:

https://support.sas.com/en/documentation/third-party-software-reference/9-4/guides-papers-for-hadoop...

 

This SAS Communities thread may help:

https://communities.sas.com/t5/SAS-Data-Management/No-sasdata-written-to-Hadoop-only-metadata/m-p/15...

 

Best wishes,

Jeff

 

a_SAS
Obsidian | Level 7

Thank you for reply ,finally i figure out this issue.

 

Just am sharing ,this is the below path removed from Yarn-site.xml file.  

 

After removing this below line,its allow to write SAS data into HDFS

 

Yarn-site.xml

 

<xi:include href="rmha-site.xml" xmlns:xi="http://www.w3.org/2001/XInclude" />

marcellorangel
Calcite | Level 5
We have a problem similar to this and we are not able to evolve. We tried this posted solution but in our case we don't have that entry in the Yarn- site.xml file for the include: href = "rmha-site.xml". We have already changed the property: dfs.client.use.datanode.hostname to true in the hdfs-site.xml file to no avail. We were able to connect with a datalake with Hadoop and Kerberos through SAS Viya using the SAS libname command. All tables are visible and accessible and we were able to create new tables in Hadoop with the data that are already in the datalake. When we try to create a table with the data that is in SAS we are unable to place it in Hadoop, the table is created with the metadata but the data is not taken. The temporary file is always created with zero bytes and we receive an error reading blocks after a long time without response (about 4 minutes) BlockMissingException. We have already tried with temporary folders without stickybit using the hdfs_tempdir option in the libname declaration and it didn't work either. ==> We changed the test to the command below and received an ok return from SAS StudioV but the file was created with zero bytes in the hdfs: filename out hadoop '/ tmp /' recfm = v lrecl = 32167 dir; data null; file out (shoes); put 'write data to shoes file'; run; ==> the command below to create a folder in hdfs works perfectly: proc hadoop hdfs mkdir = '/ tmp / new_directory'; run; We have no other alternatives for testing, thank you for your help in advance. SAS Viya Hadoop Cloudera Linux RedHat
Patrick
Opal | Level 21

Feels like something worth contacting SAS Tech Support.

sas-innovate-2024.png

Join us for SAS Innovate April 16-19 at the Aria in Las Vegas. Bring the team and save big with our group pricing for a limited time only.

Pre-conference courses and tutorials are filling up fast and are always a sellout. Register today to reserve your seat.

 

Register now!

SAS Enterprise Guide vs. SAS Studio

What’s the difference between SAS Enterprise Guide and SAS Studio? How are they similar? Just ask SAS’ Danny Modlin.

Find more tutorials on the SAS Users YouTube channel.

Click image to register for webinarClick image to register for webinar

Classroom Training Available!

Select SAS Training centers are offering in-person courses. View upcoming courses for:

View all other training opportunities.

Discussion stats
  • 4 replies
  • 2274 views
  • 2 likes
  • 4 in conversation