We have a problem similar to this and we are not able to evolve. We tried this posted solution but in our case we don't have that entry in the Yarn- site.xml file for the include: href = "rmha-site.xml". We have already changed the property: dfs.client.use.datanode.hostname to true in the hdfs-site.xml file to no avail. We were able to connect with a datalake with Hadoop and Kerberos through SAS Viya using the SAS libname command. All tables are visible and accessible and we were able to create new tables in Hadoop with the data that are already in the datalake. When we try to create a table with the data that is in SAS we are unable to place it in Hadoop, the table is created with the metadata but the data is not taken. The temporary file is always created with zero bytes and we receive an error reading blocks after a long time without response (about 4 minutes) BlockMissingException. We have already tried with temporary folders without stickybit using the hdfs_tempdir option in the libname declaration and it didn't work either. ==> We changed the test to the command below and received an ok return from SAS StudioV but the file was created with zero bytes in the hdfs: filename out hadoop '/ tmp /' recfm = v lrecl = 32167 dir; data null; file out (shoes); put 'write data to shoes file'; run; ==> the command below to create a folder in hdfs works perfectly: proc hadoop hdfs mkdir = '/ tmp / new_directory'; run; We have no other alternatives for testing, thank you for your help in advance. SAS Viya Hadoop Cloudera Linux RedHat
... View more