The actual issue was that the jar files were not set up for this environment. The issue got resolved after setting them up. It was also missing a mapping folder on Hadoop end where the datasets would sit but that was not the actual root cause.
Errors during SAS/Access Interface Hadoop connection can be very generic sometimes so it is best to work with hadoop admin team to resolve this kind of issues. In order to further troubleshoot I would suggest you to try below options.
1) Check if the connection is working out of SAS using beeline. If it's working then try to form libname statement using that.
2) Check if the user trying to connect hive has write access on /tmp HDFS file system.
3) Add extra trace logging options to the code. -> options sastrace=',,d,' sastraceloc=saslog nostsuffix;
@SangaviS it would be awesome if you could document a little bit of what you did and tag your post as the answer to make it easier for others to find the solution in this thread if they have a similar error. Thanks.
The actual issue was that the jar files were not set up for this environment. The issue got resolved after setting them up. It was also missing a mapping folder on Hadoop end where the datasets would sit but that was not the actual root cause.
The SAS Users Group for Administrators (SUGA) is open to all SAS administrators and architects who install, update, manage or maintain a SAS deployment.