- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
Posted 12-14-2023 12:53 PM
(1103 views)
I found the below solution in our community but would like to adapt it to include columns:
this is passing through to hadoop:
select * from connection to hadoop (show tables);
Is there a modification that can show tables and columns?
3 REPLIES 3
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
How are you connecting to HADOOP?
Are you using ODBC? Do these the ODBC specific queries help?
https://support.sas.com/kb/15/721.html
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
this is not ODBC:
proc sql;
connect to hadoop
(READ_METHOD=HDFS /* this attemps HDFS read which is faster than JDBC */
server='' /* the hiveserver we talk to */
LOGIN_TIMEOUT=300
schema=&schema. /* this is the schema where you want to read or write data to */
uri=
- Mark as New
- Bookmark
- Subscribe
- Mute
- RSS Feed
- Permalink
- Report Inappropriate Content
I have never worked with HADOOP (or HIVE) but a couple of points.
1) You original query is using pass thru to run HADOOP code. So if you want to do the same thing then find out what HADOOP code lists the variables in a dataet.
2) Why not just see if you can't use normal SAS code to get the contents of the datasets? Make a LIBREF using the HADOOP engine and run normal SAS code. So something like:
libname mylib hadoop .....;
proc contents data=mylib.my_dataset;
run;
Where MY_DATASET is one of the "table" names you got from the SHOW TABLES command.