Help using Base SAS procedures

Write sas table to hiveon hadoop

New Contributor
Posts: 4

Write sas table to hiveon hadoop

I would like to write a table in sas to hadoop. How do I write. My current code...


proc sql;
connect to hadoop (user=john password=xxxxx server="" port=10000 SCHEMA=hiveSchema subprotocol=hive2 dbmax_text=300 CFG="/opt/sas/myhadoopmachine/myhadoopmachine_core_hdfs_site.xml");
execute ( create table hivelib.mytable as
select *
from saslib.mytable
) by hadoop ;



results in error:


ERROR: Execute error: Error while compiling statement: FAILED: SemanticException [Error 10001]: Line 1:49 Table not found 'mytable'


The basic issue is that I can execute sql command on hive. The above expects the saslib.mytable to be on hadoop server under the schema saslib.mytable.  saslib.mytable is my sas lib with table name mylib. 


How do I pass this information in the above proc sql to copy the sas data set to hadoop.








SAS Employee
Posts: 1

Re: Write sas table to hiveon hadoop



it doesn't look like your procedure is complete in your example. See this techniques in processing in Hadoop paper, it has a few examples using proc sql; 


It has this sample code to write to hadoop using proc sql that might help;


proc sql; connect to hadoop (server=duped user=myUserID);

execute (create table myUserID_store_cnt row format delimited fields terminated by '\001' stored as textfile as select customer_rk, count(*) as total_orders from order_fact group by customer_rk) by hadoop;

disconnect from hadoop;




Super User
Posts: 5,256

Re: Write sas table to hiveon hadoop

If your source data resides in SAS (outside hadoop) you can't use explicit pass through, Hive is not aware of SAS data.
For most situations a simple create table on a Hive Libref is sufficient.
Data never sleeps
Ask a Question
Discussion stats
  • 2 replies
  • 3 in conversation