02-06-2017 02:33 AM - edited 02-06-2017 06:17 AM
i have been trying to update my master data in HADOOP with the transaction data.
So my question is how to modify this master data in hadoop from Enterprise Guide.
LIBNAME hdfs SASHDAT PATH="/staging" SERVER=SERVERNODE INSTALL="/Hadoop/TKGrid"; data hdfs.MASTER; set TRANSACTION; modify hdfs.MASTER key=Index; set TRANSACTION (drop=ObsCreatedDate); if _iorc_= 0 then replace; else output; call missing(of _all_); PROC DELETE data=TRANSACTION ; run;
It gives the error.
ERROR: The SASHDAT engine is a uni-directional engine. Data flows from the SAS client to the Hadoop Distributed File System. The engine cannot be used to fetch data from HDFS.
Then i tried with HPDS2 still the same error please help.
02-06-2017 04:34 AM - edited 02-06-2017 04:41 AM
Your code defines library "hdfs", but the data step uses libraries "staging" and work.
And it is what the message says, SASHDAT can only be used to write data to HDFS, not to read from there. At least in the way you tried.
Did you take your proc hpds2 example from http://support.sas.com/documentation/cdl/en/inmsref/67213/HTML/default/viewer.htm#p0kn1b8a7yt44fn1qw... (example 6)?
Please post the complete log, including the libname(s) and data/proc steps.
02-07-2017 02:25 AM
02-07-2017 09:17 AM
SPDE is included in the Base SAS license, at least in the stand alone one. Not sure if you have "only" Visual Analytics embedded Base, if SPDE is supported. Have you verified this with SAS?
But, don't use a tool that it isn't built for. Do you really need this data to reside in Hadoop?