Hi,
i have been trying to update my master data in HADOOP with the transaction data.
So my question is how to modify this master data in hadoop from Enterprise Guide.
LIBNAME hdfs SASHDAT PATH="/staging" SERVER=SERVERNODE INSTALL="/Hadoop/TKGrid";
data hdfs.MASTER;
set TRANSACTION;
modify hdfs.MASTER key=Index;
set TRANSACTION (drop=ObsCreatedDate);
if _iorc_= 0 then
replace;
else output;
call missing(of _all_);
PROC DELETE data=TRANSACTION ;
run;
It gives the error.
ERROR: The SASHDAT engine is a uni-directional engine. Data flows from the SAS client to the Hadoop Distributed File System. The
engine cannot be used to fetch data from HDFS.
Then i tried with HPDS2 still the same error please help.
Your code defines library "hdfs", but the data step uses libraries "staging" and work.
And it is what the message says, SASHDAT can only be used to write data to HDFS, not to read from there. At least in the way you tried.
Did you take your proc hpds2 example from http://support.sas.com/documentation/cdl/en/inmsref/67213/HTML/default/viewer.htm#p0kn1b8a7yt44fn1qw... (example 6)?
Please post the complete log, including the libname(s) and data/proc steps.
That was a typing error i'll just correct it.
and yes i have tried the hpds2 technique.
The other engines like SPDE or HADOOP is not licensed to us thats why i was trying to figure all the other possible engine to edit the data.
SPDE is included in the Base SAS license, at least in the stand alone one. Not sure if you have "only" Visual Analytics embedded Base, if SPDE is supported. Have you verified this with SAS?
But, don't use a tool that it isn't built for. Do you really need this data to reside in Hadoop?
The SAS Users Group for Administrators (SUGA) is open to all SAS administrators and architects who install, update, manage or maintain a SAS deployment.
SAS technical trainer Erin Winters shows you how to explore assets, create new data discovery agents, schedule data discovery agents, and much more.
Find more tutorials on the SAS Users YouTube channel.