i have been trying to update my master data in HADOOP with the transaction data.
So my question is how to modify this master data in hadoop from Enterprise Guide.
LIBNAME hdfs SASHDAT PATH="/staging" SERVER=SERVERNODE INSTALL="/Hadoop/TKGrid"; data hdfs.MASTER; set TRANSACTION; modify hdfs.MASTER key=Index; set TRANSACTION (drop=ObsCreatedDate); if _iorc_= 0 then replace; else output; call missing(of _all_); PROC DELETE data=TRANSACTION ; run;
It gives the error.
ERROR: The SASHDAT engine is a uni-directional engine. Data flows from the SAS client to the Hadoop Distributed File System. The engine cannot be used to fetch data from HDFS.
Then i tried with HPDS2 still the same error please help.
Your code defines library "hdfs", but the data step uses libraries "staging" and work.
And it is what the message says, SASHDAT can only be used to write data to HDFS, not to read from there. At least in the way you tried.
Did you take your proc hpds2 example from http://support.sas.com/documentation/cdl/en/inmsref/67213/HTML/default/viewer.htm#p0kn1b8a7yt44fn1qw... (example 6)?
Please post the complete log, including the libname(s) and data/proc steps.
SPDE is included in the Base SAS license, at least in the stand alone one. Not sure if you have "only" Visual Analytics embedded Base, if SPDE is supported. Have you verified this with SAS?
But, don't use a tool that it isn't built for. Do you really need this data to reside in Hadoop?
The SAS Users Group for Administrators (SUGA) is open to all SAS administrators and architects who install, update, manage or maintain a SAS deployment.
Learn how to install the SAS Viya CLI and a few commands you may find useful in this video by SAS’ Darrell Barton.
Find more tutorials on the SAS Users YouTube channel.