Dear All, I am trying to ingest data from SAS Server to Hive using Data step. It was working till last week. -> Takes file from SAS Server , move it to HDFS Temp as .dlv file -> creates a temporary table on the schema we gave in hadoop configuration connect string. -> Finally it runs The INSERT INTO table orignaltable from table sastmp_04_04_21_*** . -> file inside /tmp will be deleted. Recently i am facing an issue. Temporary table is getting created on hive. The INSERT INTO table orignaltable from table sastmp_04_04_21_*** failed, though original table is created with no rows. and i am not finding the file under/tmp. Any expert suggestions? greatly appreciated. Here is the code i am using. options set=SAS_HADOOP_RESTFUL=1; options set=SAS_HADOOP_JAR_PATH="/hadoop/jars"; options set= SAS_HADOOP_CONFIG_PATH="/hadoop/conf"; OPTIONS nofmterr; %let svr = %NRSTR('testing.domain.com'); %let stng = %NRSTR('STORED as PARQUET'); libname myhadoop hadoop server=&svr hdfs_tempdir='/tmp/sastmp' user=hive password=pxxxx schema=schema1 port=10000 DBCREATE_TABLE_OPTS=&stng subprotocol=hive2; libname sai '/mydropzone'; data myhadoop.carshelp; set sai.cars_temp; run;
... View more