BookmarkSubscribeRSS Feed
SaiSankar116
Calcite | Level 5

Dear All, 

 

I am trying to ingest data from SAS Server to Hive using Data step. It was working till last week. 

 

-> Takes file from SAS Server , move it to  HDFS Temp as .dlv file

-> creates a temporary table on the schema we gave in hadoop configuration connect string. 

-> Finally it runs The INSERT INTO table orignaltable from table sastmp_04_04_21_*** .  

-> file inside /tmp will be deleted. 

 

Recently i am facing an issue.

Temporary table is getting created on hive. 

The INSERT INTO table orignaltable from table sastmp_04_04_21_***  failed, though original table is created with no rows.

and i am not finding the file under/tmp.

 

Any expert suggestions? greatly appreciated. 

 

 

Here is the code i am using. 

 

options set=SAS_HADOOP_RESTFUL=1;
options set=SAS_HADOOP_JAR_PATH="/hadoop/jars";
options set= SAS_HADOOP_CONFIG_PATH="/hadoop/conf";

 

OPTIONS nofmterr;
%let svr = %NRSTR('testing.domain.com');

%let stng = %NRSTR('STORED as PARQUET');

libname myhadoop hadoop server=&svr hdfs_tempdir='/tmp/sastmp' user=hive password=pxxxx
schema=schema1 port=10000
DBCREATE_TABLE_OPTS=&stng
subprotocol=hive2;

libname sai '/mydropzone';

data myhadoop.carshelp;
set sai.cars_temp;
run;

 

 

 

 

 

4 REPLIES 4
LinusH
Tourmaline | Level 20

Any code/log will help to understand the details of your operation.

Do I get you right that you don't acutually execute in SAS (explcit SQL pass through?)

Data never sleeps
SaiSankar116
Calcite | Level 5

Here is the sample code i am using, please help me. Thanks for your time.

 

options set=SAS_HADOOP_RESTFUL=1;
options set=SAS_HADOOP_JAR_PATH="/hadoop/jars";
options set= SAS_HADOOP_CONFIG_PATH="/hadoop/conf";

OPTIONS nofmterr;
%let svr = %NRSTR('testing.domain.com');

%let stng = %NRSTR('STORED as PARQUET');

libname myhadoop hadoop server=&svr hdfs_tempdir='/tmp/sastmp' user=hive password=pxxxx
schema=schema1 port=10000
DBCREATE_TABLE_OPTS=&stng
subprotocol=hive2;

libname sai '/mydropzone';

data myhadoop.carshelp;
set sai.cars_temp;
run;

LinusH
Tourmaline | Level 20
Ok, next step the log. Also include the following at the top:

Options sastrace =',,,d' sastraceloc =saslog nostsuffix msglevel=i;
Data never sleeps
SaiSankar116
Calcite | Level 5

HADOOP_8: Executed: on connection 2
LOAD DATA INPATH '/tmp/sastmp/sasdata-2019-04-04-16-30-00-841-e-00001.dlv' OVERWRITE INTO TABLE sastmp_04_04_16_30_02_424_00002


HADOOP_9: Executed: on connection 2
set hive.exec.dynamic.partition.mode=nonstrict


HADOOP_10: Executed: on connection 2
set hive.exec.dynamic.partition=true


HADOOP_11: Executed: on connection 2
set mapred.max.split.size=536870912


HADOOP_12: Executed: on connection 2
INSERT INTO TABLE `cars_temp` SELECT * FROM sastmp_04_04_16_30_02_424_00002


HADOOP_13: Executed: on connection 2
DROP TABLE sastmp_04_04_16_30_02_424_00002

ERROR: Execute error:
ERROR: The INSERT INTO table cars_temp from table sastmp_04_04_16_30_02_424_00002 failed.
NOTE: SAS set option OBS=0 and will continue to check statements. This might cause NOTE: No observations in data set.
NOTE: DATA statement used (Total process time):
real time 8.21 seconds
cpu time 0.51 seconds

28
29

 

Extra Information :

*********************

Actual table cars_temp table is getting created with zero records.  ERROR: The INSERT INTO table cars_temp from table sastmp_04_04_16_30_02_424_00002 failed. I checked all the folder permissions(777), everything looks good.  Recently upgraded to HDP3.1.0. 

Ready to join fellow brilliant minds for the SAS Hackathon?

Build your skills. Make connections. Enjoy creative freedom. Maybe change the world. Registration is now open through August 30th. Visit the SAS Hackathon homepage.

Register today!
Mastering the WHERE Clause in PROC SQL

SAS' Charu Shankar shares her PROC SQL expertise by showing you how to master the WHERE clause using real winter weather data.

Find more tutorials on the SAS Users YouTube channel.

Discussion stats
  • 4 replies
  • 1131 views
  • 0 likes
  • 2 in conversation