libname pdoa hadoop uri="jdbc:hive2://datalab-hive-prod.gslb.db.com:10000/pr_pwcc_analysis;principal=hive/datalab-hive-prod.gslb.db.com@DBG.ADS.DB.COM;ssl=true?mapreduce.job.queuename=root.ACE-S-FRA-DATALAB-PROD-PC_PDOA;" server="datalab-hive-prod.gslb.db.com" port=10000 schema="pc_pdoa_analysis_pcb_nemp";
data pdoa; set pdoa.Eap_pd697h; where active=1 and partid=4000300; run;
When the same extract is done using ODBC, then the expected values are extracted.
I can work-around the issue using pass-thru and the from_utc_timestamp fucnction.
proc sql; connect to hadoop (uri="jdbc:hive2://datalab-hive-prod.gslb.db.com:10000/pr_pwcc_analysis;principal=hive/datalab-hive-prod.gslb.db.com@DBG.ADS.DB.COM;ssl=true?mapreduce.job.queuename=root.PWCC.ACE-S-FRA-DATALAB-PROD-PC_PDOA" server="datalab-hive-prod.gslb.db.com" port=10000 schema="pc_pdoa_analysis_pcb_nemp" READ_METHOD=HDFS); create table test_pdoa as select * from connection to hadoop ( SELECT from_utc_timestamp(ts_to,"Europe/Berlin") as ts_berlin, * FROM Eap_pd697h where active=1 and partid=4000300 ); disconnect from hadoop ; quit;
My question is: Is there a way to adjust the libname statement so that the "from_utc_timestamp" conversion is applied to all timestamp / datetime columns automatically.
You could look at the timezone option, but i suspect that's a rabbit hole more complex than the workaround you have in place.
https://documentation.sas.com/doc/en/pgmsascdc/9.4_3.2/nlsref/n0px72paaaqx06n1ozps024j78cl.htm
Are you ready for the spotlight? We're accepting content ideas for SAS Innovate 2025 to be held May 6-9 in Orlando, FL. The call is open until September 25. Read more here about why you should contribute and what is in it for you!
Learn how use the CAT functions in SAS to join values from multiple variables into a single value.
Find more tutorials on the SAS Users YouTube channel.