BookmarkSubscribeRSS Feed
🔒 This topic is solved and locked. Need further help from the community? Please sign in and ask a new question.
suneelveluru1
Calcite | Level 5

Dear All,

 

In need of your advise,

 

we are working for a customer where they have sas system and hadoop system,now the challenge is we need to establish connectivity between sas and hadoop,more over customer wants us to develop a connecctor instead of buying licence for hadoop connecctivity,

is there any way where I can bridge in sas and hadoop,and I have seen some options as below

 

So far we have obsurved 3 approaches to connect 

 

Option 1 : LIBNAME statements can be used to make Hive tables look like SAS data sets on top of which SAS Procedures and SAS DATA steps can interact.

                 

 Option 2 : PROC SQL commands provide the ability to execute direct Hive SQL commands on Hadoop.

Option 3 : PROC HADOOP provides the ability to directly submit MapReduce, Apache Pig, and HDFS commands from the SAS execution environment to your CDH cluster.

if not these above options do we have any way to connect

Thanks,

Suneel

1 ACCEPTED SOLUTION

Accepted Solutions
suneelveluru1
Calcite | Level 5

Dear Gergely,

 

Thanks a Ton for your prompt reply,we are following instructions given by you ,here is the link I found to configure hadoop with sas,

https://support.sas.com/resources/thirdpartysupport/v94/hadoop/hadoopbacg.pdf

 

Thanks,

Suneel.

 

View solution in original post

2 REPLIES 2
gergely_batho
SAS Employee
Option1 and Option2 are basically the same: if you established a connection to Hive or Impala then you can use proc sql, data step or any other proc to access data in hadoop.
Yes: proc sql with hive or impala libname engine is able to pass through (aka. push down) the query to hadoop (as with almost all database engines). Access to Hadoop or Access to Impala license is needed for Option1/Option2.

Option3: With only Base SAS license you can use proc hadoop to run MR,Pig and HDFS commands. Also you can read and write HDFS files by creating a fireref with the HADOOP Access Method.

Option4: You could start external programs from SAS with the x command. Those external programs could upload/download files, datasets to/from Hadoop.
Similar to this: you can run java and groovy programs from SAS.
Depending on the applications on Hadoop, if they support some kind of API (REST for example), you can connect to it with custom developed SAS programs.

Downside of the later solutions: performance. You usually use Hadoop, because "data is big", and you don't want to wait. Making intermediate copies on the client or on server side slows down execution.
suneelveluru1
Calcite | Level 5

Dear Gergely,

 

Thanks a Ton for your prompt reply,we are following instructions given by you ,here is the link I found to configure hadoop with sas,

https://support.sas.com/resources/thirdpartysupport/v94/hadoop/hadoopbacg.pdf

 

Thanks,

Suneel.

 

sas-innovate-2024.png

Don't miss out on SAS Innovate - Register now for the FREE Livestream!

Can't make it to Vegas? No problem! Watch our general sessions LIVE or on-demand starting April 17th. Hear from SAS execs, best-selling author Adam Grant, Hot Ones host Sean Evans, top tech journalist Kara Swisher, AI expert Cassie Kozyrkov, and the mind-blowing dance crew iLuminate! Plus, get access to over 20 breakout sessions.

 

Register now!

What is Bayesian Analysis?

Learn the difference between classical and Bayesian statistical approaches and see a few PROC examples to perform Bayesian analysis in this video.

Find more tutorials on the SAS Users YouTube channel.

Click image to register for webinarClick image to register for webinar

Classroom Training Available!

Select SAS Training centers are offering in-person courses. View upcoming courses for:

View all other training opportunities.

Discussion stats
  • 2 replies
  • 980 views
  • 0 likes
  • 2 in conversation