Architecting, installing and maintaining your SAS environment

How to move data from SAS to MapR

Reply
Occasional Contributor
Posts: 6

How to move data from SAS to MapR

 
 

Hi,

 

We are using Linux server for SAS and version is SAS 9.3. We have a requirement in which SAS library data needs to be migrated to MapR. The SAS library connection uses oracle tables. We tried to search the library file on the SAS server but not able to find. Please suggest how can we find those SAS data files and where they are stored??

These SAS files needs to be transferred to MapR, scp command can be used for file transfer.

 

Thanks

AS

Super User
Posts: 3,101

Re: How to move data from SAS to MapR

SAS libraries can be set up in a number of ways including using a LIBNAME statement in a SAS program which could be stored almost anywhere. However I suggest you first check SAS Management Console to see if the library is defined in SAS metadata - see the Data Manager in SMC.

Frequent Contributor
Posts: 117

Re: How to move data from SAS to MapR

Your requirement isn't very specific. MapR is an Hadoop distribution coming with many storage tools and file or data electronic formats.  On the other hand, an Oracle table is a proprietary data format - as far as I now - not compatible with any Hadoop native storage format. You've been asked to migrate data from an Oracle database to Hadoop MapR using SAS as a kind of handy bridge/migration tool at your disposal. Dumping data from Oracle with any SQL client that generates flat files (e g CSV) would work all the same if I understand correctly your question.

 

 

There are even tools in Mapr (Hadoop) like Sqoop + Oozie or Sqoop2 which can transfer directly - if your network rules allows this - betwen Oracle and Hadoop/HDFS (flat-files stored in distributed chunks inside Hadoop) using JDBC connection. An Oracle JDBC driver installed on the Hadoop corresponding node (executing Sqoop or Sqoop2) together with a network flow opening (if feasible) can launch SQL requests on the Oracle DBMS and feed the Hadoop File System, generating CSV files for instance or Hive Tables sometimes.

 

In which format are you expected to deliver your data on the target system - namely the Hadoop MapR cluster ? flat-files (CSV), binary files (PARQUET, AVRO etc.). Hive tables etc. ? 

 

Unfortunately, your 9.3 SAS system won't be able to communicate directly with MapR since the connector SAS/ACCESS to Hadoop is supported with MapR distribution only with SAS 9.4 latest release M3. 

 

HTH ...

Ask a Question
Discussion stats
  • 2 replies
  • 277 views
  • 0 likes
  • 3 in conversation