06-15-2016 12:53 PM
I'm not sure where to post this question, so I'm going to start here.
I have a fairly deep Base SAS background. I've been playing with the Hortonworks Sandbox to get introduced to 'big data' and tooling for data discovery for analysis, using say micro-marts.
What I found is that as you walk along the data discovery path with a Hadoop platform, when you get to the point where you need to create new 'columns', 'transform the data' on the Hadoop platform you end up writing essentially SQL queries in hive, which in my humble opinion are more difficult to use and more error prone than using SAS and data steps to 'stepwise' manipulate and discover the data.
Has anyone made a similiar comparison for programmer ease of use and perhaps sees it differently?
06-16-2016 02:23 AM
06-16-2016 06:13 AM
Thanks for your reply.
Yes, Hadoop (and I'm referring here to the example enviroment of Hortonworks) is made to handle the physical side of data management with high performance. In that space, SAS High Performance Analytics can also handle the physical side.
My question is focused on the user/programmer/data discovery ease of use. For me at least, SAS Data Step processing is far easier for data viewing, data discovery, data transformation and data model emergence than SQL queries (long, complex, error prone, and obtuse), except for simpler queries to join or subset data tables.
Other points of view on the ease of use factors invited (SAS support?)
06-16-2016 08:43 AM