07-05-2011 04:25 PM
I apologize for asking a question that is not directly linked to SAS procedure. I am asking this question because many people have asked me if I have any experience with Large Size Data. The largest data I have ever dealt with is about 200MB, and I don't feel much difference.
I realize that nowdays data could be several GB big or even bigger, but I don't understand what a data analyst will do differently with those Large Size Data in comparison with normal size data. Could anyone give me some hint? Or is there any book I can read?
Thank you very much.
07-05-2011 05:48 PM
Definitely NOT my area of expertise but, from the position of one who has managed SAS programmers and statisticians, I can guess why they would be asking.
I think it comes down to what does a candidate know regarding optimizing the analysis of really large amounts of data. I.e., what are the benefits of such things as teradata and sql passthru, when are such techniques/technologies most useful, and what conditions might make using certain techniques a waste of time? Similarly, what are the benefits of using indexes and what conditions would make their use beneficial.
As for suggested reading, I'll leave that to those who have actually read the materials.