dear all
I apologize for asking a question that is not directly linked to SAS procedure. I am asking this question because many people have asked me if I have any experience with Large Size Data. The largest data I have ever dealt with is about 200MB, and I don't feel much difference.
I realize that nowdays data could be several GB big or even bigger, but I don't understand what a data analyst will do differently with those Large Size Data in comparison with normal size data. Could anyone give me some hint? Or is there any book I can read?
Thank you very much.
Definitely NOT my area of expertise but, from the position of one who has managed SAS programmers and statisticians, I can guess why they would be asking.
I think it comes down to what does a candidate know regarding optimizing the analysis of really large amounts of data. I.e., what are the benefits of such things as teradata and sql passthru, when are such techniques/technologies most useful, and what conditions might make using certain techniques a waste of time? Similarly, what are the benefits of using indexes and what conditions would make their use beneficial.
As for suggested reading, I'll leave that to those who have actually read the materials.
Art
Thank you, Art.
Any other comments? please...
Don't miss out on SAS Innovate - Register now for the FREE Livestream!
Can't make it to Vegas? No problem! Watch our general sessions LIVE or on-demand starting April 17th. Hear from SAS execs, best-selling author Adam Grant, Hot Ones host Sean Evans, top tech journalist Kara Swisher, AI expert Cassie Kozyrkov, and the mind-blowing dance crew iLuminate! Plus, get access to over 20 breakout sessions.
Learn the difference between classical and Bayesian statistical approaches and see a few PROC examples to perform Bayesian analysis in this video.
Find more tutorials on the SAS Users YouTube channel.