Does anyone know of any tips to optimize proc GINSIDE? It just seems like it is using a fair amount of I/O time. I didn't see an documentation on this in particular, but is there any particular index that you could put on either the input table or the map table that would speed up the process? I am mapping points into 2000 block groups, and it seems to be faster if I split up the map datasets into states. I could go further and do counties but that would be a lot of tables. Does anyone else have any similar tests or know of any techniques with this proc?
Thanks--
I'm running into the same issue if anyone has any ideas. I know this is an old post but I have over 600 million rows of census block data.
Is your data 600 million records or the block level map boundary data file?
Hi ballardw,
It's the block level map boundary data file. My address file has a little more then 13,000.
Thank you
Hi @jerry898969 ,
did you get any solution for proc ginside space?
Sorry it's a bit old discussion but I am now running through the same problem.
Already tried some way around but not helpful.
Thank you.
Good news: We've extended SAS Hackathon registration until Sept. 12, so you still have time to be part of our biggest event yet – our five-year anniversary!
Learn how use the CAT functions in SAS to join values from multiple variables into a single value.
Find more tutorials on the SAS Users YouTube channel.
Ready to level-up your skills? Choose your own adventure.