I encountered a similar problem recently with a large simulation. Basically want to do a whole bunch of independent computations on each replicate simulation run. Kind of like your computations, on each replicate sampled dataset. Consider the SAS code in the following URL: SAS/CONNECT - Tips and Tricks /** This tells SAS you want to do some parallel processing, so use your multiple cores **/ options sascmd="sas"; /** Break the job into multiple tasks to send to each core...you kind of have to balance the load yourself **/ /** It's ugly but I put the surveyselect and reg procedures in each "task chunk". Not sure how to pass your local dataset to each core otherwise **/ /** But since the replicate samples are independent anyways this shouldn't matter much **/ /** Start task chunk 1 **/ signon task1; %syslput remvar1=somevalue; rsubmit task1 wait=no log="task1.log" output="task1.lst"; proc surveyselect out=chunk1 ; run ; proc reg data=chunk1 ... ; by rep ; run ; endrsubmit; /** Start task chunk 2 **/ signon task2; %syslput remvar2=somevalue; rsubmit task2 wait=no log="task2.log" output="task2.lst"; proc surveyselect out=chunk2 ; run ; proc reg data=chunk2 ; by rep ; run ; endrsubmit; /** Start task chunk n **/ signon taskn; %syslput remvarn=somevalue; rsubmit taskn wait=no log="taskn.log" output="taskn.lst"; proc surveyselect out=chunk_n ; run ; proc reg data=chunk_n ; by rep ; run ; endrsubmit; /** Tell SAS not to do anything until all of the parallel tasks have completed **/ waitfor _all_ task1 task2 ... taskn; /* do some further local processing */ /** Sign out of the parallel SAS sessions **/ signoff task1; signoff task2; signoff taskn; The survey select samples are independent of each other. So you want to feed these independent tasks to each of your cores. Put proc surveyselect and proc proc reg in between the rsubmit and endrsubmit statements and they will run on different cores (assuming you wait=no argument). I had trouble recovering the data from each core. They go to temporary work libraries and are lost when you signoff the core. You can see they live (temporarily) somewhere like here: C:\Users\AppData\Local\Temp\SAS Temporary Files It's kind of cool to see them created and then disappear during each spawned/parallel SAS session. Some people suggest saving the libname directory in a macro variable. Didn't work for me. The macro variable stored was not in fact the name of the temporary file location created by the parallel session. Not sure why? My workaround was to just put a libname statement in each task chunk, and send the work done at each core to some permanent file location where you know. Then reference this libname in your local version of SAS to recover the work done in each chunk on each core. That's about it...in my opinion it is quite annoying. SAS makes the user do all the work of passing datasets and macro variables, etc. to the separate cores. The user must balance load on their own. Control memory usage. Etc. It might be easier with the %Distribute macro. However, I could not get it to work? For your problem I think it would be much more feasible with sample(), lm(), foreach(), doParallel() and snow()/parallel() in R. Way nicer features than SAS and muchl better documentation.
... View more