Do you have SAS Data Integration Studio? Are you running a job with the job scheduler in the Enterprise Intelligence Platform?? Are you doing this as a regular batch job without using DI Studio?? Are you trying to write to the webDAV server (Xythos) from a stored process???
Once you are writing programs to execute on the Stored Process Server or the Workspace Server, you sometimes have to change what you do -- it might be different from a regular BASE SAS job. Platform configuration and server configuration might impact how you run the job (whether you are using GRID servers, server pooling, etc).
You probably should work with Tech Support on this question, because the kind of load scheduling/pooling/etc that you have on your servers and how you are submitting the code (via DI Studio job) or via batch SAS job, could impact how you write your program. So before you go down a CALL EXECUTE path -- which will place all the code to be executed into a "stream" of code in one SAS session -- you might really want to work with Tech Support so they can take your specific Platform configuration into account.
I understand and I still think you should work with Tech Support. You need to decide which method you're going to use. Just coding a CALL EXECUTE or using a %INCLUDE will not work to achieve parallel processing.
In a single batch job, you would be submitting code to a single SAS session -- and, as you have noted, for 30,000 outputs, this may not be desirable. Unless you code the ability to spawn separate SAS sessions (as described in the papers); in your single batch job, you would be running everything in ONE SAS session. Probably not what you want to do. Much of this functionality (grid computing, load balancing, spawning servers as needed) is built into how the servers work in the Enterprise Intelligence Platform.
I had a very similar task about a year ago. It is very much dependant on your opperating system and scheduler. SAS can generate the jobs and submit them to your batch queue. At the simplest you can uses SAS to generate the jobs, then submit them to the batch proccessor. The OS then has control of the jobs and you will need to monitor the batch queue if you need to get control back to SAS at the end.
By monitoring the queue you may control how many jobs are running concurrently. In my case I needed to limit to 5 or 6 so that I did not impede other systems. When one finishes, then start another.
This is all going to be OS coding executed from within SAS. SAS will not multitask from a single session on its own.