Once you have reached this state of job complexity (150 reports is considerable), you should start building your own job infrastructure.
Since using a dedicated SAS server (and not just a SAS workstation), and the number of reports point to a certain size of your organisation, I think that there is an organisation-wide scheduling system already in place. I strongly recommend to attach the data warehouse to that.
For this I suggest:
1) have an individual program for your reports (or a group of reports created from the same source(s))
2) all those programs must be able to run in batch, with the same prerequisites (think Linux environment variables used for program control)
3) create a bash script that performs a single batch run, which handles the creation of individual job logs, and returns exit codes properly; you can find a blueprint for this in the Lev1/SASApp/BatchServer directory (you might even be able to customize this to your needs with the _usermods files)
Now you can start to build the connection from the organisation scheduler to the SAS box, and have your jobs run when the main scheduler has successfully run the jobs that create your inputs.
As an intermediate solution, you could also write a master script that repeatedly calls the script from 3) for all your jobs, checks for return codes, and handles the sequence as necessary.
While this means a lot of work to set up, once you have the infrastructure in place, adding new jobs and dependencies is a "piece of cake".
... View more