I'm curious how dictionary.macros is allocated on UNIX. If several batch jobs are running on UNIX does each batch job have its own dictionary.macros allocated in temporary space or is one dictionary.macros dataset shared by all UNIX processes?
You may want to clarify what you mean by "dictionary.macros", please. Or possibly you can cite some documentation reference where you saw the phrase mentioned?
For compiled macros, each user as an object catalog in their own WORK allocation, in addition to the SASAUTOS= CONFIG/OPTIONS setting.
And, regarding the reference to dictionary, possibly you are referring to the DICTIONARY views maintained for each user's session, providing a varied assortment of goodies through the SAS program execution life-span.
Possibly, you will find some useful search hits/matches, using the Google advanced search argument below - using a required phrase match and limited to just the SAS.COM website/domain:
I'm dealing with batch software at this point and want to make sure that each process id on UNIX has it's own copy.
I have found dictionary.macros as a bottle neck for jobs make too much use of macro variable arrays. Too much is tens of thousands of macro variables. Lots of thrashing back and forth to disk in the use of macro variables. There is definitely a downside to macro variables for high volume applications. Data step processing can be more efficient if designed properly.
I wanted to make sure that dictionary.macros would not be shared between jobs causing an even bigger contention point.
Besides one Sugi hit, Google did not find much on the creation of the dictionary.macros.
All macro variables are stored in a view: sashelp.Vmacro.
Its better to deal with a subset of table sashelp.Vmacro, if you use big sequences of macro variables.
Every sas session deals with their own macro variables.
performance is impacted by many things.
However, dictionary macros hosting won't be your problem.
That is more likely caused by application design. As you identify too much use of macro variable arrays.. That may be worth investing time to research alternative designs.
Without knowing your applications nor implementation, the very obvious issue highlighted is open for alternative approaches. Macro arrays are efficient in small volume but may not scale up efficiently, as you have seen.
Large volumes of data should be handled differently.
The sas data set architecture is a very robust alternative to macro arrays, and almost without practical limitations.
The SAS92 platform supports in-memory libraries which can improve performance up to practical limits (memory above 2gb iirc).
Hardware like solid state drives(SSD) can move the issue outside of the SAS version issue.
SAS Grid Manager can help you scale outwards to multiple SAS servers.
Macro processing need not create your batch system's constraint - macro variable values can be passed between connected sas servers.
However, I think your system needs to be designed to deal with these arrays of information differently - and not hosted on macro variables.
You're right "definitely a downside" in your application, but with sensible design macro variables need not cause "downside" for other applications.
We might be able to help further, with further description.
Also, a consideration point is that a SAS application's code compilation phase is normally a small part of the overall application performance optimization challenge. I would be more concerned about the DATA and PROC step execution phase performance, as a first priority.
Also, my understanding is that SAS macro variables "values" are stored in memory (symbol tables), where SAS does also maintain a SAS VIEW in SASHELP.VMACRO with information about each declared macro variable.
Unfortuately "memory" in SAS 9.1 means dictionary.macros cached to a UNIX file system directory. I originally realized the problem when somebody else made a cartesian join in a proc sql creating macro arrays. The bad join created so many macro variables it filled up the tmp space on a test server. All of that space was in dictionary.macros.
If SAS 9.2 will allow for the dictionary.macros to really reside in memory instead of a file system directory, that would be helpful with thoughtful use of macro variables. Applications trying to utilize a terabyte of macro variables will probably always be problematic.
I'm stuck with SAS 9.1 on multiple systems with little chance of upgrade. SAS is not the main reason to be for these servers.
I have found that using multi-dimensional 3rd generation style programming in data steps can solve the performance issue. Loops within loops within loops during a single data step much like programming COBOL or other older languages. The design takes much more thought but SAS data step functionality is well suited to the task.
I doubt the SAS Institute would recommend the type of SAS macro coding I've been seeing.