BookmarkSubscribeRSS Feed
JMarkW
Fluorite | Level 6
I'm curious how dictionary.macros is allocated on UNIX. If several batch jobs are running on UNIX does each batch job have its own dictionary.macros allocated in temporary space or is one dictionary.macros dataset shared by all UNIX processes?
11 REPLIES 11
DanielSantos
Barite | Level 11
From what I know (unix or not), each one has its own temporary macro space.

Although you can preload the same macros to different processes, no macro is shared between them.

Cheers from Portugal.

Daniel Santos @ www.cgd.pt
JMarkW
Fluorite | Level 6
Thanks from North Carolina
sbb
Lapis Lazuli | Level 10 sbb
Lapis Lazuli | Level 10
You may want to clarify what you mean by "dictionary.macros", please. Or possibly you can cite some documentation reference where you saw the phrase mentioned?

For compiled macros, each user as an object catalog in their own WORK allocation, in addition to the SASAUTOS= CONFIG/OPTIONS setting.

And, regarding the reference to dictionary, possibly you are referring to the DICTIONARY views maintained for each user's session, providing a varied assortment of goodies through the SAS program execution life-span.

Possibly, you will find some useful search hits/matches, using the Google advanced search argument below - using a required phrase match and limited to just the SAS.COM website/domain:

+"dictionary.macros" site:sas.com


Scott Barry
SBBWorks, Inc.
JMarkW
Fluorite | Level 6
The dictionary.macros that you can proc sql to.

proc sql;
select *
from dictionary.macros;
quit;

I'm dealing with batch software at this point and want to make sure that each process id on UNIX has it's own copy.

I have found dictionary.macros as a bottle neck for jobs make too much use of macro variable arrays. Too much is tens of thousands of macro variables. Lots of thrashing back and forth to disk in the use of macro variables. There is definitely a downside to macro variables for high volume applications. Data step processing can be more efficient if designed properly.

I wanted to make sure that dictionary.macros would not be shared between jobs causing an even bigger contention point.

Besides one Sugi hit, Google did not find much on the creation of the dictionary.macros.
sbb
Lapis Lazuli | Level 10 sbb
Lapis Lazuli | Level 10
SAS maintains this read-only perspective (data tables and SAS views) about much of your SAS processing/operating environment, as earlier mentioned. There is discussion in this SUGI conference paper:

http://www2.sas.com/proceedings/sugi30/070-30.pdf

If you have a SAS performance problem, I encourage you to open a SAS track to determine if your SAS operating configuration is the root-cause of any performance issues.

Scott Barry
SBBWorks, Inc.
SAS_user
Calcite | Level 5
Macro and macro variable is different.

All macro variables are stored in a view: sashelp.Vmacro.
Its better to deal with a subset of table sashelp.Vmacro, if you use big sequences of macro variables.
Every sas session deals with their own macro variables.
Peter_C
Rhodochrosite | Level 12
performance is impacted by many things.
However, dictionary macros hosting won't be your problem.
That is more likely caused by application design. As you identify too much use of macro variable arrays.. That may be worth investing time to research alternative designs.
Without knowing your applications nor implementation, the very obvious issue highlighted is open for alternative approaches. Macro arrays are efficient in small volume but may not scale up efficiently, as you have seen.
Large volumes of data should be handled differently.
The sas data set architecture is a very robust alternative to macro arrays, and almost without practical limitations.
The SAS92 platform supports in-memory libraries which can improve performance up to practical limits (memory above 2gb iirc).
Hardware like solid state drives(SSD) can move the issue outside of the SAS version issue.
SAS Grid Manager can help you scale outwards to multiple SAS servers.
Macro processing need not create your batch system's constraint - macro variable values can be passed between connected sas servers.
However, I think your system needs to be designed to deal with these arrays of information differently - and not hosted on macro variables.
You're right "definitely a downside" in your application, but with sensible design macro variables need not cause "downside" for other applications.

We might be able to help further, with further description.

PeterC
sbb
Lapis Lazuli | Level 10 sbb
Lapis Lazuli | Level 10
Also, a consideration point is that a SAS application's code compilation phase is normally a small part of the overall application performance optimization challenge. I would be more concerned about the DATA and PROC step execution phase performance, as a first priority.

Scott Barry
SBBWorks, Inc.
sbb
Lapis Lazuli | Level 10 sbb
Lapis Lazuli | Level 10
Also, my understanding is that SAS macro variables "values" are stored in memory (symbol tables), where SAS does also maintain a SAS VIEW in SASHELP.VMACRO with information about each declared macro variable.

Scott Barry
SBBWorks, Inc.

SAS Macro Language: Reference, Macro Variables
http://support.sas.com/documentation/cdl/en/mcrolref/61885/HTML/default/a002047074.htm
JMarkW
Fluorite | Level 6
Unfortuately "memory" in SAS 9.1 means dictionary.macros cached to a UNIX file system directory. I originally realized the problem when somebody else made a cartesian join in a proc sql creating macro arrays. The bad join created so many macro variables it filled up the tmp space on a test server. All of that space was in dictionary.macros.

If SAS 9.2 will allow for the dictionary.macros to really reside in memory instead of a file system directory, that would be helpful with thoughtful use of macro variables. Applications trying to utilize a terabyte of macro variables will probably always be problematic.
JMarkW
Fluorite | Level 6
I'm stuck with SAS 9.1 on multiple systems with little chance of upgrade. SAS is not the main reason to be for these servers.

I have found that using multi-dimensional 3rd generation style programming in data steps can solve the performance issue. Loops within loops within loops during a single data step much like programming COBOL or other older languages. The design takes much more thought but SAS data step functionality is well suited to the task.

I doubt the SAS Institute would recommend the type of SAS macro coding I've been seeing.

sas-innovate-2024.png

Join us for SAS Innovate April 16-19 at the Aria in Las Vegas. Bring the team and save big with our group pricing for a limited time only.

Pre-conference courses and tutorials are filling up fast and are always a sellout. Register today to reserve your seat.

 

Register now!

How to Concatenate Values

Learn how use the CAT functions in SAS to join values from multiple variables into a single value.

Find more tutorials on the SAS Users YouTube channel.

Click image to register for webinarClick image to register for webinar

Classroom Training Available!

Select SAS Training centers are offering in-person courses. View upcoming courses for:

View all other training opportunities.

Discussion stats
  • 11 replies
  • 1215 views
  • 0 likes
  • 5 in conversation