Hey guys, are there any benchmarking tools to know the performance of EG and the SAS server? I am hoping to propose some performance tuning, but thats only meaningful if there are practical ways to benchmark performance.
MOM, BMC Best1 (Performance Assurance for Servers), Teamquest, HP measureware (glance+, + ?), Tivoli, others are vendor tools for gathering system performance metrics and aid in doing system performance analysis.
Native to Windows is the Performance monitor (PerfMon).
Native to Unix are sar, vmstat, iostat.
Additional Unix tools may be pstat, prstat, top, topas, nmon.
Best benchmark for SAS are your biggest SAS jobs, or simply to monitor performance during the heaviest usage and determine where the bottlenecks lie.
Message was edited by: Chuck
Thanks Chuck. I'll skip on vendor products like IBM-Rational. I was hoping to define how well SAS performs per session or in some quantifiable way, but its difficult given that there are too many users and little control over their behavior as yet. How fdo you guys approach SAS for solutioning without a given benchmark for performance? Thanks for bearing with my ignorance
I collect performance metrics for all servers that I'm allowed to.
I collect as much business metrics as I'm allowed to access.
I do performance analysis and usage characterizations of the systems.
I discover useful business information.
I find the correlations between system resource utilization and business workloads.
I then report on my findings and provide advise based on my discoveries.
This is real capacity planning.
Charting, which includes producing trending/forecast charts that track actual vs. predicted usage is actually just operational monitoring and part of capacity management, but it is not capacity planning (although I do use charts as part of my analysis process since I have a visual orientation).
While I make extensive use of SAS to do this, SAS is also just another system that requires capacity planning and thus performance analysis and usage characterization.
Benchmarks are used only for comparison of differing hardware.
Granted that benchmarks are more meaningful to guage the relative performance of two different boxes, but wont you need to make a judgement call what physical solution is needed (based on how aggressive you intend to crunch data and run sessions).
Given the high cost of set up, I'd want to see some concrete figures or estimates to justify the type of servers used, the physical architecture and the logical architecture (at the application layer). It sounds demanding, but I'd ask the same questions if it were a web server or a typical weblogic server. After we know the demands SAS makes on a box, we'd be able to estimate what physical box (or what technical specs) we should look at, so we dont end up buying a dualcore PC with 4GB ram when what we really need is an E10K or a midrange server with 4 x cpu and 16 GB memory.
1) You have a SAS environment right now that you want to upgrade, right?
2) If so, then you measure the current usage and determine what kind of bottlenecks exist and how long things take.
3) You now ask a lot of questions, like "Would a faster box really improve our productivity? How much?" Sometimes a faster box reduces human productivity because "multi-programming/tasking" with long running processes goes away and work becomes serialized due to the faster turnaround of results.
4) SAS does a lot of valid benchmarking of its targeted applications to help new customers determine how big a box to start with. For example, Risk Dimensions. You provide them with an idea of usage and data characteristics, and they can answer back with "a DL585 with 4x 3.0 GHz Opteron processors should be adequate".
Upgrading a system is like designing a system. These are engineering tasks, so use an engineer's mind, not a technician's. Technicians get into the details of the problem too quickly. Start at a higher level first. Determine, is there a problem? or is it simply a desire? What is desired? What is needed to produce the desired result? Effective programming begins with knowing what the desired result is (specifications), not with "I need this compiler because it's cool" and "I need this box because I want it to be fast". Once the goal is well defined, then you work down into the details that are needed to accomplish the goal.
As an electrical engineer, I was given "we need a circuit to join two systems together". My first move was not to go to the stock room and pull parts and start building somethig. It was to determine, what does it mean for th two systems to be joined? What is meant by "joined"? What needs to work? Then I went on to how do the systems work? What information needs to be shared/passed? How is that information communicated? Eventually, I got down to "ok I need a transformer to couple the audio lines" and "I need an opto-isolator" to transfer some state signals, plus I determined I needed a flip-flop and two IC's of NAND gates to control all the decision logic. Finally, I asked a more experience engineer of what i needed to do to harden the circuits against static electricity, which involved adding some diodes and a couple MOV's. All the parts and their values were determined by careful top-down thinking, decision-making and design.
You are asking some valid questions, and make some valid points, but stop and listen to yourself. These things are necessary considerations. So do them. Get the numbers, make the correlations, etc. But first, identify and define the goals and the reasons for them, then get what you need to work your way down to the details of what you want.
you're right. I hope I dont offend with my tone, because I was being frustrated with myself and not with any persons (or even sas). As much as I gouge myself with sas literature, theres only so far I can go *in terms of testing performance, usability or configuration without a test platform that I can afford to tinker with.
The metrics for SAS may not be as clear as what I'd see for oracle or weblogic (throughput, number of concurrent threads or transactions per second, volume of traffic handled), but SAS is still an application server, so traditional benchmarking principles should still apply. I'll have to sit down one day and run through all the literature and build a more coherent schema of the architecture and workflow, but my own bandwidth has been eaten up so far.
I was hoping to see some case studies of how other firms or universities have set up their physical and application layer architecture (and corresponding data architecture). This would be a good way of picking up the good of other customers and learning from their merits and mistakes.
I do need to change my mindset. I was working with Redhat in another work stint, and it took me a while to re-learn how to learn because solaris (and its community) is so different. With SAS, I have to still my mind and assess what is the best frame of mind to deliver the right results
Message was edited by: Joshua
If you benchmark servers, it provides a basis for comparing servers.
If you benchmark "sas" and other analysis programs/systems, it allows comparison of the systems.
If you want to size the hardware for a SAS environment, then you are back to benchmarking hardware, and simply noting the performance of the SAS processes of interest.
You can use the system options STIMER and FULLSTIMER to get some elapse time measurements for SAS processes, and the amount of memory SAS used for that "step".
If you want to design a new environment, then you need to step back and run through the design steps like I wrote earlier.
Minimally, for an effective EG environment, you need a Windows server to hold the metadata repository, a SAS server with appropriate database clients and drivers installed, BaseSAS, SAS/IT, SAS/Access for each database and PC files and SAS/GRAPH.
Maximally, you have the whole suite of stuff: EG, Web Report Studio, BI, all the SAS components, SAS MetaData Servers, SAN connections, Database Connections, multiple physical SAS servers, including test and dev, and more.
If you need help with this, then contact SAS and have them come out and access your business for BI, etc. After they leave, step back, take a deep breath, and look at what they say soberly. Remember, salesmen want to sell you the world, and all you may need is a cork screw.