SAS BI Server is a suite of products from SAS, specifically put together to help an organization (business) improve its "Information Maturity" so that it can more effectively discover new markets, discover efficiency improvements, have more consistent reporting, better information about what is really going on, etc.
Reporting and analysis are big things. The tools generally rely on a SAS MetaData server. Reporting is generally some sort of web publication. Most of the tools are graphically oriented. Thus the Windows bias.
SAS on a mainframe is usually a long established culture of SAS programmers. SAS programming is not the same as COBOL, FORTRAN, PL/C, PASCAL, C, ADA, Java, .net, VB, PL/SQL, PowerBuilder, etc. programming, and so can easily become an arcane practice.
SAS was originally conceived to provide non-computer specialists with statistical analysis of data (SAS = Statistical Analysis System) on a computer using simple coding practices. The language is DATA centric, not machine control centric. Coding consists of identifying the data to be processed, where there results are going to go, and what to do with the data. Detailed work is accomplished through the DATA step and common or highly sophisticated analytics through PROC's -- e.g. "proc reg", to do regression analysis, or "proc forecast" to do highly advanced, statistical, forecasting.
SAS coding on the mainframe was accomplished with the local text editor, usually ISPF, and all of SAS processing was batch, and required creating a program. I did this every day, all day long, for over 5 years.
Enter EG! Wow, what a great adhoc tool. I can simply connect, drop in a data set, construct queries, chart results, run correlative analyses, transposes, etc. without having to write any code. My productivity at adhoc analysis seriously improved, sort of. EG has a lot of draw backs for multi-tasking human workloads, and in dealing with large and huge datasets. There are things I still have to do inside Excel, that I had hoped to convert to EG.
Now not only is there EG, but a plethora of point and click tools, making the power and magic of SAS available to a whole bunch of non-programmers, SAS's original intent.
But, underneath it all, is a complicated, non-trivial infrastructure. There has to be. Consider old-fashioned programming. A compiler is a highly sophisticated, complicated and non-trivial piece of programming, just to simplify the programming process for human beings, so that more people can attack more problems. Object orientation helps this effort for interactive programs, and what would otherwise be hugely complicated systems' modeling efforts.
Look at the hierachy of abstraction in an electronic digital computer:
At the bottom, is very specific electronic circuits composed of transistors, capacitors and resistors forming flip-flops, registers, adders, etc. Originally, computers had to be manually wired to implement a program. If the program changed, the wiring changed. Later, it because possible to add memory to the box, and the Von Numen architecture of stored programs came into being. Ones and Zeroes, low and High voltages (or vice versa) were stored and sequenced through the machine. Programming changed from hard wiring, to setting these ones and zeroes, specifically, by setting switches and pushing buttons. This was still referred to as machine level programming, in machine language.
The next step was the application of pneumonics to the sets of ones and zeros to make the instuctions readable by a human being, so that many could develop a program and others actually enter the programs. "set A = ...", "set B = ...", "ADD A,B", etc.
Next came programs that translated human readable code into the machine code = COBOL and FORTRAN compilers and assemblers. This started the whole field of computer language development, and programming analysis: FORTH, ALGOL, LISP, BASIC, PL/1, PL/C, C, SAS, PROLOG, ADA, shell programming, etc.
Each of these layers provides a level of abstraction away from the underlying electronics that actually do the work we want done. But to do the abstraction/conversion from one layer to the next, is a non-trivial piece of software or other system feature(s).
The development of the windowing and graphical user interface is a non-trivial piece of coding for the OS to provide those services.
VB is a non-trivial piece of coding to show a box at a specific location on a form and to translate that into an efficient expression of code for the form to be recreated when the job is executed. This non-trivial piece of work took a great deal of planning and design by Microsoft engineers and coders for it to be useful for other human beings to use.
It is the same situation here, creating a SAS BI environment so that people who are specialists at data and business analysis can access and process corporate data to discover useful and meaningful information, without also having to be experts at programming. The difference, is that instead of just buying a compiler and a PC for a developer, you have to provide a network of servers, provide data access, and populate a metadata server with meaningful information about the data. This is not an appliance that you just plug in and use.
The best analogy that I can think of is city planning. A corporation is like a city. A city provides infrastructure for its residents: water, sewer, waste removal, streets, simplified access to electrical power, simplified access to telecommunications, etc. This requires careful planning by the city engineers and planners. The IT division of a corporation is supposed to provide the infrastructure to technology services and information. To do so effectively requires careful design and planning.
Is it worth it? What do you think?
Message was edited by: Chuck