No, we still have SAS running on the mainframe. One reason = MXG, and there are others.
There was little to no conversion from mainframe to SAS BI here. SAS BI was implemented as a "new" thing. Mainframe data is run through ETL processes, either with SAS or with Informatica to provide database(s) (DB2) and SAS with the data to support BI. SAS Environment = Windows.
At another place, SAS, COBOL and C code was coverted to populate the datawarehouse -- Oracle on AIX -- which became the primary data source, directly connected through metadata. REXX conversions were mostly to Unix Korne Shell scripts or new SAS code (had to train the mainframer involved in proper SAS coding perspectives/mentalities). There were a number of SAS programs that got "converted" from OS/390 batch jobs to running as unattended batch on a Unix SAS server, but each group (7) were responsible for their own conversions. Parallel processing occurred for about six (6) months to insure correct quarterly processing. Part of the conversion included the adoption of Control-M to control scheduling and sequencing of the job streams. Environment = Unix (except for EG). Use of SAS/CONNECT provided access to TB's of historical data on tape to mitigate having to move it into the Unix environment, and retain it on disk.
Downside to Windows = 32-bit processing
Downside to Unix = doesn't support some capabilities only available on Windows boxes.
Upside of Windows = more pieces supported, more familiar to user community.
Upside of Unix = better IO performance (best throughput) and 64-bit processing.
In all cases, SAS administration makes a lot of difference for success. Well informed/trained and technically competent SAS Administration = mucho success and much happiness ("This is great!"). Deficiencies in these areas = pain for the user community ("Why can't I do this?", "How can I do this?", "What do you mean ...", "Well this was a waste of time", "How long ...", "I'm converting back", "I don't do that, it's too hard", "I don't use that, it doesn't work the way I need it to work", etc.).
Since the BI environment is going to be around for a long time, and will require a significant investment in infrastructure and people's time, I recommend careful planning by some real design engineers. Design by technician and design by management always causes grief to the end users.
Quality design engineering takes a look at the big picture = "What is trying to be accomplished" and pays attention to the details when it is appropriate, adjusting the design as appropriate. Technicians get embroiled in the details too soon, "designing" from the inside out, providing too much of one thing, not enough another, and missing outside interconnecting pieces. Management may have the big picture, but then they delegate off the task for detail management, which always causes design changes, and always causes problems because management thinks "I've already dealt with this" and "we're doing it this way because I/we said so". The worst is design by project management so that they can meet their time schedules.
It is important to keep a total system perspective, and then to choose the right tools for the required jobs.
See my other remarks on performance in the Enterprise Guide forum -- Performance differences: SAS/Server vs SAS/PC.
Data movement is the slowest and most inefficient thing to do.
If the mainframe is going to remain, then keep its data there and simply access it through EG, the MetaData Server, and SAS/CONNECT and/or SAS/SHARE.
A hybrid environment has the greatest strength, in my opinion -- mainframe, unix, linux, windows -- the right tool for the right job.
Management needs to be sober about people resources. Consultants vs. full-time employee(s) is a non-trivial consideration. It's not just about salary vs. contract, benefits vs. contract. There's also longivity, and motivation. What about support? If you spend > $100,000 a year for support, isn't that the cost of a salaried person anyway? You may not have to pay a contractor benefits, but that's not true, because they get paid more so that they can pay for their own benefits and down/bench time (when they aren't working). There is a montra for out sourcing, but what really is the quality of the work done elsewhere? What about errors, bugs, support, etc.? Lot's to think about.
Message was edited by: Chuck
Sorry for my intrusion. Your post indicated there are so much planning , design, infrastructure , and cost involve in moving from mainframe to BI. What is the usually justification moving from mainframe to other platforms because of BI ? Does this worth all the work involve ?
In our case, the justification will be (IF we do it) is cost savings. Mainframe software costs are huge and being a state agency, budget is critical. The intangibles are what we are looking at now . Yes your SAS software bill will be
less for yearly maintenance later on, but up front...not. And as Chuck says, you want to do it properly. I have already discovered the 5 layers of admin and
access so that is much different from mainframe admin.
SAS BI Server is a suite of products from SAS, specifically put together to help an organization (business) improve its "Information Maturity" so that it can more effectively discover new markets, discover efficiency improvements, have more consistent reporting, better information about what is really going on, etc.
Reporting and analysis are big things. The tools generally rely on a SAS MetaData server. Reporting is generally some sort of web publication. Most of the tools are graphically oriented. Thus the Windows bias.
SAS on a mainframe is usually a long established culture of SAS programmers. SAS programming is not the same as COBOL, FORTRAN, PL/C, PASCAL, C, ADA, Java, .net, VB, PL/SQL, PowerBuilder, etc. programming, and so can easily become an arcane practice.
SAS was originally conceived to provide non-computer specialists with statistical analysis of data (SAS = Statistical Analysis System) on a computer using simple coding practices. The language is DATA centric, not machine control centric. Coding consists of identifying the data to be processed, where there results are going to go, and what to do with the data. Detailed work is accomplished through the DATA step and common or highly sophisticated analytics through PROC's -- e.g. "proc reg", to do regression analysis, or "proc forecast" to do highly advanced, statistical, forecasting.
SAS coding on the mainframe was accomplished with the local text editor, usually ISPF, and all of SAS processing was batch, and required creating a program. I did this every day, all day long, for over 5 years.
Enter EG! Wow, what a great adhoc tool. I can simply connect, drop in a data set, construct queries, chart results, run correlative analyses, transposes, etc. without having to write any code. My productivity at adhoc analysis seriously improved, sort of. EG has a lot of draw backs for multi-tasking human workloads, and in dealing with large and huge datasets. There are things I still have to do inside Excel, that I had hoped to convert to EG.
Now not only is there EG, but a plethora of point and click tools, making the power and magic of SAS available to a whole bunch of non-programmers, SAS's original intent.
But, underneath it all, is a complicated, non-trivial infrastructure. There has to be. Consider old-fashioned programming. A compiler is a highly sophisticated, complicated and non-trivial piece of programming, just to simplify the programming process for human beings, so that more people can attack more problems. Object orientation helps this effort for interactive programs, and what would otherwise be hugely complicated systems' modeling efforts.
Look at the hierachy of abstraction in an electronic digital computer:
At the bottom, is very specific electronic circuits composed of transistors, capacitors and resistors forming flip-flops, registers, adders, etc. Originally, computers had to be manually wired to implement a program. If the program changed, the wiring changed. Later, it because possible to add memory to the box, and the Von Numen architecture of stored programs came into being. Ones and Zeroes, low and High voltages (or vice versa) were stored and sequenced through the machine. Programming changed from hard wiring, to setting these ones and zeroes, specifically, by setting switches and pushing buttons. This was still referred to as machine level programming, in machine language.
The next step was the application of pneumonics to the sets of ones and zeros to make the instuctions readable by a human being, so that many could develop a program and others actually enter the programs. "set A = ...", "set B = ...", "ADD A,B", etc.
Next came programs that translated human readable code into the machine code = COBOL and FORTRAN compilers and assemblers. This started the whole field of computer language development, and programming analysis: FORTH, ALGOL, LISP, BASIC, PL/1, PL/C, C, SAS, PROLOG, ADA, shell programming, etc.
Each of these layers provides a level of abstraction away from the underlying electronics that actually do the work we want done. But to do the abstraction/conversion from one layer to the next, is a non-trivial piece of software or other system feature(s).
The development of the windowing and graphical user interface is a non-trivial piece of coding for the OS to provide those services.
VB is a non-trivial piece of coding to show a box at a specific location on a form and to translate that into an efficient expression of code for the form to be recreated when the job is executed. This non-trivial piece of work took a great deal of planning and design by Microsoft engineers and coders for it to be useful for other human beings to use.
It is the same situation here, creating a SAS BI environment so that people who are specialists at data and business analysis can access and process corporate data to discover useful and meaningful information, without also having to be experts at programming. The difference, is that instead of just buying a compiler and a PC for a developer, you have to provide a network of servers, provide data access, and populate a metadata server with meaningful information about the data. This is not an appliance that you just plug in and use.
The best analogy that I can think of is city planning. A corporation is like a city. A city provides infrastructure for its residents: water, sewer, waste removal, streets, simplified access to electrical power, simplified access to telecommunications, etc. This requires careful planning by the city engineers and planners. The IT division of a corporation is supposed to provide the infrastructure to technology services and information. To do so effectively requires careful design and planning.
Is it worth it? What do you think?
Message was edited by: Chuck
Thank you for such an informative reply. I totally agreed with you and kshirley's point that carfully planning is an absolute must to make migration a success. I haven't have a chance to play with BI/EG so I don't know how powerful it is. But let me play a little Devil's advocate here. In the mainframe, interactive reporting and analysis application can be implemented with tools like SAS AF/FSP/SHARE , and if internet delievery is desire, I heard the Websphere in Open MVS can be use as the front end( I haven't have a chance to explor that area yet ). Won't this be the same as BI ? Besides, most of the big corporation that has mainframe would most likely already licensed SAS as a tune-up and proformance monitoring tool. Would the cost of adding AF,FSP, etc be greater than the cost of setting up the BI server and its intrafasture ?
SHARE/AF/etc. still doesn't relieve the need for planning, design, human resources, etc. In fact, it requires even more.
I would recommend an exploratory presentation on BI from SAS and a Sales Engineer to review your environment.
Also, as I have said before, and probably elsewhere, SAS and data on the mainframe should stay on the mainframe. BI is simply a way to make the mainframe data readily available to non-SAS programmers and non-Mainframers.
Don't know if you have your answers but We have totally converted from an MVS Mainframe to SAS running on a windows server. Usage on mainframe was minimal 20-25 jobs and only a few of those were run daily as scheduled production jobs. Our approach was a bit different - jobs are submitted from mainframe using XPATH. XPATH runs a window batch file, all output is returned to mainframe logs.
From the users point of view - the only thing that changed was the job JCL. All code still resides are mainframe.