Microsoft does have its co-pilot, and there is IntelliSense (code suggestions as you code), but I think there can be something more profound. Let's call it SAS Guru. As you are coding in the various interfaces, SAS EG or SAS Studio I think you could score the code (maybe even using some statistics!) and when you find code that seems to be not efficient (such as repetition of terms, profoundly short variable names, lack of comments, lack of coding structures, long procedural style code, etc) you can send badly scored pieces of code in real time to a generative AI sidebar that can make real time contextual suggestions for code enhancements. "SAS Guru: It appears that this function is doing an average payment calculation consider renaming variable "c" to "cAvgPmt" for clarity." "SAS Guru: This code could be rewritten to enhance maintainability by creating variables for these values, and rewriting it with a %do loop, see this rewritten code suggestion... %do ... " Similarly, when leveraging a data access method or doing a proc sql / datastep you can send the query / datastep to AI to suggest enhancements. When you do this though you can send in real time the underlying table configuration, etc, to give the best possible answer, ie. knowing the data types, knowing the database interface, knowing the table structure, understanding context of those structures (to suggest in db analytics) and knowing what is indexed. (things that a non-integrated AI wouldn't be able to inherently glean) In the same vein it can know when you're merging data between data sources and suggest more optimal ways of doing it. When code is running you could use AI to give an analysis of the log results or the overall code. (Build information about iterative run and present back that AI generated analysis) "SAS Guru: This run was faster than your last run, on December 3rd at 6am by 18 minutes, that's 10% faster, the performance of all successful runs to date is between 1 hour and 1.5 hours. Your fastest runs occurred between 5pm and 6pm, and your slowest runs occurred in the mornings. Your overall program took 45 minutes to run the 4th procedure (PROC SQL - ...) took the longest time (40m), it brought in 100 columns and 1m records, however your program only uses 35 of these columns and 20,000 rows. (using AI to suggest a resolution) First Consider Using mod(account_number,50)=0 in the WHERE close to build a sample set without importing all the data. Next Consider Updating the select with these (...used columns...) will drop your data import by approximately 70%. Consider moving your 6th procedure into the "whatever database" using this suggested code rewrite, this will save 6 minutes and reduce your data import by 20%. I'm a performance guy you could definitely use this to suggest parameters to functions, or better functions, etc. If using AI is a stretch, doing some post analysis/suggestion based on the log (or iterative logs) would be a really nice enhancement.
... View more