The previous examples used SAS jobs to create the visual output to be displayed in the SAS Visual Analytics (VA) report. This definitely expands the possibilities, but we have to admit that the static nature of ODS output may not meet interactive requirements, and at minimum, it doesn’t fit well with the overall VA report look & feel.
This article will introduce a very creative way of leveraging SAS jobs to extend the VA capabilities, not as much to visualize the data, but to manipulate the data or enhance it with new calculations before it’s visualized in VA. There are two ways this can be done:
The SAS job streams back a table to the DDC (Data-Driven Content), most likely in JSON format using PROC JSON, and the DDC uses a third-party visualization package to display the results. This isn’t much different from the way that DDC’s are normally used. The key difference is that the data that the DDC displays is passed to and transformed by the job, instead of coming directly from VA.
The SAS job stores the processed resulting table in memory (CAS server), and VA uses that table as any other source table that has been loaded in memory. This allows for using out of the box VA objects to visualize the data, perfectly blended into the report.
Because of the similarities of #1 above with traditional usage of DDC, we are not going to explore that option. The only thing that might be new to you in this approach is the part that streams data in JSON format back to the DDC, but you will see some of that happening in option #2.
HelloCASWorld Example
This example is built on top of the example called HelloBigWorldFormatted, introduced in the previous article. As said above, it will produce a table loaded in CAS so it can be explored in VA as any other in-memory table. In addition to that, the example will also explore a few ideas for enhancements:
Treat errors that might occur in the SAS job. This is especially important because the job doesn’t produce any visual content, and therefore any feedback to the DDC and ultimately to the VA report about the status of the job execution is very important.
Make the job reusable in many different reports by passing the name of the table to be loaded in memory as a parameter in the JSON message that VA sends to the DDC and the DDC sends to the job.
Make the DDC reusable with any job that produces a CAS table as the output by passing to it parameters such as the name of the job to be executed and a message to be displayed while the DDC waits for the job to execute. Those changes make the DDC to work as a proxy, where its solely function is to make the connection between VA and the job.
Instead of exploring this new example with all the enhancements added at once, you will first be introduced to the basic version of the job and the DDC code that are able to load a table in memory, followed by the final enhanced version with all the bells and whistles.
SAS Job Code (first pass)
This job is very similar to the one called HelloBigWorldFormatted. Starting with that job, you need to add a few extra lines in the initialization portion of the code to account for connecting to CAS and setting the CASUSER libname:
Picture 1- Setting up CAS session and library
Also, in the main processing portion of the code you need to replace the PROC PRINT statement that generates the ODS output with the code that loads the output table in memory:
Picture 2- Saving output table to CAS
The CASUSER output table name is fixed in this implementation, but obviously it would be nice to parameterize this table name, so that you could reuse this same job with multiple VA reports if needed, each one of them writing to its own table. This parameter could be set in the DDC code, but them you would have the same problem (the DDC could only be used in one VA report) or you could set this parameter in the VA report and pass it to the DDC, which would then be passed to the job. This is exactly what you will see in the enhanced implementation later in this article.
Finally, you need to terminate the CAS session created in the initialization phase:
Picture 3- Terminating CAS session
Data-Driven Content Code (first pass)
The changes necessary in the DDC code are also minimal. Considering that the job no longer sends information back to the DDC, you basically need to adjust the actions taken once the function callJob() finishes execution. Because this DDC does not have any job’s output to display, it displays some job status messages during its execution and clears the message when the execution ends:
Picture 4- Modified pre and post callJob() execution to inform job status
SAS Visual Analytics Report (first pass)
The report in this example looks like the following:
Picture 5- VA report (first pass)
The List Table at the top was added just for your reference. It is based on the input CARS data source, loaded in memory from the SASHELP library and displays what’s expected as output from the DDC/job. The object in the middle is the DDC. It doesn’t contain any visible output, except for a few status messages while the DDC/job are being executed. The List Table at the bottom uses the table loaded in the CASUSER library that the job created. The Button Bar at the very top of the report has actions to filter the first List Table and the DDC objects, so you can filter the data and see the job in action.
There are a few additional things about the VA report that should be highlighted:
When designing the VA report, you must remember that the job’s output table in the CASUSER library doesn’t exist until you execute the job successfully at least once.
Only after the table is available in the CASUSER library, you can add it into the report and use it as the source of other VA report objects, like the List Table at the bottom.
Tables from CASUSER library that are added to VA reports keep reference to the report author. This reference must be removed before the report is shared with a broader audience. As the report designer, you do that by editing the report BIRD XML:
a. Save the VA report
b. Hit Ctrl+Alt+B to open the SAS Visual Analytics Diagnostics window
c. Make sure the BIRD tab across the top and the XML button are selected (these are the defaults)
d. Click on the Data icon on the left
e. Scroll down searching for the reference to the CASUSER library
f. Remove (<userid>) that is appended to the CASUSER library
Picture 6- Removing userid from CASUSER
g. Click the Load button on the top left
h. Save the VA report
If you have the right privileges for exporting and importing content in SAS Environment Manager (menu option Manage Environment), you can replace all the steps above by exporting the report and them reimporting it. You just need to replace the target CASUSER(<userid>) library with just CASUSER in the GUI during the import process.
You must set automatic refresh in the objects that depend on the table loaded in the CASUSER library, otherwise they will not refresh as the data changes with the job execution. You do that in the object’s Option tab by turning on the option labeled either “Automatically refresh object” or “Periodically reload object data” depending on the VA release. In this example, the object that needs automatic refresh is the List Table at the bottom of the report. Remember that automatic refresh only works when the report is in view mode.
Picture 7- Setting up auto-refresh in all objects that depend on the table created by the job
SAS Job Code (with enhancements)
Although the previous implementation works just fine, it lacks some basic capabilities to make it more robust and flexible.
First, what if there is an issue with the job’s logic and it fails? The HTTP call to the job would be successful, but the results produced by the job would be incomplete or incorrect. In order to better respond to this, you would have to make the job a bit more intelligent by treating errors on every step of the process and inform the DDC if something happens.
Second, as discussed previously, what if you need to replicate the job functionality in another VA report? If you simply reuse the same job, you will end up with two reports writing to the same CASUSER table. A simple solution would be making a copy of the job just to replace the name of the output table that is hard coded, but that could create future maintenance overhead. The best solution is to parameterize the name of the output table that the job creates. Working towards a more robust job code, you could test for the existence of this parameter (macro variable CASTAB in this example) and issue an error in case the parameter doesn’t exist.
You will do those things by defining two macro functions. One called stopOnError that sends back a JSON message to the DDC in case of an error, and one called checkParams that validates the existence of expected parameters:
Picture 8- Macro functions to check job execution status and validate input parameters
Below is an example of calling the stopOnError macro function after each step in the process:
Picture 9- Checking job execution status after each step
The JSON message produced by the PROC JSON in this example has three key-value pairs:
success: set to true or false depending on the job executed successfully or not
retcode: SAS’s pre-defined macro variable SYSERR in case of error or 0 (zero) if successful
message: text message containing additional information about the status of the execution
If the job execution reaches the end, you should send a JSON message back to the DDC saying the job executed successfully:
Picture 10- Sending a JSON message back to the DDC informing the job executed successfully
With the CAS table being parameterized, the main processing steps would need to be re-written to replace the fixed table name with the macro variable &CASTAB:
Picture 11- Adapting the SAS code to name the output CAS table according to the parameter
Data-Driven Content Code (with enhancements)
In the enhanced DDC code you need do the following:
Retrieve the parameter containing the name of the job’s output table that came in the message received from VA and pass it to the job. That will become the macro variable CASTAB you have just seen.
Retrieve additional parameters from the VA message such as the name of the job to be executed and the text message to be displayed while the DDC waits for the job to execute and replace them where appropriate. This is to work towards making the DDC code configurable and reusable for multiple use cases.
Check for possible error messages coming from the job by checking the job’s returned JSON message {success: …, retcode: …, message: …}
For the first two items above, you can leverage the function getVAParameters() that is available in the GitHub utility library called util/contentUtil.js.
Picture 12- Adding contentUtil.js utility script file
You need to modify the onDataReceived() callback function to retrieve the parameters from VA, validate them, and send a message back if something is missing. That’s implemented with the code highlighted in yellow at the beginning and the bigger block of code in the middle, as indicated below. You also need to modify the code that verifies the response received from the job and sends the proper messages back to VA, done in the highlighted block at the end. The line that calls the va.contentUtil.initializeSelections() function is just a best practice to remove eventual columns that indicate brushing, which doesn’t make sense in a DDC that produces an in-memory table as the output:
Picture 13- DDC JavaScript code with enhancements
Observe in the last highlighted block above that when the job finishes and execution resumes to the callJob() function, you are informing the status of the job execution. If the execution succeeds, a temporary message “Done!” that lasts only 2 seconds is displayed, before it’s replaced with an empty string.
Considering that the job now needs the name of the output CASUSER table to be passed as a parameter called CASTAB (not case sensitive) and the job returns JSON back to the DDC, the callJob() function also needs to be slightly modified to account for those:
Picture 14- Modifications in callJob() function to pass the name of the output table as parameter and receive execution status information
Observe that the job parameter _output_type is set to JSON, and the “Accept” field in the header of the HTTP call is set to application/json, matching with what the job is returning to the DDC. With those modifications, the DDC is generic enough to work as a proxy for launching jobs that produce non-visual output.
Just to recap, the DDC receives from VA the data (aggregated table) and three parameters:
_job_name – the name of the job to be executed
_job_executing_message – the message to be displayed while the job is being executed
_job_output_cas_table – the name of the table to be created in CASUSER library (this is the only parameter passed down to the job)
The data is uploaded to the server and the name of the table to be created is passed to job. The job reads the uploaded table, processes it, and saves the output table with the desired name. The job returns a JSON object with the execution status back to the DDC that interprets it and displays the proper message.
SAS Visual Analytics Report (with enhancements)
The report in this example looks exactly as the previous one and the same observations made about that previous report apply to this one, but in addition, this enhanced version of the report must have the following:
You must define at least the parameters _job_name and _job_output_cas_table in the VA report and assign a value to them. The parameter _job_executing_message is optional. All of them are character parameters and their names are case sensitive. For this example, the following values were assigned:
_job_output_cas_table: "CARS_COPY"
_job_executing_message: "Waiting for job to finish..."
_job_name: "/Public/Jobs/SAS Communities/HelloCASWorld"
Picture 15- Example of VA parameter indicating the name of the output table to be created
2. VA parameters are only passed in the JSON message to the DDC if the parameters affect the data that is being passed to the DDC. This is further explained in the article Using parameters with Data-Driven Content in SAS Visual Analytics, and in this example you guarantee that all three parameters are being passed to the DDC by setting a dummy advanced filter in the DDC object that looks like this:
Picture 16- Dummy advanced filter applied to the DDC object to guarantee the parameters are being passed in the VA message
Deploying This Example
All files used in this example are available for downloading from the GitHub project sas-visualanalytics-thirdpartyvisualizations, under folder called samples/IntegrationWithSASJobs.
It requires the CARS table to be loaded into memory in the Public caslib. The table is originally available in the SASHELP library.
Deployment steps:
Download the GitHub project
Unzip it into your Web Server document folder. These instructions assume you have unzipped the GitHub project in a folder called github in the Web Server document root folder. You will only need the content of folders github/utils and github/samples/IntegrationWithSASJobs.
Logged into SAS Job Execution Web application, do the following:
a. Create a new job in a folder of your preference (e.g. /Public/Jobs/SAS Communities) and name it HelloCASWorld (see first article for more details if needed)
b. Open the job for edition, copy & paste the content of file github/samples/IntegrationWithSASJobs/3.HelloCASWorld/HelloCASWorld.sas
c. Save the job
d. Add job parameter _action=FORM as discussed in the first article
e. Create a new job form either as a separate content or attached to the job (SAS Visual Analytics 8.5 or above only), as explained in the first article. If it’s a separate content, give it the same name as the job
f. Open the job form for edition, copy & paste the content of file github/samples/IntegrationWithSASJobs/3.HelloCASWorld/HelloCASWorld.html (1)
g. Make changes to the host name (search for your.host.name and replace it accordingly) and path of src on lines 20, 21, and 22 (1) :
Picture 17- Modify src appropriately
h. Save the job form
Logged into SAS Visual Analytics, do the following:
a. Create a new empty report
b. With the report opened, hit Ctrl+Alt+B to bring the SAS Visual Analytics Diagnostics window.
c. With the BIRD tab selected across the top (that’s the default), replace the BIRD XML content with the content of the file github/samples/IntegrationWithSASJobs/3.HelloCASWorld/VA-DDC-Job Hello CAS World.xml
d. Hit Load (this will close the diagnostics window)
e. With the DDC object at the bottom of the report selected, go to the Options pane on the right and fix the URL entry: replace your.host.name accordingly and make sure the path to the job is correct (same value entered in #3a)
f. Make changes to the values assigned to the parameter _job_name if necessary, so it matches with your environment (same value entered in #3a)
g. Save the report
Note: (1) Starting with release 2023.06, the examples that used to inherit jQuery from their parent (the SAS Visual Analytics web application) no longer work, so we have provided replacement codes. These replacement files have the same names as their original, but they end with .v4. Because of that, some references to line numbers may not exactly match on those .v4 files.
We could have exported all the content as a package, but this would require special privileges in order to import it. Sharing the example as standalone files will give you the opportunity to better explore the SAS Job Execution Web application, familiarize yourself with the content, and understand how they are connected.
Note: The first time a user opens this report it fails because the CASUSER table doesn’t exist yet. Reopening the report will work, as well as any subsequent access to the report. This will happen to any other report that depends on a table that is being dynamically loaded in memory when a user opens the report for the first time.
Next Steps
This example showed how jobs can be used to load tables in memory, but in fact, the output table can be stored anywhere, including external files, relational databases or any other storage type your SAS environment has access to. You can also update and delete rows from existing tables, and you can do so using SAS Visual Analytics report as the front-end. There is an example in GitHub that updates tables, but it uses the SAS Viya REST API’s to do so directly from the DDC JavaScript code. Doing something similar using jobs is going to be left to you as an exercise 🙂
In the next article you will be applying the integration between SAS Visual Analytics, Data-Driven Content object, and SAS jobs in a business use case: create a Pareto chart by allowing dynamic filtering on the cumulative total percentage. You will be looking at GDP by country and year, and the goal is to allow you to report on the short list of countries that are responsible for let’s say 80% of the global GDP.
References
SAS Job Execution Web Application 2.2: User’s Guide documentation
JSON Libname Engine documentation
Reading data with the SAS JSON libname engine documentation
Data-Driven Content: leveraging third-party visualizations in SAS Visual Analytics (part 1 of 2) by SAS’s Renato Luppi
Data-Driven Content: leveraging third-party visualizations in SAS Visual Analytics (part 2 of 2) by SAS’s Renato Luppi
Using parameters with Data-Driven Content in SAS Visual Analytics by SAS’s Renato Luppi
sas-visualanalytics-thirdpartyvisualizations GitHub project and examples
Learn More…
Programming Considerations for Data-Driven Visualizations documentation
Customize data visualizations in SAS Visual Analytics with #D3Thursday by SAS’ Ryan West
Leverage a Google 3D-PieChart within VA using the Data-Driven Content Object by SAS’ Meera Venkataramani
Custom Visualizations in SAS Visual Analytics - A Strategy for Using the Data-Driven Content Object by SAS’ Robby Powell
Use Custom Visualizations to Extend SAS® Visual Analytics by SAS’ Robby Powell
Reading data with the SAS JSON libname engine by SAS’s Chris Hemedinger
... View more