BookmarkSubscribeRSS Feed

Publish SAS Models to Databricks

Started a month ago by
Modified 3 weeks ago by
Views 466

With the release of SAS Model Manager 2025.06, you can now publish SAS models into Databricks on Azure and AWS with the click of a button. Behind the scenes, SAS Model Manager uses PROC ACCELERATOR to push the SAS models into the Spark Database within Databricks. Pushing the modeling logic into the database allows models to execute faster on batch data as it reduces data movement. PROC ACCELERTOR does not require CAS and offers a framework to easily add new publishing destinations in the future.

 

In this article, we will review how to configure Databricks for publishing, how to create the publishing destination, and how to publish the model from SAS Model Manager.

 

Configuring Databricks

 

If you’ve already configured your Databricks Deployment for PROC ACCELETATOR, then good news! You’re nearly ready to publish SAS models from SAS Model Manager to Databricks. If not, the Databricks Deployment Steps will help you get started. (Note that in 2025.03, SAS changed the redistributed JDBC drivers used by SAS/ACCESS Interface to Spark. You can review this page for resolving any issues related to JDBC drivers.)

 

Creating the Publishing Destination

 

To create the publishing destination for SAS Model Manager, leverage the models plug-in to the SAS Viya CLI. Before creating the publishing destination, you must create a profile and sign in. Finally, run the following CLI command, replacing the red options with your preferred names, user or group access, and information for your Databricks deployment. The databaseConnection argument is optional.

 

sas-models-cli destination createDatabricks 
 --name nameForTheDesintation 
 --credDomainID nameForTheDomain
 --identityId userOrGroupIDThatHasAccess
 --identityType selectUserOrGroup
 --databricksUserId token
 --databricksAccessToken databricksAccessToken
 --databaseHost databricksHost
 --databaseSchema model 
 --databaseConnection defaultStringColumnLength=1024
 --httpPath "JDBCorODBCpathForDatabricksCluster"

 

You can also use options like overwriteInputTable to copy the input data from CAS to the spark library and keepOutputResult to leave the output in your spark library after execution of the publishing validation tests.  After running the CLI command, you’ll be able to see Databricks as an option in the drop-down menu of SAS Model Manager for users and/or groups with access.

 

Publishing from SAS Model Manager

 

Once Databricks has been configured and the publishing destination has been created for SAS Model Manager, you just need to select your SAS models, hit the Publish button, and select the Databricks destination from the drop-down menu. Once published, you can run publishing validation inside Databricks to score data from SAS Model Manager.

 

 

The published model can also be called via SAS code using the RUNMODEL statement from PROC ACCELERATOR.

 

Next Steps

 

What publishing destination would you like to see next for SAS Model Manager? Let us know in the comments!

Version history
Last update:
3 weeks ago
Updated by:

hackathon24-white-horiz.png

The 2025 SAS Hackathon Kicks Off on June 11!

Watch the live Hackathon Kickoff to get all the essential information about the SAS Hackathon—including how to join, how to participate, and expert tips for success.

YouTube LinkedIn

SAS AI and Machine Learning Courses

The rapid growth of AI technologies is driving an AI skills gap and demand for AI talent. Ready to grow your AI literacy? SAS offers free ways to get started for beginners, business leaders, and analytics professionals of all skill levels. Your future self will thank you.

Get started

Article Labels
Article Tags