In this recent article, @NicolasRobert shares 3 ways to import your data files to SAS Viya on Microsoft Azure Marketplace:
For details, see the complete article here:
SAS Viya on Microsoft Azure: Three Ways to Import your Data files
ADLS Gen2 is a low-cost object storage solution for the cloud, used for building enterprise data lakes on Azure. Microsoft customers use ADLS2 for storing massive amounts of structured/unstructured data.
You are using ADLS Gen2 when you create a Storage Account and check “Enable hierarchical namespace”.
In this article, Nicolas describes how to connect SAS Viya to Microsoft Azure Data Lake Storage Gen2 (or simply ADLS2):
SAS Viya on Microsoft Azure Marketplace: Accessing your Data on ADLS Gen2
@ChrisHemedinger started writing ORC to ADLS for use in Databricks using libname because the write performance using JDBC / ODBC was too slow. This is fast and doesn't require a running Databricks session to access data like JDBC / ODBC does. For SAS Viya on Azure Pay-Go, I discovered that I had to set azureauthcacheloc in the context sas option field to a new path before the azure authentication JSON file would save properly.
Can you advise how I to set cas.AZUREAUTHCACHELOC to use the same authentication file as the one in the option azureauthcacheloc path?
v/r, -Brad
Hi @balbarka,
I don't have the expertise on this, but others have written articles to address. See this one by @UttamKumar, and feel free to post a comment/question there for more help.
Again, this is specific to a SAS Azure Pay-Go managed deployment and I suspect it won't be the same or necessary for other deployments. I ultimately solved this by setting for all SAS Job Execution Computes. You can navigate to the setting through:
-azureauthcacheloc "/export/sas-viya/data"
Registration is now open for SAS Innovate 2025 , our biggest and most exciting global event of the year! Join us in Orlando, FL, May 6-9.
Sign up by Dec. 31 to get the 2024 rate of just $495.
Register now!