BookmarkSubscribeRSS Feed
ChrisHemedinger
Community Manager

Importing your local data files to SAS

In this recent article, @NicolasRobert shares 3 ways to import your data files to SAS Viya on Microsoft Azure Marketplace:

  1. Use the Manage Data application to import data from your client machine.
  2. Use Upload Files in SAS Studio ("Develop code and flows") to make your local data available to your SAS programs.
  3. Use scp (secure copy) to transfer files from your local environment to the "jumpbox" virtual machine that's configure with the SAS Viya on Microsoft Azure deployment.

For details, see the complete article here:

 SAS Viya on Microsoft Azure: Three Ways to Import your Data files 

 

Accessing your Data on ADLS Gen2

ADLS Gen2 is a low-cost object storage solution for the cloud, used for building enterprise data lakes on Azure. Microsoft customers use ADLS2 for storing massive amounts of structured/unstructured data.

 

You are using ADLS Gen2 when you create a Storage Account and check “Enable hierarchical namespace”.

 

In this article, Nicolas describes how to connect SAS Viya to Microsoft Azure Data Lake Storage Gen2 (or simply ADLS2):

 SAS Viya on Microsoft Azure Marketplace: Accessing your Data on ADLS Gen2 

Check out SAS Innovate on-demand content! Watch the main stage sessions, keynotes, and over 20 technical breakout sessions!
3 REPLIES 3
balbarka
Calcite | Level 5

@ChrisHemedinger started writing ORC to ADLS for use in Databricks using libname because the write performance using JDBC / ODBC was too slow. This is fast and doesn't require a running Databricks session to access data like JDBC / ODBC does. For SAS Viya on Azure Pay-Go, I discovered that I had to set azureauthcacheloc in the context sas option field to a new path before the azure authentication JSON file would save properly.

 

Can you advise how I to set cas.AZUREAUTHCACHELOC to use the same authentication file as the one in the option azureauthcacheloc path? 

v/r, -Brad

ChrisHemedinger
Community Manager

Hi @balbarka,

 

I don't have the expertise on this, but others have written articles to address. See this one by @UttamKumar, and feel free to post a comment/question there for more help. 

 

Check out SAS Innovate on-demand content! Watch the main stage sessions, keynotes, and over 20 technical breakout sessions!
balbarka
Calcite | Level 5

Again, this is specific to a SAS Azure Pay-Go managed deployment and I suspect it won't be the same or necessary for other deployments. I ultimately solved this by setting for all SAS Job Execution Computes. You can navigate to the setting through:

  • Top Left Corner Menu of SAS Studio
  • Select ADMINISTRATION > Manage Environment
  • Select Contexts (Looks like a wrench infront of a database)
  • Change your view to Compute Contexts
  • Select SAS Job Execution compute context
  • Edit -> Advanced
  • Write the below line in the options section
-azureauthcacheloc "/export/sas-viya/data"

 

sas-innovate-2024.png

Available on demand!

Missed SAS Innovate Las Vegas? Watch all the action for free! View the keynotes, general sessions and 22 breakouts on demand.

 

Register now!

Discussion stats
  • 3 replies
  • 1348 views
  • 2 likes
  • 2 in conversation