Hello, I've established a connection between SAS Data Integration Studio and Snowflake, and I'm able to load tables from SAS DI into Snowflake. However, I'd like to leverage Snowflake's computational power when running the jobs in SAS DI Studio. How can I configure the transformations to execute on Snowflake? For example, I can set "Database Pass-Through" to "Yes" and "Target Table is Pass-Through" to "Yes" in the "Set Operators" transformation options, which displays a red "S" for "Snowflake Database" in the transformation icon on the diagram. However, this approach doesn't work for the "Extract" transformation, and the Snowflake icon isn't displayed. Does anyone have a solution or insights on how to achieve this?
It would be interesting to see what is actually happening if you have both source and target in Snowflake for the Extract transformation. In a best case scenario this would result in an implicit SQL pass through.
If you activate these options you should be able to see how the SAS/ACCESS interface handles this:
options msglevel=i sastrace=',,,d' sastraceloc =saslog nostsuffx;
Another option would be to replace the Extract transformation with Join (which funny enough doesn't require a join).
April 27 – 30 | Gaylord Texan | Grapevine, Texas
Walk in ready to learn. Walk out ready to deliver. This is the data and AI conference you can't afford to miss.
Register now and lock in 2025 pricing—just $495!
Need to connect to databases in SAS Viya? SAS’ David Ghan shows you two methods – via SAS/ACCESS LIBNAME and SAS Data Connector SASLIBS – in this video.
Find more tutorials on the SAS Users YouTube channel.