BookmarkSubscribeRSS Feed
☑ This topic is solved. Need further help from the community? Please sign in and ask a new question.
js5
Pyrite | Level 9 js5
Pyrite | Level 9

Since we did not have SAS/Access to Spark licensed, I ended up implementing it myself. Performance is great in comparison. In brief:

  1. create an empty table using implicit sql passthrough
  2. export csv, formatting datetime and date to default spark formats
  3. upload csv to S3
  4. execute copy into statement on databricks warehouse using explicit sql passthrough

 

belgeric
Calcite | Level 5

Hi,

 

As I se the thread is recent and just in case.

If you don't have SAS9.4M9 _and_ spark connector, I would use such approach but there are possible improvements: use parquet format.

And yes you can use parquet even with old SAS versions, via DuckDB -- look after dudckdb + ODBC

 

Once you have set up ODBC DSN you can use

proc sql;     connect to odbc(dsn='DuckDB_Parquet');     /* This command tells DuckDB to write the SAS-linked table directly to a Parquet file */     execute (         COPY (SELECT * FROM main.final_output)          TO 'C:\exports\data_output.parquet' (FORMAT 'PARQUET')     ) by odbc;     disconnect from odbc; quit;

Catch up on SAS Innovate 2026

Dive into keynotes, announcements and breakthroughs on demand.

Explore Now →
How to Concatenate Values

Learn how use the CAT functions in SAS to join values from multiple variables into a single value.

Find more tutorials on the SAS Users YouTube channel.

SAS Training: Just a Click Away

 Ready to level-up your skills? Choose your own adventure.

Browse our catalog!

Discussion stats
  • 16 replies
  • 3714 views
  • 4 likes
  • 7 in conversation