BookmarkSubscribeRSS Feed
🔒 This topic is solved and locked. Need further help from the community? Please sign in and ask a new question.
bl_jyskebank_dk
Obsidian | Level 7

We are experimenting with SAS libnames pointing towards Cosmos DB in Azure cloud. The connection works through ODBC (driver from CDATA). We can read data from tables in Cosmos DB, but when it comes to creating new tables in Cosmos DB it's a bit harder.

 

A simple bulkload is not accepted by the SAS ODBC driver. Not even a simple datastep, proc copy or proc append can create a new table in Cosmos. We tried pass-through SQL. It creates a table in Cosmos, but we are unable to insert rows into it because the database doesn't have a scheme, so the following insert statements are not accepted because number of columns does not match that in Cosmos DB.

 

Does anyone have experience in simple loading of data from SAS to Cosmos DB in Azure?

1 ACCEPTED SOLUTION

Accepted Solutions
bl_jyskebank_dk
Obsidian | Level 7

A Little bit more research shows that calling the following with SQL pass-through creates a table with the specified fields, and automatically adds an _id field. It also inserts a dummy record into the collection. You can now use proc append to add data to the table. Remember to omit the dummy record when selecting data from the collection. A bit spooky but that's how a schemaless Cosmos works I guess...

 

http://cdn.cdata.com/help/EHD/odbc/pg_createtable.htm

 

In Cosmos it's called a collection, not a table 🙂

 

View solution in original post

9 REPLIES 9
bl_jyskebank_dk
Obsidian | Level 7

A Little bit more research shows that calling the following with SQL pass-through creates a table with the specified fields, and automatically adds an _id field. It also inserts a dummy record into the collection. You can now use proc append to add data to the table. Remember to omit the dummy record when selecting data from the collection. A bit spooky but that's how a schemaless Cosmos works I guess...

 

http://cdn.cdata.com/help/EHD/odbc/pg_createtable.htm

 

In Cosmos it's called a collection, not a table 🙂

 

NZ_Hockey_Mum
Calcite | Level 5
Thanks for posting this ... I am about to set up SAS connections to Cosmos DB.

I have had issues with bulk uploads to Azure SQL database using ODBC - worked with OLE DB.
bl_jyskebank_dk
Obsidian | Level 7

Did you get the OLE-DB connection working alright?

Does it Work with bulkloads?

And how is your experience with performance?

Our performance towards Azure is poor.

If we dont't get that solved, cloud is probably not an option for us, having to transfer to a lot of data.

NZ_Hockey_Mum
Calcite | Level 5
We can down load fast and upload using ole DB drivers, between our sas
servers and azure sql database.

Worked out the odbc drivers would insert a row at a time but not bulk ....
13 hours for 7 columns and 350,000 records. Hence we tried the sas ole db
drivers.

I am waiting for IT to install the cosmos drivers for the azure nosql
database. So that will be interesting. I only need to download from that
database.

Good luck!
bl_jyskebank_dk
Obsidian | Level 7

A bit more details about loading data from SAS on your server into Cosmos DB in Azure cloud through the CDATA ODBC driver...

 

1. Create an empty collection in Cosmos DB via the Azure portal (don't try to use SAS to do it!).

2. Use SAS to insert a dummy record with the columns/fields containing dummy values, and do it using pass-through SQL to ODBC.

3. Insert the rest of your data using proc append with force option.

4. Remove your dummy record using pass-through SQL to ODBC.

 

Example:

 

1. Create an empty collection in Cosmos DB via the Azure portal (don't try to use SAS to do it!).

 

2.

proc sql;

connect to odbc(dsn=AZURE_MLU);

execute (

insert into [Empty_Collection] (_id, name, age, address) values ('<DUMMYKEY>', 'dummy', 0, 'dummy')

) by ODBC;

quit;

 

3.

LIBNAME bldocs ODBC DATASRC=AZURE_MLU;

proc append base=bldocs.Empty_Collection data=yourdata force; run;

 

4.

proc sql;

connect to odbc(dsn=AZURE_MLU);

execute (

delete from Empty_Collection where _id = '<DUMMYKEY>'

) by ODBC;

disconnect from odbc;

quit;

 

It's not fast going through the internet, so consider to only load changes in data rather than full loads. 1.000 records with 11 columns in one minute from our server.

PascalG
Calcite | Level 5

Hi bl_jyskebank_dk,

 

thank you for sharing your experience!

I tried recreating your example but my log keeps telling me that the table in which I want to isnert values is not found in the DB.

The libname works fine.

Which API did you choose when creating the CosmosDB? I used Core (SQL) and did not have any luck with that but it seemed the most fittign one for working with tabular data from SAS.

 

Thanks!

bl_jyskebank_dk
Obsidian | Level 7

Hi PascalG.

I created the CosmosDB itself and the empty Collection via the Azure Portal (Azure Website), not using SAS and proc sql or anything else in SAS. Plain Azure stuff.

Regards.

PascalG
Calcite | Level 5

Hi again,

 

I created an empty Collection name "Test" in the Azure Portal and tried inserting values into it. I get told that a "Schema map entry" could not be found (see screenshots). Have you changed anything in the ODBC Schema Editor?

 

Best

Pascal

bl_jyskebank_dk
Obsidian | Level 7

Hi Pascal.

I did not get that error or anything like it.

It's some time since I played around with it.

Until we get higher Network speed we will not pursue this solution.

Regards

SAS Innovate 2025: Call for Content

Are you ready for the spotlight? We're accepting content ideas for SAS Innovate 2025 to be held May 6-9 in Orlando, FL. The call is open until September 25. Read more here about why you should contribute and what is in it for you!

Submit your idea!

Discussion stats
  • 9 replies
  • 3921 views
  • 2 likes
  • 3 in conversation