A bit more details about loading data from SAS on your server into Cosmos DB in Azure cloud through the CDATA ODBC driver...
1. Create an empty collection in Cosmos DB via the Azure portal (don't try to use SAS to do it!).
2. Use SAS to insert a dummy record with the columns/fields containing dummy values, and do it using pass-through SQL to ODBC.
3. Insert the rest of your data using proc append with force option.
4. Remove your dummy record using pass-through SQL to ODBC.
Example:
1. Create an empty collection in Cosmos DB via the Azure portal (don't try to use SAS to do it!).
2.
proc sql;
connect to odbc(dsn=AZURE_MLU);
execute (
insert into [Empty_Collection] (_id, name, age, address) values ('<DUMMYKEY>', 'dummy', 0, 'dummy')
) by ODBC;
quit;
3.
LIBNAME bldocs ODBC DATASRC=AZURE_MLU;
proc append base=bldocs.Empty_Collection data=yourdata force; run;
4.
proc sql;
connect to odbc(dsn=AZURE_MLU);
execute (
delete from Empty_Collection where _id = '<DUMMYKEY>'
) by ODBC;
disconnect from odbc;
quit;
It's not fast going through the internet, so consider to only load changes in data rather than full loads. 1.000 records with 11 columns in one minute from our server.
... View more