05-11-2017 03:06 PM
Dear SAS Community,
Does anyone have experience using Azure SQL Server to read in a sas7bdat file. I found this article saying R can do it as a plug into SQL Server. The reason I ask is that in using a DSN libname to Azure, putting the sashelp.class (19 records, less than 50 KB) table on Azure takes 33 seconds and I need Azure to handle 90 GB of data transfers.
I would prefer a direct read without adding another tool. Thank you,
05-11-2017 04:21 PM
If you are using SAS/ACCESS to ODBC to interact with MS SQL Server on Azure, check out the BULKLOAD= option for you ODBC libname statemant.
and while you are at it, I would recommend enabling SAS/ACCESS tracing/debugging via the following options
/*Turn on SAS tracing */
options sastrace=”,,t,” sastraceloc=saslog nostsuffix;
For additional details and usage examples, Check this SGF paper "SASTRACE: Your Key to RDBMS Empowerment"
Hope this helps,
05-15-2017 07:01 PM
The issue is I get this message. I hadnlt considered using sastrace. Thank you for reminding meof it.
ERROR: BCP initialize error: [Microsoft][ODBC Driver 13 for SQL Server]Connection is not
enabled for BCP
Does anyone know how to turn BCP on for Azure?
05-12-2017 07:15 AM
the R Services are only available for SQL Server on premise. I think Azure SQL DB does not support R code and R packages yet.
If you plan to shift that much data into the cloud you should also consider other scenarios like exporting your data into CSV files and send the data on a HDD disk to your Azure datacenter. You can process the data much faster if it is already located in the cloud
05-12-2017 09:09 AM
Yeah. You push a too big table into DB.
Here are some options could get you a little faster.
libname x ...... insertbuff=10000 bulkload=yes dbicommit ;
05-15-2017 06:58 PM
Thank you. DBIcommit isn;t an option is SAS 9.2. The
options DBIDIRECTEXEC; seems to have sped up the process, but not by enough to use it.
I am going with the BCP load option available in SQL Server