05-16-2013 11:33 AM
We have a problem loading datasets, with large text-fields, into SQL-Server using DI-Studio. By generating the tables in SQL-Server we use the datatype "varchar(max)'.
On the SAS-Library we use on the output-tables, we set 2 options:
The sas-metadata reads the varchar(max) from SQL-Server as $32767. So far it's OK.
By running the job the following error occurs:
ERROR: Error binding parameters: [Microsoft][ODBC SQL Server Driver]Invalid field size for datatype
The combination of library-options mentioned above triggers this Error.
If we disable the bulkload-option, there will be no problem.
If we set the maximum text length =8000 there is also no problem. (Explanation of 8000: this is the default-value that SQL-Server reserver for varchar(max), if necessary SQL-Server will add some extra memory.)
What can we do to load this large character-fields with the bulkload option?
11-20-2014 02:20 PM
I understand your pain. I do not use the bulk load optionbut this is how I download large fields from the SQL server.
For tables with fields uner 32,767 characters I use the following macro; pass trhough query to download from a SQL database. For the field with 200,000 character I wrote code to determine the number of slices required and then generated the required pass through code to run. If anyone knows a better way I would love to here it - cause SAS support says the field has to be split. The way modern systems are produced this limitation is going to be an issue.
** macro to pull all the data from the table ;
%macro PullDataBig (outputdataset,tableName,DatasetLabel);
CONNECT TO ODBC AS ODBCConnection(NOPROMPT="UID=&GLB_USER;PSW=&GLB_PASSWORD;DSN=&GLB_DSN_NAME;SERVER=&GLB_IP;DATABASE=&GLB_DB" dbmax_text=32767);
Create table &outputdataset (compress=binary label="&DatasetLabel") as
Select * from connection to ODBCConnection (
2 hours ago
Hi, did you manage to find a solution for this one? I'm currently having the same problem.
If I remove the bulkload I guess it's gonna take very long to insert data when it's 100k+ or even million of records.