01-13-2016 12:13 PM
Running into an issue, maybe someone has some experience with it. We're trying to insert XML text into an Oracle table with a column that has a data type of XMLTYPE. Under the hood, Oracle stores this as a CLOB, with special XML handling abilities. But DM Studio cannot handle this insertion. We got the below error message, which is extremely misleading as it is not a DATE issue -- we tested to be sure - we do have a date column in the table, but that is not the issue:
[HY000] [DataFlux][ODBC Oracle Wire Protocol driver][Oracle]ORA-01483: invalid length for DATE or NUMBER bind variable (1483) [ODST38]
When we changed the type of the column from XMLTYPE to CLOB, the insertion worked. so the question is: Can the DM Studio handle XMLTYPE data?
01-18-2016 03:30 AM
The driver documentation says the following:
The driver supports tables containing columns whose data type is specified as XMLType,
except those with binary or object relational storage.
Are you in that case?
02-09-2016 03:59 PM
Sorry for the very late reply -- it got crunchy around here for a few weeks. To answer your question, we do not have either of those two types of database. Normal Oracle installation.
We're inserting our XML into a CLOB column, but we're finding it is consistently truncated at 65536 chars.
09-16-2016 03:22 PM
I meant to get back and post what we found regarding this issue. I apologize to those who may have been waiting.
The issue is caused by a defect in the Data Management architecture (not the database drivers). This was acknowledged by SAS Support. Here's the problem: When you read from or write to an Oracle column type CLOB (the underlying type of XMLTYPE), the data is truncated to 64 kb. That's both reading or writing. And it's a silent failure, which make this a fairly significant issue.
When this was confirmed by SAS, the last we heard on this issue was that it is on the official defect list, and that they understand it's a serious issue. Haven't heard anything regarding an estimate as to when it is anticipated to be fixed.
How we worked around this was to create a utility PL/SQL procedure, that took in something like a dozen parameters or more (we based this on the largest length we could find in our data set). The majority of these parameters were VARCHAR2 of length 64 kb. The parameter values would be "assembled" in the procedure, and written to the table.column that we needed. Obviously a hack, but we needed one given the circumstances -- and this only enables writing, not reading. We would then call this procedure from the data job after first tokenizing the data into 64 kb pieces.
Hope this helps. Good luck!