Does Spark-SQL support writing to an XMLTYPE column of an Oracle table? To clarify, this is a column in an Oracle table that is the XMLTYPE data type. I am not talking about reading/writing XML files in this question.
Using Spark-SQL I read from the Oracle table with the XMLTYPE column, then try to write it to another Oracle table with the same XMLTYPE column. The tables are defined like this:
CREATE TABLE src_clxml ( xml_unid numeric(19) NOT NULL, "XML" sys.xmltype NULL );
CREATE TABLE tgt_clxml ( xml_unid numeric(19) NOT NULL, "XML" sys.xmltype NULL );
The read is fine, no errors. I can print the XML column data with 'show' and it's all there. But when I try to write it to the Oracle target table I get this error:
java.sql.BatchUpdateException: ORA-01461: can bind a LONG value only for insert into a LONG column
If I change the target table XMLTYPE column to CLOB, then it works. But the problem is it MUST be an XMLTYPE column data type in the target column.
Is there some way to convert or cast the XMLTYPE column Spark has read, which appears to interpert as a LONG in Spark, to the XMLTYPE the database column is expecting?
Is this a JDBC issue?
Here is the code I am using:
class ClXml(val sc: SparkContext, val ss: SparkSession, val args: Array[String]) {
val s1 = args(1)
val t1 = args(2)
val sPpt = args(3)
val tPpt = args(4)
import ss.implicits.StringToColumn
val clxml = new Source(sc, ss, s1, sPpt)
val clxmlDF = clxml
.getDf
.select(
$"xml_unid".as("xml_unid"),
$"guid".as("guid"),
$"xml".as("xml"),
$"permanent_id".as("permanent_id"),
$"searchable".as("searchable"),
$"mapped_id".cast("int").as("mapped_id")
)
clxmlDF.write
.format("jdbc")
.option("driver", "oracle.jdbc.driver.OracleDriver")
.option("url", "myurl")
.option("dbtable", "mytable")
.option("user", "myuser")
.option("password", "***")
.mode("append")
.save()
}