String Size | 7631 bytes |
Database version | Oracle9i Enterprise Edition Release 9.2.0.8.0 |
Oracle NLS_CHARACTERSET | WE8ISO8859P1 |
JDBC driver version | Oracle Database 10g Release 2 (10.2.0.4) |
Table Column DataType | NCLOB |
In Oracle JDBC 10g, there is a new
Connection
property called SetBigStringTryClob
which allows the statement's setString
method to process strings greater than 32765 bytes.
Properties props = new Properties(); props.put("user", "username" ); props.put("password", "password"); props.put("SetBigStringTryClob", "true"); Connection conn = DriverManager.getConnection(dbUrl,props); PreparedStatement st = conn.prepareStatement(INSERT_SQL); st.setString(1, bigString); st.executeUpdate();This attempt failed - it inserted garbage (lots of inverted question marks and other funny characters) in my clob column. Attempt 2: setStringForClob
The Oracle specific method of
setStringForClob
can be used for binding data greater than 32765 bytes. Note that this method is called internally if you call setString
and have SetBigStringTryClob
set to true (as in Attempt 1).
OraclePreparedStatement st = (OraclePreparedStatement) conn.prepareStatement(INSERT_SQL); st.setStringForClob(1, bigString) ;This attempt failed with the same result as previous one - it inserted garbage (lots of inverted question marks and other funny characters) in my Clob column. Attempt 3: setCLOB
Create a temporary Oracle CLOB, populate it and call
setClob
.
CLOB clob = CLOB.createTemporary(conn, true, oracle.sql.CLOB.DURATION_SESSION, Const.NCHAR); clob.trim(0); Writer writer = clob.getCharacterOutputStream(); writer.write(bigString.toCharArray()); writer.flush(); writer.close(); st.setClob(1, clob);This attempt failed with:
ORA-12704: character set mismatch
Attempt 4: setCharacterStreamUse
setCharacterStream
to get a stream to write characters to the clob.
Reader reader = new StringReader(bigString); int readerLength = bigString.toCharArray().length; st.setCharacterStream(1, reader, readerLength);Failed - garbage inserted again! Attempt 5: Insert an empty_clob() and then update it
Insert an
empty_clob()
into the table, retrieve the locator, and then write the data to the clob.
String sql = "INSERT INTO clob_table (clob_col) "+ "VALUES (empty_clob())"; PreparedStatement st = conn.prepareStatement(sql); st.executeUpdate() ; sql = "SELECT clob_col FROM clob_table FOR UPDATE"; st = conn.prepareStatement(sql); ResultSet rs = st.executeQuery(); rs.next(); Clob clob = rs.getClob(1); Writer writer = clob.setCharacterStream(0); writer.write(bigString.toCharArray()); writer.flush(); writer.close(); rs.close();Success! However, I'm not happy with the two database calls; one to create the empty clob and the other to update it. There must be a better way! Attempt 6: PL/SQL
Wrap the SQL insert statement in PL/SQL to work around the size limitation.
INSERT_SQL = "BEGIN INSERT INTO clob_table (clob_col) "+ "VALUES (?); END"; st = conn.prepareStatement(sql); st.setString(1, bigString); st.executeUpdate();Success! And with only one database call!
Note, that
setString
can only process strings of less than 32766 chararacters, so if your String is bigger than this, you should use the empty_clob() technique from Attempt 5.
Phew! After six attempts, I've finally found two which work. Why does this have to be so complicated?!
References:Oracle JDBC FAQ
Oracle 10g JDBC API
Using Oracle 10g drivers to solve the 4000 character limitation
Handling CLOBs - Made easy with Oracle JDBC 10g
First off, Oracle is the more complex and advanced database management system around, never assume anything is easy (especially Blobs and Clobs). Other than that, it sounds like you weren't setting the encoding properly for the database strings which is why you saw junk characters, or rather you saw your data converted to a different encoding base. Search Google for some examples of how to set character encoding (or recode your characters) on the input and output of blobs/clobs and you should find your answer.
ReplyDeleteThanks for that. I thought that the JDBC driver transparently converted the character set appropriately so that the database server and Java client communicate in the same language? I shall google for more information.
ReplyDeleteNope, transferring Blobs/Clobs in JDBC isn't like transferring strings/numbers, the additional details have to be set. Think about it turns of file types, are all files on your harddrive text files? Sure, you could read them in a text viewer, but only some are meant to be read that way.
ReplyDeleteI tried using setAsciiStream, but that produced junk as well:
ReplyDeleteByteArrayInputStream bis = new ByteArrayInputStream(bigString.getBytes());
st.setAsciiStream(1, bis, bigString.getBytes().length);
Hi FS,
ReplyDeleteI have tested JDBC driver version-Oracle Database 10g Release 2 (10.2.0.1.0) with Oracle Database 10g Release 2, and it seems like pstmt.setString() works for CLOB columns. Could it be that this property(SetBigStringTryClob) requires Oracle 10g R2 and above ?
-Sriram
Hi-
ReplyDeleteThanks for this post...it really helped. I wanted to let you know that you can call setStringForClob() instead of setString, which will handle strings of greater then 32k.
Hi,
ReplyDeleteThanks for the post. I tried setStringForClob from OraclePreparedStatement and it worked.
I have also tried reading a CLOB field from DB which has more than 32K characters with getString. That was a success, too.
Hi Fahd, Attempt#6 solution work greats, Thanks!
ReplyDelete