Oracle JDBC and 4000 char limit

We are trying to save the UTF-16 encoded string to the Oracle AL32UTF8 database.

Our program works fine in a database that uses WE8MSWIN1252 as an encoding. When we try to run it in a database that uses AL32UTF8 , it gets the value java.sql.SQLException: ORA-01461: can bind a LONG value only for insert into a LONG column .

In the test file below, everything works fine until our input gets too long.

The input string can exceed 4000 characters. We want to keep as much information as possible, although we understand that input must be disabled.

Our database tables are defined using the CHAR keyword (see below). We hoped that this would allow us to store up to 4000 characters of any character set. Can this be done? If so, how?

We tried to convert String to UTF8 using ByteBuffer without success. OraclePreparedStatement.setFormOfUse(...) also did not help us.

Switching to CLOB not an option. If the string is too long, you need to trim it.

This is our code at the moment:

 public static void main(String[] args) throws Exception { String ip ="193.53.40.229"; int port = 1521; String sid = "ora11"; String username = "obasi"; String password = "********"; String driver = "oracle.jdbc.driver.OracleDriver"; String url = "jdbc:oracle:thin:@" + ip + ":" + port + ":" + sid; Class.forName(driver); String shortData = ""; String longData = ""; String data; for (int i = 0; i < 5; i++) shortData += "é"; for (int i = 0; i < 4000; i++) longData += "é"; Connection conn = DriverManager.getConnection(url, username, password); PreparedStatement stat = null; try { stat = conn.prepareStatement("insert into test_table_short values (?)"); data = shortData.substring(0, Math.min(5, shortData.length())); stat.setString(1, data); stat.execute(); stat = conn.prepareStatement("insert into test_table_long values (?)"); data = longData.substring(0, Math.min(4000, longData.length())); stat.setString(1, data); stat.execute(); } finally { try { stat.close(); } catch (Exception ex){} } } 

This is creating a simple table script:

 CREATE TABLE test_table_short ( DATA VARCHAR2(5 CHAR); ); CREATE TABLE test_table_long ( DATA VARCHAR2(4000 CHAR); ); 

The test script works fine on short data. However, with long data, it continues to receive an error. Even when our longData is only 3,000 characters long, it still fails.

Thanks in advance!

+7
source share
2 answers

Prior to Oracle 12.1, the VARCHAR2 column VARCHAR2 limited to storing 4000 bytes of data in the database character set, even if it is declared VARCHAR2(4000 CHAR) . Since each character in your string requires 2 bytes of storage in the UTF-8 character set, you cannot store more than 2000 characters in a column. Of course, this number will change if some of your characters require only 1 byte of memory, or if some of them require storing more than 2 bytes. When the database character set is Windows-1252, each character in your string requires only one byte of memory so you can store 4000 characters in a column.

Since you have longer rows, is it possible to declare a column as CLOB and not as VARCHAR2 ? This will (effectively) eliminate the length limit (there is a limit on the CLOB size, which depends on the version of Oracle and the block size, but at least in several GB ranges).

If you are using Oracle 12.1 or later, the max_string_size parameter allows max_string_size to increase the maximum size of a VARCHAR2 column from 4000 bytes to 32767 bytes .

+7
source

The problem was solved by cutting the string to the required byte length. Please note that this cannot be done simply using

 stat.substring(0, length) 

as this creates a UTF-8 string that can be three times longer than allowed.

 while (stat.getBytes("UTF8").length > length) { stat = stat.substring(0, stat.length()-1); } 

note do not use stat.getBytes (), since it depends on the set of "file.encoding" and yields either Windows-1252 bytes or UTF-8!

If you use Hibernate, you can do it using org.hibernate.Interceptor!

+4
source

All Articles