Limiting SQL Arguments in Oracle

There seems to be a limit of 1000 arguments in Oracle SQL. I came across this when generating queries such as ....

select * from orders where user_id IN(large list of ids over 1000) 

My workaround is to create a temporary table, insert the user IDs in the first, and not issue a request through JDBC with a giant list of parameters in IN.

Does anyone know of a simplified workaround? Since we use Hibernate, I wonder if it can automatically make such a workaround transparent.

+7
sql oracle hibernate
source share
4 answers

An alternative approach would be to pass the array to the database and use the TABLE() function in the IN clause. This will probably work better than a temporary table. This will certainly be more efficient than running multiple queries. But you will need to control the use of PGA memory if you have a large number of sessions that do this. Also, I'm not sure how easy it will be to connect to Hibernate.

Note: TABLE() functions work in an SQL engine, so they need to declare an SQL type.

 create or replace type tags_nt as table of varchar2(10); / 

The following example populates an array with several thousand random tags. Then it uses the array in the IN clause of the request.

 declare search_tags tags_nt; n pls_integer; begin select name bulk collect into search_tags from ( select name from temp_tags order by dbms_random.value ) where rownum <= 2000; select count(*) into n from big_table where name in ( select * from table (search_tags) ); dbms_output.put_line('tags match '||n||' rows!'); end; / 
+4
source share

While the temporary table is a global temporary table (i.e. visible only for the session), this is the recommended way to do things (and I would go along this route for something more than a dozen arguments, not to mention thousands).

I am wondering where / how you are building a list of 1000 arguments. If this is a semi-permanent grouping (for example, all employees based in a certain place), then this grouping should be in the database and joining is done there. Databases are designed and built to make connections very quickly. Much faster than pulling a bunch of identifiers back to the middle tier and then sending them back to the database.

 select * from orders where user_id in (select user_id from users where location = :loc) 
+3
source share

You can add additional predicates to break the list into pieces of 1000:

 select * from orders where user_id IN (<first batch of 1000>) OR user_id IN (<second batch of 1000>) OR user_id IN ... 
+3
source share

comments on "if these identifiers are in your database, use joins / correlation." However, if your list of identifiers comes from other sources, for example, the result of SOLR, you can bypass the requirement of the temp table by issuing several queries, each of which contains no more than 1000 identifiers, and then merging the query results in memory. If you put the original list of identifiers in a unique collection, such as a hash, you can supplant 1000 identifiers at a time.

+1
source share

All Articles