ORA-01795
sanjai_9@yahoo.com, March 28, 2002 - 5:53 pm UTC
Tom,
We have a select statement where in we dynamically bind the t.e (please see the select statement below).The value of t.e depends on certain conditions. Using a FOR loop we dynamically bind the values in the IN clause.
But whenever the value exceeds 1000 we get this error. I read this page. Do we need to change our java code to handle this or is there a way to get around this.
SELECT t.a, t1.b , t.c, t.d
FROM t, t1
WHERE t.a = t1.a1
AND ( t.e IN (?, ?, ?, ?........ ) )
java.sql.SQLException: ORA-01795: maximum number of expressions in a list is 1000
Thanks a lot.
March 28, 2002 - 8:57 pm UTC
Well, as the message sort of implies -- the max number of elements in an in-list (whether you BIND them or HARD code them -- please bind them) is 1,000.
You cannot exceed that.
Suggest you create a global temporary table, array insert your "in list" into this table and use
select .... and ( t.e in ( select * from global_temp_table );
Could you please tell me how can i handle ?
AdamWang, July 19, 2002 - 3:28 am UTC
Hi Tom,
I feel so trouble ,because my products of the CA must use "sleect * from a where b in(1,2,3,.......)".Then could you tell me another way to handle it.
July 19, 2002 - 7:28 am UTC
Look up cursor_sharing
search for it here, read about it in the docs, see chapter 10 in my Expert One on One.
Another solution for java
Michael, July 22, 2002 - 7:12 am UTC
This works quite good.
First:
CREATE TYPE NUMBERTABLE AS TABLE OF NUMBER;
/
Then in JAVA:
import java.sql.*;
import oracle.sql.*;
import oracle.jdbc.*;
public class ArrayBindTest {
public static void main(String[] args) throws SQLException {
long longArray[] = new long[3000];
for(int i = 0; i < longArray.length; i++) {
longArray[i] = 4500 + i;
}
System.out.println("Starting ...");
DriverManager.registerDriver(new OracleDriver());
Connection conn = DriverManager.getConnection("jdbc:oracle:thin:@e2hp11:1521:destest", "scott", "tiger");
conn.setAutoCommit(false);
ArrayDescriptor desc = ArrayDescriptor.createDescriptor("NUMBERTABLE", conn);
ARRAY array_to_pass = new ARRAY(desc, conn, longArray);
OraclePreparedStatement pstat =
(OraclePreparedStatement) conn.prepareStatement(
"SELECT ENAME FROM EMP " +
"WHERE EMPNO IN (SELECT * FROM TABLE (SELECT CAST(? AS NUMBERTABLE) FROM DUAL))");
pstat.setARRAY(1, array_to_pass);
OracleResultSet rset = (OracleResultSet) pstat.executeQuery();
while(rset.next()) {
System.out.println(rset.getString(1));
}
pstat.close();
conn.close();
System.out.println("Finished ...");
}
}
A reader, August 28, 2003 - 3:38 pm UTC
Modify the parameter
oracleo, January 02, 2008 - 12:24 am UTC
Is there anyway that i can modiy the value of 1000 to a greater value at DB level...
Happy New Year!!
thanks
January 02, 2008 - 11:12 am UTC
no.
If you have 1,000 plus elements in your dynamic in list, you have made a mistake in design in my opinion (what end user would pick 1,000 discrete elements from a list box?!?!?)
Anyway, use a global temporary table
array insert your elements into it (using a bind variable of course)
and simply select * from t where c in ( select x from gtt );
very bind friendly
unlimited "in items"
In a database with only select privilege
Pradeep P, April 02, 2012 - 2:30 am UTC
Is there any way to do this in a production database where you have only "create session" and "read" access.?
Do we need to do the export/import of the table to another non- production database and continue with creating GTT
April 03, 2012 - 6:16 am UTC
why not ask for permission to be able to create a global temporary table in that schema?
I cannot imagine exporting/importing to run a simple query, that would be "not smart"
explain to the production database people that you can either
a) hit their database with a massive query where you list out each and every value - impacting their shared pool negatively and so on
or
b) you could have a global temporary table set up for them so they do not do that.
maybe they'll agree and do the right thing, you never know.
in VS or?
A reader, April 05, 2012 - 12:41 am UTC
Hi tom,
As we know, in () will eventually be handled as or by oracle. Is it done by optimizor for query-rewrite?
Then why not we directly write 'in' query as 'or' query to save oracle's time? Here you mentioned 1 issue
is that 'or' query might be very long and exhaust share pool easily, besides this, any other objections?
April 06, 2012 - 9:47 am UTC
it is done as a matter of parsing the query, yes.
it would save no time, it is such a trivial operation.
Instead of IN (), what about 1000+ select-literal-from-dual-union statements in a subquery?
Alan, April 06, 2012 - 2:06 pm UTC
I can imagine a functional user sending a developer an ad hoc list of 1000+ key values where those records need updated somehow. I have seen developers write multiple UPDATE statements in groups of 1000 key values to handle the ad hoc update.
I sidestep around the 1000+ limit by coding a CONCATENATE formula in the first row of a functional user's spreadsheet that builds a string like: "select 'cell_value' from dual union". Then: 1) cut-n-paste the formula into the remaining 1000+ spreadsheet rows; and 2) cut-n-paste all the CONCATENATE results into one big subquery.
Old way: IN (literal_1, literal_2, etc)
New way: IN (select literal_1 from dual union
select literal_2 from dual union
etc
)
How bad am I wearing out the parser? Is it really an issue if the SQL will only run once?
April 08, 2012 - 5:40 pm UTC
please use a global temporary table for this.
please do not union all 1000+ things - first - very very very bind "unfriendly"
second, - well, see #1
you'll run that sql once, but how many times do you do this "process"
just insert the data into a global temporary table and do "where in (select * from gtt)"
I use a lot of IN statement and get ORA-00913
Alexander, September 22, 2013 - 5:34 pm UTC
I have list with more than 1000 elements. So i write function which partitioned my List by 1000 elements like: (propertyName IN () OR propertyName IN (...) OR propertyName IN (...) ).
Dim goods As List = New List()
Dim rateValue as Integer = 100000
For i As Integer = 0 To rate
goods.Add(i)
Next
Dim Sql As System.Text.StringBuilder = New System.Text.StringBuilder()
Sql.AppendLine("SELECT G.ID, G.NAME")
Sql.AppendLine("FROM GOODS G")
Sql.AppendFormat("WHERE {0}", GetSqlListPartition(goods,"G.ID"))
When i execute this query the error is appeared: "ORA-00913: too many values".
But if i set rateValue = 10000 and generate query, it is executed ok. What is problem here?
how to insert IN List into table
Durga, May 22, 2018 - 10:50 am UTC
Hi Tom,
I am trying to do the following:
In my code the IN statement will be dynamically created and it may have more than 1050 values in it.
We want to handle these values without getting error -:
ora-01795 maximum number of expressions in a list is 1000
To insert these elements in table, i tried to use DBMS_UTILITY.COMMA_TO_TABLE function.
However, i faced issues as my values are all numbers and not varchar.
Can you suggest a way to insert these values into a GTT?
Also, to handle this error, this is what i am tried to do:
example:
select * from employees
where emp_id in (select * from table(sys.odcinumberlist(1,2,3,4 ….1020 values)));
However, i get error
ORA-00939: too many arguments for function
Can you please suggest how should i proceed?
Thank you.
Sincerely,
Durga
May 23, 2018 - 7:21 am UTC
You're on the right track with the nested table type (a GTT would work just as well).
To overcome the 1000 entries, just build it up in chunks, eg
SQL> create or replace
2 package PKG is
3
4 procedure add(params sys.odcinumberlist);
5 procedure reset;
6 function vals return sys.odcinumberlist;
7 end;
8 /
Package created.
SQL>
SQL>
SQL> create or replace
2 package body PKG is
3 g sys.odcinumberlist;
4
5 procedure add(params sys.odcinumberlist) is
6 begin
7 for i in 1 .. params.count loop
8 g.extend;
9 g(g.count) := params(i);
10 end loop;
11 end;
12
13 procedure reset is
14 begin
15 g := sys.odcinumberlist();
16 end;
17
18 function vals return sys.odcinumberlist is
19 begin
20 return g;
21 end;
22
23 end;
24 /
Package body created.
SQL>
SQL>
SQL> exec pkg.reset;
PL/SQL procedure successfully completed.
SQL> exec pkg.add(sys.odcinumberlist(1,2,3,4,5,6,7,8,9));
PL/SQL procedure successfully completed.
SQL> exec pkg.add(sys.odcinumberlist(10,20,30,40,50,60,70,80,90));
PL/SQL procedure successfully completed.
SQL> exec pkg.add(sys.odcinumberlist(100,200,300,400,500,600,700,800,900));
PL/SQL procedure successfully completed.
SQL>
SQL> select *
2 from table(pkg.vals);
COLUMN_VALUE
------------
1
2
3
4
5
6
7
8
9
10
20
30
40
50
60
70
80
90
100
200
300
400
500
600
700
800
900
27 rows selected.
SQL>
SQL>
SQL>
SQL>
Thanks for the prompt hlp and reply
Durga, May 24, 2018 - 5:31 am UTC
Thank you so much for responding to my question.
That was my first question on asktom.oracle even though i have been using this site since so many years :)
My input is
Durga, May 25, 2018 - 1:17 pm UTC
Hello Sir,
I have a question about the solution given above -
Part 1:
If i want to use the sys.odcinumberlist solution you have given me, how do i get the input to this package having the sys.odcinumberlist datatype?
How do i convert my values in the IN list into sys.odcinumberlist datatype?
The input values which you have given as
exec pkg.add(sys.odcinumberlist(10,20,30,40,50,60,70,80,90));
In my code, this input list will be dynamically created in a VARCHAR list as
'10,20,30,40,50,60,70,80,90'
along with single quotes
or
(10,20,30,40,50,60,70,80,90)
along with brackets
My input will be a VARCHAR
How can i modify my input so that the pkg.add will be able to process it?
Please suggest.
Part 2:
I have a dynamic IN-list and i have inserted that IN-list into a nested table using a code given in asktom.oracle and it works perfect.
I used this link -
https://asktom.oracle.com/pls/asktom/f?p=100:11:::::P11_QUESTION_ID:110612348061 Original code:
'AND rpsm.PLAN_PLACE_DS_VEH_STATIC_ID IN ' || v_list
v_list is VARCHAR and has value "(1,2,3,4,5...........)"
without the double quotes
New code:
'AND rpsm.PLAN_PLACE_DS_VEH_STATIC_ID IN ' || '(select * from THE(select cast(in_list_tk(v_list) as mytableType ) from dual ) a)'
v_list is of type nested table of NUMBER
The select on nested table runs fine when ran independently
(select * from THE(select cast(in_list_tk(1,2,3,4,5) as mytableType ) from dual ) a)
but when we concatenate that select in the New code, it gives invalid number error.
How to resolve that?
I have tried CAST and TO_CHAR - but didn't help
Please advice.
Thank you for your help and suggestions.
Best regards,
Durga
May 28, 2018 - 1:54 am UTC
I can see from your other followup that you have got some code to parse the csv. So rather than INSERT you simply append to the dataytype, eg
CREATE OR REPLACE PROCEDURE setup_in_list (p_in_list IN VARCHAR2) AS
l_text VARCHAR2(32767) := p_in_list || ',';
l_idx NUMBER;
l_element VARCHAR2(32767);
g sys.odcinumberlist := sys.odcinumberlist()
BEGIN
LOOP
l_idx := INSTR(l_text, ',');
EXIT WHEN NVL(l_idx, 0) = 0;
l_element := TRIM(SUBSTR(l_text, 1, l_idx - 1));
l_text := SUBSTR(l_text, l_idx + 1);
g.extend;
g(g.count) :=l_element
END LOOP;
END;
/
Overcome the 1000 list limitation
Dieter, May 28, 2018 - 9:49 am UTC
WHERE t.a = t1.a1
AND ( (1, t.e) IN ((1, ?), (1, ?), (1, ?), (1, ?)...[5000 more]..... (1,?) ) )
as soon as it is a tupel, the 1000 expressions limitation does not apply.
But, as mentioned earlier, it is most certainly not a good idea to do it this way ...
Dieter
May 29, 2018 - 1:54 am UTC
Ugh :-)
In-list with more than 3000 parameters
A reader, June 07, 2019 - 12:29 pm UTC
Hi Connor,
With your kind inputs, I had been able to write this code last year -
This procedure takes IN-list data as its input and inserts it into a GTT (IN_LIST_DATA).
The data in GTT IN_LIST_DATA is queried as a part of select query to return the elements of IN-list.
This procedure has been written to solve the issue -
ORA - 01795: maximum number of expressions in a list 1000
----------------------------------------------------
create or replace PROCEDURE setup_in_list (p_in_list IN VARCHAR2) AS
l_text VARCHAR2(32767) := p_in_list || ',';
l_idx NUMBER;
l_element VARCHAR2(32767);
BEGIN
-- p_in_list is input IN-list
-- deleting elements from GTT in_list_data
DELETE FROM in_list_data;
LOOP
l_idx := INSTR(l_text, ',');
EXIT WHEN NVL(l_idx, 0) = 0;
l_element := TRIM(SUBSTR(l_text, 1, l_idx - 1));
l_text := SUBSTR(l_text, l_idx + 1);
INSERT INTO in_list_data (
num_element)
VALUES (
to_number(trim(l_element)));
END LOOP;
END setup_in_list;
However, the elements i had in the input IN-list are static ids which are 12 digits. If there is a delimiter between them, that is 13 digits per number. So, a VARCHAR2(32767) can hold 32,767 / 13 = 2,730 items in a list.
But my input IN-list has around 6000 records, then how can i change my existing setup_in_list to handle this huge input?
Please advice.
Sincerely,
Durga
June 11, 2019 - 5:43 am UTC
Pass it as a clob, and you're good to go
Need good explanation for tuples
DMITRIY, January 06, 2020 - 12:19 pm UTC
Hello!
The question above was Very poorly answered!
Q:
Overcome the 1000 list limitation
May 28, 2018 - 9:49 am UTC
Reviewer: Dieter from Germany
A:
Followup
May 29, 2018 - 1:54 am UTC
Ugh :-)
Pls, explain yourself. What dies this "ugh" mean?
Also give details about optimal solution and maybe some comparisons of performance.
The problem also is with production db permissions on editing global temp and others.
Thank you.
January 07, 2020 - 3:18 am UTC
From the original answer, and repeated throughout the thread
"Please do not dyamically create a statement with hard wired IN list values or OR'ed expressions. That is the worst thing you can do to us. We'll spend more time parsing queries then actually executing them!!"
So hacking around it by using tuples might avoid your 1000 limit, but it still just crucifies you on parsing costs. As we said:
"Either use the static temp table or use one of the object tables or PLSQL tables as outlined in the link."
With versions 12.2+ using JSON
Stew Ashton, January 07, 2020 - 11:19 am UTC
With more recent versions (I am told 12.2 patched will work, and I know 18c and 19c work) JSON can be used to split CLOB strings into rows. No need for a GTT or a stored procedure. Performance seems excellent.
- The DATA and CLOB_DATA subqueries are there to create the CLOB.
- The solution is all in the main query.
- JSON_ARRAY does two things: it escapes any special characters and it encloses the string in '[' and ']'. The string becomes a JSON array with one element.
- The REPLACE turns each comma into an element delimiter, so now the JSON array contains many elements.
- Finally, JSON_TABLE unescapes any special characters and returns each element as a separate row.
- Here I am expecting numbers, so I define SUBS as a number and JSON_TABLE does the conversion for me. I add ERROR ON ERROR, otherwise JSON_TABLE will silently discard any non-numbers without telling me!
with data(list) as (
select listagg(level,',') within group(order by null)
from dual
connect by level <= 1000
)
, clob_data(clob_list) as (
select to_clob(list)
|| ',' || list || ',' || list || ',' || list || ',' || list
|| ',' || list || ',' || list || ',' || list || ',' || list
|| ',' || list || ',' || list || ',' || list || ',' || list
|| ',' || list || ',' || list || ',' || list || ',' || list
from data
)
select subs from clob_data, json_table(
replace(json_array(clob_list returning clob), ',', '","'),
'$[*]' columns (
subs number path '$'
error on error
)
);
Happy New Year and best regards,
Stew
January 07, 2020 - 2:19 pm UTC
Nice suggestion Stew, thanks for sharing.
Another way used by one of our vendors
Patrick McManus, January 07, 2020 - 2:05 pm UTC
One of our software vendors gets around the 1,000 entry limit for IN (..) expressions by using this construct, allowing for an arbitrarily large number of values:
WHERE t1.x IN (x0..x999)
OR t1.x IN (x1000...x1999)
OR t1.x IN (x2000..x2999)...
It certainly works, whether or not it is more efficient than a global temporary table is another matter. Given the number of concurrent users that would be generating INSERTs and DELETEs against a temporary table (with overlapping values), I doubt that approach would be practical.
January 07, 2020 - 2:21 pm UTC
Are those hard-coded values?!
"Concurrency" and GTTs???
Stew Ashton, January 07, 2020 - 7:01 pm UTC
"Given the number of concurrent users that would be generating INSERTs and DELETEs against a temporary table (with overlapping values), I doubt that approach would be practical."
What does that even mean?
Every user (= session) has his own temporary table DATA. Only the definition is global. There is no concurrency whatsoever.
Regards,
Stew
January 08, 2020 - 1:02 am UTC
Your guess is as good as mine :-)
Surprise - list of tuples, triples etc. has a limit over 1000
Edgar, June 17, 2020 - 1:14 pm UTC
June 17, 2020 - 1:41 pm UTC
"Magic" indeed!
Thanks for sharing.
mathguy, August 13, 2021 - 2:13 pm UTC
@Edgar (comment from June 17, 2020)
I tested and the limit for in-list of tuples is 65,535 - as you seem to suggest with your example. Same Oracle version as yours, 12.2.0.1. Please note that 65,535 is 2^16 - 1, not 2^64.
There is no mystery about this limit - see below. Oracle has the 1000 elements limit for IN lists for whatever reason (not very clear to me), but it does not enforce it for tuples of two or more elements. Just making atomic elements into 1-tuples by wrapping them in parentheses doesn't work though; the tuples must be n-tuples with n >= 2.
Regardless, though, IN (list) conditions are expanded as long OR-delimited compound conditions. And Oracle has a hard limit of 2^16 - 1 = 65,535 component conditions for such compound conditions - regardless of what those conditions are (whether they come from an IN list with tuples, or whether they are inequalities, or LIKE conditions, or anything else). Very likely the same limit applies to AND-delimited compound conditions.
SYS.QDCIVARCHAR2LIST
Hans, February 03, 2022 - 6:14 am UTC
According due to my tests replacing
WHERE x IN ('ab','cd') OR IN ('ef','gh')
with
WHERE x IN (SELECT * FROM sys.odcivarchar2list('ab','cd') UNION ALL sys.odcivarchar2list('ef','gh') )
(assuming used after JOINS with big results and long lists) is much faster, but not as fast as creating a temp. table.
February 03, 2022 - 6:30 pm UTC
Care to share your complete test?
Code on GitHub
Hans, February 04, 2022 - 12:15 am UTC
February 04, 2022 - 5:39 am UTC
I think that is going to be a very much case by case basis, and possibly a little due to luck in terms of which is faster or slower. But generally, the optimizer is going to have an easier job costing a query which does not have table functions (nested tables, varrays etc) in it, eg
SQL> create table t as
2 with letter as (
3 select chr(ascii('a')+rownum-1) c from dual connect by level <= 26
4 )
5 select t1.c||t2.c item, rpad(rownum,100) data
6 from letter t1, letter t2,
7 ( select 1 from dual connect by level <= 100 )
8 /
Table created.
SQL>
SQL> create index ix on t ( item );
Index created.
SQL>
SQL> set autotrace traceonly explain
SQL> select * from t
2 where item in ('ab','cd') or item in ('ef','gh');
Execution Plan
----------------------------------------------------------
Plan hash value: 427062661
---------------------------------------------------------------------------------------------
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
---------------------------------------------------------------------------------------------
| 0 | SELECT STATEMENT | | 400 | 41600 | 12 (0)| 00:00:01 |
| 1 | INLIST ITERATOR | | | | | |
| 2 | TABLE ACCESS BY INDEX ROWID BATCHED| T | 400 | 41600 | 12 (0)| 00:00:01 |
|* 3 | INDEX RANGE SCAN | IX | 400 | | 5 (0)| 00:00:01 |
---------------------------------------------------------------------------------------------
Predicate Information (identified by operation id):
---------------------------------------------------
3 - access("ITEM"='ab' OR "ITEM"='cd' OR "ITEM"='ef' OR "ITEM"='gh')
SQL>
SQL> select * from t
2 where item in (select * from sys.odcivarchar2list('ab','cd') union all select * from sys.odcivarchar2list('ef','gh') );
Execution Plan
----------------------------------------------------------
Plan hash value: 2435189465
--------------------------------------------------------------------------------------------------------------
| Id | Operation | Name | Rows | Bytes |TempSpc| Cost (%CPU)| Time |
--------------------------------------------------------------------------------------------------------------
| 0 | SELECT STATEMENT | | 1633K| 3280M| | 2286 (1)| 00:00:01 |
|* 1 | HASH JOIN | | 1633K| 3280M| 7664K| 2286 (1)| 00:00:01 |
| 2 | TABLE ACCESS FULL | T | 67600 | 6865K| | 287 (1)| 00:00:01 |
| 3 | VIEW | VW_NSO_1 | 16336 | 31M| | 58 (0)| 00:00:01 |
| 4 | HASH UNIQUE | | 16336 | 32672 | | 58 (0)| 00:00:01 |
| 5 | UNION-ALL | | | | | | |
| 6 | COLLECTION ITERATOR CONSTRUCTOR FETCH| | 8168 | 16336 | | 29 (0)| 00:00:01 |
| 7 | COLLECTION ITERATOR CONSTRUCTOR FETCH| | 8168 | 16336 | | 29 (0)| 00:00:01 |
--------------------------------------------------------------------------------------------------------------
Predicate Information (identified by operation id):
---------------------------------------------------
1 - access("ITEM"="COLUMN_VALUE")
SQL> set autotrace traceonly stat
SQL> select * from t
2 where item in ('ab','cd') or item in ('ef','gh');
400 rows selected.
Statistics
----------------------------------------------------------
0 recursive calls
0 db block gets
70 consistent gets
4 physical reads
0 redo size
49310 bytes sent via SQL*Net to client
712 bytes received via SQL*Net from client
28 SQL*Net roundtrips to/from client
0 sorts (memory)
0 sorts (disk)
400 rows processed
SQL>
SQL> select * from t
2 where item in (select * from sys.odcivarchar2list('ab','cd') union all select * from sys.odcivarchar2list('ef','gh') );
400 rows selected.
Statistics
----------------------------------------------------------
0 recursive calls
0 db block gets
1029 consistent gets
0 physical reads
0 redo size
48039 bytes sent via SQL*Net to client
782 bytes received via SQL*Net from client
28 SQL*Net roundtrips to/from client
0 sorts (memory)
0 sorts (disk)
400 rows processed
SQL> set autotrace off
SQL>
CPU-Time vs. Wall-Time
Hans, February 04, 2022 - 10:21 pm UTC
Thanks again for your time and you patience.
First some clarification: The splitting of the lists and linking them with OR (in case of WHERE IN) or UNION ALL (in case of SYS.ODCIVARCHAR2LIST) is only necessary in case of the list is longer than 1000 items.
It think I found the discrepancy. You are measuring CPU time on the DB server, while I am measuring response times on the client side.
Under high load of the DB-server the results will be different, but for a low loaded server with a user waiting for response it may be faster using SYS.ODCIVARCHAR2LIST.
February 07, 2022 - 2:45 am UTC
I should not I'm not discounting any method. As always its not about "what is always best" but what works best for any particular scenario.
How to resolve problem with 1000 limitatations
Saša Petković, February 05, 2022 - 8:41 am UTC
Hi Chriss,
Please pay attention to this example, it may help when reached limit of 1000 items into IN clause.
create table test_where_in (id) as
select level from dual connect by level <= 10000;
if we do next :
select * from test_where_in where id in (1, 2, 3, 4, 5, 6, 7, 8,...1001);
we get an error:
ORA-01795: maximum number of expressions in a list is 1000
To prevent this we can do follow trick:
select * from test_where_in
where (id, 'test') in (
(1, 'test'),
(2, 'test'),
(3, 'test'),
..
(10000, 'test')
);
I have tested with 10000 items in the IN clause and that works fine.
Hope that this trick might help to someone.
Best
Saša Petković
February 07, 2022 - 3:03 am UTC
I think this caps out at around 64k, but still a useful trick to have.
This will not use index
Narendra Gupta, February 05, 2022 - 8:56 am UTC
if i have a index on column id , it will not use that index.
February 07, 2022 - 2:52 am UTC
SQL> create table t ( x int primary key, y int );
Table created.
SQL>
SQL> insert into t
2 select rownum, rownum
3 from dual
4 connect by level <= 100;
100 rows created.
SQL>
SQL> exec dbms_stats.gather_table_stats('','T');
PL/SQL procedure successfully completed.
SQL>
SQL> set autotrace traceonly explain
SQL> select * from t
2 where (x, 'test') in (
3 (1, 'test'),
4 (2, 'test'),
5 (3, 'test')
6 );
Execution Plan
----------------------------------------------------------
Plan hash value: 3205741637
---------------------------------------------------------------------------------------------
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
---------------------------------------------------------------------------------------------
| 0 | SELECT STATEMENT | | 3 | 18 | 2 (0)| 00:00:01 |
| 1 | INLIST ITERATOR | | | | | |
| 2 | TABLE ACCESS BY INDEX ROWID| T | 3 | 18 | 2 (0)| 00:00:01 |
|* 3 | INDEX UNIQUE SCAN | SYS_C0066638 | 3 | | 1 (0)| 00:00:01 |
---------------------------------------------------------------------------------------------
Predicate Information (identified by operation id):
---------------------------------------------------
3 - access("X"=1 OR "X"=2 OR "X"=3)
SQL>
SQL> set autotrace off
CPU-Time on Oracle user
Hans, February 05, 2022 - 10:05 pm UTC
Hello again!
I created queries with a list of length 20000, having items with the length of 32.
Running this the first time (query caching) brings the CPU-usage of the database-server on 100% for a few seconds (single process "oracle_7806_xe", observed with Linux "top") using "WHERE IN" only, but only to about 15% for a short time using "SYS.ODCIVARCHAR2LIST".
The statistics look the other way round, like yours...
In the case two tables are joined before, the differences are even greater.
February 07, 2022 - 3:13 am UTC
My concern is that once you start building giant SQL's, then the cost of parsing them can often outweigh anything else, eg on this simple example below, I drop 0.50seconds just parsing the SQL
SQL> create table t ( x int primary key, y int );
Table created.
SQL> insert into t
2 select rownum, rownum
3 from dual
4 connect by level <= 100;
100 rows created.
SQL> exec dbms_stats.gather_table_stats('','T');
PL/SQL procedure successfully completed.
SQL> create table t1 as select rownum z from dual connect by level <= 3000;
Table created.
select * from t
where (x, 0) in (
(1,0),
(2,0),
(3,0),
(4,0),
(5,0),
(6,0),
(7,0),
(8,0),
(9,0),
...
(2997,0),
(2998,0),
(2999,0),
(3000,0)
)
call count cpu elapsed disk query current rows
------- ------ -------- ---------- ---------- ---------- ---------- ----------
Parse 1 0.06 0.51 0 0 0 0
Execute 1 0.00 0.00 0 0 0 0
Fetch 8 0.00 0.00 0 13 0 100
------- ------ -------- ---------- ---------- ---------- ---------- ----------
total 10 0.06 0.51 0 13 0 100
select t.*
from t, t1
where t.x = t1.z
call count cpu elapsed disk query current rows
------- ------ -------- ---------- ---------- ---------- ---------- ----------
Parse 1 0.00 0.00 0 0 1 0
Execute 1 0.00 0.00 0 0 0 0
Fetch 8 0.00 0.00 5 123 0 100
------- ------ -------- ---------- ---------- ---------- ---------- ----------
total 10 0.00 0.00 5 123 1 100
Example SQLs
Hans, February 06, 2022 - 9:49 pm UTC
Hello,
for better testing, I wrote a Java program to generate the statements:
https://github.com/hans-helmut/wherein/tree/main/testsql I could not upload the statement for creating the tables, as it is over 400 MB.
To get clear results, I had to join two tables first. All tests where run with 1000 statements containing a list of 200 each. The total runtimes where 16 vs 32 seconds, sqlplus needs 8 seconds, assuming both processes wait for each other, the DB server needs 8 vs 24 seconds.
Looking at "top", the DB server needs 35% vs 70 % CPU for the double time. So using SYS.ODCIVARCHAR2LIST it is 3 to 4 times faster in this test.
Maybe someone can confirm this results. In my opinion the estimation of the costs is not correct (as far as I can understand them)