I am facing problem with migrating data from one table to another table which contains long datatype.The trigger is working well with respect to other fields but only in case of long column it is behaving differently like it is not inserting any images related to that column and at the same type it is not showing any error message.The structure of the source and target are same.The following is the structure and trigger.
SQL> desc images
Name Null? Type
------------------------------- -------- ----
IMAGE_ID NOT NULL NUMBER
IMAGE_FILEPATH NOT NULL NUMBER
IMAGE_FILENAME NOT NULL VARCHAR2(12)
IMAGE_FILETYPE NOT NULL VARCHAR2(10)
IMAGE_DATA LONG RAW
Trigger
-------
CREATE OR REPLACE TRIGGER images_testing
after INSERT OR UPDATE
ON IMAGES
FOR each ROW
BEGIN
INSERT INTO TEST_IMAGES
(image_id,
image_filepath,
image_filename,
image_filetype,
image_data)
VALUES
(:NEW.image_id,
:NEW.image_filepath,
:NEW.image_filename,
:NEW.image_filetype,
:NEW.image_data);
END;
Pls try to sort out this problem.
Thanks
Sudha
from the application developers guide:
</code>
http://technet.oracle.com/doc/server73x/ADG73/ch9.htm#toc093
<quote>
<b>
LONG and LONG RAW Datatypes
</b>
LONG and LONG RAW datatypes in triggers are subject to the following restrictions:
o A SQL statement within a trigger can insert data into a column of LONG or LONG RAW datatype.
o If data from a LONG or LONG RAW column can be converted to a constrained datatype (such as CHAR and VARCHAR2), a LONG or LONG RAW column can be referenced in a SQL statement within a trigger. Note that the maximum length for these datatypes is 32000 bytes.
o Variables cannot be declared using the LONG or LONG RAW datatypes.
o :NEW and :OLD cannot be used with LONG or LONG RAW columns.
</quote>
So, you cannot use your approach to copy a LONG RAW from one table to another. In fact, if the LONG RAW exceeds 32k, you simply <b>will not be able to do this via a trigger at all</b>.
What you can do, if the LONG RAW is less then 32k, is to use an AFTER trigger (not for each row) to process the copy. A FOR EACH ROW trigger would be used to capture the primary keys of the newly inserted rows and the AFTER trigger would read the long raw out into a temporary variable (upto 32k) and insert it into the other table. See
http://asktom.oracle.com/~tkyte/Mutate/index.html <code>
for the details. Your particular implementation,
if the long raws are 32k or less, would look like this:
tkyte@ORA734.WORLD> create table images
2 ( image_id int PRIMARY KEY,
3 image_filepath int,
4 image_filename varchar2(12),
5 image_filetype varchar2(10),
6 image_data long raw
7 );
Table created.
tkyte@ORA734.WORLD> create table test_images
2 ( image_id int,
3 image_filepath int,
4 image_filename varchar2(12),
5 image_filetype varchar2(10),
6 image_data long raw
7 );
Table created.
tkyte@ORA734.WORLD>
tkyte@ORA734.WORLD>
tkyte@ORA734.WORLD>
tkyte@ORA734.WORLD> create or replace package state_pkg
2 as
3 type numArray is table of number
4 index by binary_integer;
5
5 modified numArray;
6 empty numArray;
7 end;
8 /
Package created.
tkyte@ORA734.WORLD>
tkyte@ORA734.WORLD> create or replace trigger images_BIU
2 before insert or update on images
3 begin
4 state_pkg.modified := state_pkg.empty;
5 end;
6 /
Trigger created.
tkyte@ORA734.WORLD>
tkyte@ORA734.WORLD> create or replace trigger images_AIUfer
2 after insert or update on images
3 for each row
4 begin
5 state_pkg.modified(state_pkg.modified.count+1) :=
6 :new.image_id;
7 end;
8 /
Trigger created.
tkyte@ORA734.WORLD>
tkyte@ORA734.WORLD> create or replace trigger images_AIU
2 after insert or update on images
3 begin
4 for i in 1 .. state_pkg.modified.count
5 loop
6 for x in ( select *
7 from images
8 where image_id =
9 state_pkg.modified(i) )
10 loop
11 insert into test_images
12 ( image_id, image_filepath,
13 image_filename,
14 image_filetype, image_data )
15 values
16 ( x.image_id, x.image_filepath,
17 x.image_filename,
18 x.image_filetype, x.image_data );
19 end loop;
20 end loop;
21 state_pkg.modified := state_pkg.empty;
22 exception
23 when others then
24 state_pkg.modified := state_pkg.empty;
25 raise;
26 end;
27 /
Trigger created.
Now, to test this out, we will insert a 10k raw and see if it gets copied correctly:
tkyte@ORA734.WORLD> insert into images values
2 ( 1, 1, 'x', 'y', hextoraw( rpad( 'A', 20000, 'A') ) );
1 row created.
tkyte@ORA734.WORLD>
tkyte@ORA734.WORLD> set serveroutput on
tkyte@ORA734.WORLD> declare
2 x images%rowtype;
3 y images%rowtype;
4 begin
5
5 select * into x from images;
6 select * into y from test_images;
7
7 dbms_output.put_line( 'length(x) = ' ||
8 length( x.image_data ) );
9 dbms_output.put_line( 'length(y) = ' ||
10 length( y.image_data ) );
11
11 if ( x.image_data = y.image_data ) then
12 dbms_output.put_line( 'Are equal' );
13 else
14 dbms_output.put_line( 'Are NOT equal' );
15 end if;
16 end;
17 /
length(x) = 2000
length(y) = 2000
Are equal
PL/SQL procedure successfully completed.
sure enough -- this works. Again,
this will only work upto 32k at which point it will no longer work as the largest PLSQL variable is 32k in size. At that point, given that you are using Oracle7.3.4, your choices are very limited -- PLSQL is out of the question. The best answer at that point is really to use Oracle8 and use the BLOB datatype which does not have this restriction -- the entire blob can be manipulated using plsql or any language for that matter.