Hi Tom,
Thanks for answering questions on Oracle.
I read your response to original question which says external tables do not support LOB's. But it is long time since you answered So asking this query expecting you might come out with some thing new :).
I have files that needs to be uploaded into system, few lines of these files have lakhs of characters in each line due to which I am facing problems using external tables.
To resolve the issue I have used COLUMN TRANSFORM option as below
1. Used dummy source file in table definition which directs to original file with CLOB data.
External Table:
CREATE TABLE ET_RECORDS
(
NATION_NAME VARCHAR2(10 BYTE),
WHOLE_RECORD CLOB
)
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY TEST_DIR
ACCESS PARAMETERS
( RECORDS DELIMITED BY NEWLINE
FIELDS terminated by ','
(
nation_name char(10),
clob_filename char(1000) )
COLUMN TRANSFORMS (whole_record FROM LOBFILE (clob_filename) FROM (TEST_DIR) CLOB)
)
LOCATION (TEST_DIR:'dummy.txt')
)
REJECT LIMIT UNLIMITED
2. Dummy.txt contains
europe,data.int
where
data.int is my actual data file
My problem is :
Now when I query external table firstly the response(time to execute query) is slow and single record is returned(as expected) column value as HUGECLOB in toad but when I double click it outofmemory exception is thrown.
I have tested same with much lesser data in data.int file and it worked fine. Is there a way to resolve these issues?
2nd problem:
Since entire file is shown as single record now I have written a query to break whole file into multiple records(using newline chr(10)) but again the query did not complete execution even after 1/2 hr(performance issue) . Please suggest changes to query below
with t as (select whole_record as wr from et_and_crh_int_records)
select regexp_substr(wr,'[^'||CHR(10)||']+',1,rownum) wh_rec ,rownum rn
from t connect by rownum <= length(regexp_replace(wr,'[^'||CHR(10)||']'))+1
again - query worked fine for lesser data.
Lastly I think it is worth to inform you that I am already using SQLLDR to load the data but still exploring external tables because
- Once the data is parsed and loaded into next level of tables the data loaded is of no use hence external tables are a better option since they do not occupy space and much easier to use.
- I felt using external tables time will be saved since we need not have to run loader to load data.
Please let me know you opinion.
Thanks,
Imran.