"Here when ever going to create external table it's required to know how many columns data is there at file."
That is not true. An external table can be *anything*, for example, if I wanted to read my database alert log, I could have:
CREATE TABLE alert_log
(
text varchar2(4000)
)
ORGANIZATION external
(
TYPE oracle_loader
DEFAULT DIRECTORY datadir
ACCESS PARAMETERS
(
RECORDS DELIMITED BY NEWLINE CHARACTERSET WE8ISO8859P1
READSIZE 1048576
FIELDS LDRTRIM
MISSING FIELD VALUES ARE NULL
REJECT ROWS WITH ALL NULL FIELDS
(
text (1:4000) CHAR(4000)
)
)
location
(
'alert_mydb.log'
)
)REJECT LIMIT UNLIMITED
Now, if I wanted to dig around *inside* that "text" column to find certain attribute, I could do that with with PLSQL. But since I'm using an external table, I can short cut some of that parsing work with the external table itself. For example, typical alert log entries are:
Fri Nov 13 12:17:33 2015
Database mounted in Exclusive Mode
Lost write protection disabled
Create Relation IPS_PACKAGE_UNPACK_HISTORY
Completed: ALTER DATABASE MOUNT
Fri Nov 13 12:18:11 2015
So why not use my external table to detect date rows as well, eg
CREATE TABLE alert_log
(
text varchar2(4000),
prefix varchar2(3)
)
ORGANIZATION external
(
TYPE oracle_loader
DEFAULT DIRECTORY datadir
ACCESS PARAMETERS
(
RECORDS DELIMITED BY NEWLINE CHARACTERSET WE8ISO8859P1
READSIZE 1048576
FIELDS LDRTRIM
MISSING FIELD VALUES ARE NULL
REJECT ROWS WITH ALL NULL FIELDS
(
text (1:4000) CHAR(4000) ,
prefix (1:3) char(3)
)
)
location
(
'alert_mydb.log'
)
)REJECT LIMIT UNLIMITED
So now I can do:
select
case when prefix in ('Mon','Tue',...) then 'Y' end is_a_date_row,
text
from alert_log
which will help me when parsing etc.