Skip to Main Content

Breadcrumb

Question and Answer

Connor McDonald

Thanks for the question, Sunny.

Asked: December 15, 2020 - 8:48 am UTC

Last updated: December 17, 2020 - 4:24 am UTC

Version: 12c R2

Viewed 1000+ times

You Asked

Here I do not have any clob or blob data column here just store varchar2 fileds. but 147719 rows imported but size is 1.4375 GB. So this is reasonable or not?


SO what is "25.95 GB" mention in import log file?

following is my details of import file output.

Import: Release 12.2.0.1.0 - Production on Mon Dec 15 11:53:49 2020

Copyright (c) 1982, 2017, Oracle and/or its affiliates.  All rights reserved.
;;; 
Connected to: Oracle Database 12c Enterprise Edition Release 12.2.0.1.0 - 64bit Production
Master table "SYSTEM"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded
Starting "SYSTEM"."SYS_IMPORT_TABLE_01":  system/******** DIRECTORY=DATA_PUMP_DIR DUMPFILE=FILE_121220_0748.dmp LOGFILE=FILE_121220_0748.txt TABLES=HR.EMP_DOCS_MAS query=HR.EMP_DOCS_MAS:"where trns_srno=0 and trunc(entr_dt) between '01-DEC-2020' and '15-DEC-2020' " table_exists_action=replace 
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
. . imported "HR"."EMP_DOCS_MAS"                        25.95 GB  147719 out of 503929138 rows
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/TRIGGER
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type SCHEMA_EXPORT/STATISTICS/MARKER
Job "SYSTEM"."SYS_IMPORT_TABLE_01" successfully completed at Mon Dec 15 11:55:56 2020 elapsed 0 00:02:06


I check total size of import table is 1.4375 GB only see following query.

select a.BYTES / 1024 / 1024 / 1024 gb_size, a.*
  from dba_segments a
 where a.segment_name = 'EMP_DOCS_MAS';


Output :-

GB_SIZE :- 1.4375
OWNER :- HR
SEGMENT_NAME :- EMP_DOCS_MAS
PARTITION_NAME :-
SEGMENT_TYPE :- TABLE
SEGMENT_SUBTYPE :- ASSM
TABLESPACE_NAME :- ATRDTL
HEADER_FILE :- 5
HEADER_BLOCK :- 1572897
BYTES :- 1543503872
BLOCKS :- 188416
EXTENTS :- 23
INITIAL_EXTENT :- 1543503872
NEXT_EXTENT :- 1048576
MIN_EXTENTS :- 1
MAX_EXTENTS :- 2147483645
MAX_SIZE :- 2147483645
RETENTION :-
MINRETENTION :-
PCT_INCREASE :-
FREELISTS :-
FREELIST_GROUPS :-
RELATIVE_FNO :- 1024
BUFFER_POOL :- DEFAULT
FLASH_CACHE :- DEFAULT
CELL_FLASH_CACHE :- DEFAULT
INMEMORY :- DISABLED
INMEMORY_PRIORITY :-
INMEMORY_DISTRIBUTE :-
INMEMORY_DUPLICATE :-
INMEMORY_COMPRESSION :-
CELLMEMORY :-

And

select  sum(s.BYTES / 1024 / 1024/1024) gb  
  from dba_extents s
 where s.segment_name = 'EMP_DOCS_MAS';


GB
------
1.4375

and Chris said...

So what is "25.95 GB" mention in import log file?

That's the size of the table in the export. There's a data filter in the import command, so you're only loading some fraction of this.

147719 rows imported but size is 1.4375 GB. So this is reasonable or not?

That works out at ~10kb/row.

So back at you: does this seem reasonable? I don't know what your table looks like!

Just remember: the *_segments views report the size allocated to the table. It's likely you have much more space allocated to the table than you're actually using.

For example, let's create a table, insert one row, and see how much space it's using:

create table t (
  c1 int
);

insert into t values ( 1 );
commit;
select bytes from user_segments
where  segment_name = 'T';

BYTES   
   65536 


64Kb for one row?! That's because most of the space is empty.

Rating

  (1 rating)

Is this answer out of date? If it is, please let us know via a Comment

Comments

anyway to know actual size

Sunny, December 16, 2020 - 6:51 am UTC

Thanks for replay,
Is there any method so we can identify actual size of table?
Connor McDonald
December 17, 2020 - 4:24 am UTC

Assuming your stats are up to date, then

NUM_ROWS * AVG_ROW_LEN * ( 1 + PCTFREE/100)

from user_tables is a reasonable approximation.

More to Explore

Data Pump

All of the database utilities including Data Pump are explained in the Utilities guide.