Skip to Main Content

Breadcrumb

Question and Answer

Connor McDonald

Thanks for the question, Michael.

Asked: April 23, 2017 - 2:14 pm UTC

Last updated: April 24, 2017 - 1:15 am UTC

Version: 11/12

Viewed 1000+ times

You Asked

Hello,

I need to copy 30% of SOME_TABLE data, which occupies 3TB in total.
Is there a way to estimate "actual" size of 30% of table records?
I know that it is ~900GB by using simple math, but this may vary due to CLOB datatype and etc... so I wonder if this can be calculated by a query on dict views or data itself?

and Connor said...

If the sizes per row vary wildly, then you'd need to explicitly query the rows in order to get something close.

So a rough guide would be something like:

select sum(
  length(varchar2_col)+  -- string
  7 +                    -- date col
  13 +                   -- timestamp col
  length(num_col)/2 +    -- numeric col
  dbms_lob.getlength(c)  -- lob col
)
from my_table
where ...


If you wanted *precise* answers you could use the DUMP function, but that seems overkill.

Rating

  (2 ratings)

Is this answer out of date? If it is, please let us know via a Comment

Comments

Thanks

Michael Sharfman, April 24, 2017 - 6:37 am UTC

more or less what I was thinking :-)

other considerations

Paul, April 24, 2017 - 7:44 pm UTC

Depending on how "accurate" you want your estimate to be, you might also need to consider the partitioning and indexing on both your source and destination tables!

More to Explore

CLOB/BLOB/etc

Complete documentation on Securefiles and Large Objects here