Skip to Main Content

Breadcrumb

XMAS_TREE

The AskTOM team is taking a break over the holiday season, so we're not taking questions or responding to comments. Please have a happy and safe festive season and as always, thanks for being a member of the AskTOM community.

Question and Answer

Connor McDonald

Thanks for the question, Michael.

Asked: April 23, 2017 - 2:14 pm UTC

Last updated: April 24, 2017 - 1:15 am UTC

Version: 11/12

Viewed 1000+ times

You Asked

Hello,

I need to copy 30% of SOME_TABLE data, which occupies 3TB in total.
Is there a way to estimate "actual" size of 30% of table records?
I know that it is ~900GB by using simple math, but this may vary due to CLOB datatype and etc... so I wonder if this can be calculated by a query on dict views or data itself?

and Connor said...

If the sizes per row vary wildly, then you'd need to explicitly query the rows in order to get something close.

So a rough guide would be something like:

select sum(
  length(varchar2_col)+  -- string
  7 +                    -- date col
  13 +                   -- timestamp col
  length(num_col)/2 +    -- numeric col
  dbms_lob.getlength(c)  -- lob col
)
from my_table
where ...


If you wanted *precise* answers you could use the DUMP function, but that seems overkill.

Rating

  (2 ratings)

We're not taking comments currently, so please try again later if you want to add a comment.

Comments

Thanks

Michael Sharfman, April 24, 2017 - 6:37 am UTC

more or less what I was thinking :-)

other considerations

Paul, April 24, 2017 - 7:44 pm UTC

Depending on how "accurate" you want your estimate to be, you might also need to consider the partitioning and indexing on both your source and destination tables!

More to Explore

CLOB/BLOB/etc

Complete documentation on Securefiles and Large Objects here