there is never going to be a "perfect" way to do this and it seems like a relatively meaningless metric.
A better approach (I'm assuming this is for capacity planning purposes) would be to schedule a job that ran at the end of each month. This job would save
a) table_name
b) size
c) date
In that fashion - you can see a relevant metric - how fast is this table growing over time. I believe this is going to be something useful - it would probably answer the real question they are trying to ask.
For other guesstimates - you can do something like:
ops$tkyte%ORA11GR2> create table t
ops$tkyte%ORA11GR2> as
ops$tkyte%ORA11GR2> select *
ops$tkyte%ORA11GR2> from all_objects
ops$tkyte%ORA11GR2> /
ops$tkyte%ORA11GR2> exec dbms_stats.gather_table_stats( user, 'T' );
ops$tkyte%ORA11GR2> */
ops$tkyte%ORA11GR2>
ops$tkyte%ORA11GR2>
ops$tkyte%ORA11GR2> variable mon varchar2(30)
ops$tkyte%ORA11GR2> exec :mon := '01-sep-2010'
ops$tkyte%ORA11GR2> select :mon, cnt, cnt * avg_row_len
2 from (select count(*) cnt
3 from t
4 where t.created >= to_date( :mon, 'dd-mon-yyyy' )
5 and t.created < add_months( to_date( :mon, 'dd-mon-yyyy' ), 1 ) ),
6 user_tables
7 where table_name = 'T'
8 /
:MON CNT CNT*AVG_ROW_LEN
-------------------------------- ---------- ---------------
01-sep-2010 65926 6394822
ops$tkyte%ORA11GR2> exec :mon := '01-jan-2013';
ops$tkyte%ORA11GR2> select :mon, cnt, cnt * avg_row_len
2 from (select count(*) cnt
3 from t
4 where t.created >= to_date( :mon, 'dd-mon-yyyy' )
5 and t.created < add_months( to_date( :mon, 'dd-mon-yyyy' ), 1 ) ),
6 user_tables
7 where table_name = 'T'
8 /
:MON CNT CNT*AVG_ROW_LEN
-------------------------------- ---------- ---------------
01-jan-2013 341 33077
that would give you a good guess as to the unloaded size of the data. Another approach could be:
ops$tkyte%ORA11GR2> select :mon,
2 count(*),
3 sum(
4 nvl( vsize( OWNER ), 0 ) +
5 nvl( vsize( OBJECT_NAME ), 0 ) +
6 nvl( vsize( SUBOBJECT_NAME ), 0 ) +
7 nvl( vsize( OBJECT_ID ), 0 ) +
8 nvl( vsize( DATA_OBJECT_ID ), 0 ) +
9 nvl( vsize( OBJECT_TYPE ), 0 ) +
10 nvl( vsize( CREATED ), 0 ) +
11 nvl( vsize( LAST_DDL_TIME ), 0 ) +
12 nvl( vsize( TIMESTAMP ), 0 ) +
13 nvl( vsize( STATUS ), 0 ) +
14 nvl( vsize( TEMPORARY ), 0 ) +
15 nvl( vsize( GENERATED ), 0 ) +
16 nvl( vsize( SECONDARY ), 0 ) +
17 nvl( vsize( NAMESPACE ), 0 ) +
18 nvl( vsize( EDITION_NAME ), 0 )
19 ) sz
20 from t
21 where t.created >= to_date( :mon, 'dd-mon-yyyy' )
22 and t.created < add_months( to_date( :mon, 'dd-mon-yyyy' ), 1 )
23 /
:MON COUNT(*) SZ
-------------------------------- ---------- ----------
01-jan-2013 341 36384
ops$tkyte%ORA11GR2>
(hopefully you see how easy it would be to generate that query from user_tab_columns - so it doesn't really matter if your table has lots of columns, the query is trivial to generate - I did it with DESC )
Inserting it into a temporary table (hopefully the dba would use a true global temporary table and then query other tables - not dba_segments - to get the information would be a guesstimate - it would not be exact either as it would put all of the rows for a given month together in a physical organization that is different from where they are really in the existing table. It would be an estimate - but one that is achieved at a very very high cost.
I believe the customer really wants the information for capacity planning - charting future growth of the table. If so, use the job (the information may already exist actually if you have the tuning/diagnostic pack - we track some tables for space growth already)