Skip to Main Content

Breadcrumb

Question and Answer

Tom Kyte

Thanks for the question, Dinesh.

Asked: July 05, 2004 - 8:47 pm UTC

Last updated: June 06, 2012 - 5:39 pm UTC

Version: 8.1.7

Viewed 10K+ times! This question is

You Asked

Hi Tom,
I have a question regarding TS grwoth. Is there a way where in we can predict growth trend of TS's i know that oracle won't store historic information in dictonary but i saw that you can get trend analysis using Capacity Planner in that case from where oracle gets the historic Information?
Also is it possible to monitor the growth of a TS using script? as we do with extents??
As always your site i the best place to learn regarding oracle DBA skills.
Thanks in advance,
dinesh

and Tom said...

the capacity planner must be running from time to time in order to capture this historical information. It simply captures the dictionary information into its own repository.

You can mimick the same by setting up a job to run once a week/month that simply does:

insert into my_tablespaces
select sysdate, tablespace_name, sum(bytes)
from dba_data_files;


you can then use that data to analyze any trends you would like

Rating

  (12 ratings)

Is this answer out of date? If it is, please let us know via a Comment

Comments

Markus, July 06, 2004 - 10:30 am UTC

In the select statement is the group by expression missing (You will get: ORA-00937: not a single-group group function)

The complete statement:

insert into my_tablespaces
select sysdate, tablespace_name, sum(bytes)
from dba_data_files
group by tablespace_name;

query

A reader, December 29, 2005 - 5:37 am UTC

Hi

I used your approach, now I have some doubts of how to query the data. I have this:

select tablespace_name, bytes, load_date
from growth$data_files
where load_date between sysdate - 1 and sysdate
and tablespace_name = 'USERS'

TABLESPACE_NAME BYTES LOAD_DATE
------------------------------ ---------- -----------------
USERS 110100480 20051229 10:18:06
USERS 125829120 20051229 10:25:12
USERS 104857600 20051229 10:38:05

I want to get this output

104857600 - 110100480

the best I can get is this query:

select a.tablespace_name,
a.bytes - b.bytes GROWTH
from
(select *
from GROWTH$DATA_FILES
where load_date = (select max(load_date) from GROWTH$DATA_FILES where load_date between sysdate-1 and sysdate )
and tablespace_name = 'USERS') a,
(select *
from GROWTH$DATA_FILES
where load_date = (select min(load_date) from GROWTH$DATA_FILES where load_date between sysdate-1 and sysdate )
and tablespace_name = 'USERS') b


Is there a way to get this data using a single query (not using 2 subqueries) and witout using Analytic functions?

Tom Kyte
December 29, 2005 - 12:01 pm UTC

why without using analytics?

reason I ask - analytics is the right answer here - lag()/lead() is what you want.

Analytics functions like that are a feature of every supported Oracle database these days, why wouldn't you use them?

because I have all sorts of versions

A reader, December 30, 2005 - 5:30 am UTC

Hi

Unfortunately I have Oracle 7.3.4, 8.0.6, 8..17.4, 9.2.0.7 and 10.1.0.4 :-(

However how would lag/lead work in analytics?

My best bet in old Oracle versions guess it´s the subquery approach

Tom Kyte
December 30, 2005 - 9:33 am UTC

without analytics, you have the answer already then - analytics allow you to skip all of the painful extra work.

sms through visual basic 6.0

venkat, December 30, 2005 - 10:57 am UTC

Hi tom,
plz send me answer
i want to give sms through oxygen software . using with visual basic . but i am sending messge thats fine .who is getting my messge in that messge adding with www.oxygensoftware.com so i want to send mesg with out website name plz send me solution ,how to use

Tom Kyte
December 30, 2005 - 11:16 am UTC

plz? German postal codes?

Coupled with a question that has nothing to do with databases ;) Perfect.

On a question about tablespace growth trends.



Tablespace usage

Whisp, December 30, 2005 - 5:13 pm UTC

Tom,

In my scheduled a job to collect tablespace bytes do you think it would it be useful to collect all history information or only where the bytes have changed from the latest insert.

E.g.
Tablepsace_name Bytes date
...........................
Tab1 2048 25-12-2005
Tab1 2048 25-01-2006
Tab1 5000 25-02-2006

Or

Tablepsace_name Bytes date
...........................
Tab1 2048 25-12-2005
Tab1 5000 25-02-2006


Cheers

Whisp

Tom Kyte
December 31, 2005 - 10:54 am UTC

it would (to me) be sufficient to collect the data only when the data has changed.

We can always make up the data "later" to fill in the gaps using an outer join.

how to use lag/lead

A reader, December 31, 2005 - 5:42 pm UTC

Hi

Imagine we have this data

Tablepsace_name Bytes date
...........................
Tab1 2048 25-12-2005
Tab1 2048 25-01-2006
Tab1 5000 25-02-2006


How do we get the growth between 15-12-2005 and 25-02-2006 (2952) using lag/lead function as you mentioned...?

Tom Kyte
January 01, 2006 - 11:01 am UTC

select tablespace_name, bytes,
lag(bytes) over (partition by tablespace_name order by date) last_bytes

lag/lead

A reader, January 01, 2006 - 1:02 pm UTC

Hi

I get this data,

select tablespace_name, bytes, load_date,
lag(bytes) over (partition by tablespace_name order by load_date) last_bytes
from jeff$data_files
where tablespace_name = 'USERS'
and load_date between '20051229 00:00:00' and '20060102 00:00:00'

TABLESPACE_NAME BYTES LOAD_DATE LAST_BYTES
------------------------------ ---------- ----------------- ----------
USERS 110100480 20051229 10:18:06
USERS 125829120 20051229 10:25:12 110100480
USERS 104857600 20051229 10:38:05 125829120
USERS 83886080 20060101 00:59:31 104857600


I need to get the growth from 20051229 10:18:06 to 20060101 00:59:31 which is 110100480 - 83886080 but doesnt seem possible using lag function?

Tom Kyte
January 01, 2006 - 1:37 pm UTC

sure it is - but be more "requirements/specification like" in your talk here.

Do you want the FIRST and the LAST record by tablespace_name?

what is special about your dates? what is the real, generic, general purpose goal here.

and don't forget, if you actually want an example, you best supply a create table and some non-trivial data to test with (eg: I seriously doubt you have a single tablespace, you want to process by tablespace so any reasonable example would have at least 2 tablespaces...)

requirement

A reader, January 01, 2006 - 2:27 pm UTC

Hi

The requirement is simply, input two dates and optional tablespace name (if not supplied then show all) and show the growth for each corresponding tablespace during those two dates.

I have been looking analytic functions for 3, 4 days but I am quite new with these functions so I cant really make full power use of them!

Tom Kyte
January 01, 2006 - 4:26 pm UTC

and don't forget.... (re-read that part please...)

example

A reader, January 02, 2006 - 8:04 am UTC

CREATE TABLE "JEFF$DATA_FILES"
( "ID" NUMBER,
"LOAD_DATE" DATE,
"FILE_NAME" VARCHAR2(513),
"FILE_ID" NUMBER,
"TABLESPACE_NAME" VARCHAR2(30),
"BYTES" NUMBER,
"BLOCKS" NUMBER,
"STATUS" VARCHAR2(9),
"RELATIVE_FNO" NUMBER,
"AUTOEXTENSIBLE" VARCHAR2(3),
"MAXBYTES" NUMBER,
"MAXBLOCKS" NUMBER,
"INCREMENT_BY" NUMBER,
"USER_BYTES" NUMBER,
"USER_BLOCKS" NUMBER,
"ONLINE_STATUS" VARCHAR2(7)
) PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING
STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
TABLESPACE "USERS"

test data:

INSERT INTO JEFF$DATA_FILES ( LOAD_DATE, FILE_NAME, FILE_ID, TABLESPACE_NAME, BYTES, BLOCKS, STATUS,
RELATIVE_FNO, AUTOEXTENSIBLE, MAXBYTES, MAXBLOCKS, INCREMENT_BY, USER_BYTES, USER_BLOCKS,
ONLINE_STATUS ) VALUES (
TO_Date( '12/29/2005 10:18:06 AM', 'MM/DD/YYYY HH:MI:SS AM'), '/u02/oradata/lnx102/users01.dbf'
, 8, 'USERS', 110100480, 26880, 'AVAILABLE', 8, 'YES', 17179860992, 4194302, 320, 110034944
, 26864, 'ONLINE');
INSERT INTO JEFF$DATA_FILES ( LOAD_DATE, FILE_NAME, FILE_ID, TABLESPACE_NAME, BYTES, BLOCKS, STATUS,
RELATIVE_FNO, AUTOEXTENSIBLE, MAXBYTES, MAXBLOCKS, INCREMENT_BY, USER_BYTES, USER_BLOCKS,
ONLINE_STATUS ) VALUES (
TO_Date( '12/29/2005 10:18:06 AM', 'MM/DD/YYYY HH:MI:SS AM'), '/u02/oradata/lnx102/system01.dbf'
, 1, 'SYSTEM', 576716800, 140800, 'AVAILABLE', 1, 'YES', 17179860992, 4194302, 2560
, 576651264, 140784, 'SYSTEM');
INSERT INTO JEFF$DATA_FILES ( LOAD_DATE, FILE_NAME, FILE_ID, TABLESPACE_NAME, BYTES, BLOCKS, STATUS,
RELATIVE_FNO, AUTOEXTENSIBLE, MAXBYTES, MAXBLOCKS, INCREMENT_BY, USER_BYTES, USER_BLOCKS,
ONLINE_STATUS ) VALUES (
TO_Date( '12/29/2005 10:25:12 AM', 'MM/DD/YYYY HH:MI:SS AM'), '/u02/oradata/lnx102/users01.dbf'
, 8, 'USERS', 125829120, 30720, 'AVAILABLE', 8, 'YES', 17179860992, 4194302, 320, 125763584
, 30704, 'ONLINE');
INSERT INTO JEFF$DATA_FILES ( LOAD_DATE, FILE_NAME, FILE_ID, TABLESPACE_NAME, BYTES, BLOCKS, STATUS,
RELATIVE_FNO, AUTOEXTENSIBLE, MAXBYTES, MAXBLOCKS, INCREMENT_BY, USER_BYTES, USER_BLOCKS,
ONLINE_STATUS ) VALUES (
TO_Date( '12/29/2005 10:25:12 AM', 'MM/DD/YYYY HH:MI:SS AM'), '/u02/oradata/lnx102/system01.dbf'
, 1, 'SYSTEM', 576716800, 140800, 'AVAILABLE', 1, 'YES', 17179860992, 4194302, 2560
, 576651264, 140784, 'SYSTEM');
INSERT INTO JEFF$DATA_FILES ( LOAD_DATE, FILE_NAME, FILE_ID, TABLESPACE_NAME, BYTES, BLOCKS, STATUS,
RELATIVE_FNO, AUTOEXTENSIBLE, MAXBYTES, MAXBLOCKS, INCREMENT_BY, USER_BYTES, USER_BLOCKS,
ONLINE_STATUS ) VALUES (
TO_Date( '12/29/2005 10:38:05 AM', 'MM/DD/YYYY HH:MI:SS AM'), '/u02/oradata/lnx102/users01.dbf'
, 8, 'USERS', 104857600, 25600, 'AVAILABLE', 8, 'YES', 17179860992, 4194302, 320, 104792064
, 25584, 'ONLINE');
INSERT INTO JEFF$DATA_FILES ( LOAD_DATE, FILE_NAME, FILE_ID, TABLESPACE_NAME, BYTES, BLOCKS, STATUS,
RELATIVE_FNO, AUTOEXTENSIBLE, MAXBYTES, MAXBLOCKS, INCREMENT_BY, USER_BYTES, USER_BLOCKS,
ONLINE_STATUS ) VALUES (
TO_Date( '12/29/2005 10:38:05 AM', 'MM/DD/YYYY HH:MI:SS AM'), '/u02/oradata/lnx102/system01.dbf'
, 1, 'SYSTEM', 576716800, 140800, 'AVAILABLE', 1, 'YES', 17179860992, 4194302, 2560
, 576651264, 140784, 'SYSTEM');
INSERT INTO JEFF$DATA_FILES ( LOAD_DATE, FILE_NAME, FILE_ID, TABLESPACE_NAME, BYTES, BLOCKS, STATUS,
RELATIVE_FNO, AUTOEXTENSIBLE, MAXBYTES, MAXBLOCKS, INCREMENT_BY, USER_BYTES, USER_BLOCKS,
ONLINE_STATUS ) VALUES (
TO_Date( '01/01/2006 12:59:31 AM', 'MM/DD/YYYY HH:MI:SS AM'), '/u02/oradata/lnx102/users01.dbf'
, 8, 'USERS', 83886080, 20480, 'AVAILABLE', 8, 'YES', 17179860992, 4194302, 320, 83820544
, 20464, 'ONLINE');
INSERT INTO JEFF$DATA_FILES ( LOAD_DATE, FILE_NAME, FILE_ID, TABLESPACE_NAME, BYTES, BLOCKS, STATUS,
RELATIVE_FNO, AUTOEXTENSIBLE, MAXBYTES, MAXBLOCKS, INCREMENT_BY, USER_BYTES, USER_BLOCKS,
ONLINE_STATUS ) VALUES (
TO_Date( '01/01/2006 12:59:31 AM', 'MM/DD/YYYY HH:MI:SS AM'), '/u02/oradata/lnx102/system01.dbf'
, 1, 'SYSTEM', 576716800, 140800, 'AVAILABLE', 1, 'YES', 17179860992, 4194302, 2560
, 576651264, 140784, 'SYSTEM');

commit;



Tom Kyte
January 02, 2006 - 10:24 am UTC

Your data was "un-interesting", so I tweaked it for SYSTEM.


ops$tkyte@ORA10GR2> select tablespace_name, bytes, load_date from jeff$data_files order by 1, 3;

TABLESPACE      BYTES LOAD_DATE
---------- ---------- --------------------
SYSTEM      576716801 29-dec-2005 10:18:06
                                           <<<===== start time
SYSTEM      576716802 29-dec-2005 10:25:12
SYSTEM      576716803 29-dec-2005 10:38:05
                                           <<<===== end time
SYSTEM      576716804 01-jan-2006 00:59:31
USERS       110100480 29-dec-2005 10:18:06
                                           <<<===== start time
USERS       125829120 29-dec-2005 10:25:12
USERS       104857600 29-dec-2005 10:38:05
                                           <<<====== end time
USERS        83886080 01-jan-2006 00:59:31

8 rows selected.


<b>What we want is for each tablespace - the bytes associated with the MAX(load_date) that is LESS THAN or EQUAL TO the input start time (see below for what I used).  In other words - the row right above the start time pointer above.

We want the bytes associated with the MAX(LOAD_DATE) that is LESS THAN or EQUAL TO the input end time.  In other words the row right above the end time pointer above.

These are the two "byte values" we want - here are the input dates:</b>

ops$tkyte@ORA10GR2>
ops$tkyte@ORA10GR2> variable d1 varchar2(21)
ops$tkyte@ORA10GR2> variable d2 varchar2(21)
ops$tkyte@ORA10GR2>
ops$tkyte@ORA10GR2> exec :d1 := '29-dec-2005 10:20:00'

PL/SQL procedure successfully completed.

ops$tkyte@ORA10GR2> exec :d2 := '01-jan-2006 00:50:00'

PL/SQL procedure successfully completed.

<b>here is one approach - take the set of tablespaces (you could obviously add a predicate to the query to get just SOME of them) and for each tablespace - get the row that satisfies the above need:</b>

ops$tkyte@ORA10GR2>
ops$tkyte@ORA10GR2> select :d1 start_time, :d2 end_time, tablespace_name,
  2         (select to_number( substr( max( to_char(load_date,'yyyymmddhh24miss') || bytes), 15 ) )
  3                from jeff$data_files
  4                   where tablespace_name = x.tablespace_name
  5                     and load_date <= to_date( :d1, 'dd-mon-yyyy hh24:mi:ss' )) start_bytes,
  6         (select to_number( substr( max( to_char(load_date,'yyyymmddhh24miss') || bytes), 15 ) )
  7                from jeff$data_files
  8                   where tablespace_name = x.tablespace_name
  9                     and load_date <= to_date( :d2, 'dd-mon-yyyy hh24:mi:ss' )) end_bytes
 10    from (select distinct tablespace_name from jeff$data_files) x
 11  /

START_TIME            END_TIME              TABLESPACE START_BYTES  END_BYTES
--------------------- --------------------- ---------- ----------- ----------
29-dec-2005 10:20:00  01-jan-2006 00:50:00  USERS        110100480  104857600
29-dec-2005 10:20:00  01-jan-2006 00:50:00  SYSTEM       576716801  576716803



There are other ways to do this via analytics and what not, but this works. 

10g

vj, October 05, 2006 - 1:08 pm UTC

thanks tom,

is there any command in 10g that i can leverage to get the tablespace growth analysis..

the objective is to get the following information,
Server Name,
DB Name
Tablespace Name
Tblspc size. 6 mths ago (GB)
Tblspc size. 5 mths ago (GB)
Tblspc size. 4 mths ago (GB)
Tblspc size. 3 mths ago (GB)
Tblspc size. 2 mths ago (GB)
Tblspc size. 1 mths ago (GB)
Tblspc size. current (GB)
Space growth in 6 mths
% of growth in 6 mths


Tom Kyte
October 05, 2006 - 1:34 pm UTC

not unless youve

a) been using a tool to track it (such as a pack of enterprise manager)
b) been tracking it yourself.

10g new features

Swapnil, September 21, 2011 - 8:16 am UTC

In 10g and above you can use below query:

SELECT TO_CHAR (sp.begin_interval_time,'DD-MM-YYYY') days
, ts.tsname
, max(round((tsu.tablespace_size* dt.block_size )/(1024*1024),2) ) cur_size_MB
, max(round((tsu.tablespace_usedsize* dt.block_size )/(1024*1024),2)) usedsize_MB
FROM DBA_HIST_TBSPC_SPACE_USAGE tsu
, DBA_HIST_TABLESPACE_STAT ts
, DBA_HIST_SNAPSHOT sp
, DBA_TABLESPACES dt
WHERE tsu.tablespace_id= ts.ts#
AND tsu.snap_id = sp.snap_id
AND ts.tsname = dt.tablespace_name
AND ts.tsname NOT IN ('SYSAUX','SYSTEM')
GROUP BY TO_CHAR (sp.begin_interval_time,'DD-MM-YYYY'), ts.tsname
ORDER BY ts.tsname, days;

The query above requires a Diagnostics Pack license

A reader, June 06, 2012 - 4:39 pm UTC

Just a warning - the query provided above on the DBA_HIST_TBSPC_SPACE_USAGE view requires a Diagnostics Pack license (for legal compliance).
Tom Kyte
June 06, 2012 - 5:39 pm UTC

yes, everything that starts with dba_hist_* would (in addition to others)