any other clients,
sns, January 10, 2006 - 3:09 pm UTC
Tom,
to emulate 10K users login simulteneously and execute certain SQL's, do you know any tool which you are aware of?
I searched in google and found quite a lot which I am not sure if it satisfies my purpose.
Is there a way to emulate 10K users without using any tool?
Thanks
January 10, 2006 - 3:14 pm UTC
without using a tool? only by developing your own.
Mercury Interactives Loadrunner is pretty good.
If you have 10,000 simultaneous (concurrently active) users, you have a fairly large system - it would be worth your money to invest in such a tool.
develop on my own,
sns, January 10, 2006 - 3:32 pm UTC
Tom,
thanks for the response. However, it is upto the management to decide about buying the tool.
I am interested to know how to develop on my own. If you have any guidelines or ideas could you please share with me?
Thanks,
January 10, 2006 - 4:29 pm UTC
it would take quite a bit of effort.
first you need enough hardware to run 10,000 clients.
then you need the software infrastructure to start up 10,000 client.
you need to reporting system so you get some metrics out of this.
and so on.
google around - you'll find sourceforge like projects out there already existing.
JMeter
TBarry, January 10, 2006 - 4:22 pm UTC
If you can work with Java, JMeter is an open-source tool from the Apache Jakarta project. I don't have a ton of experience with it, but seems pretty flexible. You can load-test web apps and you can load-test database access, but that access will be through JDBC.
</code>
http://jakarta.apache.org/jmeter/index.html <code>
emulating users,
sns, January 10, 2006 - 4:54 pm UTC
Well 10K is probably a big number we eventually want to try. But surely we want to try 200-300 users at first and then increasing the number.
So for 200-300 users (clients), is there a simple way?
Thanks,
January 10, 2006 - 7:30 pm UTC
take heed from the poster right above? or google about, you'll stumble upon many things.
I just plugged in "load test tool" to google and got what looks like a really good start.
Is it Possible using DBMS_JOB?
A reader, January 10, 2006 - 10:43 pm UTC
Tom,
I was reading one book on PL/SQL and found that similar testing was done using DBMS_JOB.SUBMIT. Is it possible to simulate this requirement with DBMS_JOB?
Thanks.
January 12, 2006 - 10:02 am UTC
not realistically.
first, job_queue_processes is 1,000 maximum.
second, everything would take place on the database server - this is unlikely to mimick your actual implementation.
load testing it used to test your system - if you have clients submitting sql from another machine, running everything (the client and all) on the same machine with the database is not realistic.
grinder
anil, January 11, 2006 - 3:50 am UTC
hi .
there is a free tool called Grinder that we have used recently to simulate up to 3000 concurrent users. IT is very good
rgds
anil
Thanks guys,
sns, January 11, 2006 - 10:36 am UTC
One more thing is I don't want my webserver to put into load stress test. I have SQL statements that runs against the database and I want just my database to be put under load stress test by emulating whatever number of users.
Anil: Thanks for letting me know about GRINDER.
Thanks,
January 12, 2006 - 10:35 am UTC
A reader, January 12, 2006 - 3:57 am UTC
stress testing
ignorant, March 08, 2007 - 2:48 pm UTC
Hi Tom,
I absolutely subscribe to your view of stress testing the application, not the database/hardware. However in some cases we can only try to get as close to that ideal as possible.
Consider this, I am working on a 9i system that is about to go live. The problem with that is that I can roughly estimate loads, etc based on the current application's load but it is not going to help much. In other words, I need to throw that out the window (except number of concurrent users) and devise a stress test before it goes live.
In the absence of Loadrunner or any such tool (won¿t get money for it), what do you think about the following approach -?
1) Get one user to log on the system.
2) Find the SID and trace it.
3) Get the user to do all regular "expected" tasks. This has to be a small enough subset of what the clients will ultimately do but should model all critical processes.
4) Stop trace.
5) Modify the sql statements to 'parameterize' them and put them into some sort of PL/SQL block.
6) Fire this procedure off with different values (random) up to no_of_concurrent_users (from previous system) + 10% overhead.
Drawbacks:
1) Clearly the time to come up with this will be quite long. I am looking for any recommendations that you might have to reduce the effort (capturing queries into table, etc).
2) This does not accurately model what will happen because the queries remain the same once compiled. This might not happen in the "real" scenario.
3) This approach ignores SQL message times, etc that might be relevant when the production switches over.
Given these drawbacks, it should still help in finding out ugly queries and disk contention. But is it worth the effort?
What is your opinion? Also, can this process be streamlined a little more?
Yours truly,
ignorant.
March 08, 2007 - 3:37 pm UTC
testing is not a trivial task - many people never schedule the time for it. It takes a significant amount of effort and (if you ask me) should be part of the development schedule from day one (rarely is). That is, as the guys writing the code are writing the code - they are also thinking about how to test drive this code and building that infrastructure.
Your approach is "perhaps OK, but won't be entirely real world". I'm not sure it will work however - how do you parameterize that working set with inputs that actually "work" (do not result in errors/failure). That'll be tricky.
Sorry, no magical silver bullets here.