Django FCGI Stress Test 01.02.2007

Stress testing the nginx+flup combination seems to have positive results:

 # siege -d1 -r10 -c25 nodnod.net

  ** siege 2.65
  ** Preparing 25 concurrent users for battle.
  250 hits
  Availability:                 100.00 %
  Elapsed time:                  10.01 secs
  Data transferred:               1.72 MB
  Response time:                  0.16 secs
  Transaction rate:              24.98 trans/sec
  Throughput:                     0.17 MB/sec
  Concurrency:                    4.08
  Successful transactions:         250
  Failed transactions:               0
  Longest transaction:            1.96
  Shortest transaction:           0.01

Not too bad, I suppose. I would be ecstatic if I could get 25 concurrent users for 10 whole repetitions.

But something a little more challenging reveals:

  # siege -d1 -r10 -c100 nodnod.net

  ** siege 2.65
  ** Preparing 100 concurrent users for battle.
  The server is now under siege..      done.                                                                                                                                                                          
  Transactions:                    1000 hits
  Availability:                 100.00 %
  Elapsed time:                  22.06 secs
  Data transferred:               4.96 MB
  Response time:                  1.05 secs
  Transaction rate:              45.33 trans/sec
  Throughput:                     0.22 MB/sec
  Concurrency:                   47.70
  Successful transactions:         713
  Failed transactions:               0
  Longest transaction:            5.59
  Shortest transaction:           0.00

What does this means?

It’s interesting that with 25 users, the concurrency was ~4 — meaning that the server processed requests fast enough so that at any given time, there were at most 4 users connected at the same time.

With 100 users however, server performance decreases drastically, and concurrent users jumped up to a whopping 47.70. That means that at one point, there were 47 users waiting on the server.

To be honest though, I’m not all that worried.

100 users every second for 10 seconds is alot of visitors. Yes, server performance decreased. But on the other hand, there were no failed transactions — everyone would eventually get to the page.

I have a feeling that mod_python would probably perform better than FCGI.

I also have not done any sort of developer dictated caching. Any and all caching done is done automagically by Django.