[TIP] testing-in-python Digest, Vol 50, Issue 10

nyxtom at gmail.com nyxtom at gmail.com
Tue Mar 15 13:02:35 PDT 2011


Firstly, I would like to say that TiP BoF was amazing. I was surprised about
how many people went to it, and it's awesome to see people "passionate about
testing" (saying with an accent).

Nevertheless, one thing I definitely learned from TiP BoF and Gary
Bernhardt's Talk on "Units Needs Testing Too" (
http://pycon.blip.tv/file/4879997/) is that if you want fast unit tests, you
need to write tests that cover localized code to get the quicker feedback.

This means reducing the number of integration points or mocking out where an
exhaustive test simply isn't necessary for the case. With Django, there
seems to be a tendency to exhaustively test views. This creates a scenario
where your test integrates with context processors, middleware, database
filters and probably a number of other items you may be unaware of. In
short, if your test case isn't about integration or exhaustive acceptance,
then mock out those areas that aren't required.

At my company we're working on moving towards a more localized unit test
suite, while still maintaining our integration test suite. Running the
localized unit test suite allows us to run hundreds of tests in a matter of
a second whereas our integration and automation tests will take longer, but
won't run as often.

That's all I can say, but I'm sure glad to be apart of this new community
I've discovered!

- Thomas

On Tue, Mar 15, 2011 at 3:00 PM,
<testing-in-python-request at lists.idyll.org>wrote:

> Send testing-in-python mailing list submissions to
>        testing-in-python at lists.idyll.org
>
> To subscribe or unsubscribe via the World Wide Web, visit
>        http://lists.idyll.org/listinfo/testing-in-python
> or, via email, send a message with subject or body 'help' to
>        testing-in-python-request at lists.idyll.org
>
> You can reach the person managing the list at
>        testing-in-python-owner at lists.idyll.org
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of testing-in-python digest..."
>
>
> Today's Topics:
>
>   1. TiP BoF feedback & call for help (Jorge Vargas)
>   2. Re: TiP BoF feedback & call for help (Alfredo Deza)
>   3. Re: TiP BoF feedback & call for help (Mark Sienkiewicz)
>   4. Re: TiP BoF feedback & call for help (Greg Turnquist)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Mon, 14 Mar 2011 15:11:11 -0400
> From: Jorge Vargas <jorge.vargas at gmail.com>
> Subject: [TIP] TiP BoF feedback & call for help
> To: TesttingInPython <testing-in-python at lists.idyll.org>
> Message-ID:
>        <AANLkTi=VfnYuc-w+-0boaby0hchgr2GXM=86ZhXni2DA at mail.gmail.com>
> Content-Type: text/plain; charset=ISO-8859-1
>
> Hello all,
>
> First of all although I dont' think there is anything with a different
> opinion I totally loved it.
>
> It's a shame I missed it last year.
>
> Second (and I'm sorry if you did this but I got in a little late) I
> thought it will be more like other BoF where people will discuss
> rather than lighting talks. Because I was hoping to get some help to
> find out why our tests are so slow. So here is my lightning talk
> regarding that.
>
> Hi, I got a problem ( halp! )
> <picture of a sad kitty>
> My tests run slooooooooow
> But I don't know who to blame
> This is django so it's probably it's fault.
> <haters got to hate>
> Or perhaps it's that my boss does not want to let me move them over to
> nose Still stuck on vanilla unittest :(
> Or perhaps it's just a bad setup ? more fixtures around ?
> Or maybe it's some of the crazyness we do to have a standalone package
> that does not require a django project.
> We currently have 501 tests on our model only.
> ALL of them just check our business logic and raise validation errors
> for different problems and such.
> Running under jenkins in a VPS with sqlite and postgres (so 1002 runs)
> it takes anywhere from 20-30minutes to run.
>
> So who is wrong? me? us? jenkins? django? unittest ? Who to blame ?
> I'll love to run my tests more but 30 minutes feedback is just crazy.
>
> Looking forward for Santa Clara !
>
>
>
> ------------------------------
>
> Message: 2
> Date: Mon, 14 Mar 2011 15:26:19 -0400
> From: Alfredo Deza <alfredodeza at gmail.com>
> Subject: Re: [TIP] TiP BoF feedback & call for help
> To: Jorge Vargas <jorge.vargas at gmail.com>
> Cc: TesttingInPython <testing-in-python at lists.idyll.org>
> Message-ID:
>        <AANLkTikq9sr+1r0B1Gfx7w7E7Otox8rP8LC1sYeJt4E7 at mail.gmail.com>
> Content-Type: text/plain; charset="iso-8859-1"
>
> On Mon, Mar 14, 2011 at 3:11 PM, Jorge Vargas <jorge.vargas at gmail.com
> >wrote:
>
> > Hello all,
> >
> > First of all although I dont' think there is anything with a different
> > opinion I totally loved it.
> >
> > It's a shame I missed it last year.
> >
> > Second (and I'm sorry if you did this but I got in a little late) I
> > thought it will be more like other BoF where people will discuss
> > rather than lighting talks. Because I was hoping to get some help to
> > find out why our tests are so slow. So here is my lightning talk
> > regarding that.
> >
> > Hi, I got a problem ( halp! )
> > <picture of a sad kitty>
> > My tests run slooooooooow
> > But I don't know who to blame
> > This is django so it's probably it's fault.
> > <haters got to hate>
> > Or perhaps it's that my boss does not want to let me move them over to
> > nose Still stuck on vanilla unittest :(
> > Or perhaps it's just a bad setup ? more fixtures around ?
> > Or maybe it's some of the crazyness we do to have a standalone package
> > that does not require a django project.
> > We currently have 501 tests on our model only.
> > ALL of them just check our business logic and raise validation errors
> > for different problems and such.
> > Running under jenkins in a VPS with sqlite and postgres (so 1002 runs)
> > it takes anywhere from 20-30minutes to run.
> >
> > So who is wrong? me? us? jenkins? django? unittest ? Who to blame ?
> > I'll love to run my tests more but 30 minutes feedback is just crazy.
> >
>
> Independently of why they are slow, I would play with py.test and the xdist
> plugin that will
> allow you to run tests in parallel.
>
> At work, we went from 20 minute test runs to around 5.
>
> Since your tests are "vanilla unittest" you should be able to try out
> py.test without issues.
>
> For best results, try to match the number of processes to the number of
> cores on your machine.
>
> On our 4 core testing server, the command to run in parallel looks like
> this:
>
>    py.test -n 4
>
> It should work out of the box :)
>
>
> You need to install the following:
>
> pip install pytest
> pip install pytest-xdist
>
>
> In that order. Please post back if you do try it to see how it went!
>
> >
> > Looking forward for Santa Clara !
> >
> > _______________________________________________
> > testing-in-python mailing list
> > testing-in-python at lists.idyll.org
> > http://lists.idyll.org/listinfo/testing-in-python
> >
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <
> http://lists.idyll.org/pipermail/testing-in-python/attachments/20110314/b3ae49e3/attachment-0001.htm
> >
>
> ------------------------------
>
> Message: 3
> Date: Mon, 14 Mar 2011 18:05:29 -0400
> From: Mark Sienkiewicz <sienkiew at stsci.edu>
> Subject: Re: [TIP] TiP BoF feedback & call for help
> To: Jorge Vargas <jorge.vargas at gmail.com>
> Cc: TesttingInPython <testing-in-python at lists.idyll.org>
> Message-ID: <4D7E9129.30306 at stsci.edu>
> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>
>
> > So who is wrong? me? us? jenkins? django? unittest ? Who to blame ?
> > I'll love to run my tests more but 30 minutes feedback is just crazy.
> >
>
> You'll just have to start measuring things.  Blaming django doesn't get
> you anywhere even if it is accurate -- you have to understand what is
> happening in your system so you can know what you can do different.
>
> First, how do you know there is anything wrong?  Your average throughput
> works out to about 1.8 seconds per test.  Is it possible that 20-30
> minutes is actually _fast_ for what you are doing?  If not, how do you
> know?
>
> Are all the tests taking 1.8 seconds +/- 2% ?  Or do 90% of your tests
> take 0.1 seconds each and 10% of your tests take 17 seconds each?  What
> do the fast ones have in common?  What do the slow ones have in common?
>
> If you run a single test, how long does it take?  If you run 10 tests,
> does it take 10 times as long?  Do 100 tests take 100 times as long?
>
> If you're using unittest, how long does it spend in the test method, how
> long in the class setup, how long in the overhead code?
>
> Do your tests log in to the postgres server 500 times, or only once?
> And are the postgres tests faster or slower than the same tests run on
> sqlite?
>
> Do you run all your tests in a single process?  If not, how many
> processes and how much process creation time?  (I ran truss on my python
> interpreter once and saw about 3000 file open calls just to start an
> interactive interpreter.)
>
> Once you collect some data (maybe even just a few minutes - you might
> not need a whole test run), you have some clues.  There is no point in
> optimizing until you know what needs to be better.  If you just poke
> about at random, you might completely optimize something that is
> responsible for 1% of your run time -- but you don't care if you save 18
> seconds out of half an hour.
>
> Mark
>
>
>
>
> ------------------------------
>
> Message: 4
> Date: Mon, 14 Mar 2011 17:15:38 -0500
> From: Greg Turnquist <greg.l.turnquist at gmail.com>
> Subject: Re: [TIP] TiP BoF feedback & call for help
> To: TesttingInPython <testing-in-python at lists.idyll.org>
> Message-ID:
>        <AANLkTimg6hsBCubDg8GQpFAYr=+zjg9cmUscwF3qEp07 at mail.gmail.com>
> Content-Type: text/plain; charset="iso-8859-1"
>
> As they say, premature optimization is the root of all evil. Profile and
> determine where all your time is being spent.
>
> On Mon, Mar 14, 2011 at 5:05 PM, Mark Sienkiewicz <sienkiew at stsci.edu
> >wrote:
>
> >
> >  So who is wrong? me? us? jenkins? django? unittest ? Who to blame ?
> >> I'll love to run my tests more but 30 minutes feedback is just crazy.
> >>
> >>
> >
> > You'll just have to start measuring things.  Blaming django doesn't get
> you
> > anywhere even if it is accurate -- you have to understand what is
> happening
> > in your system so you can know what you can do different.
> >
> > First, how do you know there is anything wrong?  Your average throughput
> > works out to about 1.8 seconds per test.  Is it possible that 20-30
> minutes
> > is actually _fast_ for what you are doing?  If not, how do you know?
> >
> > Are all the tests taking 1.8 seconds +/- 2% ?  Or do 90% of your tests
> take
> > 0.1 seconds each and 10% of your tests take 17 seconds each?  What do the
> > fast ones have in common?  What do the slow ones have in common?
> >
> > If you run a single test, how long does it take?  If you run 10 tests,
> does
> > it take 10 times as long?  Do 100 tests take 100 times as long?
> >
> > If you're using unittest, how long does it spend in the test method, how
> > long in the class setup, how long in the overhead code?
> >
> > Do your tests log in to the postgres server 500 times, or only once?  And
> > are the postgres tests faster or slower than the same tests run on
> sqlite?
> >
> > Do you run all your tests in a single process?  If not, how many
> processes
> > and how much process creation time?  (I ran truss on my python
> interpreter
> > once and saw about 3000 file open calls just to start an interactive
> > interpreter.)
> >
> > Once you collect some data (maybe even just a few minutes - you might not
> > need a whole test run), you have some clues.  There is no point in
> > optimizing until you know what needs to be better.  If you just poke
> about
> > at random, you might completely optimize something that is responsible
> for
> > 1% of your run time -- but you don't care if you save 18 seconds out of
> half
> > an hour.
> >
> > Mark
> >
> >
> >
> > _______________________________________________
> > testing-in-python mailing list
> > testing-in-python at lists.idyll.org
> > http://lists.idyll.org/listinfo/testing-in-python
> >
>
>
>
> --
> Greg Turnquist (Greg.L.Turnquist at gmail.com)
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <
> http://lists.idyll.org/pipermail/testing-in-python/attachments/20110314/d0b8b606/attachment-0001.htm
> >
>
> ------------------------------
>
> _______________________________________________
> testing-in-python mailing list
> testing-in-python at lists.idyll.org
> http://lists.idyll.org/listinfo/testing-in-python
>
>
> End of testing-in-python Digest, Vol 50, Issue 10
> *************************************************
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.idyll.org/pipermail/testing-in-python/attachments/20110315/8bd9d689/attachment-0001.htm>


More information about the testing-in-python mailing list