[TIP] Test reporting in unittest (and plugins)

Michael Foord michael at voidspace.org.uk
Tue Aug 3 05:10:28 PDT 2010

On 03/08/2010 12:59, Jonathan Lange wrote:
> On Tue, Aug 3, 2010 at 12:23 AM, Michael Foord<michael at voidspace.org.uk>  wrote:
>> On 03/08/2010 00:14, Robert Collins wrote:
>>> Michael,
>>>      I'd *really* love it if you grabbed testtools, read up on the
>>> TestCase.addDetail API and looked at the modified contract for
>>> addSuccess/addError/addFailure/addSkip etc which is backwards
>>> compatible, provides a rich medium for additional data - and we're
>>> having *great* success with additional things building on this
>>> foundation to include additional data automatically.
>>> As for test naming - I used TestCase.id() *everywhere* in the UI now,
>>> the description interface is terrible as a machine managable
>>> interface, and most of the automated workflows being built on test
>>> frameworks want machine processing of data - see for instance
>>> http://pypi.python.org/pypi/testrepository or
>>> http://pypi.python.org/pypi/junitxml.
> ...
>> I can *look* at the projects you describe, and *hopefully* the documentation
>> will tell me what I need to know. I won't have time to go spelunking too
>> deeply into the source code to work out what they are doing though. :-(
>> Hopefully that won't be necessary.
> Well, of course you can also just steal the code and trust us that it
> works, which would actually save time for everyone :P

I doubt that I can unfortunately. It needs to work with the plugin 
system, and support all the use cases there.

> Anyway, I can try to give some examples of how we use
> TestCase.addDetail. I may be wrong on the details.
> In bzrlib, when tests fail, we add the relevant section of the
> .bzr.log file as a text/plain attachment. bzr has a custom reporter
> which calls getDetails() on tests to render the log appropriately. The
> mechanism is used for benchmarks and subprocess stderr dumping.
> In Launchpad, we use addDetail to store OOPS reports for tests that
> generate them. OOPS reports occur when there's an unhandled error in
> the webapp or our backend processes, and contain query logs and all
> sorts of yummy debugging stuff.
> AIUI, the difference between what we've developed for our own,
> genuine, real-world, honest-to-goodness, proven-in-the-wild needs and
> what's suggested in your post are:

Note that I have some specific usecases for porting genuine, real-world, 
honest-to-goodness, proven-in-the-wild nose plugins. The more usecases 
the better though. :-)

>    * In testtools, the TestCase adds information as it sees fit using
> TestCase.addDetail. The TestResult gets the information whenever it
> sees fit using TestCase.getDetails.


>    * testtools stores this stuff as a dict of short name ->
> mime-encoded stuff. Strengths of this is that it's relatively easy to
> serialize unexpectedly complex data (which leads to easier
> parallelism). Weaknesses are that it's a bit heavyweight.

Right - parallelism is an important story that I will probably be 
working on next.

>   * testtools doesn't use this mechanism to support custom outcomes

What is the mechanism (and use cases) for custom outcomes?

>   * testtools.addDetail is fully backwards compatible with older
> TestResult objects – they just don't display the new data

This actually sounds *fairly* different from what I have in mind, 
although my system *would* support arbitrary metadata, so you could add 
mimetyped data if you want (mimetyped - really! :-).

My thinking is to add the APIs for this in the event and on the result 
object. I will probably create a "Report" data structure (object) that 
holds all the details, including arbitrary metadata.

I'll sketch something up and come back. Thanks *very* much for replying 
to this. This information will make it a lot easier to understand the 
testtools code.

All the best,


> Hope this helps,
> jml


READ CAREFULLY. By accepting and reading this email you agree, on behalf of your employer, to release me from all obligations and waivers arising from any and all NON-NEGOTIATED agreements, licenses, terms-of-service, shrinkwrap, clickwrap, browsewrap, confidentiality, non-disclosure, non-compete and acceptable use policies (”BOGUS AGREEMENTS”) that I have entered into with your employer, its partners, licensors, agents and assigns, in perpetuity, without prejudice to my ongoing rights and privileges. You further represent that you have the authority to release me from any BOGUS AGREEMENTS on behalf of your employer.

More information about the testing-in-python mailing list