[TIP] decorators in testing : warning support for TestCase

Olemis Lang olemis at gmail.com
Mon Apr 6 06:21:26 PDT 2009


On Sat, Apr 4, 2009 at 7:48 AM, Michael Foord <fuzzyman at voidspace.org.uk> wrote:
>
> Any other useful decorators provided by the xUnit frameworks?
>

I assume you have read the previous message where I talk about the
confusion using the word «decorator» since it means two different
things for Py and J ... considering that ... my first post about the
test decorators I've implemented so far.

An example to illustrate test decorators is to expand the
current capabilities of test cases by issuing and recording warnings.
This could be useful to highlight a state which does not really
represent a failure but can be considered as supicious or may be
of interest for the tester. The following use cases come to my mind:

- The recent thread about higlighting the tests that were previously
  run but didnt show up at testing time ... this is not precisely an
  error, but perhaps means that something weird was going on, or
  maybe not ;).

- The same use case we are talking about ... if setUp fails there are
  mainly three options to consider :

  * signal a failed test case ... but IMHO this *SHOULD NOT* be used
    since the test itself didnot fail ... it just couldnt be
    performed ... so a better approach is ...
  * ... to signal a test failure, indicating that the test code was
    not accurate and needs to be fixed. But maybe this is not the
    case. Perhaps under certain circumstances the lack of resources
    is not a test failure at all, but a consequence of a specific
    config ... and in this case maybe it is better ...
  * ... to log a warning message ... ;)

- Maybe more ... ;)

I tried to use this kind of things while testing CASPy (Comprehensive
Assertion Support for Python ;) which was actually developped by the
FLiOOPS project and was also the first
motivation for building dutest module. The test suite in there is
based on the idea of what I call (possibly erroneusly ... so CMIIW
...) «functional benchmarking» since I wanted to compare how the
library performed with respect to similar implementations considering
how well it satisfied DbC semantics. However I found some features
being undoubtedly relevant (since they were directly related to
underlying semantics ...), and others more cosmetic (e.g. string
interpolation for error messages, etc). So the test cases for the
later case must not be reported as a failure (since a specific
library might not implement it, and that's just fine ...) but I still
wanted to record such situations in order to say «CASPy does # things
that ipdbc doesnot incorporate» or viceversa ... ;)

The first approach I envisioned to get this done needed to modify
the run method ... since it needed to intercept some exceptions and
do things a little bit different (so I perceived that run method
was a little bit monolithic -not too much- and could be refactored
-Template Method ?- in a way that different parts of it could be
overriden so as to offer specialized behaviors without needing to
reimplement the whole method ;).

The actual implementation [1]_ includes some extensions for text
test runners and needs to be improved (... yes, I dont have
time for this right now ...) but it acts like GoF decorator pattern (therefore
aggregation is used instead of inheritance ... ) and a wrapper is
placed on top of test case ... The decorator acts like
«alpha blending» in computer graphics ... it adds the warn method to
log warnings in a separate list in instances of TestResult ... but
for any other attribute access it pass the (get|set)attr request to
the original instance of test case ... so, the classes using the
instances of the decorator see no difference with respect to the
original | underlying test case instance ...

The foreseen approach (and not the current one, based on mixins) has
some advantages :

- It is non-invasive ... TestCase remained just the same ...
- Any existing test case can be enhanced to have warning support by
  wrapping it using an instance of the specific test decorator ...

This is one of the classes I plan (... and I do since a very long
time ...) to include in dutest, including integration with warnings
and | or logging modules ... once I can ;)

The results are as follows :

  - PEP 306             FAILED (failures=35), WARNINGS = 42
  - ipdbc               FAILED (failures=48, errors=1) WARNINGS = 20
  - oop.dbc (CASPy)     FAILED (failures=4)
  - pydbc               FAILED (failures=33, errors=1)
  - Reinhold Plösch's DbC package could not be tested since it is
    not publicly available ... :-/

How impartial are test results ... you'll know about that in a
separate thread since another use case for test decorators is
involved ...

Further commments about the results ... :
    * errors shown indicate test cases that should fail but didnt.
      This happens since I used the class `NoFailureError`
      to track these situations, and it is not a subclass of
      AssertionError ... perhaps I have to fix that ... :-/
    * Most of the failures quantified for PEP 306 reference impl are
      related to some very specific rules about the interactions
      between contracts and behavioral subtyping, and also to the
      fact that CASPy has not *pronounced an official position*
      (... uffffff ... said like this it seems like a big deal ...)
      about that ... However, some of them do fail. So please, dont
      feel bad about that ... yet ... ;)
    * Further work has been done (when possible and obvious ;) so
      that the test cases written for a particular framework could pass.
      This applies to ipdbc and pydbc but the same was not done for
      PEP 306 since it is not obvious and simple to provide further
      behavior without modifying its source code.

.. [1] oop.utils.testing.WarnTestCase et al.
       (https://apps.sourceforge.net/trac/flioops/browser/py/trunk/utils/testing.py?rev=83&marks=41-118#L39)

PS: I apologize if this is two long to be read ... I posted this in a
separate thread to get feedback only on this topic, if any ... I hope
you dont mind ;)

-- 
Regards,

Olemis.

Blog ES: http://simelo-es.blogspot.com/
Blog EN: http://simelo-en.blogspot.com/

Featured article:
No me gustan los templates de Django ...



More information about the testing-in-python mailing list