[TIP] Warning testing/catch_warnings context

Sebastian Berg sebastian at sipsolutions.net
Tue Jun 21 00:26:25 PDT 2016


Hi all,

not sure if this is the right place, or anyone is interested. But I was
wondering about a broader interest in reliably testing warnings. (tl;dr
was thinking of a new warning context for numpy, in
https://github.com/numpy/numpy/pull/7763 and it might be interested to
people frustrated with warning testing?)

Basically, the story is that in NumPy I have been seriously annoyed in
not being able to reliably test warnings. This means on the one hand,
suppress/filter away warnings in tests where we do not care about them,
on the other hand actually be able to check that they are given
correctly elsewhere.
Additionally, I was annoyed that creating a new warning it is often
impossible to see what parts are actually affected, because it does not
show up everywhere.

The biggest problem is that before some Python 3.4 version, this type
of code is not reliable due to a bug (the new behaviour has other
oddities maybe, but solves this issue):

```
with warnings.catch_warnings():
    warnings.simplefilter("ignore", DeprecationWarning)
    function_causing_deprecation_warning()

with warnings.catch_warnings(record=True):
    warnings.simplefilter("always", DeprecationWarning)
    # Depending on the warning location, the warning may not be given
    same_or_other_function_causing_same_warning()
```

The same happens if a catch warnings context with recording catches an
extra warning (which maybe should be set to error though, but IMO
ideally not in release versions -- a new downstream deprecation
warning, should not fail the tests, but would be good to be visible).


Another thing is that nesting of these contexts is not always cleanly
possible. Also specific catching of one warning, but printing (not
raising) of other warnings is difficult or impossible since the inner
catch warning context will catch all warnings set to "always".


Because especially the first bug annoyed me so much, to get around
these issues I implemented a new catch warnings context on steroids for
possible inclusion in numpy in https://github.com/numpy/numpy/pull/7763
Now I have been wondering a bit whether there may even be more interest
in such a hack/context manager, or am I the only one being frustrated
by warnings in tests?

Regards,

Sebastian
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 819 bytes
Desc: This is a digitally signed message part
URL: <http://lists.idyll.org/pipermail/testing-in-python/attachments/20160621/e551a0da/attachment.pgp>


More information about the testing-in-python mailing list