[TIP] Tools to measure effectiveness of test with declarative code

Danny Staple -X (dastaple - TRITON UK BIDCO LIMITED c/o Alter Domus (UK) Limited -OBO at Cisco) dastaple at cisco.com
Tue Apr 30 09:41:48 PDT 2019


Hello,
I am trying to make a metric to show how effective tests are around python code that is declarative in content. In this case it is a Django app using the Rest Framework, class based/generic views, Serialisers, Django filters etc. This code is all config code. If put through a line coverage tool (like coverage.py) all the declaration lines will have been covered, however, this says nothing about if the tests have covered the filters (for example) they define or even attempted to hit those views (for example).

I am looking for a tool I could use, for example when starting `./manage.py test myapp` that would show this. Now a good start would be a tool that shows Url or View coverage in Django. 

I have been searching in vain for such a tool, with perhaps mutmut (mutation based testing) being the closest to this I’ve found. Are there other tools in the python testing ecosystem that would do this? Give me visibility of views and filters actually being used after they are declared?

Thanks,
Danny


More information about the testing-in-python mailing list