[TIP] Identifying slowest py.test fixtures

André Caron andre.l.caron at gmail.com
Sat Jul 23 12:20:26 PDT 2016

@Bruno: thanks, that's really enlightening!  In hindsight, this is rather
obvious.  Still getting the hang of this :-)

I'll see if I can play around with a development build to flesh out my
plug-in so that I can roll it out on my projects as soon as the 3.0 release
is out.

@René: pytest-profiling does that and it is quite an interesting
suggestion, at least as a temporay workaround.  However, my experience with
function-level profiling is that it gives too much detail for this use
case: you have to track down the fixture functions yourself and mentally
remove all the noise in the graph to get a clear picture of where time is
spent.  This is useful for a one-off analysis to perform a kind of "blitz",
but I'd like a simplified text report I can show on each build so that I
routinely take a quick look at the data.


On Sat, Jul 23, 2016 at 11:25 AM, René Dudfield <renesd at gmail.com> wrote:

> What about running pytest under the python profiler?
> On Sat, Jul 23, 2016 at 4:42 PM, Bruno Oliveira <nicoddemus at gmail.com>
> wrote:
>> On Sat, Jul 23, 2016 at 11:29 AM André Caron <andre.l.caron at gmail.com>
>> wrote:
>>> These two new hooks look like what was looking for, so guess that yes,
>>> this will help a lot :-)  Do you have a ballpark-figure of how soon is
>>> "soon"?
>> We are aiming for a release right after EuroPython, probably on the week
>> of August 1st.
>> Unfortunately, printing is subject to log capture plug-ins, so it's very
>>> inconvenient.  I see your example plug-in uses printing as well.
>> Oh sorry, I meant to demonstrate only how to use the new hooks, certainly
>> printing is not the way to go.
>> How would I go about printing a report at the end like the `--durations`
>>> argument does?  I assume I also need to use the `pytest_terminal_summary()`
>>> hook, but I'm not sure how to pass info from your example
>>> `pytest_fixture_setup()` to the reporting function.  Any hints?
>> You are right, `pytest_terminal_summary` is the best way to approach
>> this. As to how to pass the information, aside from using global variables
>> you could implement a plugin class yourself and register it during
>> `pytest_configure`:
>> class MeasureSetupsPlugin:
>>     def __init__(self):
>>         self.elapsed_times = {}
>>     @pytest.hookimpl(hookwrapper=True)
>>     def pytest_fixture_setup(self, fixturedef, request):
>>         started = time.time()
>>         yield
>>         elapsed = time.time() - started
>>         self.elapsed_times.setdefault(fixturedef.name,
>> []).append(elapsed)
>>     def pytest_terminal_summary(self, terminalreporter):
>>         terminalreporter.write_sep("=", "Fixture setup times (max)")
>>         for name, times in sorted(self.elapsed_times.items()):
>>             tr.write_line("%s %.3f" % (name, max(times))
>> def pytest_configure(config):
>>     config.pluginmanager.register(MeasureSetupsPlugin(), 'measure_setups')
>> You could decide to display all matter of different statistics during
>> pytest_terminal_summary, like number of setups, slowest setup, etc.
>> Hope this helps!
>> Cheers,
>> Bruno.
>> _______________________________________________
>> testing-in-python mailing list
>> testing-in-python at lists.idyll.org
>> http://lists.idyll.org/listinfo/testing-in-python
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.idyll.org/pipermail/testing-in-python/attachments/20160723/1f28ba90/attachment.htm>

More information about the testing-in-python mailing list