[TIP] How to parameterize a fixture based on a parser option with py.test?

Wouter Overmeire lodagro at gmail.com
Fri Dec 13 06:30:20 PST 2013


2013/12/13 Wouter Overmeire <lodagro at gmail.com>

> Somehow i missed the point that it is possible to access the config object
> trough the metafunc argument of pytest_generate_tests when i first looked
> at the docs. Based on your recommendations i gave it another go, problem
> solved. Thanks for your feedback!
>
> There is however another problem now. There is a big difference in the
> order of configuration / test execution when deferring the setup of
> parametrized resources.
> When using the params argument of the pytest.fixture decorator the tests
> are run such that for a given config of the platform, all tests run after
> each other. This is what i want, since the config is time-consuming
> compared to running the tests. When using the pytest_generate_tests
> function to parameterize the platform fixture, the tests are run, such that
> one give tests is executed for all configs, than the next test for all
> selcted configs, and so on. Is there an easy way to change the ordering? Or
> should i implement my own ordering, maybe using
> pytest_collection_modifyitems?
>

metafunc.parametrize(...., scope='session') does the trick


>
>
>
> 2013/12/13 holger krekel <holger at merlinux.eu>
>
>> Hi Wouter,
>>
>> On Fri, Dec 13, 2013 at 12:21 +0100, Wouter Overmeire wrote:
>> > Using py.test (version 2.5.0) i would like to run a number of test on a
>> > user select-able platform with multiple configurations.  A platform is a
>> > bunch measurement instruments, and  hardware under test. The tests are
>> not
>> > aware on which specific platform they run, nor how the platform is
>> > configured, they get a platform fixture object and do stuff with it.
>> >
>> > Using two parser arguments "--platform" and  "--config"" the user can
>> > select a platform and a platform config. The platform fixture function
>> gets
>> > the options from the request object, creates, configures and returns the
>> > platform. So far so good.
>>
>> :)
>>
>> > However, test need to run for many configs (range 100) on  a single
>> > platform, configs are grouped in sets. The user should be able to select
>> > the config set. Instead of selecting a single config i would like to
>> select
>> > a set, the ""--config" option is now the name (string) referring to a
>> > factory function generating a list of configs. But i`m stuck on how to
>> > parameterize the platform fixture function using the "--config" parser
>> > option. The fixture decorator has a "parameters" argument to
>> parameterize a
>> > fixture, which can do the trick for a hardcoded config set, but how to
>> set
>> > the values based on a parser option? I also tried to use
>> > pytest.yield_fixture, but this supports only a single yield statement
>> :-(.
>>
>> I recommend you look into the pytest_generate_tests hook:
>>
>>     http://pytest.org/latest/parametrize.html#pytest-generate-tests
>>
>> It allows to define actual parameters for building a fixture from
>> the command line.  In the API description of Metafunc.parametrize()
>> on that page you find "indirect".  If you set it to true then your
>> fixture function can access the indirect parameter via the typical
>> "request.param".  See also the following example:
>>
>>
>> http://pytest.org/latest/example/parametrize.html#deferring-the-setup-of-parametrized-resources
>>
>> HTH,
>> holger
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.idyll.org/pipermail/testing-in-python/attachments/20131213/c9453485/attachment-0001.htm>


More information about the testing-in-python mailing list