[TIP] NoseTests xUnit with test times

Prasanna Santhanam Prasanna.Santhanam at citrix.com
Mon Nov 5 07:47:03 PST 2012


I guess what I'm wondering about is why nosetests determines the overall run to be FAIL when 58 tests have actually passed. The error-ed and failed tests are something I'm aware will fail in this run.

Thanks in anticipation,




From: Tim Aerdts [mailto:fragger123 at gmail.com]
Sent: Monday, November 05, 2012 06:50 PM
To: Prasanna Santhanam
Cc: testing-in-python at lists.idyll.org <testing-in-python at lists.idyll.org>
Subject: Re: [TIP] NoseTests xUnit with test times

Hello,

I think nosetest does not report the individual times then.

As to your second problem. You might want to run nosetest with --verbose which defaults to a verbosity level of 2 that might give you more output or --verbosity=3 for even more verbose output.

But it seems like 2 errors occured which usually relate to exceptions or other things that went wrong (I'm no expert here)
And 10 failures which usually means that the tests failed e.g asserts fail.

-Tim


On Mon, Nov 5, 2012 at 1:55 PM, Prasanna Santhanam <prasanna.santhanam at citrix.com<mailto:prasanna.santhanam at citrix.com>> wrote:
On Wed, Oct 31, 2012 at 12:44:27PM +0530, Tim Aerdts wrote:
> Hello Prasanna,
>
> For me it seems like this is enabled by default.
>
> <testsuite name="nosetests" tests="3" errors="0" failures="0" skip="0">
> <testcase classname="test_someclass" name="test_some_method" time="2.042" />
> <testcase classname="test_someclass" name="test_some_other" time="0.058" />
> <testcase classname="test_someclass" name="test_some_other" time="0.022" />
> </testsuite>
>
> I run my nosetests with the following parameters.
>
> nosetests --with-xunit --verbose --with-coverage --cover-xml --cover-package=mypackage
>
> Hope this helps your case.
>
> -Tim

Hello Tim - Thanks for your response. I do have the times for individual
tests in my nosetests.xml but for some reason it won't show them in
stdout. Guess this is a plugin setting (in xunit) For eg:

With xUnit plugin I see:
$ nosetests -v --with-marvin --marvin-config=../cloud-autodeploy/$hypervisor.cfg --load --with-xunit integration/smoke/test_service_offerings.py
Test to create service offering ... ok
Test to update existing service offering ... ok
Test to delete service offering ... ok

----------------------------------------------------------------------
XML: nosetests.xml
----------------------------------------------------------------------
Ran 3 tests in 0.143s

OK

With unittest-xml-reporting I see:
$ python -m marvin.deployAndRun --config=$hypervisor.cfg --load --xml integration/smoke/test_service_offerings.py

        test_01_create_service_offering (test_service_offerings.TestCreateServiceOffering)
Test to create service offering ... OK (0.057s)
  test_02_edit_service_offering (test_service_offerings.TestServiceOfferings)
Test to update existing service offering ... OK (0.036s)
  test_03_delete_service_offering (test_service_offerings.TestServiceOfferings)
Test to delete service offering ... OK (0.035s)


Additionally, I wanted to understand what nose is using to determine the
overall result of a run. In the above run you see 'OK' printed at the end. In
the actual environment where I run many more tests I see

----------------------------------------------------------------------
XML: nosetests.xml
----------------------------------------------------------------------
Ran 73 tests in 11938.352s

FAILED (SKIP=3, errors=2, failures=10)

Many tests have passed, but why has nose determined FAILED? How can I fix this?


Thanks,

--
Prasanna.,

> On Wed, Oct 31, 2012 at 7:27 AM, Prasanna Santhanam <prasanna.santhanam at citrix.com<mailto:prasanna.santhanam at citrix.com><mailto:prasanna.santhanam at citrix.com<mailto:prasanna.santhanam at citrix.com>>> wrote:
> Hi All,
>
> Is it possible to retrieve the time for each test fixture when using
> the nosetests xunit plugin? By default I didn't see any options on the
> cmd line for doing this.
>
> I've used unittest-xml-reporting with some success and it reports the
> fixture run times which is quite useful for optimizing and speeding up
> tests.
>


_______________________________________________
testing-in-python mailing list
testing-in-python at lists.idyll.org<mailto:testing-in-python at lists.idyll.org>
http://lists.idyll.org/listinfo/testing-in-python



--
Kind regards,
Tim Aerdts
http://www.tuimz.nl
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.idyll.org/pipermail/testing-in-python/attachments/20121105/684290f0/attachment.htm>


More information about the testing-in-python mailing list