[TIP] Different results when running under coverage

Peter Fein pfein at pobox.com
Mon Nov 8 08:15:32 PST 2010


Turns out this was due to an incompletely implemented set of rich comparison operators.  The ordering 
resulting from the default id based comparison just happened to be consistently correct when running without 
coverage, and consistently wrong when running with it.  Creepy.

Three cheers to the death of arbitrary object comparison in Py3k!

--Pete

On 11/06/2010 04:20 AM, Ned Batchelder wrote:
> Hi Peter,
>
> Unfortunately, I can't reproduce this. When I run cover-twiggy-tests.sh,
> the test passes, and I get an HTML report showing 70% overall coverage.
> I think the only difference in my setup over yours is that I am using
> the tip of the coverage repo, but there have been no changes there since
> 3.4 that would affect the outcome.
>
> We can take the rest of the investigation off-list.
>
> --Ned.
>
> On 11/6/2010 2:06 AM, Peter Fein wrote:
>> Hi-
>>
>> Got a strange situation... I've got some code that gives different
>> results when run under coverage.py than not (with both timid = True and
>> False). I haven't had a chance to isolate a test case (not sure I
>> could), but I figured I'd ask if anyone has seen behavior like this.
>>
>> coverage.py 3.4 & built from source Python:
>> Python 2.7 (r27:82500, Oct 13 2010, 23:07:50)
>> [GCC 4.4.3] on linux2
>>
>> I've been able to reproduce it consistently, perhaps you can too:
>>
>> hg clone -u coverage-problems https://python-twiggy.googlecode.com/hg/
>> python-twiggy
>>
>> To run the tests normally, in the top-level directory:
>>
>> TWIGGY_UNDER_TEST=1 python scripts/unittest_main.py -b
>> tests.test_integration
>>
>> 1 test, should pass
>>
>> To run under coverage:
>>
>> ./scripts/cover-twiggy-tests.sh -b tests.test_integration
>>
>> 1 test, fails - missing this line in out1:
>>
>> 2010-10-28T02:15:57Z:WARNING:second.child:cheese=hate:No
>>
>> Even stranger, the default branch of twiggy, which has only minor
>> differences to this branch (and *not* in test_integration.py) doesn't
>> manifest this problem. Very odd.
>>
>> I'm offline until Monday, so thanks in advance.
>>
>> --Pete
>>
>> _______________________________________________
>> testing-in-python mailing list
>> testing-in-python at lists.idyll.org
>> http://lists.idyll.org/listinfo/testing-in-python
>>




More information about the testing-in-python mailing list