Giter Site home page Giter Site logo

flaky's People

Contributors

anthonywee avatar aptxkid avatar cclauss avatar danielwuz avatar dependabot[bot] avatar edwardbetts avatar felixonmars avatar gaborbernat avatar hroncok avatar hugovk avatar jacebrowning avatar jeff-meadows avatar jmoldow avatar mcepl avatar nmalaguti avatar olehkyba avatar potrebic avatar rwstauner avatar sobolevn avatar stratakis avatar wernight avatar yeputons avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

flaky's Issues

UnicodeEncodeError when raising an exception with non-ascii characters

The Python 2.6 tests will fail with a UnicodeEncodeError if an exception is raised with non-ascii characters in the message.

For example, I updated the test_non_flaky_thing() test in test_example.py to include the non-ascii message in the exception:

@genty
class ExampleFlakyTestsWithUnicodeTestNames(ExampleFlakyTests):
    @genty_dataset('ascii name', 'ńőń ȁŝćȉȉ ŝƭȕƒƒ')
    def test_non_flaky_thing(self, message):
        self._threshold += 1
        if self._threshold < 1:
            raise Exception(
                "Threshold is not high enough: {0} vs {1} for '{2}'.".format(
                    self._threshold, 1, message),
            )

This caused a UnicodeEncodeError when running the Python 2.6 tests:

py26 runtests: PYTHONHASHSEED='2117695302'
py26 runtests: commands[0] | nosetests --with-flaky --exclude=pytest|test_options_example
....................................
===Flaky Test Report===

Traceback (most recent call last):
  File ".tox/py26/bin/nosetests", line 11, in <module>
    sys.exit(run_exit())
  File "/Users/awee/open_source/flaky_fork/flaky/.tox/py26/lib/python2.6/site-packages/nose/core.py", line 121, in __init__
    **extra_args)
  File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/unittest.py", line 817, in __init__
    self.runTests()
  File "/Users/awee/open_source/flaky_fork/flaky/.tox/py26/lib/python2.6/site-packages/nose/core.py", line 207, in runTests
    result = self.testRunner.run(self.test)
  File "/Users/awee/open_source/flaky_fork/flaky/.tox/py26/lib/python2.6/site-packages/nose/core.py", line 66, in run
    result.printErrors()
  File "/Users/awee/open_source/flaky_fork/flaky/.tox/py26/lib/python2.6/site-packages/nose/result.py", line 110, in printErrors
    self.config.plugins.report(self.stream)
  File "/Users/awee/open_source/flaky_fork/flaky/.tox/py26/lib/python2.6/site-packages/nose/plugins/manager.py", line 99, in __call__
    return self.call(*arg, **kw)
  File "/Users/awee/open_source/flaky_fork/flaky/.tox/py26/lib/python2.6/site-packages/nose/plugins/manager.py", line 167, in simple
    result = meth(*arg, **kw)
  File "/Users/awee/open_source/flaky_fork/flaky/flaky/flaky_nose_plugin.py", line 182, in report
    self._add_flaky_report(stream)
  File "/Users/awee/open_source/flaky_fork/flaky/flaky/_flaky_plugin.py", line 236, in _add_flaky_report
    stream.write(self._stream.getvalue())
UnicodeEncodeError: 'ascii' codec can't encode characters in position 760-762: ordinal not in range(128)
ERROR: InvocationError: '/Users/awee/open_source/flaky_fork/flaky/.tox/py26/bin/nosetests --with-flaky --exclude=pytest|test_options_example'
__________________________________________________________________________________ summary __________________________________________________________________________________
ERROR:   py26: commands failed

Flaky, xdist, and fixture exceptions can result in test suites running indefinitely

This happens if and only if all of the following are true:

  • using xdist
  • using flaky
  • An exception occurs in a fixture
  • There are remaining tests left to be assigned to processes, and every process has died due to the exceptions occurring in fixtures.

Sample test script:

import pytest
import flaky


@pytest.fixture
def fixture():
    raise AssertionError


@pytest.mark.usefixtures('fixture')
@flaky.flaky(max_runs=2, min_passes=1)
class TestCase:

    @pytest.mark.foobar
    def test_foo(self):
        pass

    def test_bar(self,):
        pass

    def test_baz(self):
        pass

Sample output:

[dshuga@i-110516 ~/indeed/shield/products/examples master]
(shield) $ pytest -v -n1 test_example.py
================================================================== test session starts ===================================================================
platform darwin -- Python 3.5.1, pytest-3.0.5, py-1.4.31, pluggy-0.4.0 -- /Users/dshuga/.pyenv/versions/shield/bin/python
cachedir: .cache
rootdir: /Users/dshuga/indeed/shield/products/examples, inifile: 
plugins: indeed-shield-0.0.0, flaky-3.3.0, warnings-0.1.0, xdist-1.15.0
[gw0] darwin Python 3.5.1 cwd: /Users/dshuga/indeed/shield/products/examples
[gw0] Python 3.5.1 (default, May  4 2016, 14:23:33)  -- [GCC 4.2.1 Compatible Apple LLVM 6.1.0 (clang-602.0.53)]
gw0 [3]
scheduling tests via LoadScheduling

test_example.py::TestCase::test_foo

(stays this way indefinitely)

[dshuga@i-110516 ~/indeed/shield/products/examples master]
(shield) $ pytest -v -n3 test_example.py
================================================================== test session starts ===================================================================
platform darwin -- Python 3.5.1, pytest-3.0.5, py-1.4.31, pluggy-0.4.0 -- /Users/dshuga/.pyenv/versions/shield/bin/python
cachedir: .cache
rootdir: /Users/dshuga/indeed/shield/products/examples, inifile: 
plugins: indeed-shield-0.0.0, flaky-3.3.0, warnings-0.1.0, xdist-1.15.0
[gw0] darwin Python 3.5.1 cwd: /Users/dshuga/indeed/shield/products/examples
[gw1] darwin Python 3.5.1 cwd: /Users/dshuga/indeed/shield/products/examples
[gw2] darwin Python 3.5.1 cwd: /Users/dshuga/indeed/shield/products/examples
[gw0] Python 3.5.1 (default, May  4 2016, 14:23:33)  -- [GCC 4.2.1 Compatible Apple LLVM 6.1.0 (clang-602.0.53)]
[gw1] Python 3.5.1 (default, May  4 2016, 14:23:33)  -- [GCC 4.2.1 Compatible Apple LLVM 6.1.0 (clang-602.0.53)]
[gw2] Python 3.5.1 (default, May  4 2016, 14:23:33)  -- [GCC 4.2.1 Compatible Apple LLVM 6.1.0 (clang-602.0.53)]
gw0 [3] / gw1 [3] / gw2 [3]
scheduling tests via LoadScheduling

test_example.py::TestCase::test_baz 
test_example.py::TestCase::test_bar 
test_example.py::TestCase::test_foo 
INTERNALERROR> Traceback (most recent call last):
INTERNALERROR>   File "/Users/dshuga/.pyenv/versions/shield/lib/python3.5/site-packages/_pytest/main.py", line 98, in wrap_session
INTERNALERROR>     session.exitstatus = doit(config, session) or 0
INTERNALERROR>   File "/Users/dshuga/.pyenv/versions/shield/lib/python3.5/site-packages/_pytest/main.py", line 133, in _main
INTERNALERROR>     config.hook.pytest_runtestloop(session=session)
INTERNALERROR>   File "/Users/dshuga/.pyenv/versions/shield/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 745, in __call__
INTERNALERROR>     return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)
INTERNALERROR>   File "/Users/dshuga/.pyenv/versions/shield/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 339, in _hookexec
INTERNALERROR>     return self._inner_hookexec(hook, methods, kwargs)
INTERNALERROR>   File "/Users/dshuga/.pyenv/versions/shield/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 334, in <lambda>
INTERNALERROR>     _MultiCall(methods, kwargs, hook.spec_opts).execute()
INTERNALERROR>   File "/Users/dshuga/.pyenv/versions/shield/lib/python3.5/site-packages/_pytest/vendored_packages/pluggy.py", line 614, in execute
INTERNALERROR>     res = hook_impl.function(*args)
INTERNALERROR>   File "/Users/dshuga/.pyenv/versions/3.5.1/envs/shield/lib/python3.5/site-packages/xdist/dsession.py", line 536, in pytest_runtestloop
INTERNALERROR>     self.loop_once()
INTERNALERROR>   File "/Users/dshuga/.pyenv/versions/3.5.1/envs/shield/lib/python3.5/site-packages/xdist/dsession.py", line 555, in loop_once
INTERNALERROR>     call(**kwargs)
INTERNALERROR>   File "/Users/dshuga/.pyenv/versions/3.5.1/envs/shield/lib/python3.5/site-packages/xdist/dsession.py", line 593, in slave_slavefinished
INTERNALERROR>     assert not crashitem, (crashitem, node)
INTERNALERROR> AssertionError: ('test_example.py::TestCase::()::test_bar', <SlaveController gw1>)
INTERNALERROR> assert not 'test_example.py::TestCase::()::test_bar'

============================================================== no tests ran in 1.39 seconds ==============================================================

[dshuga@i-110516 ~/indeed/shield/products/examples master]
(shield) $
[dshuga@i-110516 ~/indeed/shield/products/examples master]
(shield) $ pytest -v -n1 -p no:flaky test_example.py
================================================================== test session starts ===================================================================
platform darwin -- Python 3.5.1, pytest-3.0.5, py-1.4.31, pluggy-0.4.0 -- /Users/dshuga/.pyenv/versions/shield/bin/python
cachedir: .cache
rootdir: /Users/dshuga/indeed/shield/products/examples, inifile: 
plugins: indeed-shield-0.0.0, warnings-0.1.0, xdist-1.15.0
[gw0] darwin Python 3.5.1 cwd: /Users/dshuga/indeed/shield/products/examples
[gw0] Python 3.5.1 (default, May  4 2016, 14:23:33)  -- [GCC 4.2.1 Compatible Apple LLVM 6.1.0 (clang-602.0.53)]
gw0 [3]
scheduling tests via LoadScheduling

test_example.py::TestCase::test_foo 
[gw0] ERROR test_example.py::TestCase::test_foo 
test_example.py::TestCase::test_bar 
[gw0] ERROR test_example.py::TestCase::test_bar 
test_example.py::TestCase::test_baz 
[gw0] ERROR test_example.py::TestCase::test_baz 

========================================================================= ERRORS =========================================================================
__________________________________________________________ ERROR at setup of TestCase.test_foo ___________________________________________________________
[gw0] darwin -- Python 3.5.1 /Users/dshuga/.pyenv/versions/shield/bin/python
@pytest.fixture
    def navigate_to_indeed():
>       raise AssertionError
E       AssertionError

test_example.py:14: AssertionError
__________________________________________________________ ERROR at setup of TestCase.test_bar ___________________________________________________________
[gw0] darwin -- Python 3.5.1 /Users/dshuga/.pyenv/versions/shield/bin/python
@pytest.fixture
    def navigate_to_indeed():
>       raise AssertionError
E       AssertionError

test_example.py:14: AssertionError
__________________________________________________________ ERROR at setup of TestCase.test_baz ___________________________________________________________
[gw0] darwin -- Python 3.5.1 /Users/dshuga/.pyenv/versions/shield/bin/python
@pytest.fixture
    def navigate_to_indeed():
>       raise AssertionError
E       AssertionError

test_example.py:14: AssertionError
================================================================ 3 error in 1.49 seconds =================================================================

[dshuga@i-110516 ~/indeed/shield/products/examples master]
(shield) $ 

Probably the same root cause as #114

using flaky outside unit tests

I would like to be able to use @flaky outside unittests, on normal Python code in order to implement a simple and clean retry mechanism.

Note: if this is already possible it should be documented.

ValueError: too many values to unpack

Found an issue when building with Flaky.

flaky/flaky_nose_plugin.py", line 145, in _get_test_method_name
_, _, class_and_method_name = test.address()
ValueError: too many values to unpack

Looks like a tuple issue.

Add an ability to specify non-retried exceptions

I'd like to use the "flaky" module, but before it's useful for my use case, I'll need to be able to specify which exception classes to retry, either as a whitelist or as a blacklist.

As an example, I'd like to make sure exceptions in my "product crashed" class hierarchy are reported immediately.

Flaky counts a skipped unittest test as a failure

If a test is skipped via unittest.skip(), it's counted as a failure and rerun. (If it's skipped via pytest.mark.skipif, it seems to work fine.)

This causes the test to get reported multiple times (it's correctly reported as skipped each time in the xml), which causes problems with xdist again (trying to remove tests from the queue multiple times).

Using flaky breaks test reporting

Info
Report generation breaks when using flaky with Allure or nosedbreport.

Steps (allure)

  • Setup allure
  • Setup flaky
  • This is my test: test.py
from flaky import flaky

@flaky
def test_1():
     assert 1 == 1, 'error message'
  • Run this test with allure without flaky
nosetests test.py --with-allure --logdir=./allure-results
  • Check result generated
    <ns0:test-suite xmlns:ns0="urn:model.allure.qatools.yandex.ru" start="1452172898710" stop="1452172898711">
      <name>test</name>
      <test-cases>
        <test-case start="1452172898710" status="passed" stop="1452172898710">
          <name>test.test_1</name>
          <attachments/>
          <labels/>
          <steps/>
        </test-case>
      </test-cases>
      <labels/>
    </ns0:test-suite>
  • Run this test again with allure and flaky
nosetests test.py --with-allure --logdir=./allure-results-flaky --with-flaky
  • Check result generated
    <ns0:test-suite xmlns:ns0="urn:model.allure.qatools.yandex.ru" start="1452172868032" stop="1452172868033">
      <name>test</name>
      <test-cases/>
      <labels/>
    </ns0:test-suite>

pytest , tox , flaky;'TestReport' object has no attribute 'item'

During results reporting I get this error, seems to be a basic issue, any help is appreciated.

....test completed successfully .....

INTERNALERROR> Traceback (most recent call last):
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/main.py", line 90, in wrap_session
INTERNALERROR> session.exitstatus = doit(config, session) or 0
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/main.py", line 121, in _main
INTERNALERROR> config.hook.pytest_runtestloop(session=session)
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 724, in call
INTERNALERROR> return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 338, in _hookexec
INTERNALERROR> return self._inner_hookexec(hook, methods, kwargs)
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 301, in call
INTERNALERROR> return outcome.get_result()
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 279, in get_result
INTERNALERROR> _reraise(*ex) # noqa
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 264, in init
INTERNALERROR> self.result = func()
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 299, in
INTERNALERROR> outcome = _CallOutcome(lambda: self.oldcall(hook, hook_impls, kwargs))
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 333, in
INTERNALERROR> _MultiCall(methods, kwargs, hook.spec_opts).execute()
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 596, in execute
INTERNALERROR> res = hook_impl.function(*args)
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/main.py", line 146, in pytest_runtestloop
INTERNALERROR> item.config.hook.pytest_runtest_protocol(item=item, nextitem=nextitem)
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 724, in call
INTERNALERROR> return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 338, in _hookexec
INTERNALERROR> return self._inner_hookexec(hook, methods, kwargs)
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 301, in call
INTERNALERROR> return outcome.get_result()
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 279, in get_result
INTERNALERROR> _reraise(*ex) # noqa
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 264, in init
INTERNALERROR> self.result = func()
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 299, in
INTERNALERROR> outcome = _CallOutcome(lambda: self.oldcall(hook, hook_impls, kwargs))
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 333, in
INTERNALERROR> _MultiCall(methods, kwargs, hook.spec_opts).execute()
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 595, in execute
INTERNALERROR> return _wrapped_call(hook_impl.function(_args), self.execute)
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 253, in _wrapped_call
INTERNALERROR> return call_outcome.get_result()
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 279, in get_result
INTERNALERROR> _reraise(_ex) # noqa
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 264, in init
INTERNALERROR> self.result = func()
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 596, in execute
INTERNALERROR> res = hook_impl.function(*args)
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/flaky/flaky_pytest_plugin.py", line 84, in pytest_runtest_protocol
INTERNALERROR> self.runner.pytest_runtest_protocol(item, nextitem)
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/runner.py", line 65, in pytest_runtest_protocol
INTERNALERROR> runtestprotocol(item, nextitem=nextitem)
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/runner.py", line 75, in runtestprotocol
INTERNALERROR> reports.append(call_and_report(item, "call", log))
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/runner.py", line 123, in call_and_report
INTERNALERROR> hook.pytest_runtest_logreport(report=report)
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 724, in call
INTERNALERROR> return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 338, in _hookexec
INTERNALERROR> return self._inner_hookexec(hook, methods, kwargs)
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 301, in call
INTERNALERROR> return outcome.get_result()
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 279, in get_result
INTERNALERROR> _reraise(*ex) # noqa
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 264, in init
INTERNALERROR> self.result = func()
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 299, in
INTERNALERROR> outcome = _CallOutcome(lambda: self.oldcall(hook, hook_impls, kwargs))
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 333, in
INTERNALERROR> _MultiCall(methods, kwargs, hook.spec_opts).execute()
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 596, in execute
INTERNALERROR> res = hook_impl.function(*args)
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/resultlog.py", line 74, in pytest_runtest_logreport
INTERNALERROR> res = self.config.hook.pytest_report_teststatus(report=report)
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 724, in call
INTERNALERROR> return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 338, in _hookexec
INTERNALERROR> return self._inner_hookexec(hook, methods, kwargs)
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 301, in call
INTERNALERROR> return outcome.get_result()
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 279, in get_result
INTERNALERROR> _reraise(*ex) # noqa
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 264, in init
INTERNALERROR> self.result = func()
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 299, in
INTERNALERROR> outcome = _CallOutcome(lambda: self.oldcall(hook, hook_impls, kwargs))
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 333, in
INTERNALERROR> _MultiCall(methods, kwargs, hook.spec_opts).execute()
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 595, in execute
INTERNALERROR> return _wrapped_call(hook_impl.function(*args), self.execute)
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 249, in _wrapped_call
INTERNALERROR> wrap_controller.send(call_outcome)
INTERNALERROR> File "/tmp/devpi-test3/dlb-dcp-test-1.1.0.dev12/.tox/prod1/lib/python2.7/site-packages/flaky/flaky_pytest_plugin.py", line 150, in pytest_report_teststatus
INTERNALERROR> item = report.item
INTERNALERROR> AttributeError: 'TestReport' object has no attribute 'item'
(debug) [scutils-plugin]Forcing to serialize the specmap file during scutils session finish
(info) [dvcp-dev-2.2.0.58-junit-cucm-prod1.specmap] saving report to /home-local/voiceqa/jenkins/workspace/voiceqa-dlb-dcp-tests-ProdAutoLab-voiceqaDevIndex/dvcp-dev-2.2.0.58-junit-cucm-prod1.specmap

packages in the tox environment for the above run:

(prod1)[voiceqa@dsv-bvsrv-08 ~]$ pip freeze
You are using pip version 6.1.1, however version 8.0.0 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
beautifulsoup4==4.4.1
configparser2==4.0.0
dlb-dcp-ap==1.1.0.dev13
dlb-dcp-prov-client==1.2.0.dev9
dlb-dcp-test==1.1.0.dev12
dlb-voice-common==1.0.0
dolby-commander==2.1.0
flaky==3.0.3
gti-common==1.2.0.dev1
gti-containers==3.0
gti-docutils==1.0
gti-scutils==1.0.6
gti-toolwrappers==2.0.1
gti-utils==2.0.1
html2text==2016.1.8
httplib2==0.9.2
pexpect==4.0.1
ptyprocess==0.5
py==1.4.31
Pyro4==4.39
pytest==2.8.5
pytest-scutils==1.0.1
pytest-specmap==1.0.3
python-dateutil==2.4.2
pytl==1.0.2
PyYAML==3.11
requests==2.9.1
serpent==1.12
six==1.10.0
wget==3.2
xlrd==0.9.4
xlutils==1.7.1
xlwt==1.0.0
(prod1)[voiceqa@dsv-bvsrv-08 ~]$

pypi tarball is missing test/base_test_case.py

While packaging for downstream and running the test suite I hit

EEEEEE
======================================================================
ERROR: Failure: ImportError (No module named 'test.base_test_case')
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/lib64/python3.3/site-packages/nose/failure.py", line 39, in runTest
    raise self.exc_val.with_traceback(self.tb)
  File "/usr/lib64/python3.3/site-packages/nose/loader.py", line 418, in loadTestsFromName
    addr.filename, addr.module)
  File "/usr/lib64/python3.3/site-packages/nose/importer.py", line 47, in importFromPath
    return self.importFromDir(dir_path, fqname)
  File "/usr/lib64/python3.3/site-packages/nose/importer.py", line 94, in importFromDir
    mod = load_module(part_fqname, fh, filename, desc)
  File "/usr/lib64/python3.3/imp.py", line 180, in load_module
    return load_source(name, filename, file)
  File "/usr/lib64/python3.3/imp.py", line 119, in load_source
    _LoadSourceCompatibility(name, pathname, file).load_module(name)
  File "<frozen importlib._bootstrap>", line 584, in _check_name_wrapper
  File "<frozen importlib._bootstrap>", line 1022, in load_module
  File "<frozen importlib._bootstrap>", line 1003, in load_module
  File "<frozen importlib._bootstrap>", line 560, in module_for_loader_wrapper
  File "<frozen importlib._bootstrap>", line 868, in _load_module
  File "<frozen importlib._bootstrap>", line 313, in _call_with_frames_removed
  File "/var/tmp/portage/dev-python/flaky-2.4.0/work/flaky-2.4.0/test/test_flaky_decorator.py", line 6, in <module>
    from test.base_test_case import TestCase
ImportError: No module named 'test.base_test_case'

======================================================================
ERROR: Failure: ImportError (No module named 'test.base_test_case')
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/lib64/python3.3/site-packages/nose/failure.py", line 39, in runTest
    raise self.exc_val.with_traceback(self.tb)
  File "/usr/lib64/python3.3/site-packages/nose/loader.py", line 418, in loadTestsFromName
    addr.filename, addr.module)
  File "/usr/lib64/python3.3/site-packages/nose/importer.py", line 47, in importFromPath
    return self.importFromDir(dir_path, fqname)
  File "/usr/lib64/python3.3/site-packages/nose/importer.py", line 94, in importFromDir
    mod = load_module(part_fqname, fh, filename, desc)
  File "/usr/lib64/python3.3/imp.py", line 180, in load_module
    return load_source(name, filename, file)
  File "/usr/lib64/python3.3/imp.py", line 119, in load_source
    _LoadSourceCompatibility(name, pathname, file).load_module(name)
  File "<frozen importlib._bootstrap>", line 584, in _check_name_wrapper
  File "<frozen importlib._bootstrap>", line 1022, in load_module
  File "<frozen importlib._bootstrap>", line 1003, in load_module
  File "<frozen importlib._bootstrap>", line 560, in module_for_loader_wrapper
  File "<frozen importlib._bootstrap>", line 868, in _load_module
  File "<frozen importlib._bootstrap>", line 313, in _call_with_frames_removed
  File "/var/tmp/portage/dev-python/flaky-2.4.0/work/flaky-2.4.0/test/test_flaky_nose_plugin.py", line 12, in <module>
    from test.base_test_case import TestCase
ImportError: No module named 'test.base_test_case'

======================================================================
ERROR: Failure: ImportError (No module named 'test.base_test_case')
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/lib64/python3.3/site-packages/nose/failure.py", line 39, in runTest
    raise self.exc_val.with_traceback(self.tb)
  File "/usr/lib64/python3.3/site-packages/nose/loader.py", line 418, in loadTestsFromName
    addr.filename, addr.module)
  File "/usr/lib64/python3.3/site-packages/nose/importer.py", line 47, in importFromPath
    return self.importFromDir(dir_path, fqname)
  File "/usr/lib64/python3.3/site-packages/nose/importer.py", line 94, in importFromDir
    mod = load_module(part_fqname, fh, filename, desc)
  File "/usr/lib64/python3.3/imp.py", line 180, in load_module
    return load_source(name, filename, file)
  File "/usr/lib64/python3.3/imp.py", line 119, in load_source
    _LoadSourceCompatibility(name, pathname, file).load_module(name)
  File "<frozen importlib._bootstrap>", line 584, in _check_name_wrapper
  File "<frozen importlib._bootstrap>", line 1022, in load_module
  File "<frozen importlib._bootstrap>", line 1003, in load_module
  File "<frozen importlib._bootstrap>", line 560, in module_for_loader_wrapper
  File "<frozen importlib._bootstrap>", line 868, in _load_module
  File "<frozen importlib._bootstrap>", line 313, in _call_with_frames_removed
  File "/var/tmp/portage/dev-python/flaky-2.4.0/work/flaky-2.4.0/test/test_flaky_plugin.py", line 8, in <module>
    from test.base_test_case import TestCase
ImportError: No module named 'test.base_test_case'

======================================================================
ERROR: Failure: ImportError (No module named 'test.base_test_case')
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/lib64/python3.3/site-packages/nose/failure.py", line 39, in runTest
    raise self.exc_val.with_traceback(self.tb)
  File "/usr/lib64/python3.3/site-packages/nose/loader.py", line 418, in loadTestsFromName
    addr.filename, addr.module)
  File "/usr/lib64/python3.3/site-packages/nose/importer.py", line 47, in importFromPath
    return self.importFromDir(dir_path, fqname)
  File "/usr/lib64/python3.3/site-packages/nose/importer.py", line 94, in importFromDir
    mod = load_module(part_fqname, fh, filename, desc)
  File "/usr/lib64/python3.3/imp.py", line 180, in load_module
    return load_source(name, filename, file)
  File "/usr/lib64/python3.3/imp.py", line 119, in load_source
    _LoadSourceCompatibility(name, pathname, file).load_module(name)
  File "<frozen importlib._bootstrap>", line 584, in _check_name_wrapper
  File "<frozen importlib._bootstrap>", line 1022, in load_module
  File "<frozen importlib._bootstrap>", line 1003, in load_module
  File "<frozen importlib._bootstrap>", line 560, in module_for_loader_wrapper
  File "<frozen importlib._bootstrap>", line 868, in _load_module
  File "<frozen importlib._bootstrap>", line 313, in _call_with_frames_removed
  File "/var/tmp/portage/dev-python/flaky-2.4.0/work/flaky-2.4.0/test/test_multiprocess_string_io.py", line 8, in <module>
    from test.base_test_case import TestCase
ImportError: No module named 'test.base_test_case'

======================================================================
ERROR: Failure: ImportError (No module named 'test.base_test_case')
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/lib64/python3.3/site-packages/nose/failure.py", line 39, in runTest
    raise self.exc_val.with_traceback(self.tb)
  File "/usr/lib64/python3.3/site-packages/nose/loader.py", line 418, in loadTestsFromName
    addr.filename, addr.module)
  File "/usr/lib64/python3.3/site-packages/nose/importer.py", line 47, in importFromPath
    return self.importFromDir(dir_path, fqname)
  File "/usr/lib64/python3.3/site-packages/nose/importer.py", line 94, in importFromDir
    mod = load_module(part_fqname, fh, filename, desc)
  File "/usr/lib64/python3.3/imp.py", line 180, in load_module
    return load_source(name, filename, file)
  File "/usr/lib64/python3.3/imp.py", line 119, in load_source
    _LoadSourceCompatibility(name, pathname, file).load_module(name)
  File "<frozen importlib._bootstrap>", line 584, in _check_name_wrapper
  File "<frozen importlib._bootstrap>", line 1022, in load_module
  File "<frozen importlib._bootstrap>", line 1003, in load_module
  File "<frozen importlib._bootstrap>", line 560, in module_for_loader_wrapper
  File "<frozen importlib._bootstrap>", line 868, in _load_module
  File "<frozen importlib._bootstrap>", line 313, in _call_with_frames_removed
  File "/var/tmp/portage/dev-python/flaky-2.4.0/work/flaky-2.4.0/test/test_nose_example.py", line 11, in <module>
    from test.base_test_case import TestCase, expectedFailure, skip
ImportError: No module named 'test.base_test_case'

======================================================================
ERROR: Failure: ImportError (No module named 'test.base_test_case')
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/lib64/python3.3/site-packages/nose/failure.py", line 39, in runTest
    raise self.exc_val.with_traceback(self.tb)
  File "/usr/lib64/python3.3/site-packages/nose/loader.py", line 418, in loadTestsFromName
    addr.filename, addr.module)
  File "/usr/lib64/python3.3/site-packages/nose/importer.py", line 47, in importFromPath
    return self.importFromDir(dir_path, fqname)
  File "/usr/lib64/python3.3/site-packages/nose/importer.py", line 94, in importFromDir
    mod = load_module(part_fqname, fh, filename, desc)
  File "/usr/lib64/python3.3/imp.py", line 180, in load_module
    return load_source(name, filename, file)
  File "/usr/lib64/python3.3/imp.py", line 119, in load_source
    _LoadSourceCompatibility(name, pathname, file).load_module(name)
  File "<frozen importlib._bootstrap>", line 584, in _check_name_wrapper
  File "<frozen importlib._bootstrap>", line 1022, in load_module
  File "<frozen importlib._bootstrap>", line 1003, in load_module
  File "<frozen importlib._bootstrap>", line 560, in module_for_loader_wrapper
  File "<frozen importlib._bootstrap>", line 868, in _load_module
  File "<frozen importlib._bootstrap>", line 313, in _call_with_frames_removed
  File "/var/tmp/portage/dev-python/flaky-2.4.0/work/flaky-2.4.0/test/test_utils.py", line 8, in <module>
    from test.base_test_case import TestCase
ImportError: No module named 'test.base_test_case'

===Flaky Test Report===


===End Flaky Test Report===
----------------------------------------------------------------------
Ran 6 tests in 0.017s

FAILED (errors=6)

Ignore empty flaky test report

When flaky is installed, all py.test invokes will generate a flaky test report, even if flaky isn't used in the project.

It would be nice if it could be skipped when there is nothing to show, or at least when it's not used.

===Flaky Test Report===


===End Flaky Test Report===

Extra testcases in Junit xml output with pytest

When running flaky together with pytest using junit xml output:

pytest --junit-xml=result.xml

it appears like each rerun of a flaky test is registered as an extra testcase in the junit xml output with no attributes except time, for example:
<testcase time="0.0019998550415"></testcase>

Is this working as intended?

Exceptions not raised from class-based fixtures

Very useful library! I did find a bug with flaky, which is that if a class-based fixtures raises an exception, the exception is not raised, and for any tests that use that exception, it appears if the fixture never ran or exists. This makes it very hard to debug the fixture. (Originally I didn't know why this was happening, so I started uninstalling packages until I removed flaky, and things started working again)

(Actually I don't think it's just limited class-based fixtures, this happens in other cases too)

Test Example 1 (does a divide by zero to illustrate a bad fixture)

import pytest

class TestStuff(object):
    @pytest.fixture(autouse=True)
    def setup(self):
        5 / 0
        self.foo = 'foo'

    def test_stuff(self):
        assert self.foo == 'foo'

Resulting output:

test_flaky.py F

=================================================== FAILURES ====================================================
_____________________________________________ TestStuff.test_stuff ______________________________________________

self = <test_flaky.TestStuff object at 0x10aac3d50>

    def test_stuff(self):
>       assert self.foo == 'foo'
E       AttributeError: 'TestStuff' object has no attribute 'foo'

Test Example 2 (variation of above, without autouse=True)

class TestTwo(object):
    @pytest.fixture
    def setup(self):
        5 / 0
        self.foo = 'foo'

    def test_two(self, setup):
        assert self.foo == 'foo'

Resulting output:

self = <CallInfo when='call' exception: 'setup'>, func = <function <lambda> at 0x101b84668>
plugin = <flaky.flaky_pytest_plugin.FlakyPlugin object at 0x101b2d1d0>

    def call(self, func, plugin):
        """
            Call the test function, handling success or failure.
            :param func:
                The test function to run.
            :type func:
                `callable`
            :param plugin:
                Plugin class for flaky that can handle test success or failure.
            :type plugin:
                :class: `FlakyPlugin`
            """
        is_call = self.when == 'call'
        try:
>           self.result = func()

venv/lib/python2.7/site-packages/flaky/flaky_pytest_plugin.py:356:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
venv/lib/python2.7/site-packages/flaky/flaky_pytest_plugin.py:181: in <lambda>
    lambda: ihook(item=item, **kwds),
venv/lib/python2.7/site-packages/_pytest/core.py:521: in __call__
    return self._docall(self.methods, kwargs)
venv/lib/python2.7/site-packages/_pytest/core.py:528: in _docall
    firstresult=self.firstresult).execute()
venv/lib/python2.7/site-packages/_pytest/core.py:393: in execute
    return wrapped_call(method(*args), self.execute)
venv/lib/python2.7/site-packages/_pytest/core.py:113: in wrapped_call
    return call_outcome.get_result()
venv/lib/python2.7/site-packages/_pytest/core.py:138: in get_result
    py.builtin._reraise(*ex)
venv/lib/python2.7/site-packages/_pytest/core.py:123: in __init__
    self.result = func()
venv/lib/python2.7/site-packages/_pytest/core.py:394: in execute
    res = method(*args)
venv/lib/python2.7/site-packages/_pytest/runner.py:90: in pytest_runtest_call
    item.runtest()
venv/lib/python2.7/site-packages/_pytest/python.py:1181: in runtest
    self.ihook.pytest_pyfunc_call(pyfuncitem=self)
venv/lib/python2.7/site-packages/_pytest/core.py:521: in __call__
    return self._docall(self.methods, kwargs)
venv/lib/python2.7/site-packages/_pytest/core.py:528: in _docall
    firstresult=self.firstresult).execute()
venv/lib/python2.7/site-packages/_pytest/core.py:394: in execute
    res = method(*args)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

pyfuncitem = <Function 'test_two'>

    @pytest.mark.trylast
    def pytest_pyfunc_call(pyfuncitem):
        testfunction = pyfuncitem.obj
        if pyfuncitem._isyieldedfunction():
            testfunction(*pyfuncitem._args)
        else:
            funcargs = pyfuncitem.funcargs
            testargs = {}
            for arg in pyfuncitem._fixtureinfo.argnames:
>               testargs[arg] = funcargs[arg]
E               KeyError: 'setup'

Expected Output:

self = <test_flaky.TestOne object at 0x104c9fa90>

    @pytest.fixture(autouse=True)
    def setup(self):
>       5 / 0
E       ZeroDivisionError: integer division or modulo by zero

test_flaky.py:7: ZeroDivisionError

Test file with these examples at https://gist.github.com/alvinchow86/82ff8671dd8db847e251.

This is with Python 2.7.9, running on OS X Yosemite. Can reproduce this with just these PIP dependencies.

flaky==2.2.0
py==1.4.30
pytest==2.7.2

Exit status is set to 0 on failure instead of 1

When running tests with nose and the flaky plugin, the exit status was not being set to 1 when the only tests that failed were ones that run with the flaky decorator and had max tries greater than 1. If a test that was not decorated with @flaky also failed, then the exit status was properly set to 1.

For example, here is a test that always fails, run with nose and without flaky. The exit status is 1.

$ nosetests test_flaky.py:TestFlaky
F
-----------------------------------------------------------------------------
1) FAIL: test_flaky_exit (test_flaky.TestFlaky)

   Traceback (most recent call last):
    test_flaky.py line 7 in test_flaky_exit
      self.assertTrue(False)
   AssertionError: False is not true


-----------------------------------------------------------------------------
1 test run in 0.0 seconds. 
1 FAILED (0 tests passed)
$ echo $?
1

... And here is the same test, run with nose and using flaky with the default 2 max tries. It correctly runs the test twice and prints the correct results, but then exits with 0 status.

$ nosetests test_flaky.py:TestFlaky --with-flaky
-F
-----------------------------------------------------------------------------
1) FAIL: test_flaky_exit (test_flaky.TestFlaky)

   Traceback (most recent call last):
    test_flaky.py line 7 in test_flaky_exit
      self.assertTrue(False)
   AssertionError: False is not true


-----------------------------------------------------------------------------
2 tests run in 0.0 seconds. 
1 FAILED, 1 skipped (0 tests passed)
$ echo $?
0

The root cause is that the flaky plugin passes a newly created TextTestResult object to _rerun_test rather than the original object that nose created, leaving nose unaware of any rerun results.

Flaky report missing in stdout w/ nosetests

I only ever see the flaky report when my tests short circuit and (presumably) stderr prints to console. However, if my tests run completely, there's no flaky report at the end. I'm running nosetests 1.3.4 and flaky 2.0.0.

Here's a sample command line:

nosetests -a '!quarantine' -v -s --with-flaky --processes=4 --process-timeout=12000 mytests.py

I'm certain that flaky is loaded and it shows up in nosetests --plugins.

tearDown() not called between reruns

Hi,

Thanks for what seems to be a very promising idea...
I tried to use flaky with nosetests, however I find that the tests are rerun for fails fine but the tearDown() does not get called between reruns until the very end when all reruns are finished, which is not good the test can leave something behind on the environment. Is this how flaky was designed or is it an actual issue?

Thanks,
Regards,
Yaro

Errors out when running with nose multiprocess

When running with --processes flag in nose tests, I receive below error message:

20:51:00     is_multiprocess = getattr(options, 'multiprocess_workers', 0) > 0
20:51:00 TypeError: unorderable types: str() > int()

I believe this is caused here and I've created a PR to fix this issue.

FYI, below is the command I used to run my tests:

nosetests --processes=6 --process-timeout=3600 --process-restartworker --with-flaky --force-flaky --max-runs=3 -v tests/functional 2>&1

`ensure_unicode_string()` throws an exception when argument is an Exception with non-ASCII bytes in the message

If an object (like an Exception) is passed into ensure_unicode_string and the exception has non-ASCII bytes in the message, it will raise a UnicodeError (which will be caught) and then makes the false assumption that the argument must be a string (and therefore have a .decode() method) resulting in an AttributeError being raised.

This can happen whenever a test raises an exception that has a message containing non-ASCII bytes. When that happens, the exception gets passed to ensure_unicode_string() here https://github.com/box/flaky/blob/master/flaky/_flaky_plugin.py#L43 and raises an exception.

Test case:

def test_ensure_unicode_string_handles_nonascii_exception_message(self):
        ex = Exception(u'\u2013'.encode('utf-8'))

        string = ensure_unicode_string(ex)
        self.assertEqual(string, u'\u2013')

Doesn't work with teamcity-messages

Hey, looks like flaky is not working very well with teamcity-messages (under pytest for my case).

For some reason, every retries seems to log an empty line, and a failure status is logged alone on it's line without the test name.

Here's the log I'm encountering:

[14:37:58][Step 4/8] test.test_name
[14:37:59][test.test_name] 
[14:37:59][Step 4/8] 
[14:37:59][Step 4/8] 
[14:37:59][Step 4/8] 
[14:37:59][Step 4/8] FAILED

So I'm wondering if maybe test retries are missing some metadata allowing them to be correctly logged?

no:flaky instructions in README don't work

Docs say:

With py.test, flaky will automatically run. It can, however be disabled via the command line:
py.test no:flaky

but when I run it pytest shows an error:

ERROR: file not found: no:flaky

unittest addCleanup only run once per test with nose plugin

I discovered an issue with hooking up flaky with our test suite:

Cleanup/teardown functions added to tests via unittest's addCleanup are run only once, even if a test is re-run:

>>> calling the test
>>> flaky startTest
>>> cleanup function
>>> flaky stopTest
>>> calling the test
>>> flaky startTest
>>> flaky stopTest

I'd expect the cleanup function to be run after every test run.

Here's the behavior of unittest's doCleanups, which runs the cleanup functions:

doCleanups() pops methods off the stack of cleanup functions one at a time, so it can be called at any time.

My understanding is that since doCleanups() mutates the stack of cleanup functions, there are no cleanup functions left to run when calling the test a second time.

I hope to have time to poke around and try to find a solution for this today. Maybe we can save a copy of the cleanup function stack before running the test, and then re-add the stack to the test after each test run if the test is marked for re-run? Any other ideas come to mind?

no new-line when runnig with --verbose

Hey, cool plugin but it seems that it does not play well with the --verbose parameter. In some situations it does not print the results correctly. Consider the following test file:

import random
import unittest
from hamcrest import assert_that, equal_to


class Test(unittest.TestCase):
    def test_1(self):
        """Test 1"""
        assert_that(random.choice((1, 2)), equal_to(1))

    def test_2(self):
        """Test 2"""
        assert_that(random.choice((1, 2)), equal_to(1))

And run the tests with:

nosetests --with-flaky --force-flaky --verbose

Depending on the result of the tests you can end up with (the first line is the problem):

Test 1 ... Test 2 ... FAIL

======================================================================
FAIL: Test 2
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/.../test1.py", line 13, in test_2
    assert_that(random.choice((1, 2)), equal_to(1))
AssertionError:
Expected: <1>
     but: was <2>


===Flaky Test Report===

test_1 failed (1 runs remaining out of 2).
	<class 'AssertionError'>

Expected: <1>
     but: was <2>

	<traceback object at 0x107436348>
test_1 passed 1 out of the required 1 times. Success!
test_2 failed (1 runs remaining out of 2).
	<class 'AssertionError'>

Expected: <1>
     but: was <2>

	<traceback object at 0x107436c88>
test_2 failed; it passed 0 out of the required 1 times.
	<class 'AssertionError'>

Expected: <1>
     but: was <2>

	<traceback object at 0x107436f88>

===End Flaky Test Report===
----------------------------------------------------------------------
Ran 2 tests in 0.002s

EXPECTED:

Test 1 ... ok
Test 2 ... FAIL

Flaky ran with py.test counts skipped tests as failing and reruns them

It looks like flaky (version 2.1.1) counts an intentionally skipped test as failed, and will rerun it up to max_runs times before reporting it as failed and allowing the test to be counted as skipped.

Code that displays the problem:

import unittest, flaky

@flaky.flaky(max_runs=5)
@unittest.skip("example")
class SkipTestCase(unittest.TestCase):
    def test_skip(self):
        assert True

When run with py.test, this outputs

============================= test session starts ==============================
platform linux2 -- Python 2.7.8 -- py-1.4.27 -- pytest-2.7.0
rootdir: /home/jluke/Development/flaky-skip-demo, inifile: 
plugins: xdist, colordots, timeout, flaky
collected 1 items

test_flaky_skip.py s
===Flaky Test Report===

test_skip failed (4 runs remaining out of 5).
    <class 'builtins.Skipped'>
    example
    [<TracebackEntry /usr/lib/python2.7/site-packages/_pytest/unittest.py:114>, <TracebackEntry /usr/lib/python2.7/site-packages/_pytest/runner.py:468>]
test_skip failed (3 runs remaining out of 5).
    <class 'builtins.Skipped'>
    example
    [<TracebackEntry /usr/lib/python2.7/site-packages/_pytest/unittest.py:114>, <TracebackEntry /usr/lib/python2.7/site-packages/_pytest/runner.py:468>]
test_skip failed (2 runs remaining out of 5).
    <class 'builtins.Skipped'>
    example
    [<TracebackEntry /usr/lib/python2.7/site-packages/_pytest/unittest.py:114>, <TracebackEntry /usr/lib/python2.7/site-packages/_pytest/runner.py:468>]
test_skip failed (1 runs remaining out of 5).
    <class 'builtins.Skipped'>
    example
    [<TracebackEntry /usr/lib/python2.7/site-packages/_pytest/unittest.py:114>, <TracebackEntry /usr/lib/python2.7/site-packages/_pytest/runner.py:468>]
test_skip failed; it passed 0 out of the required 1 times.
    <class 'builtins.Skipped'>
    example
    [<TracebackEntry /usr/lib/python2.7/site-packages/_pytest/unittest.py:114>, <TracebackEntry /usr/lib/python2.7/site-packages/_pytest/runner.py:468>]

===End Flaky Test Report===

========================== 1 skipped in 0.01 seconds ===========================

I see no reason why flaky would try to rerun a skipped test multiple times.

Tag 3.0.1 release

There's a 3.0.1 release up on PyPI, but no corresponding tag here on GitHub; I assume this is just a simple oversight.

does it work with pytest-xdist?

I'm running tests in parallel using pytest-xdist. There were test failures in tests decorated with @flaky, and flaky printed an empty report:

===Flaky Test Report===


===End Flaky Test Report===

I'm not sure, but it seems flaky didn't try to re-run failed tests. There were no flaky-related messages in the log.

pytest: test re-run should also include resetup of all the test-scoped fixtures

This is obviously required if fixtures are mutable
For example: test logs in using a 'browser' fixture and checks some content on the page loaded by ajax
It can be flaky if there's some timeout for ajax to load
But with this plugin it will not work as test will not be able to log in again as it's already logged in

flaky doesn't work well with tests defined in base classes

I have tests like this:

class BaseTest(unittest.TestCase):
    param = 'foo'

    @flaky()
    def test_method1(self):
         # ...

class AnotherTest(BaseTest):
    param = 'bar'

test_method is run twice here. It seems that this drives flaky mad - see https://gist.github.com/kmike/ab01f0595d5776f58295. The output is very large, but if you scroll to the bottom there are log messages like

test_timeout failed (-1 runs remaining out of 2).
test_meta_redirect_delay_wait_enough passed 3 out of the required 1 times. Success!
test_timeout failed (-15 runs remaining out of 2).

and the "===Flaky Test Report===" output is a little weird overall.

I think it could be related to reusing test methods, but maybe there are some other issues.

Test classes inheriting TestCase aren't rerun with pytest?

Hi! This seems like a really useful tool! But I'm having some troubles getting it to work with my project.

In the readme there's an example of "Marking a class flaky":

@flaky
class TestMultipliers(TestCase):
    def test_flaky_doubler(self):
        value_to_double = 21
        result = get_result_from_flaky_doubler(value_to_double)
        self.assertEqual(result, value_to_double * 2, 'Result doubled incorrectly.')

Note how it's inheriting TestCase. That's how we are writing our test classes in the project I tried to use flaky in.

The problem is that it does not seem to retry the tests if classes are inheriting TestCase when using pytest. It only works if it's inheriting object, like in the test_pytest_example.py module.

Please see my fork for an example of a test case that fails but shouldn't (?)

Best regards
Simon

Whats with the `<traceback object at 0x7f87ed651098>` in the "Flaky Test Report"?

Steps to reproduce:

# tests.py
import unittest
from flaky import flaky


@flaky
class MyTest(unittest.TestCase):
    def test_fail(self):
        raise Exception("very important message")
  • run nosetests --with-flaky
E
======================================================================
ERROR: test_fail (tests.MyTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/mg/tmp/flaky-test/tests.py", line 16, in test_fail
    raise Exception
Exception

===Flaky Test Report===

test_fail failed (1 runs remaining out of 2).
    <type 'exceptions.Exception'>
    very important message
    <traceback object at 0x7fc0b5634248>
test_fail failed; it passed 0 out of the required 1 times.
    <type 'exceptions.Exception'>
    very important message
    <traceback object at 0x7fc0b5634128>

===End Flaky Test Report===
----------------------------------------------------------------------
Ran 1 test in 0.002s

FAILED (errors=1)

The memory addresses of the traceback objects do not seem to be useful. Why are you printing them? Why not print the actual traceback? (You may want to use traceback.format_exception() to format the traceback with the exception type + message, in the usual format, and then add some indentation.)

Flaky doesn't work with xdist

Flaky adds the item field to the report in pytest_runtest_makereport, which is of a type that execnet can't serialize in gateway_base.py, which leads to an error like this:
INTERNALERROR> File "/usr/local/lib/python2.7/dist-packages/execnet/gateway_base.py", line 1388, in _save
INTERNALERROR> raise DumpError("can't serialize %s" % (tp,))
INTERNALERROR> DumpError: can't serialize <class '_pytest.unittest.TestCaseFunction'>

Versions used: Python 2.7.6, pytest-2.8.3, py-1.4.30, pluggy-0.3.1, flaky-3.0.1, random-0.02, xdist-1.13.1

Django's LiveServerTestCase is not supported

Django uses a __call__ method on the test for setup and teardown of the live server. This currently breaks with flaky. I have a pull request coming momentarily that addresses this.

flaky makes py.test --pep8 crash.

Initially reported here: errbotio/errbot#672
Similar issue here: #34

(botbot)➜  err-jenkins git:(master) coverage run --source jenkinsBot -m py.test --pep8
================================================= test session starts =================================================
platform linux -- Python 3.4.3, pytest-2.9.0, py-1.4.31, pluggy-0.3.1
rootdir: /home/jtanay/dev/botbot/plugins/err-jenkins, inifile: 
plugins: flaky-3.1.0, cov-2.2.1, xdist-1.14, pep8-1.0.6
collected 12 items 
INTERNALERROR> Traceback (most recent call last):
INTERNALERROR>   File "/home/jtanay/.virtualenvs/botbot/lib/python3.4/site-packages/_pytest/main.py", line 94, in wrap_session
INTERNALERROR>     session.exitstatus = doit(config, session) or 0
INTERNALERROR>   File "/home/jtanay/.virtualenvs/botbot/lib/python3.4/site-packages/_pytest/main.py", line 125, in _main
INTERNALERROR>     config.hook.pytest_runtestloop(session=session)
INTERNALERROR>   File "/home/jtanay/.virtualenvs/botbot/lib/python3.4/site-packages/_pytest/vendored_packages/pluggy.py", line 724, in __call__
INTERNALERROR>     return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)
INTERNALERROR>   File "/home/jtanay/.virtualenvs/botbot/lib/python3.4/site-packages/_pytest/vendored_packages/pluggy.py", line 338, in _hookexec
INTERNALERROR>     return self._inner_hookexec(hook, methods, kwargs)
INTERNALERROR>   File "/home/jtanay/.virtualenvs/botbot/lib/python3.4/site-packages/_pytest/vendored_packages/pluggy.py", line 333, in <lambda>
INTERNALERROR>     _MultiCall(methods, kwargs, hook.spec_opts).execute()
INTERNALERROR>   File "/home/jtanay/.virtualenvs/botbot/lib/python3.4/site-packages/_pytest/vendored_packages/pluggy.py", line 596, in execute
INTERNALERROR>     res = hook_impl.function(*args)
INTERNALERROR>   File "/home/jtanay/.virtualenvs/botbot/lib/python3.4/site-packages/_pytest/main.py", line 150, in pytest_runtestloop
INTERNALERROR>     item.config.hook.pytest_runtest_protocol(item=item, nextitem=nextitem)
INTERNALERROR>   File "/home/jtanay/.virtualenvs/botbot/lib/python3.4/site-packages/_pytest/vendored_packages/pluggy.py", line 724, in __call__
INTERNALERROR>     return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)
INTERNALERROR>   File "/home/jtanay/.virtualenvs/botbot/lib/python3.4/site-packages/_pytest/vendored_packages/pluggy.py", line 338, in _hookexec
INTERNALERROR>     return self._inner_hookexec(hook, methods, kwargs)
INTERNALERROR>   File "/home/jtanay/.virtualenvs/botbot/lib/python3.4/site-packages/_pytest/vendored_packages/pluggy.py", line 333, in <lambda>
INTERNALERROR>     _MultiCall(methods, kwargs, hook.spec_opts).execute()
INTERNALERROR>   File "/home/jtanay/.virtualenvs/botbot/lib/python3.4/site-packages/_pytest/vendored_packages/pluggy.py", line 595, in execute
INTERNALERROR>     return _wrapped_call(hook_impl.function(*args), self.execute)
INTERNALERROR>   File "/home/jtanay/.virtualenvs/botbot/lib/python3.4/site-packages/_pytest/vendored_packages/pluggy.py", line 253, in _wrapped_call
INTERNALERROR>     return call_outcome.get_result()
INTERNALERROR>   File "/home/jtanay/.virtualenvs/botbot/lib/python3.4/site-packages/_pytest/vendored_packages/pluggy.py", line 278, in get_result
INTERNALERROR>     raise ex[1].with_traceback(ex[2])
INTERNALERROR>   File "/home/jtanay/.virtualenvs/botbot/lib/python3.4/site-packages/_pytest/vendored_packages/pluggy.py", line 264, in __init__
INTERNALERROR>     self.result = func()
INTERNALERROR>   File "/home/jtanay/.virtualenvs/botbot/lib/python3.4/site-packages/_pytest/vendored_packages/pluggy.py", line 596, in execute
INTERNALERROR>     res = hook_impl.function(*args)
INTERNALERROR>   File "/home/jtanay/.virtualenvs/botbot/lib/python3.4/site-packages/flaky/flaky_pytest_plugin.py", line 68, in pytest_runtest_protocol
INTERNALERROR>     self._copy_flaky_attributes(item, test_instance)
INTERNALERROR>   File "/home/jtanay/.virtualenvs/botbot/lib/python3.4/site-packages/flaky/_flaky_plugin.py", line 404, in _copy_flaky_attributes
INTERNALERROR>     _, test_callable, _ = cls._get_test_declaration_callable_and_name(test)
INTERNALERROR>   File "/home/jtanay/.virtualenvs/botbot/lib/python3.4/site-packages/flaky/flaky_pytest_plugin.py", line 359, in _get_test_declaration_callable_and_name
INTERNALERROR>     elif hasattr(test.module, callable_name):
INTERNALERROR> AttributeError: 'Pep8Item' object has no attribute 'module'
============================================ no tests ran in 0.94 seconds =============================================

flaky doesn't work with py.test --doctest-modules

When doctests are enabled in pytest and flaky is installed, pytest raises an error when collecting the tests:

(splash)kmike ~/svn/splash [master+?]> py.test --doctest-modules splash
============================================================= test session starts =============================================================
platform darwin -- Python 2.7.5 -- py-1.4.26 -- pytest-2.6.4
plugins: flaky, greendots, xdist
collected 1035 items 
INTERNALERROR> Traceback (most recent call last):
INTERNALERROR>   File "/Users/kmike/envs/splash/lib/python2.7/site-packages/_pytest/main.py", line 84, in wrap_session
INTERNALERROR>     doit(config, session)
INTERNALERROR>   File "/Users/kmike/envs/splash/lib/python2.7/site-packages/_pytest/main.py", line 122, in _main
INTERNALERROR>     config.hook.pytest_runtestloop(session=session)
INTERNALERROR>   File "/Users/kmike/envs/splash/lib/python2.7/site-packages/_pytest/core.py", line 413, in __call__
INTERNALERROR>     return self._docall(methods, kwargs)
INTERNALERROR>   File "/Users/kmike/envs/splash/lib/python2.7/site-packages/_pytest/core.py", line 424, in _docall
INTERNALERROR>     res = mc.execute()
INTERNALERROR>   File "/Users/kmike/envs/splash/lib/python2.7/site-packages/_pytest/core.py", line 315, in execute
INTERNALERROR>     res = method(**kwargs)
INTERNALERROR>   File "/Users/kmike/envs/splash/lib/python2.7/site-packages/_pytest/main.py", line 142, in pytest_runtestloop
INTERNALERROR>     item.config.hook.pytest_runtest_protocol(item=item, nextitem=nextitem)
INTERNALERROR>   File "/Users/kmike/envs/splash/lib/python2.7/site-packages/_pytest/core.py", line 413, in __call__
INTERNALERROR>     return self._docall(methods, kwargs)
INTERNALERROR>   File "/Users/kmike/envs/splash/lib/python2.7/site-packages/_pytest/core.py", line 424, in _docall
INTERNALERROR>     res = mc.execute()
INTERNALERROR>   File "/Users/kmike/envs/splash/lib/python2.7/site-packages/_pytest/core.py", line 315, in execute
INTERNALERROR>     res = method(**kwargs)
INTERNALERROR>   File "/Users/kmike/envs/splash/lib/python2.7/site-packages/flaky/flaky_pytest_plugin.py", line 18, in pytest_runtest_protocol
INTERNALERROR>     PLUGIN.run_test(item, nextitem)
INTERNALERROR>   File "/Users/kmike/envs/splash/lib/python2.7/site-packages/flaky/flaky_pytest_plugin.py", line 102, in run_test
INTERNALERROR>     self._copy_flaky_attributes(item, test_instance)
INTERNALERROR>   File "/Users/kmike/envs/splash/lib/python2.7/site-packages/flaky/_flaky_plugin.py", line 253, in _copy_flaky_attributes
INTERNALERROR>     test_callable, _ = cls._get_test_callable_and_name(test)
INTERNALERROR>   File "/Users/kmike/envs/splash/lib/python2.7/site-packages/flaky/flaky_pytest_plugin.py", line 222, in _get_test_callable_and_name
INTERNALERROR>     elif hasattr(test.module, callable_name):
INTERNALERROR> AttributeError: 'DoctestItem' object has no attribute 'module'

crash if using Nose short test naming convention

If a test uses Nose's "short style" tests, Flaky gets confused about the test name, and crashes.

Test case:

test_nose_example.py

# USAGE:
#   nosetests -s --with-flaky ./test/test_nose_example.py

from nose.tools import eq_

def test1():
    eq_(1, 1)

flaky passes tests even if it fails on every try

When I run flaky with selenium, it appears that flaky suppresses errors and possibly failures as well.

the below code shows you how the flaky decorator is applied (though the complete integration test is a bit of a bother so I gave the abridged version).

@flaky(max_runs=3, min_passes=1)
class TestLoginModalInteractions(ModalTestCases):

    def test_ajax_modal_launch(self):
        # open modal from login link
        self.open_modal_link()

        # check everything is there
        self.assertIdExists('login_form')
        self.assertClassExists('mfp-bg')

        # check everything closes correct
        self.driver.find_element_by_class_name('mfp-bg').click()
        mfp_modal_blur = self.driver.find_elements_by_class_name('mfp-bg')
        self.assertTrue(len(mfp_modal_blur) == 0)

This is my output when I run the test with the flaky decorator:
screen shot 2015-04-04 at 10 22 33 am

and here is the output when I run it without the flaky decorator:
screen shot 2015-04-04 at 10 28 11 am

I've read over your docs, I think this is a bug, but correct me if I am wrong.

AttributeError: _call_infos when running with pytester plugin

I'm using the pytester plugin to test my pytest plugin for Hypothesis. If I happen to have flaky installed when I do, it goes a bit wrong.

If I run the following test while flaky is installed:

import pytest

pytest_plugins = str('pytester')

TESTSUITE = """
def test_a_thing():
    pass

"""


def test_output_without_capture(testdir):
    script = testdir.makepyfile(TESTSUITE)
    result = testdir.runpytest(script, '--verbose', '--capture', 'fd')
    assert result.ret == 0

I get the following output:

INTERNALERROR> Traceback (most recent call last):
INTERNALERROR>   File "/home/david/projects/scratch/flakytest/flaker/lib/python2.7/site-packages/_pytest/main.py", line 90, in wrap_session
INTERNALERROR>     session.exitstatus = doit(config, session) or 0
INTERNALERROR>   File "/home/david/projects/scratch/flakytest/flaker/lib/python2.7/site-packages/_pytest/main.py", line 121, in _main
INTERNALERROR>     config.hook.pytest_runtestloop(session=session)
INTERNALERROR>   File "/home/david/projects/scratch/flakytest/flaker/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 724, in __call__
INTERNALERROR>     return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)
INTERNALERROR>   File "/home/david/projects/scratch/flakytest/flaker/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 338, in _hookexec
INTERNALERROR>     return self._inner_hookexec(hook, methods, kwargs)
INTERNALERROR>   File "/home/david/projects/scratch/flakytest/flaker/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 333, in <lambda>
INTERNALERROR>     _MultiCall(methods, kwargs, hook.spec_opts).execute()
INTERNALERROR>   File "/home/david/projects/scratch/flakytest/flaker/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 596, in execute
INTERNALERROR>     res = hook_impl.function(*args)
INTERNALERROR>   File "/home/david/projects/scratch/flakytest/flaker/lib/python2.7/site-packages/_pytest/main.py", line 146, in pytest_runtestloop
INTERNALERROR>     item.config.hook.pytest_runtest_protocol(item=item, nextitem=nextitem)
INTERNALERROR>   File "/home/david/projects/scratch/flakytest/flaker/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 724, in __call__
INTERNALERROR>     return self._hookexec(self, self._nonwrappers + self._wrappers, kwargs)
INTERNALERROR>   File "/home/david/projects/scratch/flakytest/flaker/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 338, in _hookexec
INTERNALERROR>     return self._inner_hookexec(hook, methods, kwargs)
INTERNALERROR>   File "/home/david/projects/scratch/flakytest/flaker/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 333, in <lambda>
INTERNALERROR>     _MultiCall(methods, kwargs, hook.spec_opts).execute()
INTERNALERROR>   File "/home/david/projects/scratch/flakytest/flaker/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 595, in execute
INTERNALERROR>     return _wrapped_call(hook_impl.function(*args), self.execute)
INTERNALERROR>   File "/home/david/projects/scratch/flakytest/flaker/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 253, in _wrapped_call
INTERNALERROR>     return call_outcome.get_result()
INTERNALERROR>   File "/home/david/projects/scratch/flakytest/flaker/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 279, in get_result
INTERNALERROR>     _reraise(*ex)  # noqa
INTERNALERROR>   File "/home/david/projects/scratch/flakytest/flaker/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 264, in __init__
INTERNALERROR>     self.result = func()
INTERNALERROR>   File "/home/david/projects/scratch/flakytest/flaker/lib/python2.7/site-packages/_pytest/vendored_packages/pluggy.py", line 596, in execute
INTERNALERROR>     res = hook_impl.function(*args)
INTERNALERROR>   File "/home/david/projects/scratch/flakytest/flaker/lib/python2.7/site-packages/flaky/flaky_pytest_plugin.py", line 98, in pytest_runtest_protocol
INTERNALERROR>     del self._call_infos
INTERNALERROR> AttributeError: _call_infos

Note: The test passes (I'm not totally sure why), but it still causes this output.

I think this is because pytester is running in inprocess mode, so the plugin is being called recursively and trying to delete the attribute twice. Indeed, adding the flags "-p pytester --runpytest subprocess" to get pytester to default to subprocess mode causes the problem to go away.

Rerun flaky tests after the others

If a flaky test fails, it looks like it's rerun immediately.

For flaky integration tests which run against a third party, the external service may only be temporarily down. Running the same test immediately won't help much.

But deferring it, and sticking it at the end of the queue so it's run after all the remaining tests, might give the external service enough time to sort itself out and let the flaky test pass.

I'm seeing this now (using pytest). A flaky test passes in one Travis job, but fails in the next.

Is there an option for deferring flaky tests until after the rest of the test suite?

If not, would it be possible to add one?

Thank you!

Call setup_method & teardown_method on each re-run

If using the setup_class or setup_method to prepare fixtures for a test (like opening a browser in the setup_method and closing it in the teardown_method) should not flaky call those methods aswell on each run?

It might be related with #53

@flaky(min_passes=3, max_runs=6)
class TestFlaky:

    def setup_method(self, method):
        # It's called once
        pass

    def teardown_method(self, method):
        # Called once as well
        pass

    def test_fail_always(self):
        assert None

Delay

It would be useful to have a delay optional parameter that would add a sleep between each retry.

Also a msg or message would be useful just as a way to have also in the output why it may be flaky (in cases it flakes or fails especially).

Using rerun_filter causes INTERNALERROR

With the new version and rerunning the setup, I was able to start using flaky. But when I added a rerun_filter to the tests, I get the following error

INTERNALERROR> TypeError: 'NoneType' object is not subscriptable

Some digging into it looks like the err param is being passed as None to the rerun filter method:

def is_not_video_error(err, *args):
    raise not issubclass(err[0], VideoProcessingException))

@flaky(rerun_filter=is_not_video_error)
class ...

Here's the full stack trace: http://pastebin.com/kY7nE8E8

Am i missing something in my code?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.