Are you happy with your logging solution? Would you help us out by taking a 30-second survey? Click here


A more pythonic testing framework.

Subscribe to updates I use Testify

Statistics on Testify

Number of watchers on Github 285
Number of open issues 10
Average time to close an issue 8 months
Main language Python
Average time to merge a PR 7 days
Open pull requests 3+
Closed pull requests 11+
Last commit about 3 years ago
Repo Created almost 10 years ago
Repo Last Updated over 1 year ago
Size 1.71 MB
Organization / Authoryelp
Latest Releasev0.11.0
Page Updated
Do you use Testify? Leave a review!
View open issues (10)
View Testify activity
View on github
Fresh, new opensource launches 🚀🚀🚀
Trendy new open source projects in your inbox! View examples

Subscribe to our mailing list

Evaluating Testify for your project? Score Explanation
Commits Score (?)
Issues & PR Score (?)

Testify - A Testing Framework

PLEASE NOTE: Yelp is in the process of switching to py.test. We recommend you use it instead of Testify.

Build Status

Testify is a replacement for Python's unittest module and nose. It is modeled after unittest, and existing unittest classes are fully supported.

However, Testify has features above and beyond unittest:

  • Class-level setup and teardown fixture methods, which are run only once for an entire class of test methods.

  • A decorator-based approach to fixture methods, enabling features like lazily-evaluated attributes and context managers for tests.

  • Enhanced test discovery. Testify can drill down into packages to find test cases (similiar to nose).

  • Support for detecting and running test suites, grouped by modules, classes, or individual test methods.

  • Pretty test runner output (hooray color!).

  • Extensible plugin system for adding additional functionality around reporting.

  • Comes complete with other handy testing utilities, including turtle (for mocking), code coverage integration, profiling, and numerous common assertion helpers for easier debugging.

  • More Pythonic naming conventions.

Example Test Case

from testify import *

class AdditionTestCase(TestCase):

    def init_the_variable(self):
        self.variable = 0

    def increment_the_variable(self):
        self.variable += 1

    def test_the_variable(self):
        assert_equal(self.variable, 1)

    @suite('disabled', reason='ticket #123, not equal to 2 places')
    def test_broken(self):
        # raises 'AssertionError: 1 !~= 1.01'
        assert_almost_equal(1, 1.01, threshold=2)

    def decrement_the_variable(self):
        self.variable -= 1

    def get_rid_of_the_variable(self):
        self.variable = None

if __name__ == "__main__":

Unittest Compatibility

Testify will discover and run unittests without any code changes, just point it to a directory containing your tests:

$ testify my_unittests/

To take advantage of more advanced Testify features, just replace unittest.TestCase with testify.TestCase!


Testify provides the following fixtures for your enjoyment:

  • @setup: Run before each individual test method on the TestCase(that is, all methods that begin with 'test').

  • @teardown: Like setup, but run after each test completes (regardless of success or failure).

  • @class_setup: Run before a TestCase begins executing its tests. Note that this not a class method; you still have access to the same TestCase instance as your tests.

  • @class_teardown: Like class_setup, but run after all tests complete (regardless of success or failure).

  • @setup_teardown: A context manager for individual tests, where test execution occurs during the yield. For example:

    def mock_something(self):
        with mock.patch('foo') as foo_mock:
            self.foo_mock = foo_mock
        # this is where you would do teardown things
  • @class_setup_teardown: Like setup_teardown, but all of the TestCase's methods are run when this yields.

  • @let: This declares a lazily-evaluated attribute of the TestCase. When accessed, this attribute will be computed and cached for the life of the test (including setup and teardown). For example:

    def expensive_attribute(self):
      return expensive_function_call()
    def test_something(self):
      assert self.expensive_attribute
    def test_something_else(self):
      # no expensive call
      assert True

Order of Execution

In pseudo code, Testify follows this schedule when running your tests:

   Run all 'class_setup' methods
   Enter all 'class_setup_teardown' context managers
   For each method beginning with 'test':
       Run all 'setup' methods
       Enter all 'setup_teardown' context managers
           Run the method and record failure or success
       Exit all 'setup_teardown' context managers
       Run all 'teardown' methods
   Exit all 'class_setup_teardown' context managers
   Run all 'class_teardown' methods
...When Subclassing

Your fixtures are just decorated methods, so they can be inherited and overloaded as expected. When you introduce subclasses and mixins into the... mix, things can get a little crazy. For this reason, Testify makes a couple guarantees about how your fixtures are run:

  • A subclass's fixture context is always contained within its parent's fixture context (as determined by the usual MRO). That is, fixture context is pushed and popped in FILO order.

  • Fixtures of the same type (and defined at the same level in the class heirarchy) will be run in the order they are defined on the class.

Testify open issues Ask a question     (View All Issues)
  • almost 4 years Write Integration test for testing --test-case-result
  • over 4 years assert_almost_equal: complain if second argument is non-integer or negative
  • almost 6 years assert_raises_such_that should assert that the lambda's evaluation is None or is True
  • over 6 years assert_raises_and_contains does not have a contextmanager version
  • almost 7 years results from class_setup_teardown don't indicate whether they came from setup or teardown phase
  • about 7 years Use option groups to make --help more clear
  • over 7 years Add an option to set pattern to match files and/or test cases.
  • over 8 years testify doesn't include class_setup/teardown in "Total test time"
  • about 9 years add the ability to skip tests
Testify open pull requests (View All Pulls)
  • Adding integration test for json.result formatting
  • Don't print tput warning if no TERM
  • Support for tags on test classes and methods
Testify questions on Stackoverflow (View All Questions)
  • How is testify/assert.Contains used with a map?
  • Separating unit tests and integration tests in GoLang (testify)
  • testify (Golang) seemingly running test suites concurrrently?
  • Testify whether values of one table exists in another table
  • How can i testify that downloaded application is native or cross platform developed
  • How can I test Component with Tapestry Testify
Testify list of languages used
Other projects in Python