Giter Site home page Giter Site logo

behave-django's Introduction

behave-django Latest version

Code checks status Test suite status Python versions Software license Documentation Status Gitter chat room

Behave BDD integration for Django

Features

  • Web browser automation ready
  • Database transactions per scenario
  • Use Django's test client
  • Use unittest + Django assert library
  • Use behave's command line arguments
  • Use behave's configuration file
  • Fixture loading
  • Page objects

Version Support

behave-django is tested against the officially supported combinations of Python and Django (Django 3.2, 4.2, 5.0 on Python 3.7 through 3.12).

behave-django requires a few newer features of behave and hence installs a recent unreleased version of behave as a dependency.

Documentation

How to Contribute

Please, read the contributing guide in the docs.

behave-django's People

Contributors

avsd avatar bittner avatar blue-hope avatar fizista avatar ivancrneto avatar jenisys avatar kingbuzzman avatar mixxorz avatar morty avatar nhill-cpi avatar nikolas avatar pauloxnet avatar pydolan avatar sebastianmanger avatar sih4sing5hong5 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

behave-django's Issues

Dev-server has to already be running?

Hi,

I can't figure out what I'm missing. I'm just starting to use behave-django, and I'm finding that my tests run, but fail, when my Django dev-server at 127.0.0.1:8000 is not already running. When it is already before I start the tests, the tests pass. I thought behave-django starts a dev server in the background, so I believe I'm missing something very basic. Can you help me out?

Thanks.

Scenario Independence

In Feature Fixture loading, I switch the first two Scenarios: Load fixtures' and Load fixtures for this scenario and feature'. The new second scenario fails. Scenarios should not pass because they run on specific order. Scenarios should be total independent.

Scenario Load fixtures for this scenario and feature violates scenario independence by:

        context.fixtures.append('behave-second-fixture.json')

Fixture append is permanent update to Feature and impacts remaining scenarios.

I write test scenarios to be independent and run in random order.

Build fails, probably because behave can't find steps directory

The current build for PR #63 fails. When I run behave or python manage.py behave locally I get:

$ behave
ConfigError: No steps directory in '/home/peter/Development/repos/behave-django'
$ python manage.py behave
Creating test database for alias 'default'...
ConfigError: No steps directory in '/home/peter/Development/repos/behave-django'
Destroying test database for alias 'default'...

I've done some research and found some probably unrelated issue in the behave project. Not sure.

The problem seems to be related to how behave discovers the steps/ folder when paths are specified in the behave configuration.

Send-email testing with mail.outbox

I've written this simple test:

@when('asking for review and saving')
def act(self):
    site = Site(domain='http://127.0.0.1:8000', name='127.0.0.1')
    site.save()

    request = {'SITE_ID': site.id}
    self.asking_reservation = {'reservation_id': self.reservation1.id}
    self.deserialized_data = AskForReviewSerializer(
        data=self.asking_reservation, context={'request': request})


@then('validate data and send email for review asking')
def test(self):
    with freeze_time(self.reservation_creation_date_plus_two_months):
        assert self.deserialized_data.is_valid() == True
        assert len(mail.outbox) == 0

        self.deserialized_data.save()
        assert len(mail.outbox) == 1

And this is the serializer I'm testing which sends an email when saving:

class AskForReviewSerializer(serializers.Serializer):
    reservation_id = serializers.IntegerField()

    def save(self, *args, **kwargs):
        request = self.context.get('request')

        current_site = get_current_site(request)
        reservation_id = self.validated_data['reservation_id']

        reservation = models.Reservation.objects.get(pk=reservation_id)
        if not reservation.is_reviewable:
            raise serializers.ValidationError('This reservation is not reviewable')

        user = reservation.user

        path = f'/reservations/{reservation_id}/review'
        url = settings.BASE_URL + path
        context = {'current_site': current_site,
                   'user': user,
                   'review_url': url,
                   'request': request}

        get_adapter().send_mail(
            'review/email/ask_for_review',
            user.email,
            context)

    class Meta:
        fields = ('reservation_id', )

But mail.outbox is never affected when running the tests, its length is always zero (I ensured that my EMAIL_BACKEND = 'django.core.mail.backends.locmem.EmailBackend')

Full test: https://github.com/coretabs/dorm-portal/blob/master/features/steps/reviews.py#L117

Any idea how can I test the email sending behavior using behave-django?

Websocket support (django channels)

Is it possible to configure behave-django to test django apps with channels integrated?

For WSGI there is wsgiref used right in the documentation:

import threading
from wsgiref import simple_server
from selenium import webdriver
from my_application import model
from my_application import web_app

def before_all(context):
    context.server = simple_server.WSGIServer(('', 8000))
    context.server.set_app(web_app.main(environment='test’))
    context.thread = threading.Thread(target=context.server.serve_forever) context.thread.start()
    context.browser = webdriver.Chrome() 

However Django channels is using ASGI and daphne: http://channels.readthedocs.io/en/latest/asgi.html

Currently I’m deploying my app (for debugging) by:

$ venv/bin/daphne <app>.asgi:channel_layer --port 80 --bind 0.0.0.0 -v2
$ venv/bin/python manage.py runworker --settings=<app>.settings -v2

How can I configure behave-django to use daphne (and use runworker..)

I tried to set "context.server" in "before_all", googled a lot but I still can't figure it out.

Should I override DjangoBehaveTestSuiteRunner and if so, how?

--noinput

Hi, when I run normal tests using test management command and test database already exists, command line asks me if I want existing database to be deleted, ex:

Creating test database for alias 'default'...
Got an error creating the test database: (1007, "Can't create database 'test_project'; database exists")
Type 'yes' if you would like to try deleting the test database 'test_project', or 'no' to cancel:

Same prompt can popup for behave command.

test management command (and many other) have --noinput parameter to suppress these prompts and always confirm/overwrite:
https://docs.djangoproject.com/en/1.11/ref/django-admin/#cmdoption-test-noinput

Would it be possible to add --noinput to behave command as well?

Feature proposal: register fixtures for any 'step_impl'

Prerequisites

I am adding behave-django to an old-ish project, which provides many json-fixtures. So I am working with various fixtures, and many different steps. As fixtures are loaded in environment.py, knowing which fixtures are loaded for a given step_impl is not always transparent - at least for me. I have therefore developed a decorator for fixtures, which can be applied to any step_impl

The decorator accepts a list of strings that are then loaded in the before_scenario in the environment.py. As I prefer callables to json-fixtures, the decorator also accepts paths to callables.

Example

This is a step:
@fixtures('languages.json', 'behave.fixtures.pages.wagtail_page')
@when('someone does something')
def step_impl(context):
...

In the environment.py - before_scenario, callables for all steps of given scenario are executed, and all other files are appended to context.fixtures

Question

So - please let me know in case you'd like me to provide a pull request, or if you consider this too specific for my project. Thanks a lot for your efforts!

Integrating with jenkins

Hey,

I tried this project and it works very well with my bdd scenarios. I would like to integrate it with jenkins same way I am integrating my unit tests using django-jenkins to see the results in dashboard.

Is there way to do it? If there isn't, could you point me into right direction?

Are scenarios run in a database transaction?

Hello,
I have some test cases that appear to be failing because of how the database is reset after each scenario. Specifically, database sequences are not being reset. I stumbled on what seems like conflicting statements in the documentation.

From https://pythonhosted.org/behave-django/usage.html#database-transactions-per-scenario

Each scenario is run inside a database transaction, just like your regular TestCases

From the last paragraph of https://pythonhosted.org/behave-django/usage.html#fixture-loading

This is because Django’s LiveServerTestCase resets the test database after each scenario

Now if you dig into LiveServerTestCase via https://docs.djangoproject.com/en/1.9/topics/testing/tools/...

LiveServerTestCase does basically the same as TransactionTestCase

Or, LiveServerTestCase is a subclass of TransactionTestCase. And further down...

  • A TransactionTestCase resets the database after the test runs by truncating all tables. A TransactionTestCase may call commit and rollback and observe the effects of these calls on the database.
  • A TestCase, on the other hand, does not truncate tables after a test. Instead, it encloses the test code in a database transaction that is rolled back at the end of the test. This guarantees that the rollback at the end of the test restores the database to its initial state.

My first question is... is it correct that that behave-django is using a LiveServerTestCase, which means there is no database transaction happening?

Second, is there any way to change what happens when the database is reset? i.e. specify that the TestCase transaction behavior be used. Or alternatively, allow support modifying things like TransactionTestCase.reset_sequences

Reset the sequence of table

I was using fixtures to load data in the test.
The problem I found is that the id of django "auth_permission" table was not starting from 1.
Since the fixtures I was loading are using id of "auth_permission" as foreign key and the error I got is that it can not map the fixture id to the "auth_permission" id in the database.
edc--mdash--ipython--workplace-edc--mdash---bash--mdash--151-times-40
pgadmin-4

@jenisys

AttributeError: 'NoneType' object has no attribute 'addError'

python manage.py behave estoresx/features

Complete log:

Feature: Register for new account and login # estoresx/features/account.feature:1
Traceback (most recent call last):
  File "manage.py", line 10, in <module>
    execute_from_command_line(sys.argv)
  File "/home/ravi/bit/estores/env/local/lib/python2.7/site-packages/django/core/management/__init__.py", line 354, in execute_from_command_line
    utility.execute()
  File "/home/ravi/bit/estores/env/local/lib/python2.7/site-packages/django/core/management/__init__.py", line 346, in execute
    self.fetch_command(subcommand).run_from_argv(self.argv)
  File "/home/ravi/bit/estores/env/local/lib/python2.7/site-packages/django/core/management/base.py", line 394, in run_from_argv
    self.execute(*args, **cmd_options)
  File "/home/ravi/bit/estores/env/local/lib/python2.7/site-packages/django/core/management/base.py", line 445, in execute
    output = self.handle(*args, **options)
  File "/home/ravi/bit/estores/env/local/lib/python2.7/site-packages/behave_django/management/commands/behave.py", line 92, in handle
    exit_status = behave_main(args=behave_args)
  File "/home/ravi/bit/estores/env/local/lib/python2.7/site-packages/behave/__main__.py", line 111, in main
    failed = runner.run()
  File "/home/ravi/bit/estores/env/local/lib/python2.7/site-packages/behave/runner.py", line 659, in run
    return self.run_with_paths()
  File "/home/ravi/bit/estores/env/local/lib/python2.7/site-packages/behave/runner.py", line 680, in run_with_paths
    return self.run_model()
  File "/home/ravi/bit/estores/env/local/lib/python2.7/site-packages/behave/runner.py", line 481, in run_model
    failed = feature.run(self)
  File "/home/ravi/bit/estores/env/local/lib/python2.7/site-packages/behave/model.py", line 461, in run
    failed = scenario.run(runner)
  File "/home/ravi/bit/estores/env/local/lib/python2.7/site-packages/behave/model.py", line 773, in run
    runner.run_hook('before_scenario', runner.context, self)
  File "/home/ravi/bit/estores/env/local/lib/python2.7/site-packages/behave_django/environment.py", line 62, in run_hook
    django_test_runner.before_scenario(context)
  File "/home/ravi/bit/estores/env/local/lib/python2.7/site-packages/behave_django/environment.py", line 30, in before_scenario
    context.test()
  File "/home/ravi/bit/estores/env/local/lib/python2.7/site-packages/django/test/testcases.py", line 191, in __call__
    result.addError(self, sys.exc_info())
AttributeError: 'NoneType' object has no attribute 'addError'

All my Behave tests are passing when i use--use-existing-database option above.

behave attempts to install fixtures each time and fails since they already exist

Hi,

I have a couple of scenarios, all of which need the same set of fixtures. After the first scenario finished, behave attempts to re-install the fixtures even though they've been loaded already. I've tried setup_databases and teardown_databases thinking that would flush the fixtures. I've even tried calling "flush" directl from django's call_command.

I'm not sure the best way to manage my fixtures. As part of my django.setup() I already call initial_data to load my db...which doesn't seem to be accessible unless I use fixtures in before_scenario.

How do I remove my fixtures after each scenario and prevent the error. OR how do I only load the fixtures once?

Unable to access context.test in before_scenario

We've set up before_scenario to swap out Django's test client with Django Rest Framework's APIClient:

from rest_framework.test import APIClient

def before_scenario(context, scenario):
    context.test.client = APIClient()

This worked in 0.5.0. In 1.0.0, it seems that I can't access context.test at this point:

Exception AttributeError: 'PatchedContext' object has no attribute 'test'
Traceback (most recent call last):
  File "manage.py", line 30, in <module>
    main()
  File "manage.py", line 26, in main
    execute_from_command_line(sys.argv)
  File "/usr/local/lib/python3.6/site-packages/django/core/management/__init__.py", line 364, in execute_from_command_line
    utility.execute()
  File "/usr/local/lib/python3.6/site-packages/django/core/management/__init__.py", line 356, in execute
    self.fetch_command(subcommand).run_from_argv(self.argv)
  File "/usr/local/lib/python3.6/site-packages/django/core/management/base.py", line 283, in run_from_argv
    self.execute(*args, **cmd_options)
  File "/usr/local/lib/python3.6/site-packages/django/core/management/base.py", line 330, in execute
    output = self.handle(*args, **options)
  File "/usr/local/lib/python3.6/site-packages/behave_django/management/commands/behave.py", line 128, in handle
    exit_status = behave_main(args=behave_args)
  File "/usr/local/lib/python3.6/site-packages/behave/__main__.py", line 109, in main
    failed = runner.run()
  File "/usr/local/lib/python3.6/site-packages/behave/runner.py", line 672, in run
    return self.run_with_paths()
  File "/usr/local/lib/python3.6/site-packages/behave/runner.py", line 693, in run_with_paths
    return self.run_model()
  File "/usr/local/lib/python3.6/site-packages/behave/runner.py", line 471, in run_model
    self.run_hook('before_all', context)
  File "/usr/local/lib/python3.6/site-packages/behave_django/environment.py", line 91, in run_hook
    behave_run_hook(self, name, context, *args)
  File "/usr/local/lib/python3.6/site-packages/behave/runner.py", line 405, in run_hook
    self.hooks[name](context, *args)
  File "features/environment.py", line 14, in before_all
    context.test.client = APIClient()
  File "/usr/local/lib/python3.6/site-packages/behave/runner.py", line 214, in __getattr__
    raise AttributeError(msg)
AttributeError: 'PatchedContext' object has no attribute 'test'

As far as I can tell, I should be able to access context.test here.

Fix navigation drawbacks in docs

The documentation has a substantial drawback when it comes to navigating through content.

Problem example

See the Contributing chapter. How do you get back to the welcome page, for example?

This issue seems to be inherent to the Alabaster theme. Other projects have the same problem, see e.g. the behave docs, but it's less relevant there because the docs have the behave "logo" (the image with the words box) that takes you home. We should probably have something similar.

  • Can we fix the scarce navigation via some configuration switch? (Sphinx, Alabaster)
  • We should probably come up with a (Pony) logo and add it as a "home link" to the docs.

Logo ideas

"Given a pony, when I run behave with behave-django, then my project has awesome tests."

How to configure where to find fixtures?

Hi,

I am struggling letting behave know where my fixtures are. Currently, I have a shopozor_features folder at the root of my project containing all my features. This folder is provided in the setup.cfg file like this:

[behave]
paths = shopozor_features

Currently, my features folder is very small:

shopozor_features
  - Authentication
    - LogUserIn.feature
  - fixtures
    - user-data.json
  - steps
    - logUserIn.py
  - environment.py

Here's the code of my environment.py:

def before_scenario(context, scenario):
    context.fixtures = ['user-data.json']

def django_ready(context):
    context.test.client = ApiClient(user=AnonymousUser())
    context.django = True

Now, when I use it in my step, behave looks for the fixture file at the root of my project. How do I let behave know that the fixture file is located under shopozor_features/fixtures/user-data.json? I could of course write this:

def before_scenario(context, scenario):
    context.fixtures = ['shopozor_features/fixtures/user-data.json']

but my little finger tells me that there must be a way to tell behave that all the fixtures are located in the folder shopozor_features/fixtures. How do I achieve this?

Fixtures loaded via decorator do not always reset between scenarios

If running two scenarios that each load different fixtures and context.fixtures is already defined (e.g. in before_all or before_feature), then the fixtures loaded in the first scenario remain in the context for the second scenario.

For example, in the following setup, the "Testing 2" scenario would have fixtures "somedata.json", "test1.json", and "test2.json" loaded:

def before_all(context):
    context.fixtures = ['somedata.json']

############

@fixtures('test1.json')
@given('I have test1 data')
def step_impl(context):
    pass

@fixtures('test2.json')
@given('I have test2 data')
def step_impl(context):
    pass

############

Scenario: Testing 1
    Given I have test1 data

Scenario: Testing 2
    Given I have test2 data

Migrations data is invalid when testing

I inserted some data in my migrations.

I can use them in unittest.

But when I ran behave, it got ObjectDoesNotExist exception.

I traced the code, the main function in behave/__main__.py

    runner = Runner(config)

Before this line, they were in database.
After this, got empty in my database.

How should I trace code to fix this problem?

PS: pip freeze

behave==1.2.5
behave-django==0.3.0

Django Versions for supporting behave-django

Hi, there:

in README

behave-django is tested against the officially supported combinations of Python and Django (Django 1.8, 1.9, 1.10, 1.11, 2.0 on Python 2.7, 3.3, 3.4, 3.5, 3.6, excluding the aged Python 3.2).

However, in your repo, requirements.txt enforces Django to be upgraded to 1.11
Django>=1.11

Can you clarify the Django Version? As of now, latest behave-django is working well in Django 1.9 for me

Thanks,

How can I see context.response.content in the browser?

Hello,
I'm facing a problem, hoping that someone will help me.

I'm trying to do:

I want to run an authentication test of Django application using selenium in behave and behave-django. Also, I want to check the flow of execution of selenium by accessing the vnc server (vnc://localhost:5900).

My environment:

macOS(v10.14.1), Docker(v18.06.1-ce), Django(v2.1.4), behave(v1.2.6), behave-django(v1.1.0), selenium(v3.141.0)

The problem:

The vnc server refuses the access to the localhost. But, I can access the other websites, for exmaple, https://www.google.com/. Why?

2018-12-10 15 41 07

Also, I can probably check the error at context.response.content.split(), but it is very hard to see.

2018-12-10 15 40 18

Here, I know that the status_code is 200, and pages to be accessed certainly exists.

ipdb> context.response
<TemplateResponse status_code=200, "text/html; charset=utf-8">
ipdb> Page.objects.all()=6

Questions:

  1. How can I access to localhost?
  2. How can I make context.response.content more readable?

Update documentation with minor detail about the fixtures decorator

While writing up notes on settings overriding, I was thinking about how the fixtures decorator works, and I realized that the order of steps doesn't matter with regards to when the fixture is loaded; it'll always be loaded before all steps.

For example, in the following, the 4th step (Then user with pk="1" has a name of "john") would fail because the fixture loading happens before step 1, even though step 3 is where the decorator is, and then step 1 sets the user's name to "bob", thus step 4 fails:

# Step function:
@fixtures('users.json')
@given('Given I load a fixture of users (including pk=1 and name=john)')
def step_impl(context):
    pass

# Gherkin file:
Scenario: User test name
    Given I create a user with pk="1" and name="bob"
    Then user with pk="1" has a name of "bob"
    Given I load a fixture of users (including pk=1 and name=john)
    Then user with pk="1" has a name of "john"

I think this is fine, but the documentation should probably have a big warning so that users are fully aware of how it works.

no context.base_url

I have 0.4.1 behave-django with 1.2.5 behave, and splinter installed, but context has no base_url, nor get url or anything like that.

it has the following:
['BEHAVE', 'USER', '__class__', '__contains__', '__delattr__', '__dict__', '__dir__', '__doc__', '__eq__', '__format__', '__ge__', '__getattr__', '__getattribute__', '__gt__', '__hash__', '__init__', '__le__', '__lt__', '__module__', '__ne__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', '__str__', '__subclasshook__', '__weakref__', '_config', '_dump', '_emit_warning', '_mode', '_origin', '_pop', '_push', '_record', '_root', '_runner', '_set_root_attribute', '_stack', 'execute_steps', 'user_mode']

Fixture loading using decorators not working ?

Hello,

I'm facing a strange bug when trying to use the @fixtures decorator to load date. Here is my step definition :

from behave_django.decorators import fixtures
from behave import given


@fixtures("test_project.json")
@given('a default project database exist')
def project_database(context):
    pass

When I try to check if the objects exists, they doesn't. I tryied to check where it could come from so I added this in my environment.py :

def django_ready(context):
    print(context.test.fixtures)

I always display None, hiting that there are no fixture at all, even when I try to load them. I added a few print, changing this plugin's environment.py file to look like this:

def load_registered_fixtures(context):
    """
    Apply fixtures that are registered with the @fixtures decorator.
    """
    for step in context.scenario.all_steps:
        match = step_registry.registry.find_match(step)
        print(step_registry.registry.steps)
        if match and hasattr(match.func, 'registered_fixtures'):
            if not context.test.fixtures:
                context.test.fixtures = []
            context.test.fixtures.extend(match.func.registered_fixtures)

The resulting print is a bit strange:

{'then': [], 'step': [], 'when': [], 'given': []}
{'then': [], 'step': [], 'when': [], 'given': []}
{'then': [], 'step': [], 'when': [], 'given': []}
{'then': [], 'step': [], 'when': [], 'given': []}
{'then': [], 'step': [], 'when': [], 'given': []}
{'then': [], 'step': [], 'when': [], 'given': []}
{'then': [], 'step': [], 'when': [], 'given': []}

So it looks like the steps are not recognized, but they are - the test do run well, except for the part where it needs those data. It looks like behave's step_registry is not the one used internally ? I don't exactly know why ? Anyway, the fixtures-loading function can't find any of the steps of the scenario and so it doesn't load anything...

Is there some way to make it load the fixtures as it should ?

Thank you !

Reload data migrations after flush

Hello.

I have data migrations in my app. Migrations are being executed once after database creation. After scenario database is being flushed except django's data (post_migrate). What is the best way to reload my own data migrations after each scenario?

Thanks

New port for each scenario...

I use Django + Behave + Selenium for the functional tests.

I just execute python manage.py behave, for each Scenario, there is a new Django instance (I say that, but in fact, for each test, the port of the live_server changes each time, it seems strange.

Do you know why?

Thank you

Landscape builds fail (after 27 Aug 2016)

The Landscape badge on top of the README shows a misleading number of 96% for health.

Misleading, because it doesn't refer to the current state of the code base. For some reason the last successful Landscape build was on 27 August 2016. I guess there is a permission issue. If you manually trigger the builds an error looks like this:

landscape-build-failing

@mixxorz Can you please take a look at the settings? I would have done that, but I've never been given full access to this repo. Hence, all I can do is ask.

Note that all I see in my Landscape account is tied to my GitHub user. The successful builds also happened on behalf of user @bittner, until 27 Aug 2016. Maybe that helps to lock down the issue.

Q: overriding settings

Is there a recommended way to override settings for the live server? Normally I'd use @override_settings(DEBUG=True) so I can see tracebacks while debugging, but had to settle for this:

def before_scenario(context, scenario):
    if "DEBUG" in scenario.tags:
        from django.conf import settings
        settings.DEBUG = True

(and corresponding after_scenario())

It works, but is there a better way?

How to run Channels 2 worker?

Hi,
my tests worked fine so far with Channels 1, using the manual integration of behave. Now I'm trying the upgrade to Channels 2, and I don't know how to start the workers anymore.
Until now I started some workers as Threads in before_all, but with Channels 2 I only get the following error, when I'm trying to do that:

RuntimeError: There is no current event loop in thread 'Worker'

What can I do, any hints?

'There is 1 other session using the database.'

When I run:
python manage.py behave

Django begins preparing for tests, but throws this error. behave-django appears to be complaining about it's own connection to the database (or Django is complaining about behave-djangos use of the database). Have any ideas what's going on here?

Python 2.7
Django 1.9.8

Thanks.

Remove deprecation code (old hooks)

@mixxorz I plan to remove the old hooks functions that currently print out a deprecation warning. This is just one of the things I plan for the next version 0.4.0:

  1. Have deprecated code removed
  2. Switch from optparse to argparse
  3. Maybe add some handy, optional helpers (Selenium helpers for the moment, along with recommendations on how to use Selenium for testing Django projects)

Let me know what you think about all this.

Command behave seems to ignore verbosity argument

I'm getting the same output regardless the value in -v / --verbosity:

verbosity: 3

/code # MODE=ci python manage.py behave --verbosity 3 --no-logcapture --no-skipped --tags="wip"
Creating test database for alias 'default'...
Feature: Provider List # features/providers/list.feature:1
  I want to see the list of providers.
  I should be able to access the page and see the full list.
  Background:   # features/providers/list.feature:5

  @wip
  Scenario: Accessing the provider list page.                    # features/providers/list.feature:20
    Given there are the following groups                         # features/steps/groups_steps.py:8 0.027s
      | name            | permissions |
      | Administradores | providers   |
      | Gerentes        |             |
    And there are the following users                            # features/steps/users_steps.py:8 0.652s
      | username | first_name | last_name | group           |
      | user1    | Gabriel    | Jesus     | Administradores |
      | user2    | Elias      | Gonçalves | Gerentes        |
    And there are the following providers                        # features/steps/providers_steps.py:12 1.652s
      | name                | type        | host                   | hosts |
      | Labrisco SUMA       | susemanager | http://labrisco.usp.br | 10    |
      | ViaVarejo Saltstack | saltstack   | http://via.varejo.br   | 20    |
    When accessing the page "providers:list" as "[email protected]" # features/steps/steps.py:37 0.791s
    Then I see the following providers                           # features/steps/providers_steps.py:35 0.006s
      | name                | status  | type                 | host                   | hosts_count |
      | Labrisco SUMA       | Working | SuseManager Provider | http://labrisco.usp.br | 10 found    |
      | ViaVarejo Saltstack | Working | Saltstack Provider   | http://via.varejo.br   | 20 found    |

1 feature passed, 0 failed, 11 skipped
1 scenario passed, 0 failed, 27 skipped
5 steps passed, 0 failed, 119 skipped, 0 undefined
Took 0m3.130s
Destroying test database for alias 'default'...

verbosity: 0

/code # MODE=ci python manage.py behave --verbosity 0 --no-logcapture --no-skipped --tags="wip"
Creating test database for alias 'default'...
Feature: Provider List # features/providers/list.feature:1
  I want to see the list of providers.
  I should be able to access the page and see the full list.
  Background:   # features/providers/list.feature:5

  @wip
  Scenario: Accessing the provider list page.                    # features/providers/list.feature:20
    Given there are the following groups                         # features/steps/groups_steps.py:8 0.031s
      | name            | permissions |
      | Administradores | providers   |
      | Gerentes        |             |
    And there are the following users                            # features/steps/users_steps.py:8 0.618s
      | username | first_name | last_name | group           |
      | user1    | Gabriel    | Jesus     | Administradores |
      | user2    | Elias      | Gonçalves | Gerentes        |
    And there are the following providers                        # features/steps/providers_steps.py:12 1.657s
      | name                | type        | host                   | hosts |
      | Labrisco SUMA       | susemanager | http://labrisco.usp.br | 10    |
      | ViaVarejo Saltstack | saltstack   | http://via.varejo.br   | 20    |
    When accessing the page "providers:list" as "[email protected]" # features/steps/steps.py:37 0.781s
    Then I see the following providers                           # features/steps/providers_steps.py:35 0.007s
      | name                | status  | type                 | host                   | hosts_count |
      | Labrisco SUMA       | Working | SuseManager Provider | http://labrisco.usp.br | 10 found    |
      | ViaVarejo Saltstack | Working | Saltstack Provider   | http://via.varejo.br   | 20 found    |

1 feature passed, 0 failed, 11 skipped
1 scenario passed, 0 failed, 27 skipped
5 steps passed, 0 failed, 119 skipped, 0 undefined
Took 0m3.094s
Destroying test database for alias 'default'...

is someone getting the same behavior?

Django Timezone Change

Sometimes we need to use a timezone.now() in django for create models for exemple:
default=timezone.now()
And when We testing this module with time based tests, we have some problems to define this time. My propose is make a function with context to change a django's timezone.now for a timezone setted in context.date.

Move docs to ReadTheDocs and setup automated doc builds

Currently, we're hosting our docs on pythonhosted.org. While this is fine, we can't automate the process of releasing documentation. Moving to RTD will allow us to automate doc uploading.

I've actually never done something like this before. I'm assuming we setup some task on the CI server that uploads the docs to RTD?

Landscape badge not being updated

I've noticed that builds on Landscape aren't triggered anymore.

@mixxorz Can you check whether you can (re)connect the GitHub repository to Landscape? There is something broken here. If you don't have the necessary permissions either @jenisys must connect the repo to Landscape.

Cheers!

"FOREIGN KEY constraint failed" exception raised in django_test_runner.teardown_test(context)

django==2.1.7
python3.7
behave_django==1.1.0
behave==1.2.6

When I run "./manage.py behave flipr/features" command it runs the test with an empty step implementation but it raises an exception in django_test_runner.teardown_test(context) @ environment.py(110). Running django tests, however, is fine. Please let me know if you need any more information and any help is appreciated.

This is the full stacktrace:

(venv) ➜  flipr git:(partnership) ✗ ./manage.py behave  flipr/features
Creating test database for alias 'default'...
Feature: Partnership Tests # flipr/features/affiliate.feature:2

  Scenario: Create a New Story with affiliate ID         # flipr/features/affiliate.feature:3
    Given We access the endpoint "/plans/?affiliateId=1" # flipr/features/steps/steps.py:7 0.000s
Exception IntegrityError: FOREIGN KEY constraint failed
Traceback (most recent call last):
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/django/db/backends/base/base.py", line 239, in _commit
    return self.connection.commit()
sqlite3.IntegrityError: FOREIGN KEY constraint failed

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "./manage.py", line 15, in <module>
    execute_from_command_line(sys.argv)
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/django/core/management/__init__.py", line 381, in execute_from_command_line
    utility.execute()
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/django/core/management/__init__.py", line 375, in execute
    self.fetch_command(subcommand).run_from_argv(self.argv)
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/django/core/management/base.py", line 316, in run_from_argv
    self.execute(*args, **cmd_options)
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/django/core/management/base.py", line 353, in execute
    output = self.handle(*args, **options)
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/behave_django/management/commands/behave.py", line 128, in handle
    exit_status = behave_main(args=behave_args)
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/behave/__main__.py", line 183, in main
    return run_behave(config)
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/behave/__main__.py", line 127, in run_behave
    failed = runner.run()
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/behave/runner.py", line 804, in run
    return self.run_with_paths()
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/behave/runner.py", line 824, in run_with_paths
    return self.run_model()
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/behave/runner.py", line 626, in run_model
    failed = feature.run(self)
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/behave/model.py", line 321, in run
    failed = scenario.run(runner)
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/behave/model.py", line 758, in run
    runner.run_hook("after_scenario", runner.context, self)
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/behave_django/environment.py", line 110, in run_hook
    django_test_runner.teardown_test(context)
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/behave_django/environment.py", line 87, in teardown_test
    context.test._post_teardown(run=True)
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/behave_django/testcase.py", line 20, in _post_teardown
    super(BehaviorDrivenTestMixin, self)._post_teardown()
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/django/test/testcases.py", line 908, in _post_teardown
    self._fixture_teardown()
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/django/test/testcases.py", line 943, in _fixture_teardown
    inhibit_post_migrate=inhibit_post_migrate)
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/django/core/management/__init__.py", line 148, in call_command
    return command.execute(*args, **defaults)
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/django/core/management/base.py", line 353, in execute
    output = self.handle(*args, **options)
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/django/core/management/commands/flush.py", line 80, in handle
    emit_post_migrate_signal(verbosity, interactive, database)
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/django/core/management/sql.py", line 51, in emit_post_migrate_signal
    **kwargs
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/django/dispatch/dispatcher.py", line 175, in send
    for receiver in self._live_receivers(sender)
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/django/dispatch/dispatcher.py", line 175, in <listcomp>
    for receiver in self._live_receivers(sender)
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/django/contrib/auth/management/__init__.py", line 79, in create_permissions
    Permission.objects.using(using).bulk_create(perms)
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/django/db/models/query.py", line 471, in bulk_create
    obj_without_pk._state.db = self.db
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/django/db/transaction.py", line 212, in __exit__
    connection.commit()
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/django/db/backends/base/base.py", line 261, in commit
    self._commit()
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/django/db/backends/base/base.py", line 239, in _commit
    return self.connection.commit()
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/django/db/utils.py", line 89, in __exit__
    raise dj_exc_value.with_traceback(traceback) from exc_value
  File "/Users/fnegarestan/Documents/GitHub/flipr/venv/lib/python3.7/site-packages/django/db/backends/base/base.py", line 239, in _commit
    return self.connection.commit()
django.db.utils.IntegrityError: FOREIGN KEY constraint failed

Feature File:

Feature: Partnership Tests
  Scenario: Create a New Story with affiliate ID
    Given We access the endpoint "/plans/?affiliateId=1"

Step File:

from behave import *

@given('We access the endpoint "{url}"')
def step_impl(context, url):
    pass

Create a logo

We should probably come up with a (Pony) logo.

Where to use it

  • Logo of the GitHub repository
  • Logo in documentation (works as a "home link" to the sidebar)

Logo ideas

"Given a pony, when I run behave with behave-django, then my project has awesome tests."

(Note: This issue was originally part of issue #3)

Test fails if order changes

When I run this command (manage-room features first):

python manage.py behave features/manage-room.feature features/manage-reservations.feature --stop

Things go fine, but when I run it in the reverse order:

python manage.py behave features/manage-reservations.feature features/manage-room.feature --stop

It will run into this problem:

api.engine.models.RadioOption.DoesNotExist: RadioOption matching query does not exist.

I tried to find what's happening by adding these lines:

class DormManagementNewRoomSerializer(serializers.Serializer):

    def create(self, validated_data):

        print('lololololo', models.RadioOption.objects.all())
        print('lololololo', validated_data['room_type_id'])
        print(qqq) # to stop it here

        dormitory = models.Dormitory.objects.get(pk=self.context.get('dorm_pk'))

        # ...

And I noticed the id (13) doesn't exist in the RadioOption created objects:

 Captured stdout:
      lololololo <QuerySet [<RadioOption: Spring option id: 5>, <RadioOption: Winter option id: 6>, <RadioOption: Single option id: 7>, <RadioOption: Double option id: 8>, <RadioOption: Breakfast option id: 9>, <RadioOption: Dinner option id: 10>, <RadioOption: Big Balcony option id: 11>, <RadioOption: Small Balcony option id: 12>]>
      lololololo 13

The database should be flushed in each scenario (but the ids seem to start from 5) as mentioned in the documentation here

And when I use the switch --simple, things go fine in the two previously mentioned cases

I'm not sure if this is a bug in behave-django or wrong usage from my side (since I use the same step in many scenarios).

Shared step implementation: https://github.com/coretabs/dorm-portal/blob/master/features/steps/manage-dorm.py#L26

[Question] Set server host and port for one scenario

I have one scenario inside of a feature file where the LiveServer needs to be running on port 8000 due to an external service I have no control over expecting it to be running there.

I've tried setting the port value in BehaviorDrivenTestCase inside of before_scenario but that causes the scenario AFTER the current one to run on port 8000.

I've also tried setting the port in before_all in the same manner and running all the tests on the same port but that causes an address in use error since the server isn't torn down before the next scenario starts.

Is there a "best practice" to setting the port for a scenario? I can work around this by creating a scenario before the one that needs to be ran on port 8000 who's sole purpose is to set the port number but that doesn't seem like a good solution.

How to get features written for behave-django to work in PyCharm?

PyCharm has its own behave runner. Even with behave-django in the project, PyCharm's runner doesn't know how to hook up behave-django's database management or other features such as context.base_url. Is there a way to set up behave-django with an arbitrary runner, or would it require explicit support in behave-django or PyCharm or both?

Related:

How to generate test execution time breakdown

Hi, @mixxorz:

I want to have a logging about the test execution to analyze where the most of time has been consumed.

the list of timestamps could be as following:

  1. Time to set up the environment
  2. Time to create the database
  3. Time to load fixtures
  4. Time to execute test steps
  5. Time to destroy the database

Currently, I can use behave API to know the time spent in each step, scenario, feature but I don't know how to calculate the time in creation and deletion of database

Thanks

Switch from optparse to argparse

As Django 1.9 complains with a deprecation warning (and more and more developers start using Django 1.9) we should consider switching to argparse in a backward-compatible manner. Affected code locations:

Compatibility Details

Until Django 1.7 optparse's make_option is used for the option_list. The implementation of our management command pretty much depends on that. Django 1.8 makes the switch finally to argparse. I fear an implementation change on our side will probably have to follow their approach and may need to consider the Django version in use.

See Also

Historical Note

This issue is a continuation of issue 24 in the original repository location at mixxorz/behave-django.

django 1.10 compatibility

Hi,
I've started using Django, and felt like BDD for my project would be nice.

Is there any plan for a 1.10 compat soon?

with behave-django 0.3.0:

› python manage.py behave
Traceback (most recent call last):
  File "manage.py", line 10, in <module>
    execute_from_command_line(sys.argv)
  File "/Users/laurent/.virtualenvs/proj/lib/python3.5/site-packages/django/core/management/__init__.py", line 367, in execute_from_command_line
    utility.execute()
  File "/Users/laurent/.virtualenvs/proj/lib/python3.5/site-packages/django/core/management/__init__.py", line 359, in execute
    self.fetch_command(subcommand).run_from_argv(self.argv)
  File "/Users/laurent/.virtualenvs/proj/lib/python3.5/site-packages/django/core/management/__init__.py", line 208, in fetch_command
    klass = load_command_class(app_name, subcommand)
  File "/Users/laurent/.virtualenvs/proj/lib/python3.5/site-packages/django/core/management/__init__.py", line 40, in load_command_class
    module = import_module('%s.management.commands.%s' % (app_name, name))
  File "/Users/laurent/.virtualenvs/proj/lib/python3.5/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 986, in _gcd_import
  File "<frozen importlib._bootstrap>", line 969, in _find_and_load
  File "<frozen importlib._bootstrap>", line 958, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 673, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 665, in exec_module
  File "<frozen importlib._bootstrap>", line 222, in _call_with_frames_removed
  File "/Users/laurent/.virtualenvs/proj/lib/python3.5/site-packages/behave_django/management/commands/behave.py", line 69, in <module>
    class Command(BaseCommand):
  File "/Users/laurent/.virtualenvs/proj/lib/python3.5/site-packages/behave_django/management/commands/behave.py", line 71, in Command
    option_list = BaseCommand.option_list + get_behave_options() + \
AttributeError: type object 'BaseCommand' has no attribute 'option_list'

According to SO, option_list has been deprecated for a while and removed in Django 1.10,

http://stackoverflow.com/a/35181811/843194

Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.