Running django tests with selenium in docker

To run the tests, I usually run a separate container with:

docker-compose run --rm web /bin/bash 

Where web is a container with django. From the shell, I do py.test from time to time.

To reach selenium from a container using django and let the browser from a selenium container reach django liveserver, I decided to use the "net" parameter, which allows containers to share the network. So I added it in yml:

 selenium: image: selenium/standalone-firefox net: "container:web" 

Unfortunately this will not work. I do not see port 4444 in the django container.

It only works if, instead of net:"container:web" I specify the name of the auto-generated container, for example net:"container:project_web_run_1" .

Also, instead of docker-compose run --rm .... I used the command parameter instead of docker-compose up --no-deps to change command to py.test functional_tests , but that didn't work either.

Is this a right to use selenium with containers?

+8
python django selenium
source share
4 answers

This is how I do it. The main problem is that running docker-compose will generate a different hostname (project_container_run_x), where x is hard to understand. In the end, I just walked away from the ip address. I also guarantee that DEBUG is False, otherwise I get a bad request.

I am using StaticLiveServerTestCase as follows:

 import os import socket os.environ['DJANGO_LIVE_TEST_SERVER_ADDRESS'] = '0.0.0.0:8000' class IntegrationTests(StaticLiveServerTestCase): live_server_url = 'http://{}:8000'.format( socket.gethostbyname(socket.gethostname()) ) def setUp(self): settings.DEBUG = True self.browser = webdriver.Remote( command_executor="http://selenium:4444/wd/hub", desired_capabilities=DesiredCapabilities.CHROME ) def tearDown(self): self.browser.quit() super().tearDown() def test_home(self): self.browser.get(self.live_server_url) 

My docker build file has this for selenium and extends the web container (where django works). Port 5900 is open to VNC. I like to keep it isolated in something like docker-compose.selenium.yml

 version: '2' services: web: environment: SELENIUM_HOST: http://selenium:4444/wd/hub TEST_SELENIUM: 'yes' depends_on: - selenium selenium: image: selenium/standalone-chrome-debug ports: - "5900:5900" 

I can run tests like

 docker-compose run --rm web ./manage.py test 

So, my web container is accessing selenium through the host "selenium". Selenium then accesses the web container by the ip address, which is determined on the fly.

Another problem is that the temptation is to simply use the "network" as the host name. If your docker run command launches a separate web container, this will work. However, he will not use your test database, which would make this not a great test.

+6
source share

I just specified host='web' for LiveServerTestCase . Here is my working solution.

test.py

 from django.test import LiveServerTestCase from selenium import webdriver from selenium.webdriver.common.desired_capabilities import DesiredCapabilities class FunctionalTestCase(LiveServerTestCase): host = 'web' def setUp(self): self.browser = webdriver.Remote( command_executor="http://selenium:4444/wd/hub", desired_capabilities=DesiredCapabilities.FIREFOX ) def test_user_registration(self): self.browser.get(self.live_server_url) self.assertIn('Django', self.browser.title) def tearDown(self): self.browser.close() 

docker-compose.yml

 version: '3' services: db: image: postgres web: build: . ports: - "8000:8000" depends_on: - db - selenium selenium: image: selenium/standalone-firefox 
0
source share

In my case, the web container runs only one command, which is bash -c "sleep infinity" .

Then I start the whole stack with docker-compose up -d .

Then I use docker-compose exec web bash -c "cd /usr/src/app && tox" , for example.

Thus, my web host is accessible from selenium , always under the same name.

Using docker-compose run web ... generates a new (predictable, but still) host name every time.

0
source share

For everyone who runs pytest and possibly pytest-splinter (Selenium wrapper)

 version: '3' services: db: image: postgres django: build: . ports: - "8000:8000" depends_on: - db - selenium selenium: image: selenium/standalone-firefox-debug:latest ports: - "4444:4444" # Selenium - "5900:5900" # VNC 

Define the conftest.py file in the root directory so that these devices are available for all your tests

 import socket import pytest from pytest_django.live_server_helper import LiveServer @pytest.fixture(scope='session') def test_server() -> LiveServer: addr = socket.gethostbyname(socket.gethostname()) server = LiveServer(addr) yield server server.stop() @pytest.fixture(autouse=True, scope='function') def _test_server_helper(request): """ Configures test_server fixture so you don't have to mark tests with @pytest.mark.django_db """ if "test_server" not in request.fixturenames: return request.getfixturevalue("transactional_db") # Settings below here are exclusive to splinter, # I'm just overriding the default browser fixture settings # If you just use selenium, no worries, just take note of the remote url and use # it wherever you define your selenium browser @pytest.fixture(scope='session') def splinter_webdriver(): return 'remote' @pytest.fixture(scope='session') def splinter_remote_url(): return 'http://selenium:4444/wd/hub' 

Remember to set ALLOWED_HOSTS in your configuration file:

 if env('USE_DOCKER') == 'yes': import socket ALLOWED_HOSTS = [socket.gethostbyname(socket.gethostname())] # or just ALLOWED_HOSTS = ['*'] 

Then just check!

 from django.urls import reverse def test_site_loads(browser, test_server): browser.visit(test_server.url + reverse('admin:index')) 
0
source share

All Articles