Python tox, creating rpm virtualenv as part of a ci pipeline, not sure where in the workflow

I am exploring how Python applications can also use the CI pipeline, but I'm not sure how to create a standard workflow.

Jenkins is used to create the initial repository cloning and then initiates the current. Basically this is where maven and / or msbuild will receive dependency packages and build .... what current does through pip, so everything is fine here.

But now for the intricate part, the last part of the pipeline is creating and loading packages. Developers will probably download the created packages to the local repository, but then it is also possible to create a deployment package. In this case, it must be an RPM containing the virtual version of the application. I did one of them manually using rpmvenev, but no matter how it is done, how can I add such a step to the tox configuration? In case rpmvenv, it creates its own virtualenv, an autonomous command, so to speak.

+8
python jenkins packaging tox
source share
1 answer

I like to go with the Unix philosophy for this problem. You have a tool that does something incredibly good, and then put the other tools together. Tox is designed to run your tests in several different python environments, so using it to then create deb / rpm / etc for you, I feel this is a misnomer. It is probably easier to use the current only to run all your tests, then depending on the results there is another step in your pipeline that consists in creating a package for what was just checked.

Jenkins 2.x, which is quite recent at the time of this writing, seems to be much better connected with building pipelines. BuildBot is experiencing a decent amount of development and is already pretty easy to create a good pipeline for this.

What have we done in my work

  • Buildbot in AWS that receives push notifications from Github by PR
  • This launches a docker container that pulls in the current code and launches Tox (py.test, flake8, as well as protractor and jasmine tests).
  • If the toxicity step returns, open another docker container to build the deb package
  • Press this deb package to S3 and Salt tell how these machines update

This deb package is also available as an assembly artifact, similar to what Jenkins 1.x would do. Once we are ready to go to the stage, we just take this package and push it to an intermediate replica of debian manually. The same thing for the fact that he swept it.

The tools I found useful for all of this:

  • Buildbot , because in Python it's easier for us to work, but Jenkins will work just as well. Regardless of the fact that this is a controller for the entire pipeline
  • Docker, because each assembly must be completely isolated from any other assembly
  • Tox is a glorious test runner for handling all of these details.
  • fpm creates the package. RPM, DEB, tar.gz, whatever. Very customizable and convenient script.
  • Aptly simplifies the management of debian repositories and, in particular, pushes them to S3.
+1
source share

All Articles