When to use the file requirements file compared to install_requires in setup.py?

I am using pip with virtualenv to package and install some Python libraries.

I would suggest that I make a fairly common scenario. I support several libraries for which I can explicitly specify dependencies. Some of my libraries depend on third-party libraries that have transitive dependencies that I do not control.

What I'm trying to achieve is pip install in one of my libraries to download / install all my dependencies over a stream. What I come across in pip documentation is if / how requirements files can do this on their own or if they are really just an addition to using install_requires .

Would I use install_requires in all my libraries to specify dependencies and version ranges, and then use the requirements file to resolve the conflict and / or freeze them to build the assembly?

Assume that I live in an imaginary world (I know, I know), and my relationships upstream are straightforward and are guaranteed to never conflict or interrupt compatibility. Should I be forced to use the protocol requirements file in general or just let pip / setuptools / distribute all install_requires based installations?

There are many similar questions here, but I could not find a single element that would be as basic as when it was used, or use them together harmoniously.

+72
python pip setuptools setup.py distribute
Aug 04 2018-11-21T00:
source share
4 answers

My philosophy is that install_requires should indicate a minimum of what you need. It may include version requirements if you know that some versions will not work; but it should not have version requirements if you are not sure (for example, you are not sure whether the future version will lead to dependencies in your library or not).

Requirement files, on the other hand, should indicate what you know, and may include optional dependencies that you recommend. For example, you can use SQLAlchemy, but suggest MySQL, and therefore put MySQLdb in the requirements file).

So, in a nutshell: install_requires is to keep people safe from things that you know don't work, while file requirements make people know that they work. One reason for this is that install_requires requirements are always checked and cannot be disabled without actually changing the package metadata. Thus, you cannot easily try a new combination. Requirement files are checked only during installation.

+55
Aug 16 2018-11-21T00:
source share

Here is what I entered in my setup.py:

 # this grabs the requirements from requirements.txt REQUIREMENTS = [i.strip() for i in open("requirements.txt").readlines()] setup( ..... install_requires=REQUIREMENTS ) 
+12
Mar 11 '13 at 14:26
source share

The Python Packaging user guide has a page about this topic, I highly recommend that you read it:

Summary:

install_requires There is a list of possible packages that must be installed for the package to work. It is not intended for binding dependencies to specific versions, but ranges are accepted, for example install_requires=['django>=1.8'] . install_requires is observed by pip install name-on-pypi and other tools.

requirements.txt is just a text file that you can select to run pip install -r requirements.txt . This meant that versions of all dependencies and sub-dependencies are supported, for example: django==1.8.1 . You can create it using pip freeze > requirements.txt . (Some services, such as Heroku, automatically run pip install -r requirements.txt for you.) pip install name-on-pypi does not look at requirements.txt , only install_requires .

+5
Oct 12 '16 at 13:41
source share

I use only setup.py and install_requires because there is only one place to view. It is as powerful as the requirements file, and there is no duplication for support.

+4
Nov 13 '13 at 16:43
source share



All Articles