I am using pip with virtualenv to package and install some Python libraries.
I would suggest that I make a fairly common scenario. I support several libraries for which I can explicitly specify dependencies. Some of my libraries depend on third-party libraries that have transitive dependencies that I do not control.
What I'm trying to achieve is pip install in one of my libraries to download / install all my dependencies over a stream. What I come across in pip documentation is if / how requirements files can do this on their own or if they are really just an addition to using install_requires .
Would I use install_requires in all my libraries to specify dependencies and version ranges, and then use the requirements file to resolve the conflict and / or freeze them to build the assembly?
Assume that I live in an imaginary world (I know, I know), and my relationships upstream are straightforward and are guaranteed to never conflict or interrupt compatibility. Should I be forced to use the protocol requirements file in general or just let pip / setuptools / distribute all install_requires based installations?
There are many similar questions here, but I could not find a single element that would be as basic as when it was used, or use them together harmoniously.
python pip setuptools setup.py distribute
Joe Holloway Aug 04 2018-11-21T00: 00Z
source share