Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Proposal] Include abstract dependencies in install_requires format in Pipfile.lock #98

Open
taion opened this issue Nov 27, 2017 · 6 comments

Comments

@taion
Copy link
Contributor

taion commented Nov 27, 2017

Summary: It would be nice if Pipfile.lock contained the abstract dependencies in a way that could be used in install_requires in setup.py, as this would allow better tooling in the near term for library developers


This is intended as an actionable follow-up to #27, https://github.com/kennethreitz/pipenv/issues/209.

To summarize many earlier discussions, across applications and libraries, we can taxonomize three different kinds of abstract dependencies.

  1. For applications, the abstract dependencies required to run the application
  2. For libraries, the abstract dependencies that consumers of the library must also add
  3. For both applications and libraries, the abstract dependencies required for a development environment – these also almost always include respectively (1) or (2), since you usually can't run tests without the actual dependencies
  4. The concrete dependencies required to run an application, corresponding to (1)
  5. The concrete dependencies to develop an application or a library, corresponding to (3)

Currently, (1) and (3) are handled via Pipfile (or requirements.in). (2) is handled by setup.py. (4) and (5) are handled by Pipfile.lock (or requirements.txt).

In practice, however, (1) and (2) above are often managed in very similar ways. Also, the requirements around (5) are nearly identical for both libraries and applications.

This mostly works, except there is proximate room for improvement in tooling support. Specifically, my user workflow for adding a dependency per (2) to a library closely resembles that for adding a dependency per (1) for an application. I want to add the dependency to the abstract specification, then lock down a version for my development environment per (5) to have a reproducible build environment (so I can e.g. separate out test failures from dep version bumps from those relating to my own code). While this is easy for an application with Pipfile, it's not really possible for a library using setup.py, since setup.py is arbitrary Python rather than structured data.

In an ideal world, this would be a non-issue if we had a setuptools entry in pyproject.toml, but we don't have that right now. This would also be less of an issue if we could just include packages from Pipfile in setup.py, but barring near-universal adoption of pyproject.toml, there's no real way to access a TOML parser in setup.py.

That leaves a last option, which is realizable – as Pipfile.lock is just JSON, if the abstract dependencies from Pipfile are made available in Pipfile.lock, then it would be straightforward to just forward through those dependencies from setup.py with an appropriately configured package manifest.

This would mitigate the flaws in tooling available for Python library development right now. Once better solutions become available in the future, as this is something that would be handled within packages, it would be straightforward to remove this section from Pipfile.lock and move people to newer tooling instead.

@taion
Copy link
Contributor Author

taion commented Nov 27, 2017

I should list the alternatives here that I see:

  • Keep doing what we're doing right now – to add a dep for a library, add it by hand to setup.py, then add it separately to Pipfile, so that it shows up both as a transitive dependency and in my locked dev environment
    • It's what we already have, but it's cumbersome
  • As a tooling-only concern, generate another output file (like a pip-tools-style requirements.in) from Pipfile
    • This works, but requires carrying around another environment-ish file; it's already not ideal to have setup.py, setup.cfg, Pipfile, and Pipfile.lock – adding yet another one seems even worse
    • This is my second-best solution
  • Using pyproject.toml, make a TOML parser a build-system-level requirement, and pull from Pipfile directly
    • I don't think pyproject.toml is ubiquitous enough to make this safe
    • The raw Pipfile format isn't exactly what you want in setup.py anyway

Also, while in principle Pipfiles can include dependencies that can't be represented nicely in this manner (e.g. a GitHub repo or something), in practice for libraries this is unlikely to be a major concern.

@wpietri
Copy link

wpietri commented Dec 8, 2017

Hopefully this is the right place to add another request: a bit of documentation explaining in clear terms what should be done when converting some code into a library. I'm far from a Python expert, and I used Pipfile and pipenv to manage dependencies for an app. People requested I turn the app into a library, and it's been a struggle for me to figure out The Right Way to manage dependencies.

@tailhook
Copy link

I also think that Pipfile.lock should contain original versions of the packages. Not just to read it from setup.py, but also to be able to check quickly if lockfile is not up to date comparing to Pipfile. In other words to quickly find out if user has added something to Pipfile and did not update Pipfile.lock just yet.

I can elaborate on use-case more if needed. This same yarn uses to check if yarn.lock is up to date (and we get advantage of it in vagga)

@tailhook
Copy link

Any updates on this issue? It should be pretty trivial to do and maintenance cost should be minimal. Should I make a PR?

@bennofs
Copy link

bennofs commented Aug 6, 2018

This feels to me like it is adding another quirk to python packaging. If we do this, setup.py will depend on Pipfile.lock which is very weird: now abstract dependencies in setup.py will be generated from a file supposed to contain concrete dependencies (Pipfile.lock).

As the use case if for libraries, doesn't this also mean that now libraries have to publish their Pipfile.lock so that setup.py works? But that's against the general principle that libraries are not supposed to publish their lockfile as they need to be able to work with many different versions of their dependencies.

To me, a much better way to handle this would be to extend the Pipfile with enough information to automatically generate a setup.py from it. But maybe I am not understanding the distinction between setup.py and Pipfile well enough? Is there a case where you'd want to have both a Pipfile and a setup.py with different information? In all cases I can imagine, these two files would contain the same information (with setup.py having additional meta data, and Pipfile specifying additional deps for other environments): abstract dependencies that my application/library depends on to run ((1) and (2) in the classification above).

@tailhook
Copy link

tailhook commented Aug 6, 2018

This feels to me like it is adding another quirk to python packaging. If we do this, setup.py will depend on Pipfile.lock which is very weird: now abstract dependencies in setup.py will be generated from a file supposed to contain concrete dependencies (Pipfile.lock).

Sorry, but one of us is misunderstanding this issue.

My understanding is that Pipfile.lock contains a copy requirements from either Pipfile or setup.py, so that it's easy to find out whether Pipfile (or setup.py) was changed without updating lockfile yet.

Am I misunderstanding something?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants