-
Notifications
You must be signed in to change notification settings - Fork 15
Description
Description
I'm trying to understand python dependency management in general, and pipgrep's approach specifically. In particular, I'd like to understand if it is possible, at all, to determine dependencies for a package without ever having to build wheels.
For example, the result of running pipgrip to see the dependency tree of a package containing non-Python code differs depending on the version I want to analyze. For example,
pipgrip --tree numpy==1.9.2
fails (clang compilation fails with a massive stacktrace), whereas
pipgrip --tree numpy==1.19.1
succeeds, which appears to be caused by the latter case (1.19.1) having a readily-available wheel in PyPi that matches my Python interpreter (version, abi, platform) [1] whereas the former (1.9.2) does not have a pre-built bdist wheel that matches my Python interpreter and, therefore, needs to be built from the sdist (which requires a ton of packages to be installed on my computer).
Now, my question is if it's strictly necessary for pipgrip to build wheels to determine dependencies?
Would it be possible to determine the tree of dependencies by only looking at sdist distributions (and hence, never having to build wheels)? I'm probably missing something in my understanding of python's dependency management so feel free to enlighten me! :)
[1] {'implementation_name': 'cpython', 'implementation_version': '3.8.2', 'os_name': 'posix', 'platform_machine': 'x86_64', 'platform_release': '5.4.0-42-generic', 'platform_system': 'Linux', 'platform_version': '#46-Ubuntu SMP Fri Jul 10 00:24:02 UTC 2020', 'python_full_version': '3.8.2', 'platform_python_implementation': 'CPython', 'python_version': '3.8', 'sys_platform': 'linux'}