-
Notifications
You must be signed in to change notification settings - Fork 4
Closed
Description
Most RAPIDS wheels contain extension modules. However, after #33 we will have a number of pure C++ wheels that contain no Python code at all. We also have a handful of pure Python packages, namely dask_cudf and the wheels in the cugraph repo aside from cugraph and pylibcugraph. Those packages are handled in a somewhat specialized manner in the wheels workflows in order to produce pure Python wheels, but we do not handle this correctly for conda packages, where we still produce a package per minor version of Python. We should address this issue more holistically.
There are two parts to this request:
- Updating our shared workflows to support building pure wheels. The most important thing to do here is to create a new workflows based on
wheels-buildandconda-python-buildthat only use a single version. We already do this manually in a few places (especially in the new jobs added in addressing Support dynamic linking between RAPIDS wheels #33), so the simplest solution I see here is creating workflows that wrap those preexisting workflows but pass in a matrix filter containing amax_by(py_ver). The other thing that we may want to do here is forward along any other information specific to pure wheel builds. One example is the need to specify theRAPIDS_PY_WHEEL_PUREvariable for various gha-tools to work correctly. We could set that appropriately to the environment of all jobs using this shared workflow. - Updating conda recipes to properly produce packages without a Python ABI dependence. This will require that we remove the Python component of the build string, that we specify the packages as
noarch:python, and that we ensure that the Python dependency becomes a>=min_versioninstead of pinning to a specific version (this should automatically be handled if the package is built asnoarch:python).
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels