Skip to content

Improve variant priorisation for mamba #43

@wolfv

Description

@wolfv

We're currently trying to fix mamba resolutions when variant packages are involved. It already works reasonably well when (directly) requiring a package that has a track feature applied. However, it stops working well when we have variants that do not directly expose any track_feature information.

For example, lets say we have 5 numpy builds:

  • numpy = 1.20 = py36
  • numpy = 1.20 = py37
  • numpy = 1.20 = py38
  • numpy = 1.20 = pypy36
  • numpy = 1.20 = pypy37

The metadata for numpy numpy=1.20=py37 contains a dependency on python >=3.7,<3.8. However, the metadata for numpy=1.20=pypy37 contains exactly the same python dependency.
The dependencies differ in python_abi and the pypy37 variant package has an additional pypy3.7 dependency.

However, the de-priorization by track_feature is applied on the python package. This makes it hard to pick the preferred solution right away, since we cannot figure this out without doing a full resolution.

In my opinion it would be preferable to either:

  • inherit track_features into the numpy variants to make sure that pypy packages are down-weighted. Nothing prevents us from adding track_features into the numpy-variant package as well, and I would say that it would be "more correct" to do so. However, a downside is that this might increase complexity, especially for the conda solver which does additional computations with the track_features that we do not take into account in mamba.
  • We could specifically pin the python package via build string to the pypy variant. If we'd export a stricter dependency (e.g. python >=3.7,<3.8 *pypy) for numpy's pypy build, we would be able to "inherit" the de-priorization by inspecting the first level of dependencies. This should be a fairly quick process. If we'd get consensus for this idea, we could have a rule that a variant package must indicate a de-priorization in the first level of dependency via build-string pinning (or on the package itself if the variants depend on differently named packages).

Does conda-forge think that's reasonable? I think for the case of pypy we could do some pretty straight-forward repodata patches to add *pypy to all packages that also require pypy3X.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions