• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

pantsbuild / pants / 20147226056

11 Dec 2025 08:58PM UTC coverage: 78.827% (-1.5%) from 80.293%
20147226056

push

github

web-flow
Forwarded the `style` and `complete-platform` args from pants.toml to PEX (#22910)

## Context

After Apple switched to the `arm64` architecture, some package
publishers stopped releasing `x86_64` variants of their packages for
`darwin`. As a result, generating a universal lockfile now fails because
no single package version is compatible with both `x86_64` and `arm64`
on `darwin`.

The solution is to use the `--style` and `--complete-platform` flags
with PEX. For example:
```
pex3 lock create \
    --style strict \
    --complete-platform 3rdparty/platforms/manylinux_2_28_aarch64.json \
    --complete-platform 3rdparty/platforms/macosx_26_0_arm64.json \
    -r 3rdparty/python/requirements_pyarrow.txt \
    -o python-pyarrow.lock
```

See the Slack discussion here:
https://pantsbuild.slack.com/archives/C046T6T9U/p1760098582461759

## Reproduction

* `BUILD`
```
python_requirement(
    name="awswrangler",
    requirements=["awswrangler==3.12.1"],
    resolve="awswrangler",
)
```
* Run `pants generate-lockfiles --resolve=awswrangler` on macOS with an
`arm64` CPU
```
pip: ERROR: Cannot install awswrangler==3.12.1 because these package versions have conflicting dependencies.
pip: ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts
pip:  
pip:  The conflict is caused by:
pip:      awswrangler 3.12.1 depends on pyarrow<18.0.0 and >=8.0.0; sys_platform == "darwin" and platform_machine == "x86_64"
pip:      awswrangler 3.12.1 depends on pyarrow<21.0.0 and >=18.0.0; sys_platform != "darwin" or platform_machine != "x86_64"
pip:  
pip:  Additionally, some packages in these conflicts have no matching distributions available for your environment:
pip:      pyarrow
pip:  
pip:  To fix this you could try to:
pip:  1. loosen the range of package versions you've specified
pip:  2. remove package versions to allow pip to attempt to solve the dependency conflict
```

## Implementation
... (continued)

77 of 100 new or added lines in 6 files covered. (77.0%)

868 existing lines in 42 files now uncovered.

74471 of 94474 relevant lines covered (78.83%)

3.18 hits per line

Source File
Press 'n' to go to next uncovered line, 'b' for previous

88.06
/src/python/pants/util/collections.py
1
# Copyright 2017 Pants project contributors (see CONTRIBUTORS.md).
2
# Licensed under the Apache License, Version 2.0 (see LICENSE).
3

4
from __future__ import annotations
11✔
5

6
import collections
11✔
7
import collections.abc
11✔
8
import gc
11✔
9
import math
11✔
10
from collections.abc import Callable, Iterable, Iterator, MutableMapping
11✔
11
from sys import getsizeof
11✔
12
from typing import Any, TypeVar
11✔
13

14
from pants.engine.internals import native_engine
11✔
15
from pants.util.strutil import softwrap
11✔
16

17

18
def recursively_update(d: MutableMapping, d2: MutableMapping) -> None:
11✔
19
    """dict.update but which merges child dicts (dict2 takes precedence where there's conflict)."""
20
    for k, v in d2.items():
6✔
21
        if k in d:
1✔
22
            if isinstance(v, dict):
1✔
23
                recursively_update(d[k], v)
1✔
24
                continue
1✔
25
        d[k] = v
1✔
26

27

28
def deep_getsizeof(o: Any, ids: set[int]) -> int:
11✔
29
    """Find the memory footprint of the given object.
30

31
    To avoid double-counting, `ids` should be a set of object ids which have been visited by
32
    previous calls to this method.
33
    """
34
    if id(o) in ids:
×
35
        return 0
×
36

37
    d = deep_getsizeof
×
38
    r = getsizeof(o)
×
39
    ids.add(id(o))
×
40

41
    return r + sum(d(x, ids) for x in gc.get_referents())
×
42

43

44
_T = TypeVar("_T")
11✔
45

46

47
def assert_single_element(iterable: Iterable[_T]) -> _T:
11✔
48
    """Get the single element of `iterable`, or raise an error.
49

50
    :raise: :class:`StopIteration` if there is no element.
51
    :raise: :class:`ValueError` if there is more than one element.
52
    """
53
    it = iter(iterable)
11✔
54
    first_item = next(it)
11✔
55

56
    try:
11✔
57
        next(it)
11✔
58
    except StopIteration:
11✔
59
        return first_item
11✔
60

UNCOV
61
    raise ValueError(f"iterable {iterable!r} has more than one element.")
×
62

63

64
def ensure_list(
11✔
65
    val: Any | Iterable[Any], *, expected_type: type[_T], allow_single_scalar: bool = False
66
) -> list[_T]:
67
    """Ensure that every element of an iterable is the expected type and convert the result to a
68
    list.
69

70
    If `allow_single_scalar` is True, a single value T will be wrapped into a `List[T]`.
71
    """
72
    if isinstance(val, expected_type):
11✔
73
        if not allow_single_scalar:
2✔
74
            raise ValueError(f"The value {val} must be wrapped in an iterable (e.g. a list).")
1✔
75
        return [val]
2✔
76
    if not isinstance(val, collections.abc.Iterable):
11✔
UNCOV
77
        raise ValueError(
×
78
            f"The value {val} (type {type(val)}) was not an iterable of {expected_type}."
79
        )
80
    result: list[_T] = []
11✔
81
    for i, x in enumerate(val):
11✔
82
        if not isinstance(x, expected_type):
11✔
83
            raise ValueError(
1✔
84
                softwrap(
85
                    f"""
86
                    Not all elements of the iterable have type {expected_type}. Encountered the
87
                    element {x} of type {type(x)} at index {i}.
88
                    """
89
                )
90
            )
91
        result.append(x)
11✔
92
    return result
11✔
93

94

95
def ensure_str_list(val: str | Iterable[str], *, allow_single_str: bool = False) -> list[str]:
11✔
96
    """Ensure that every element of an iterable is a string and convert the result to a list.
97

98
    If `allow_single_str` is True, a single `str` will be wrapped into a `List[str]`.
99
    """
100
    return ensure_list(val, expected_type=str, allow_single_scalar=allow_single_str)
2✔
101

102

103
def partition_sequentially(
11✔
104
    items: Iterable[_T],
105
    *,
106
    key: Callable[[_T], str],
107
    size_target: int,
108
    size_max: int | None = None,
109
) -> Iterator[list[_T]]:
110
    """Stably partitions the given items into batches of around `size_target` items.
111

112
    The "stability" property refers to avoiding adjusting all batches when a single item is added,
113
    which could happen if the items were trivially windowed using `itertools.islice` and an
114
    item was added near the front of the list.
115

116
    Batches will optionally be capped to `size_max`, but note that this can weaken the stability
117
    properties of the bucketing, by forcing bucket boundaries to be created where they otherwise
118
    might not.
119
    """
120

121
    # To stably partition the arguments into ranges of approximately `size_target`, we sort them,
122
    # and create a new batch sequentially once we encounter an item hash prefixed with a threshold
123
    # of zeros.
124
    #
125
    # The hashes act like a (deterministic) series of rolls of an evenly distributed die. The
126
    # probability of a hash prefixed with Z zero bits is 1/2^Z, and so to break after N items on
127
    # average, we look for `Z == log2(N)` zero bits.
128
    #
129
    # Breaking on these deterministic boundaries reduces the chance that adding or removing items
130
    # causes multiple buckets to be recalculated. But when a `size_max` value is set, it's possible
131
    # for adding items to cause multiple sequential buckets to be affected.
132
    zero_prefix_threshold = math.log(max(1, size_target), 2)
2✔
133

134
    batch: list[_T] = []
2✔
135

136
    def emit_batch() -> list[_T]:
2✔
137
        assert batch
2✔
138
        result = list(batch)
2✔
139
        batch.clear()
2✔
140
        return result
2✔
141

142
    keyed_items = []
2✔
143
    for item in items:
2✔
144
        keyed_items.append((key(item), item))
2✔
145
    keyed_items.sort()
2✔
146

147
    for item_key, item in keyed_items:
2✔
148
        batch.append(item)
2✔
149
        prefix_zero_bits = native_engine.hash_prefix_zero_bits(item_key)
2✔
150
        if prefix_zero_bits >= zero_prefix_threshold or (size_max and len(batch) >= size_max):
2✔
151
            yield emit_batch()
2✔
152
    if batch:
2✔
153
        yield emit_batch()
1✔
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2025 Coveralls, Inc