• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

pantsbuild / pants / 19015773527

02 Nov 2025 05:33PM UTC coverage: 17.872% (-62.4%) from 80.3%
19015773527

Pull #22816

github

web-flow
Merge a12d75757 into 6c024e162
Pull Request #22816: Update Pants internal Python to 3.14

4 of 5 new or added lines in 3 files covered. (80.0%)

28452 existing lines in 683 files now uncovered.

9831 of 55007 relevant lines covered (17.87%)

0.18 hits per line

Source File
Press 'n' to go to next uncovered line, 'b' for previous

25.0
/src/python/pants/util/requirements.py
1
# Copyright 2023 Pants project contributors (see CONTRIBUTORS.md).
2
# Licensed under the Apache License, Version 2.0 (see LICENSE).
3
from collections.abc import Iterator
1✔
4

5
from pants.util.pip_requirement import PipRequirement
1✔
6

7

8
def parse_requirements_file(content: str, *, rel_path: str) -> Iterator[PipRequirement]:
1✔
9
    """Parse all `PipRequirement` objects from a requirements.txt-style file.
10

11
    This will safely ignore any options starting with `--` and will ignore comments. Any pip-style
12
    VCS requirements will fail, with a helpful error message describing how to use PEP 440.
13
    """
UNCOV
14
    for i, line in enumerate(content.splitlines(), start=1):
×
UNCOV
15
        line, _, _ = line.partition("--")
×
UNCOV
16
        line = line.strip().rstrip("\\")
×
UNCOV
17
        if not line or line.startswith(("#", "-")):
×
UNCOV
18
            continue
×
19

20
        # Strip comments which are otherwise on a valid requirement line.
UNCOV
21
        comment_pos = line.find("#")
×
UNCOV
22
        if comment_pos != -1:
×
UNCOV
23
            line = line[0:comment_pos].strip()
×
24

UNCOV
25
        yield PipRequirement.parse(line, description_of_origin=f"{rel_path} at line {i}")
×
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2025 Coveralls, Inc