• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

kivy / python-for-android / 6215290912

17 Sep 2023 06:49PM UTC coverage: 59.095% (+1.4%) from 57.68%
6215290912

push

github

web-flow
Merge pull request #2891 from misl6/release-2023.09.16

* Update `cffi` recipe for Python 3.10 (#2800)

* Update __init__.py

version bump to 1.15.1

* Update disable-pkg-config.patch

adjust patch for 1.15.1

* Use build rather than pep517 for building (#2784)

pep517 has been renamed to pyproject-hooks, and as a consequence all of
the deprecated functionality has been removed. build now provides the
functionality required, and since we are only interested in the
metadata, we can leverage a helper function for that. I've also removed
all of the subprocess machinery for calling the wrapping function, since
it appears to not be as noisy as pep517.

* Bump actions/setup-python and actions/checkout versions, as old ones are deprecated (#2827)

* Removes `mysqldb` recipe as does not support Python 3 (#2828)

* Removes `Babel` recipe as it's not needed anymore. (#2826)

* Remove dateutil recipe, as it's not needed anymore (#2829)

* Optimize CI runs, by avoiding unnecessary rebuilds (#2833)

* Remove `pytz` recipe, as it's not needed anymore (#2830)

* `freetype` recipe: Changed the url to use https as http doesn't work (#2846)

* Fix `vlc` recipe build (#2841)

* Correct sys_platform (#2852)

On Window, sys.platform = "win32".

I think "nt" is a reference to os.name.

* Fix code string - quickstart.rst

* Bump `kivy` version to `2.2.1` (#2855)

* Use a pinned version of `Cython` for now, as most of the recipes are incompatible with `Cython==3.x.x` (#2862)

* Automatically generate required pre-requisites (#2858)

`get_required_prerequisites()` maintains a list of Prerequisites required by each platform.

But that same information is already stored in each Prerequisite class.

Rather than rather than maintaining two lists which might become inconsistent, auto-generate one.

* Use `platform.uname` instead of `os.uname` (#2857)

Advantages:

- Works cross platform, not just Unix.
- Is a namedtuple, ... (continued)

944 of 2241 branches covered (0.0%)

Branch coverage included in aggregate %.

174 of 272 new or added lines in 32 files covered. (63.97%)

9 existing lines in 5 files now uncovered.

4725 of 7352 relevant lines covered (64.27%)

2.56 hits per line

Source File
Press 'n' to go to next uncovered line, 'b' for previous

81.22
/pythonforandroid/pythonpackage.py
1
""" This module offers highlevel functions to get package metadata
2
    like the METADATA file, the name, or a list of dependencies.
3

4
    Usage examples:
5

6
       # Getting package name from pip reference:
7
       from pythonforandroid.pythonpackage import get_package_name
8
       print(get_package_name("pillow"))
9
       # Outputs: "Pillow" (note the spelling!)
10

11
       # Getting package dependencies:
12
       from pythonforandroid.pythonpackage import get_package_dependencies
13
       print(get_package_dependencies("pep517"))
14
       # Outputs: "['pytoml']"
15

16
       # Get package name from arbitrary package source:
17
       from pythonforandroid.pythonpackage import get_package_name
18
       print(get_package_name("/some/local/project/folder/"))
19
       # Outputs package name
20

21
    NOTE:
22

23
    Yes, this module doesn't fit well into python-for-android, but this
24
    functionality isn't available ANYWHERE ELSE, and upstream (pip, ...)
25
    currently has no interest in taking this over, so it has no other place
26
    to go.
27
    (Unless someone reading this puts it into yet another packaging lib)
28

29
    Reference discussion/upstream inclusion attempt:
30

31
    https://github.com/pypa/packaging-problems/issues/247
32

33
"""
34

35

36
import functools
4✔
37
from io import open  # needed for python 2
4✔
38
import os
4✔
39
import shutil
4✔
40
import subprocess
4✔
41
import sys
4✔
42
import tarfile
4✔
43
import tempfile
4✔
44
import time
4✔
45
from urllib.parse import unquote as urlunquote
4✔
46
from urllib.parse import urlparse
4✔
47
import zipfile
4✔
48

49
import toml
4✔
50
import build.util
4✔
51

52
from pythonforandroid.util import rmdir, ensure_dir
4✔
53

54

55
def transform_dep_for_pip(dependency):
4✔
56
    if dependency.find("@") > 0 and (
4✔
57
            dependency.find("@") < dependency.find("://") or
58
            "://" not in dependency
59
            ):
60
        # WORKAROUND FOR UPSTREAM BUG:
61
        # https://github.com/pypa/pip/issues/6097
62
        # (Please REMOVE workaround once that is fixed & released upstream!)
63
        #
64
        # Basically, setup_requires() can contain a format pip won't install
65
        # from a requirements.txt (PEP 508 URLs).
66
        # To avoid this, translate to an #egg= reference:
67
        if dependency.endswith("#"):
4✔
68
            dependency = dependency[:-1]
4✔
69
        url = (dependency.partition("@")[2].strip().partition("#egg")[0] +
4✔
70
               "#egg=" +
71
               dependency.partition("@")[0].strip()
72
              )
73
        return url
4✔
74
    return dependency
4✔
75

76

77
def extract_metainfo_files_from_package(
4✔
78
        package,
79
        output_folder,
80
        debug=False
81
        ):
82
    """ Extracts metdata files from the given package to the given folder,
83
        which may be referenced in any way that is permitted in
84
        a requirements.txt file or install_requires=[] listing.
85

86
        Current supported metadata files that will be extracted:
87

88
        - pytoml.yml  (only if package wasn't obtained as wheel)
89
        - METADATA
90
    """
91

92
    if package is None:
4!
93
        raise ValueError("package cannot be None")
×
94

95
    if not os.path.exists(output_folder) or os.path.isfile(output_folder):
4!
96
        raise ValueError("output folder needs to be existing folder")
×
97

98
    if debug:
4✔
99
        print("extract_metainfo_files_from_package: extracting for " +
4✔
100
              "package: " + str(package))
101

102
    # A temp folder for making a package copy in case it's a local folder,
103
    # because extracting metadata might modify files
104
    # (creating sdists/wheels...)
105
    temp_folder = tempfile.mkdtemp(prefix="pythonpackage-package-copy-")
4✔
106
    try:
4✔
107
        # Package is indeed a folder! Get a temp copy to work on:
108
        if is_filesystem_path(package):
4✔
109
            shutil.copytree(
4✔
110
                parse_as_folder_reference(package),
111
                os.path.join(temp_folder, "package"),
112
                ignore=shutil.ignore_patterns(".tox")
113
            )
114
            package = os.path.join(temp_folder, "package")
4✔
115

116
        _extract_metainfo_files_from_package_unsafe(package, output_folder)
4✔
117
    finally:
118
        rmdir(temp_folder)
4✔
119

120

121
def _get_system_python_executable():
4✔
122
    """ Returns the path the system-wide python binary.
123
        (In case we're running in a virtualenv or venv)
124
    """
125
    # This function is required by get_package_as_folder() to work
126
    # inside a virtualenv, since venv creation will fail with
127
    # the virtualenv's local python binary.
128
    # (venv/virtualenv incompatibility)
129

130
    # Abort if not in virtualenv or venv:
131
    if not hasattr(sys, "real_prefix") and (
4!
132
            not hasattr(sys, "base_prefix") or
133
            os.path.normpath(sys.base_prefix) ==
134
            os.path.normpath(sys.prefix)):
135
        return sys.executable
×
136

137
    # Extract prefix we need to look in:
138
    if hasattr(sys, "real_prefix"):
4!
139
        search_prefix = sys.real_prefix  # virtualenv
×
140
    else:
141
        search_prefix = sys.base_prefix  # venv
4✔
142

143
    def python_binary_from_folder(path):
4✔
144
        def binary_is_usable(python_bin):
4✔
145
            """ Helper function to see if a given binary name refers
146
                to a usable python interpreter binary
147
            """
148

149
            # Abort if path isn't present at all or a directory:
150
            if not os.path.exists(
4✔
151
                os.path.join(path, python_bin)
152
            ) or os.path.isdir(os.path.join(path, python_bin)):
153
                return
4✔
154
            # We should check file not found anyway trying to run it,
155
            # since it might be a dead symlink:
156
            try:
4✔
157
                filenotfounderror = FileNotFoundError
4✔
158
            except NameError:  # Python 2
×
159
                filenotfounderror = OSError
×
160
            try:
4✔
161
                # Run it and see if version output works with no error:
162
                subprocess.check_output([
4✔
163
                    os.path.join(path, python_bin), "--version"
164
                ], stderr=subprocess.STDOUT)
165
                return True
4✔
166
            except (subprocess.CalledProcessError, filenotfounderror):
×
167
                return False
×
168

169
        python_name = "python" + sys.version
4✔
170
        while (not binary_is_usable(python_name) and
4✔
171
               python_name.find(".") > 0):
172
            # Try less specific binary name:
173
            python_name = python_name.rpartition(".")[0]
4✔
174
        if binary_is_usable(python_name):
4✔
175
            return os.path.join(path, python_name)
4✔
176
        return None
4✔
177

178
    # Return from sys.real_prefix if present:
179
    result = python_binary_from_folder(search_prefix)
4✔
180
    if result is not None:
4!
181
        return result
×
182

183
    # Check out all paths in $PATH:
184
    bad_candidates = []
4✔
185
    good_candidates = []
4✔
186
    ever_had_nonvenv_path = False
4✔
187
    ever_had_path_starting_with_prefix = False
4✔
188
    for p in os.environ.get("PATH", "").split(":"):
4✔
189
        # Skip if not possibly the real system python:
190
        if not os.path.normpath(p).startswith(
4✔
191
                os.path.normpath(search_prefix)
192
                ):
193
            continue
4✔
194

195
        ever_had_path_starting_with_prefix = True
4✔
196

197
        # First folders might be virtualenv/venv we want to avoid:
198
        if not ever_had_nonvenv_path:
4!
199
            sep = os.path.sep
4✔
200
            if (
4!
201
                ("system32" not in p.lower() and
202
                 "usr" not in p and
203
                 not p.startswith("/opt/python")) or
204
                {"home", ".tox"}.intersection(set(p.split(sep))) or
205
                "users" in p.lower()
206
            ):
207
                # Doesn't look like bog-standard system path.
208
                if (p.endswith(os.path.sep + "bin") or
4✔
209
                        p.endswith(os.path.sep + "bin" + os.path.sep)):
210
                    # Also ends in "bin" -> likely virtualenv/venv.
211
                    # Add as unfavorable / end of candidates:
212
                    bad_candidates.append(p)
4✔
213
                    continue
4✔
214
            ever_had_nonvenv_path = True
4✔
215

216
        good_candidates.append(p)
4✔
217

218
    # If we have a bad env with PATH not containing any reference to our
219
    # real python (travis, why would you do that to me?) then just guess
220
    # based from the search prefix location itself:
221
    if not ever_had_path_starting_with_prefix:
4!
222
        # ... and yes we're scanning all the folders for that, it's dumb
223
        # but i'm not aware of a better way: (@JonasT)
224
        for root, dirs, files in os.walk(search_prefix, topdown=True):
×
225
            for name in dirs:
×
226
                bad_candidates.append(os.path.join(root, name))
×
227

228
    # Sort candidates by length (to prefer shorter ones):
229
    def candidate_cmp(a, b):
4✔
230
        return len(a) - len(b)
×
231
    good_candidates = sorted(
4✔
232
        good_candidates, key=functools.cmp_to_key(candidate_cmp)
233
    )
234
    bad_candidates = sorted(
4✔
235
        bad_candidates, key=functools.cmp_to_key(candidate_cmp)
236
    )
237

238
    # See if we can now actually find the system python:
239
    for p in good_candidates + bad_candidates:
4!
240
        result = python_binary_from_folder(p)
4✔
241
        if result is not None:
4✔
242
            return result
4✔
243

244
    raise RuntimeError(
×
245
        "failed to locate system python in: {}"
246
        " - checked candidates were: {}, {}"
247
        .format(sys.real_prefix, good_candidates, bad_candidates)
248
    )
249

250

251
def get_package_as_folder(dependency):
4✔
252
    """ This function downloads the given package / dependency and extracts
253
        the raw contents into a folder.
254

255
        Afterwards, it returns a tuple with the type of distribution obtained,
256
        and the temporary folder it extracted to. It is the caller's
257
        responsibility to delete the returned temp folder after use.
258

259
        Examples of returned values:
260

261
        ("source", "/tmp/pythonpackage-venv-e84toiwjw")
262
        ("wheel", "/tmp/pythonpackage-venv-85u78uj")
263

264
        What the distribution type will be depends on what pip decides to
265
        download.
266
    """
267

268
    venv_parent = tempfile.mkdtemp(
4✔
269
        prefix="pythonpackage-venv-"
270
    )
271
    try:
4✔
272
        # Create a venv to install into:
273
        try:
4✔
274
            if int(sys.version.partition(".")[0]) < 3:
4!
275
                # Python 2.x has no venv.
276
                subprocess.check_output([
×
277
                    sys.executable,  # no venv conflict possible,
278
                                     # -> no need to use system python
279
                    "-m", "virtualenv",
280
                    "--python=" + _get_system_python_executable(),
281
                    os.path.join(venv_parent, 'venv')
282
                ], cwd=venv_parent)
283
            else:
284
                # On modern Python 3, use venv.
285
                subprocess.check_output([
4✔
286
                    _get_system_python_executable(), "-m", "venv",
287
                    os.path.join(venv_parent, 'venv')
288
                ], cwd=venv_parent)
289
        except subprocess.CalledProcessError as e:
×
290
            output = e.output.decode('utf-8', 'replace')
×
291
            raise ValueError(
×
292
                'venv creation unexpectedly ' +
293
                'failed. error output: ' + str(output)
294
            )
295
        venv_path = os.path.join(venv_parent, "venv")
4✔
296

297
        # Update pip and wheel in venv for latest feature support:
298
        try:
4✔
299
            filenotfounderror = FileNotFoundError
4✔
300
        except NameError:  # Python 2.
×
301
            filenotfounderror = OSError
×
302
        try:
4✔
303
            subprocess.check_output([
4✔
304
                os.path.join(venv_path, "bin", "pip"),
305
                "install", "-U", "pip", "wheel",
306
            ])
307
        except filenotfounderror:
×
308
            raise RuntimeError(
×
309
                "venv appears to be missing pip. "
310
                "did we fail to use a proper system python??\n"
311
                "system python path detected: {}\n"
312
                "os.environ['PATH']: {}".format(
313
                    _get_system_python_executable(),
314
                    os.environ.get("PATH", "")
315
                )
316
            )
317

318
        # Create download subfolder:
319
        ensure_dir(os.path.join(venv_path, "download"))
4✔
320

321
        # Write a requirements.txt with our package and download:
322
        with open(os.path.join(venv_path, "requirements.txt"),
4✔
323
                  "w", encoding="utf-8"
324
                 ) as f:
325
            def to_unicode(s):  # Needed for Python 2.
4✔
326
                try:
4✔
327
                    return s.decode("utf-8")
4✔
328
                except AttributeError:
4✔
329
                    return s
4✔
330
            f.write(to_unicode(transform_dep_for_pip(dependency)))
4✔
331
        try:
4✔
332
            subprocess.check_output(
4✔
333
                [
334
                    os.path.join(venv_path, "bin", "pip"),
335
                    "download", "--no-deps", "-r", "../requirements.txt",
336
                    "-d", os.path.join(venv_path, "download")
337
                ],
338
                stderr=subprocess.STDOUT,
339
                cwd=os.path.join(venv_path, "download")
340
            )
341
        except subprocess.CalledProcessError as e:
×
342
            raise RuntimeError("package download failed: " + str(e.output))
×
343

344
        if len(os.listdir(os.path.join(venv_path, "download"))) == 0:
4!
345
            # No download. This can happen if the dependency has a condition
346
            # which prohibits install in our environment.
347
            # (the "package ; ... conditional ... " type of condition)
348
            return (None, None)
×
349

350
        # Get the result and make sure it's an extracted directory:
351
        result_folder_or_file = os.path.join(
4✔
352
            venv_path, "download",
353
            os.listdir(os.path.join(venv_path, "download"))[0]
354
        )
355
        dl_type = "source"
4✔
356
        if not os.path.isdir(result_folder_or_file):
4!
357
            # Must be an archive.
358
            if result_folder_or_file.endswith((".zip", ".whl")):
4✔
359
                if result_folder_or_file.endswith(".whl"):
4!
360
                    dl_type = "wheel"
4✔
361
                with zipfile.ZipFile(result_folder_or_file) as f:
4✔
362
                    f.extractall(os.path.join(venv_path,
4✔
363
                                              "download", "extracted"
364
                                             ))
365
                    result_folder_or_file = os.path.join(
4✔
366
                        venv_path, "download", "extracted"
367
                    )
368
            elif result_folder_or_file.find(".tar.") > 0:
4!
369
                # Probably a tarball.
370
                with tarfile.open(result_folder_or_file) as f:
4✔
371
                    f.extractall(os.path.join(venv_path,
4✔
372
                                              "download", "extracted"
373
                                             ))
374
                    result_folder_or_file = os.path.join(
4✔
375
                        venv_path, "download", "extracted"
376
                    )
377
            else:
378
                raise RuntimeError(
×
379
                    "unknown archive or download " +
380
                    "type: " + str(result_folder_or_file)
381
                )
382

383
        # If the result is hidden away in an additional subfolder,
384
        # descend into it:
385
        while os.path.isdir(result_folder_or_file) and \
4✔
386
                len(os.listdir(result_folder_or_file)) == 1 and \
387
                os.path.isdir(os.path.join(
388
                    result_folder_or_file,
389
                    os.listdir(result_folder_or_file)[0]
390
                )):
391
            result_folder_or_file = os.path.join(
4✔
392
                result_folder_or_file,
393
                os.listdir(result_folder_or_file)[0]
394
            )
395

396
        # Copy result to new dedicated folder so we can throw away
397
        # our entire virtualenv nonsense after returning:
398
        result_path = tempfile.mkdtemp()
4✔
399
        rmdir(result_path)
4✔
400
        shutil.copytree(result_folder_or_file, result_path)
4✔
401
        return (dl_type, result_path)
4✔
402
    finally:
403
        rmdir(venv_parent)
4✔
404

405

406
def _extract_metainfo_files_from_package_unsafe(
4✔
407
        package,
408
        output_path
409
        ):
410
    # This is the unwrapped function that will
411
    # 1. make lots of stdout/stderr noise
412
    # 2. possibly modify files (if the package source is a local folder)
413
    # Use extract_metainfo_files_from_package_folder instead which avoids
414
    # these issues.
415

416
    clean_up_path = False
4✔
417
    path_type = "source"
4✔
418
    path = parse_as_folder_reference(package)
4✔
419
    if path is None:
4✔
420
        # This is not a path. Download it:
421
        (path_type, path) = get_package_as_folder(package)
4✔
422
        if path_type is None:
4!
423
            # Download failed.
424
            raise ValueError(
×
425
                "cannot get info for this package, " +
426
                "pip says it has no downloads (conditional dependency?)"
427
            )
428
        clean_up_path = True
4✔
429

430
    try:
4✔
431
        metadata_path = None
4✔
432

433
        if path_type != "wheel":
4✔
434
            # Use a build helper function to fetch the metadata directly
435
            metadata = build.util.project_wheel_metadata(path)
4✔
436
            # And write it to a file
437
            metadata_path = os.path.join(output_path, "built_metadata")
4✔
438
            with open(metadata_path, 'w') as f:
4✔
439
                for key in metadata.keys():
4✔
440
                    for value in metadata.get_all(key):
4✔
441
                        f.write("{}: {}\n".format(key, value))
4✔
442
        else:
443
            # This is a wheel, so metadata should be in *.dist-info folder:
444
            metadata_path = os.path.join(
4✔
445
                path,
446
                [f for f in os.listdir(path) if f.endswith(".dist-info")][0],
447
                "METADATA"
448
            )
449

450
        # Store type of metadata source. Can be "wheel", "source" for source
451
        # distribution, and others get_package_as_folder() may support
452
        # in the future.
453
        with open(os.path.join(output_path, "metadata_source"), "w") as f:
4✔
454
            try:
4✔
455
                f.write(path_type)
4✔
456
            except TypeError:  # in python 2 path_type may be str/bytes:
×
457
                f.write(path_type.decode("utf-8", "replace"))
×
458

459
        # Copy the metadata file:
460
        shutil.copyfile(metadata_path, os.path.join(output_path, "METADATA"))
4✔
461
    finally:
462
        if clean_up_path:
4✔
463
            rmdir(path)
4✔
464

465

466
def is_filesystem_path(dep):
4✔
467
    """ Convenience function around parse_as_folder_reference() to
468
        check if a dependency refers to a folder path or something remote.
469

470
        Returns True if local, False if remote.
471
    """
472
    return (parse_as_folder_reference(dep) is not None)
4✔
473

474

475
def parse_as_folder_reference(dep):
4✔
476
    """ See if a dependency reference refers to a folder path.
477
        If it does, return the folder path (which parses and
478
        resolves file:// urls in the process).
479
        If it doesn't, return None.
480
    """
481
    # Special case: pep508 urls
482
    if dep.find("@") > 0 and (
4✔
483
            (dep.find("@") < dep.find("/") or "/" not in dep) and
484
            (dep.find("@") < dep.find(":") or ":" not in dep)
485
            ):
486
        # This should be a 'pkgname @ https://...' style path, or
487
        # 'pkname @ /local/file/path'.
488
        return parse_as_folder_reference(dep.partition("@")[2].lstrip())
4✔
489

490
    # Check if this is either not an url, or a file URL:
491
    if dep.startswith(("/", "file://")) or (
4✔
492
            dep.find("/") > 0 and
493
            dep.find("://") < 0) or (dep in ["", "."]):
494
        if dep.startswith("file://"):
4✔
495
            dep = urlunquote(urlparse(dep).path)
4✔
496
        return dep
4✔
497
    return None
4✔
498

499

500
def _extract_info_from_package(dependency,
4✔
501
                               extract_type=None,
502
                               debug=False,
503
                               include_build_requirements=False
504
                               ):
505
    """ Internal function to extract metainfo from a package.
506
        Currently supported info types:
507

508
        - name
509
        - dependencies  (a list of dependencies)
510
    """
511
    if debug:
4✔
512
        print("_extract_info_from_package called with "
4✔
513
              "extract_type={} include_build_requirements={}".format(
514
                  extract_type, include_build_requirements,
515
              ))
516
    output_folder = tempfile.mkdtemp(prefix="pythonpackage-metafolder-")
4✔
517
    try:
4✔
518
        extract_metainfo_files_from_package(
4✔
519
            dependency, output_folder, debug=debug
520
        )
521

522
        # Extract the type of data source we used to get the metadata:
523
        with open(os.path.join(output_folder,
4✔
524
                               "metadata_source"), "r") as f:
525
            metadata_source_type = f.read().strip()
4✔
526

527
        # Extract main METADATA file:
528
        with open(os.path.join(output_folder, "METADATA"),
4✔
529
                  "r", encoding="utf-8"
530
                 ) as f:
531
            # Get metadata and cut away description (is after 2 linebreaks)
532
            metadata_entries = f.read().partition("\n\n")[0].splitlines()
4✔
533

534
        if extract_type == "name":
4✔
535
            name = None
4✔
536
            for meta_entry in metadata_entries:
4!
537
                if meta_entry.lower().startswith("name:"):
4✔
538
                    return meta_entry.partition(":")[2].strip()
4✔
539
            if name is None:
×
540
                raise ValueError("failed to obtain package name")
×
541
            return name
×
542
        elif extract_type == "dependencies":
4!
543
            # First, make sure we don't attempt to return build requirements
544
            # for wheels since they usually come without pyproject.toml
545
            # and we haven't implemented another way to get them:
546
            if include_build_requirements and \
4✔
547
                    metadata_source_type == "wheel":
548
                if debug:
4✔
549
                    print("_extract_info_from_package: was called "
4✔
550
                          "with include_build_requirements=True on "
551
                          "package obtained as wheel, raising error...")
552
                raise NotImplementedError(
553
                    "fetching build requirements for "
554
                    "wheels is not implemented"
555
                )
556

557
            # Get build requirements from pyproject.toml if requested:
558
            requirements = []
4✔
559
            if os.path.exists(os.path.join(output_folder,
4!
560
                                           'pyproject.toml')
561
                              ) and include_build_requirements:
562
                # Read build system from pyproject.toml file: (PEP518)
UNCOV
563
                with open(os.path.join(output_folder, 'pyproject.toml')) as f:
×
UNCOV
564
                    build_sys = toml.load(f)['build-system']
×
UNCOV
565
                    if "requires" in build_sys:
×
UNCOV
566
                        requirements += build_sys["requires"]
×
567
            elif include_build_requirements:
4✔
568
                # For legacy packages with no pyproject.toml, we have to
569
                # add setuptools as default build system.
570
                requirements.append("setuptools")
4✔
571

572
            # Add requirements from metadata:
573
            requirements += [
4✔
574
                entry.rpartition("Requires-Dist:")[2].strip()
575
                for entry in metadata_entries
576
                if entry.startswith("Requires-Dist")
577
            ]
578

579
            return list(set(requirements))  # remove duplicates
4✔
580
    finally:
581
        rmdir(output_folder)
4✔
582

583

584
package_name_cache = dict()
4✔
585

586

587
def get_package_name(dependency,
4✔
588
                     use_cache=True):
589
    def timestamp():
4✔
590
        try:
4✔
591
            return time.monotonic()
4✔
592
        except AttributeError:
×
593
            return time.time()  # Python 2.
×
594
    try:
4✔
595
        value = package_name_cache[dependency]
4✔
596
        if value[0] + 600.0 > timestamp() and use_cache:
4!
597
            return value[1]
4✔
598
    except KeyError:
4✔
599
        pass
4✔
600
    result = _extract_info_from_package(dependency, extract_type="name")
4✔
601
    package_name_cache[dependency] = (timestamp(), result)
4✔
602
    return result
4✔
603

604

605
def get_package_dependencies(package,
4✔
606
                             recursive=False,
607
                             verbose=False,
608
                             include_build_requirements=False):
609
    """ Obtain the dependencies from a package. Please note this
610
        function is possibly SLOW, especially if you enable
611
        the recursive mode.
612
    """
613
    packages_processed = set()
4✔
614
    package_queue = [package]
4✔
615
    reqs = set()
4✔
616
    reqs_as_names = set()
4✔
617
    while len(package_queue) > 0:
4✔
618
        current_queue = package_queue
4✔
619
        package_queue = []
4✔
620
        for package_dep in current_queue:
4!
621
            new_reqs = set()
4✔
622
            if verbose:
4✔
623
                print("get_package_dependencies: resolving dependency "
4✔
624
                      f"to package name: {package_dep}")
625
            package = get_package_name(package_dep)
4✔
626
            if package.lower() in packages_processed:
4!
627
                continue
×
628
            if verbose:
4✔
629
                print("get_package_dependencies: "
4✔
630
                      "processing package: {}".format(package))
631
                print("get_package_dependencies: "
4✔
632
                      "Packages seen so far: {}".format(
633
                          packages_processed
634
                      ))
635
            packages_processed.add(package.lower())
4✔
636

637
            # Use our regular folder processing to examine:
638
            new_reqs = new_reqs.union(_extract_info_from_package(
4✔
639
                package_dep, extract_type="dependencies",
640
                debug=verbose,
641
                include_build_requirements=include_build_requirements,
642
            ))
643

644
            # Process new requirements:
645
            if verbose:
4✔
646
                print('get_package_dependencies: collected '
4✔
647
                      "deps of '{}': {}".format(
648
                          package_dep, str(new_reqs),
649
                      ))
650
            for new_req in new_reqs:
4✔
651
                try:
4✔
652
                    req_name = get_package_name(new_req)
4✔
653
                except ValueError as e:
×
654
                    if new_req.find(";") >= 0:
×
655
                        # Conditional dep where condition isn't met?
656
                        # --> ignore it
657
                        continue
×
658
                    if verbose:
×
659
                        print("get_package_dependencies: " +
×
660
                              "unexpected failure to get name " +
661
                              "of '" + str(new_req) + "': " +
662
                              str(e))
663
                    raise RuntimeError(
×
664
                        "failed to get " +
665
                        "name of dependency: " + str(e)
666
                    )
667
                if req_name.lower() in reqs_as_names:
4!
668
                    continue
×
669
                if req_name.lower() not in packages_processed:
4!
670
                    package_queue.append(new_req)
4✔
671
                reqs.add(new_req)
4✔
672
                reqs_as_names.add(req_name.lower())
4✔
673

674
            # Bail out here if we're not scanning recursively:
675
            if not recursive:
4!
676
                package_queue[:] = []  # wipe queue
4✔
677
                break
4✔
678
    if verbose:
4✔
679
        print("get_package_dependencies: returning result: {}".format(reqs))
4✔
680
    return reqs
4✔
681

682

683
def get_dep_names_of_package(
4✔
684
        package,
685
        keep_version_pins=False,
686
        recursive=False,
687
        verbose=False,
688
        include_build_requirements=False
689
        ):
690
    """ Gets the dependencies from the package in the given folder,
691
        then attempts to deduce the actual package name resulting
692
        from each dependency line, stripping away everything else.
693
    """
694

695
    # First, obtain the dependencies:
696
    dependencies = get_package_dependencies(
4✔
697
        package, recursive=recursive, verbose=verbose,
698
        include_build_requirements=include_build_requirements,
699
    )
700
    if verbose:
4✔
701
        print("get_dep_names_of_package_folder: " +
4✔
702
              "processing dependency list to names: " +
703
              str(dependencies))
704

705
    # Transform dependencies to their stripped down names:
706
    # (they can still have version pins/restrictions, conditionals, ...)
707
    dependency_names = set()
4✔
708
    for dep in dependencies:
4✔
709
        # If we are supposed to keep exact version pins, extract first:
710
        pin_to_append = ""
4✔
711
        if keep_version_pins and "(==" in dep and dep.endswith(")"):
4!
712
            # This is a dependency of the format: 'pkg (==1.0)'
UNCOV
713
            pin_to_append = "==" + dep.rpartition("==")[2][:-1]
×
714
        elif keep_version_pins and "==" in dep and not dep.endswith(")"):
4✔
715
            # This is a dependency of the format: 'pkg==1.0'
716
            pin_to_append = "==" + dep.rpartition("==")[2]
4✔
717
        # Now get true (and e.g. case-corrected) dependency name:
718
        dep_name = get_package_name(dep) + pin_to_append
4✔
719
        dependency_names.add(dep_name)
4✔
720
    return dependency_names
4✔
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2025 Coveralls, Inc