• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

FAIRmat-NFDI / pynxtools-apm / 23415808064
17%

Build:
DEFAULT BRANCH: main
Ran 23 Mar 2026 12:09AM UTC
Jobs 1
Files 35
Run time 1min
Badge
Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

23 Mar 2026 12:07AM UTC coverage: 16.768% (-32.0%) from 48.75%
23415808064

push

github

web-flow
Apm prepare nomad v142 (#93)

* Code for generic batch processing of collections of legacy atom probe data in various formats used globally into NeXus/HDF5 files

* Utilities for dealing with datasets from multiple machines and every possible country in the world

* Utility tools to inspect content of single files, zip, tar, rar, and 7z compressed files

* linting

* Hashing queue operational, now running, required to get unique object names which especially for ranging definitions file is relevant

* batch queus used for hashing content

* clean-up

* Workflow utilities

* Added draft of extract code and extraction pipeline

* Further work on the extraction given new examples

* Identify workflows for new datasets

* linting

* examples are clean now, remaining work on removing duplicates, thereafter we can decompress

* duplicates removed

* New formats

* Updates on batch processing preparation code

* Store the state of the scripts that were used for unpacking the original files from the scientific community. A total of 340 GB in 1240 files were prepared for the batch-processing, these include typical open-source formats like pos, epos, rrng, rng, apt, but also CamecaROOT-based formats like rhit, hits, and community formats

* Add batch processing script reduce redundant code and rename variables to explain better what the scripts do

* work on linting

* work on linting

* attempting fixing of file paths

* linting

* Refactoring version resolving and adding to the package

* Clean-up, parser scripts seems to work, next step: i) modify chunking in actual reader, ii) hook in newly developed parsers from ifes library, iii) run-through

* suggestion for different chunking strategy

* source code documentation

* modify reader functions that connect to use the ifes library, using chunking and raising the default compression strength to 9, note that this will make the writing part slower but ought to result in smaller files

* Moving out other default... (continued)

81 of 532 new or added lines in 22 files covered. (15.23%)

357 of 2129 relevant lines covered (16.77%)

0.17 hits per line

New Missed Lines in Diff

Lines Coverage ∆ File
1
76.92
src/pynxtools_apm/concepts/nxs_concepts.py
1
15.09
src/pynxtools_apm/examples/usa_madison_cameca_eln.py
2
66.67
src/pynxtools_apm/__init__.py
2
13.76
src/pynxtools_apm/parsers/oasis_eln.py
2
86.67
tests/nomad/test_nomad_example_uploads.py
2
77.78
tests/test_utils_versioning.py
3
13.33
src/pynxtools_apm/utils/io_case_logic.py
3
45.45
src/pynxtools_apm/utils/pint_custom_unit_registry.py
4
19.12
src/pynxtools_apm/parsers/oasis_config.py
4
13.33
src/pynxtools_apm/utils/string_conversions.py
6
23.81
src/pynxtools_apm/utils/hfive_concepts.py
7
29.33
src/pynxtools_apm/reader.py
16
8.4
src/pynxtools_apm/utils/create_nx_default_plots.py
38
17.01
src/pynxtools_apm/parsers/ifes_ranging.py
360
6.93
src/pynxtools_apm/parsers/ifes_reconstruction.py
Jobs
ID Job ID Ran Files Coverage
1 23415808064.1 23 Mar 2026 12:09AM UTC 35
16.77
GitHub Action Run
Source Files on build 23415808064
  • Tree
  • List 35
  • Changed 0
  • Source Changed 0
  • Coverage Changed 0
Coverage ∆ File Lines Relevant Covered Missed Hits/Line
  • Back to Repo
  • 1a73fd8a on github
  • Prev Build on main (#21910403400)
  • Next Build on main (#23491032720)
  • Delete
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc