• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

globus-labs / mof-generation-at-scale / 8522963786 / 1
48%
main: 48%

Build:
DEFAULT BRANCH: main
Ran 02 Apr 2024 12:51PM UTC
Files 80
Run time 2s
Badge
Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

02 Apr 2024 12:42PM UTC coverage: 42.034% (+0.2%) from 41.787%
8522963786.1

push

github

web-flow
Add retraining DiffLinker to the workflow (#81)

* Move difflinker training item generation to model

* First pass at using examples in training

1. Create a JSON.gz file in before calling train script
2. Set the dataset override flag
3. Fix a few iterations

* Use a model from checkpoint

* Start a component test

* Update to use Xiaoli's latest model

* First attempt at XPU support

* Implement abstract methods

* Optimize the optimizer

* Add a hook for moving optimizer and model to XPU

* Use the GPU, but only one worker

Ensures that we don't get two pytorch tasks on the GPU
at the same time

* Add retraining to the workflow

* Add test file with 1024 example MOFs

* Reduce training set size

4229 of 10061 relevant lines covered (42.03%)

0.42 hits per line

Source Files on job 8522963786.1
  • Tree
  • List 0
  • Changed 9
  • Source Changed 0
  • Coverage Changed 9
Coverage ∆ File Lines Relevant Covered Missed Hits/Line
  • Back to Build 8522963786
  • 66015723 on github
  • Prev Job for on main (#8493234402.1)
  • Next Job for on main (#8526024285.1)
  • Delete
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc