• Home
  • Features
  • Pricing
  • Docs
  • Announcements
  • Sign In

googledatalab / pydatalab / 2872 / 4
78%
master: 78%

Build:
DEFAULT BRANCH: master
Ran 21 Nov 2017 11:15PM UTC
Files 101
Run time 4s
Badge
Embed ▾
README BADGES
x

If you need to use a raster PNG badge, change the '.svg' to '.png' in the link

Markdown

Textile

RDoc

HTML

Rst

21 Nov 2017 11:01PM UTC coverage: 77.773%. Remained the same
TOX_ENV=coveralls

push

travis-ci

web-flow
Delete composer and airflow scripts (Part 1 of move to googledatalab/notebooks) (#617)

* VM create script and VM startup script for airflow

* Removing relative paths and references to /usr/bin from vm create script.

* Use default service account

* Make DATALAB_TAR

* Rename temp variables, add sleeps

* Hard-code bucket and dag-path. Cleanup and parameterize script.

* Default values for project-id and vm-name to make testing easier.

* Adding comments to the scripts

* Fixing bug in script, some cleanup

* Explicitly setting permissions to everybody for the airflow folder

* Introducing macro for gsutil

* Removing extra / in startup script

* Simplifying release scripts

* Needed to use datalab-pipeline as bucket name in startup script

* Incrementing VM number

* Add crazy comments

* Explicitly setting the airflow path to /usr/local/bin

* Forgot '.' for invoking airflow executable

* Removing '.' - shouldn't use this for executables

* Removing leading path for airflow

* Trying some stuff from https://lemag.sfeir.com/installing-and-using-apache-airflow-on-the-google-cloud-platform/

* Incrementing message number

* id

* Increasing instance number

* Adding comments, and a bigger sleep.

* Adding 'airflow scheduler' and 'airflow worker' commands

* Upping instance number and trying airflow scheduler and airflow worker commands again since they didn't work the first time.

* Running airflow scheduler in the background, and incrementing the instance count.

* Make 'airflow worker' a non-blocking call with an & at the end of the command

* worker & and scheduler & resulted in airflow not even pip installing. Retrying.

* Looks like datalab is not getting installed on the worker. Just trying to remove the call to 'airflow worker &' and seeing if there is a default worker that will work.

* Upping VM instance count

* Swapping the order of airflow and datalab to see if it fixes the e... (continued)

5378 of 6915 relevant lines covered (77.77%)

0.78 hits per line

Source Files on job 2872.4 (TOX_ENV=coveralls)
  • Tree
  • List 0
  • Changed 0
  • Source Changed 0
  • Coverage Changed 0
Coverage ∆ File Lines Relevant Covered Missed Hits/Line
  • Back to Build 2872
  • Travis Job 2872.4
  • 16fd2fc7 on github
  • Prev Job for TOX_ENV=coveralls on master (#2785.4)
  • Next Job for TOX_ENV=coveralls on master (#2893.4)
STATUS · Troubleshooting · Open an Issue · Sales · Support · CAREERS · ENTERPRISE · START FREE · SCHEDULE DEMO
ANNOUNCEMENTS · TWITTER · TOS & SLA · Supported CI Services · What's a CI service? · Automated Testing

© 2026 Coveralls, Inc