Skip to content

Commit a475118

Browse files
Migrate to GitHub Actions (#199)
1 parent 2b9db92 commit a475118

File tree

3 files changed

+50
-54
lines changed

3 files changed

+50
-54
lines changed

.github/workflows/ci-build.yaml

Lines changed: 38 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,38 @@
1+
name: CI
2+
on: [push, pull_request]
3+
4+
jobs:
5+
build-and-deploy:
6+
runs-on: ubuntu-latest
7+
steps:
8+
- name: Checkout source
9+
uses: actions/checkout@v2
10+
11+
- name: Setup Conda Environment
12+
uses: goanpeca/setup-miniconda@v1
13+
with:
14+
miniconda-version: "latest"
15+
python-version: "3.7"
16+
environment-file: binder/environment.yml
17+
activate-environment: dask-tutorial
18+
auto-activate-base: false
19+
20+
- name: Install testing and docs dependencies
21+
shell: bash -l {0}
22+
run: |
23+
conda install -c conda-forge nbconvert nbformat jupyter_client ipykernel
24+
pip install nbsphinx dask-sphinx-theme sphinx
25+
- name: Build
26+
shell: bash -l {0}
27+
run: |
28+
python prep.py --small
29+
sphinx-build -M html . _build -v
30+
31+
- name: Deploy
32+
if: ${{ github.ref == 'refs/heads/master' && github.event_name != 'pull_request'}}
33+
uses: JamesIves/[email protected]
34+
with:
35+
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
36+
BRANCH: gh-pages
37+
FOLDER: _build/html
38+
CLEAN: true

.travis.yml

Lines changed: 0 additions & 43 deletions
This file was deleted.

README.md

Lines changed: 12 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,7 @@ This tutorial was last given at SciPy 2020 which was a virtual conference.
44
[A video of the SciPy 2020 tutorial is available online](https://www.youtube.com/watch?v=EybGGLbLipI).
55

66
[![Binder](https://mybinder.org/badge_logo.svg)](https://mybinder.org/v2/gh/dask/dask-tutorial/master?urlpath=lab)
7+
[![Build Status](https://github.com/dask/dask-tutorial/workflows/CI/badge.svg)](https://github.com/dask/dask-tutorial/actions?query=workflow%3ACI)
78

89
Dask provides multi-core execution on larger-than-memory datasets.
910

@@ -35,13 +36,13 @@ schedulers (odd sections.)
3536

3637
and then install necessary packages.
3738
There are three different ways to achieve this, pick the one that best suits you, and ***only pick one option***.
38-
They are, in order of preference:
39+
They are, in order of preference:
3940

4041
#### 2a) Create a conda environment (preferred)
4142

4243
In the main repo directory
4344

44-
conda env create -f binder/environment.yml
45+
conda env create -f binder/environment.yml
4546
conda activate dask-tutorial
4647
jupyter labextension install @jupyter-widgets/jupyterlab-manager
4748
jupyter labextension install @bokeh/jupyter_bokeh
@@ -55,10 +56,10 @@ You will need the following core libraries
5556
You may find the following libraries helpful for some exercises
5657

5758
conda install python-graphviz -c conda-forge
58-
59-
Note that this options will alter your existing environment, potentially changing the versions of packages you already
60-
have installed.
61-
59+
60+
Note that this options will alter your existing environment, potentially changing the versions of packages you already
61+
have installed.
62+
6263
#### 2c) Use Dockerfile
6364

6465
You can build a docker image out of the provided Dockerfile.
@@ -69,7 +70,7 @@ Run a container, replacing the ID with the output of the previous command
6970

7071
$ docker run -it -p 8888:8888 -p 8787:8787 <container_id_or_tag>
7172

72-
The above command will give an URL (`Like http://(container_id or 127.0.0.1):8888/?token=<sometoken>`) which
73+
The above command will give an URL (`Like http://(container_id or 127.0.0.1):8888/?token=<sometoken>`) which
7374
can be used to access the notebook from browser. You may need to replace the given hostname with "localhost" or
7475
"127.0.0.1".
7576

@@ -79,7 +80,7 @@ can be used to access the notebook from browser. You may need to replace the giv
7980

8081
From the repo directory
8182

82-
jupyter notebook
83+
jupyter notebook
8384

8485
Or
8586

@@ -110,8 +111,8 @@ This was already done for method c) and does not need repeating.
110111

111112
2. [Bag](02_bag.ipynb) - the first high-level collection: a generalized iterator for use
112113
with a functional programming style and to clean messy data.
113-
114-
3. [Array](03_array.ipynb) - blocked numpy-like functionality with a collection of
114+
115+
3. [Array](03_array.ipynb) - blocked numpy-like functionality with a collection of
115116
numpy arrays spread across your cluster.
116117

117118
7. [Dataframe](04_dataframe.ipynb) - parallelized operations on many pandas dataframes
@@ -120,7 +121,7 @@ spread across your cluster.
120121
5. [Distributed](05_distributed.ipynb) - Dask's scheduler for clusters, with details of
121122
how to view the UI.
122123

123-
6. [Advanced Distributed](06_distributed_advanced.ipynb) - further details on distributed
124+
6. [Advanced Distributed](06_distributed_advanced.ipynb) - further details on distributed
124125
computing, including how to debug.
125126

126127
7. [Dataframe Storage](07_dataframe_storage.ipynb) - efficient ways to read and write

0 commit comments

Comments
 (0)