Skip to content

Commit

Permalink
Add colab links (#939)
Browse files Browse the repository at this point in the history
* Adds google colab links where it makes sense

We need to fix up our notebooks and redo them with the
new magics -- and then also add notebooks to examples
that don't have them.

* Fixes docs typo

Fixes code typoe

* Apply suggestions from code review
  • Loading branch information
skrawcz authored Jun 13, 2024
1 parent b89c019 commit 6385a82
Show file tree
Hide file tree
Showing 24 changed files with 1,003 additions and 1,199 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -14,3 +14,8 @@ Open it in google collab:
* simple_pipeline.ipynb - this contains documentation and code. Read this.
* pipeline.py - what the code in simple_pipeline.ipynb creates for easy reference
* requirements.txt - python dependencies required (outside of jupyter lab)

To exercise this example you can run it in Google Colab:

[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)
](https://colab.research.google.com/github/dagworks-inc/hamilton/blob/main/examples/LLM_Workflows/RAG_document_extract_chunk_embed/simple_pipeline.ipynb)
5 changes: 5 additions & 0 deletions examples/caching_nodes/caching_graph_adapter/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,3 +16,8 @@ For iterating during development, the general process would be:
its name to the adapter in the `force_compute` argument. Then, this node and its downstream
nodes will be computed instead of loaded from cache.
4. When no longer required, you can just skip (3) and any caching behavior will be skipped.

To exercise this example you can run it in Google Colab:

[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)
](https://colab.research.google.com/github/dagworks-inc/hamilton/blob/main/examples/caching_nodes/caching_graph_adapter/caching_nodes.ipynb)
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,13 @@
"# Caching Nodes with Hamilton"
]
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": null,
"source": "#! pip install pandas pyarrow sf-hamilton"
},
{
"cell_type": "code",
"execution_count": 1,
Expand Down
4 changes: 4 additions & 0 deletions examples/contrib/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,10 @@ For the purpose of this example, we will create a virtual environment with hamil
2. `. venv/bin/activate` (on MacOS / Linux) or `. venv/bin/Scripts` (Windows)
3. `pip install -r requirements.txt`

Or run it in Google Colab:
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)
](https://colab.research.google.com/github/dagworks-inc/hamilton/blob/main/examples/contrib/notebook.ipynb)


# 3 ways to import
There are 3 main ways to use community dataflows: static installation, dynamic installation, and local copy (see [documentation](https://github.com/DAGWorks-Inc/hamilton/tree/main/contrib)). We present each of them in this example:
Expand Down
7 changes: 7 additions & 0 deletions examples/contrib/notebook.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,13 @@
"collapsed": false
}
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": null,
"source": "# !pip install sf-hamilton-contrib"
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down
7 changes: 7 additions & 0 deletions examples/dagster/hamilton_code/notebook.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,13 @@
"[Tips on Hamilton + notebooks in the docs](https://hamilton.dagworks.io/en/latest/how-tos/use-in-jupyter-notebook/)"
]
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": null,
"source": ""
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down
6 changes: 6 additions & 0 deletions examples/dask/community_demo/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,12 @@ pip install -r requirements.txt
jupyter notebook
```

Or run it in Google Colab:
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)
](https://colab.research.google.com/github/dagworks-inc/hamilton/blob/main/examples/dask/community_demo/demo_day_notebook.ipynb)



If you have questions, or need help with this example,
join us on [slack](https://join.slack.com/t/hamilton-opensource/shared_invite/zt-1bjs72asx-wcUTgH7q7QX1igiQ5bbdcg), and we'll try to help!

Expand Down
19 changes: 9 additions & 10 deletions examples/dask/community_demo/demo_day_notebook.ipynb
Original file line number Diff line number Diff line change
@@ -1,18 +1,17 @@
{
"cells": [
{
"metadata": {},
"cell_type": "code",
"execution_count": 1,
"metadata": {
"ExecuteTime": {
"end_time": "2023-05-22T22:27:16.952106Z",
"start_time": "2023-05-22T22:27:15.116241Z"
},
"pycharm": {
"name": "#%%\n"
}
},
"outputs": [],
"execution_count": null,
"source": "#!pip install pandas \"sf-hamilton[dask,visualization]\""
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": 1,
"source": [
"# Cell 1 - import the things you need\n",
"import logging\n",
Expand Down
6 changes: 6 additions & 0 deletions examples/dask/hello_world/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,12 @@ idea is that you'd swap this module out for other ways of loading data or use @c
* `run_with_delayed_and_dask_objects.py` shows the combination of the above. It is slightly non-sensical, since we're
entirely operating on what are dask objects effectively. But otherwise shows the code pattern to use both.

Or run it in Google Colab:
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)
](https://colab.research.google.com/github/dagworks-inc/hamilton/blob/main/examples/dask/hello_world/notebook.ipynb)



# Visualization of execution
Here is the graph of execution:

Expand Down
Loading

0 comments on commit 6385a82

Please sign in to comment.