Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add colab links #939

Merged
merged 3 commits into from
Jun 13, 2024
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -14,3 +14,8 @@ Open it in google collab:
* simple_pipeline.ipynb - this contains documentation and code. Read this.
* pipeline.py - what the code in simple_pipeline.ipynb creates for easy reference
* requirements.txt - python dependencies required (outside of jupyter lab)

To exercise this example you can run it in Google Colab:

[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)
](https://colab.research.google.com/github/dagworks-inc/hamilton/blob/main/examples/LLM_Workflows/simple_pipeline.ipynb)
skrawcz marked this conversation as resolved.
Show resolved Hide resolved
skrawcz marked this conversation as resolved.
Show resolved Hide resolved
5 changes: 5 additions & 0 deletions examples/caching_nodes/caching_graph_adapter/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,3 +16,8 @@ For iterating during development, the general process would be:
its name to the adapter in the `force_compute` argument. Then, this node and its downstream
nodes will be computed instead of loaded from cache.
4. When no longer required, you can just skip (3) and any caching behavior will be skipped.

To exercise this example you can run it in Google Colab:

[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)
](https://colab.research.google.com/github/dagworks-inc/hamilton/blob/main/examples/caching_nodes/caching_graph_adapter/caching_nodes.ipynb)
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,13 @@
"# Caching Nodes with Hamilton"
]
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": null,
"source": "#! pip isntall pandas pyarrow sf-hamilton"
skrawcz marked this conversation as resolved.
Show resolved Hide resolved
skrawcz marked this conversation as resolved.
Show resolved Hide resolved
},
{
"cell_type": "code",
"execution_count": 1,
Expand Down
4 changes: 4 additions & 0 deletions examples/contrib/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,10 @@ For the purpose of this example, we will create a virtual environment with hamil
2. `. venv/bin/activate` (on MacOS / Linux) or `. venv/bin/Scripts` (Windows)
3. `pip install -r requirements.txt`

Or run it in Google Colab:
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)
](https://colab.research.google.com/github/dagworks-inc/hamilton/blob/main/examples/contrib/notebook.ipynb)


# 3 ways to import
There are 3 main ways to use community dataflows: static installation, dynamic installation, and local copy (see [documentation](https://github.com/DAGWorks-Inc/hamilton/tree/main/contrib)). We present each of them in this example:
Expand Down
7 changes: 7 additions & 0 deletions examples/contrib/notebook.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,13 @@
"collapsed": false
}
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": null,
"source": "# !pip install sf-hamilton-contrib"
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down
6 changes: 6 additions & 0 deletions examples/dagster/hamilton_code/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,12 @@ The directory also include `mock_api.py` which simulates a `ressource` in the Da
python run.py
```

Or run it in Google Colab:
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)
](https://colab.research.google.com/github/dagworks-inc/hamilton/blob/main/examples/dagster/hamilton_code/notebook.ipynb)
skrawcz marked this conversation as resolved.
Show resolved Hide resolved



skrawcz marked this conversation as resolved.
Show resolved Hide resolved
## Going further
- Learn the basics of Hamilton via the `Concepts/` [documentation section](https://hamilton.dagworks.io/en/latest/concepts/node/)
- Visit [tryhamilton.dev](tryhamilton.dev) for an interactive tutorial in your browser
Expand Down
7 changes: 7 additions & 0 deletions examples/dagster/hamilton_code/notebook.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,13 @@
"[Tips on Hamilton + notebooks in the docs](https://hamilton.dagworks.io/en/latest/how-tos/use-in-jupyter-notebook/)"
]
},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": null,
"source": ""
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down
6 changes: 6 additions & 0 deletions examples/dask/community_demo/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,12 @@ pip install -r requirements.txt
jupyter notebook
```

Or run it in Google Colab:
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)
](https://colab.research.google.com/github/dagworks-inc/hamilton/blob/main/examples/dask/community_demo/demo_day_notebook.ipynb)



If you have questions, or need help with this example,
join us on [slack](https://join.slack.com/t/hamilton-opensource/shared_invite/zt-1bjs72asx-wcUTgH7q7QX1igiQ5bbdcg), and we'll try to help!

Expand Down
19 changes: 9 additions & 10 deletions examples/dask/community_demo/demo_day_notebook.ipynb
Original file line number Diff line number Diff line change
@@ -1,18 +1,17 @@
{
"cells": [
{
"metadata": {},
"cell_type": "code",
"execution_count": 1,
"metadata": {
"ExecuteTime": {
"end_time": "2023-05-22T22:27:16.952106Z",
"start_time": "2023-05-22T22:27:15.116241Z"
},
"pycharm": {
"name": "#%%\n"
}
},
"outputs": [],
"execution_count": null,
"source": "#!pip install pandas \"sf-hamilton[dask,visualization]\""
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need to escape \" quotation marks? Also, might no need the comment #

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is notebook escaping because this is JSON.

},
{
"metadata": {},
"cell_type": "code",
"outputs": [],
"execution_count": 1,
"source": [
"# Cell 1 - import the things you need\n",
"import logging\n",
Expand Down
6 changes: 6 additions & 0 deletions examples/dask/hello_world/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,12 @@ idea is that you'd swap this module out for other ways of loading data or use @c
* `run_with_delayed_and_dask_objects.py` shows the combination of the above. It is slightly non-sensical, since we're
entirely operating on what are dask objects effectively. But otherwise shows the code pattern to use both.

Or run it in Google Colab:
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)
](https://colab.research.google.com/github/dagworks-inc/hamilton/blob/main/examples/dask/hello_world/notebook.ipynb)



# Visualization of execution
Here is the graph of execution:

Expand Down
6 changes: 6 additions & 0 deletions examples/dlt/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,12 @@ It includes a pipeline to ingest messages from Slack channels and generate threa
python run.py general dlt
```

Or run it in Google Colab:
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)
](https://colab.research.google.com/github/dagworks-inc/hamilton/blob/main/examples/dlt/notebook.ipynb)
skrawcz marked this conversation as resolved.
Show resolved Hide resolved



skrawcz marked this conversation as resolved.
Show resolved Hide resolved
# References
- dlt to ingest [Slack data](https://dlthub.com/docs/dlt-ecosystem/verified-sources/slack)
- Reconstructing Slack threads [docs](https://api.slack.com/messaging/retrieving#finding_threads)
Loading