Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prototype Compile Hamilton on to an Orchestration Framework #16

Open
HamiltonRepoMigrationBot opened this issue Feb 26, 2023 · 0 comments
Labels
enhancement New feature or request help wanted Extra attention is needed migrated-from-old-repo Migrated from old repository product idea

Comments

@HamiltonRepoMigrationBot
Copy link
Collaborator

Issue by skrawcz
Wednesday Feb 02, 2022 at 01:35 GMT
Originally opened as stitchfix/hamilton#44


Is your feature request related to a problem? Please describe.
Another way to scale a Hamilton DAG is to break it up into stages and have some other orchestrator handle execution. Hamilton need not implement these functions itself -- it could just compile and delegate execution to these frameworks.

E.g. I have a Hamilton DAG, but I want to use my in house Metaflow system -- the user should be able to generate code to run on Metaflow.

Describe the solution you'd like
A prototype to show how you could go from a Hamilton DAG to a DAG/Pipeline of some orchestration framework.

E.g.:

You'd have to think through the flow to do this:
e.g. define Hamilton DAG -> Compile to X Framework -> Commit code -> Run code on Framework X

We should prototype at least two implementations and see how we'd need to structure the code to make it manageable to maintain.

Describe alternatives you've considered
Hamilton could implement something like these other orchestration frameworks do, but that seems like a heavy lift. Better to try compiling to an existing framework.

Additional context
N/A

@elijahbenizzy elijahbenizzy added the migrated-from-old-repo Migrated from old repository label Feb 26, 2023
@skrawcz skrawcz added the help wanted Extra attention is needed label Jul 18, 2024
elijahbenizzy added a commit that referenced this issue Sep 9, 2024
# This is the 1st commit message:

Update graph_functions.py

Describes what to do in `graph_functions.py`
# This is the commit message #2:

Adds comments to lifecycle base
# This is the commit message #3:

Update h_ray.py with comments for ray tracking compatibility
# This is the commit message #4:

Replicate previous error

# This is the commit message #5:

Inline function, unsure if catching errors and exceptions to be handadled differently

# This is the commit message #6:

BaseDoRemoteExecute has the added Callable function that snadwisched lifecycle hooks

# This is the commit message #7:

method fails, says AssertionError about ray.remote decorator

# This is the commit message #8:

simple script for now to check telemetry, execution yield the ray.remote AssertionError

# This is the commit message #9:

passing pointer through and arguments to lifecycle wrapper into ray.remote

# This is the commit message #10:

post-execute hook for node not called

# This is the commit message #11:

finally executed only when exception occurs, hamilton tracker not executed

# This is the commit message #12:

atexit.register does not work, node keeps running inui

# This is the commit message #13:

added stop() method, but doesn't get called

# This is the commit message #14:

Ray telemtry works for single node, problem with connected nodes

# This is the commit message #15:

Ray telemtry works for single node, problem with connected nodes

# This is the commit message #16:

Ray telemtry works for single node, problem with connected nodes

# This is the commit message #17:

Fixes ray object dereferencing

Ray does not resolve nested arguments:
https://docs.ray.io/en/latest/ray-core/objects.html#passing-object-arguments

So one option is to make them all top level:

- one way to do that is to make the other arguments not clash with any
possible user parameters -- hence the `__` prefix. This is what I did.
- another way would be in the ray adapter, wrap the incoming function,
and explicitly do a ray.get() on any ray object references in the
kwargs arguments. i.e. keep the nested structure, but when the ray
task starts way for all inputs... not sure which is best, but this
now works correctly.

# This is the commit message #18:

ray works checkpoint, pre-commit fixed

# This is the commit message #19:

fixed graph level telemtry proposal

# This is the commit message #20:

pinned ruff

# This is the commit message #21:

Correct output, added option to start ray cluster

# This is the commit message #22:

Unit test mimicks the DoNodeExecute unit test
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request help wanted Extra attention is needed migrated-from-old-repo Migrated from old repository product idea
Projects
None yet
Development

No branches or pull requests

3 participants