Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Handle importing relocated dispatch functions #623

Merged
merged 5 commits into from
May 26, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 8 additions & 3 deletions dask_cuda/proxy_object.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,11 @@
from dask.sizeof import sizeof
from distributed.worker import dumps_function, loads_function

try:
from dask.dataframe.backends import concat_pandas
except ImportError:
from dask.dataframe.methods import concat_pandas

from .get_device_memory_objects import get_device_memory_objects
from .is_device_object import is_device_object

Expand Down Expand Up @@ -726,10 +731,10 @@ def wrapper(*args, **kwargs):

# Register dispatch of ProxyObject on all known dispatch objects
for dispatch in (
dask.dataframe.utils.hash_object_dispatch,
dask.dataframe.core.hash_object_dispatch,
dask.dataframe.utils.make_meta,
dask.dataframe.utils.make_scalar,
dask.dataframe.utils.group_split_dispatch,
dask.dataframe.core.group_split_dispatch,
dask.array.core.tensordot_lookup,
dask.array.core.einsum_lookup,
dask.array.core.concatenate_lookup,
Expand All @@ -745,5 +750,5 @@ def wrapper(*args, **kwargs):
# deserialize all ProxyObjects before concatenating
dask.dataframe.methods.concat_dispatch.register(
(pandas.DataFrame, pandas.Series, pandas.Index),
unproxify_input_wrapper(dask.dataframe.methods.concat_pandas),
unproxify_input_wrapper(concat_pandas),
)
1 change: 1 addition & 0 deletions dask_cuda/tests/test_ucx_options.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,7 @@ def _test_global_option(seg_size):
assert conf["SEG_SIZE"] == seg_size


@pytest.mark.xfail(reason="https://github.com/rapidsai/dask-cuda/issues/627")
def test_global_option():
for seg_size in ["2K", "1M", "2M"]:
p = mp.Process(target=_test_global_option, args=(seg_size,))
Comment on lines +53 to 56
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok marked this test with xfail so we can get these changes out-the-door to fix other CI issues. Though raised issue ( #627 ) to track/investigate

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is ok. @pentschev do you have any concerns? we are still in burn down so if something does crop up we have time to handle

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For clarity cuDF's and RAFT's CI still fails without these changes, which is blocking a fair number of people. No objection to revisiting, but I do think we need to do this short term.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is weird, it never failed before, including in my nightly runs with both UCX 1.9 and UCX master. I'm actually concerned that this may be some error from a previous test that pytest is spilling over. I opened a PR to trigger CI in #628 , let's hold on and see if that fails too before merging this.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agree it is weird. I don't recall seeing this error on this PR before that build either.

Ok though JFYI before this came up, Ben had used a merge command upthread ( #623 (comment) ). So the bot may wind up merging this anyways if passes.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Haha, yeah, it was too quick, so no worries. I'm just now wondering if this isn't a bit dangerous, one could do any changes to the PR after someone told the bot to merge, and unless somebody sees it in time, it would be merged.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah it probably should be associated with a commit hash

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Or just invalidated based on timestamp. If commit timestamp > gpucibot merge comment, invalidate automerge.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean the time the commit was pushed, not the local commit timestamp.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Submitted PR ( #630 ) to revert this piece. Probably still fails, but at least that gives us something to work from

Expand Down