Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Partially implement LTC Backend interface for PyTorch/XLA. #3571

Merged
merged 1 commit into from
May 27, 2022

Conversation

JackCaoG
Copy link
Collaborator

@JackCaoG JackCaoG commented May 17, 2022

Partially implement LTC Backend interface for PyTorch/XLA.

@JackCaoG JackCaoG added the ltc label May 17, 2022
@JackCaoG JackCaoG force-pushed the xla_backend_data branch 3 times, most recently from 568024c to 7ec217c Compare May 19, 2022 22:14
@JackCaoG JackCaoG force-pushed the backend_interface branch from bdaedc2 to 7626a74 Compare May 20, 2022 00:18
@JackCaoG JackCaoG changed the base branch from xla_backend_data to xla_lowering_context May 20, 2022 00:18
@JackCaoG JackCaoG force-pushed the backend_interface branch from 7626a74 to 457ea3e Compare May 20, 2022 01:58
@JackCaoG JackCaoG force-pushed the xla_lowering_context branch from 39158bb to 0a1b78b Compare May 20, 2022 18:36
@JackCaoG JackCaoG force-pushed the backend_interface branch from 457ea3e to f55ac29 Compare May 20, 2022 20:11
@JackCaoG JackCaoG mentioned this pull request May 20, 2022
@JackCaoG JackCaoG force-pushed the backend_interface branch from f55ac29 to 6d0a2d4 Compare May 21, 2022 00:01
@JackCaoG JackCaoG changed the title [WIP] Add backend interface Add backend interface May 21, 2022
@JackCaoG JackCaoG requested a review from wonjoolee95 May 21, 2022 00:02
@JackCaoG JackCaoG force-pushed the backend_interface branch from 6d0a2d4 to e142cdb Compare May 21, 2022 00:15
Base automatically changed from xla_lowering_context to master May 25, 2022 18:36
@JackCaoG JackCaoG force-pushed the backend_interface branch 2 times, most recently from 86cd80c to 251520b Compare May 26, 2022 21:59
@JackCaoG JackCaoG force-pushed the backend_interface branch from 251520b to 49198a2 Compare May 26, 2022 22:02
@JackCaoG JackCaoG changed the title Add backend interface Partially implement LTC Backend interface for PyTorch/XLA. May 26, 2022
@JackCaoG
Copy link
Collaborator Author

@wonjoolee95 I think this pr is ready to be review. This pr only partially implement LTC Backend interface. I will have follow up pr(s) for the remaining prs.

std::vector<std::string> GetCompilationDevices(
const std::string& device,
c10::ArrayRef<std::string> devices) const override {
return xla::ComputationClient::Get()->GetCompilationDevices(device,
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fyi @yeounoh, we currently call it in

xla::ComputationClient::Get()->GetCompilationDevices(
.

Copy link
Collaborator

@wonjoolee95 wonjoolee95 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! So this XlaBackendImpl won't be used yet in XLA until we implement the InitXlaBackend function, correct?

@JackCaoG
Copy link
Collaborator Author

JackCaoG commented May 26, 2022

@wonjoolee95 It won't be used even if we implement InitXlaBackend for the most part. Some functions needs to call getBackend()->xxxx to execute the function we defined here. Currently most of the Backend function is used the the ltc's LazyGraphExecutor.cpp. Until we fully migrated to that this file won't be very relevant.

One thing we can do incrementally is to start using the function in XlaBackend in our torch_xla/csrc/tensor.cpp. That way we can get some early signal of the correctness of these methods.

@JackCaoG JackCaoG merged commit 73aa5c5 into master May 27, 2022
@JackCaoG JackCaoG deleted the backend_interface branch May 27, 2022 01:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants