Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Introduce spec.md #766

Merged
merged 1 commit into from
Dec 14, 2022
Merged

Introduce spec.md #766

merged 1 commit into from
Dec 14, 2022

Conversation

burmako
Copy link
Contributor

@burmako burmako commented Dec 14, 2022

Earlier today, in conclusion of the Q3/Q4 speccing marathon, we have finished speccing HLO semantics for the StableHLO ops.

This was a huge effort that involved writing 93 specs, including digging deep into involved semantics of ops like batch_norm_grad, convolution, dot_general and more. Congratulations to everyone who contributed to this important milestone!

The idea of this project was to create a baseline from which the StableHLO opset will evolve in the future. Our immediate next steps will be writing a dynamism RFC (#8) and speccing quantization (#588) on top of this baseline.

Also, this speccing marathon has uncovered a lot of future work - both in cleaning up the opset and improving the implementation to fully conform to the spec. This is something that we're aiming to address in the next year.

Earlier today, in conclusion of the Q3/Q4 speccing marathon, we have
finished speccing HLO semantics for the StableHLO ops.

This was a huge effort that involved writing 93 specs, including digging
deep into involved semantics of ops like batch_norm_grad, convolution,
dot_general and more. Congratulations to everyone who contributed to
this important milestone!

The idea of this project was to create a baseline from which the
StableHLO opset will evolve in the future. Our immediate next steps
will be writing a dynamism RFC (#8) and speccing quantization (#588)
on top of this baseline.

Also, this speccing marathon has uncovered a lot of future work - both
in cleaning up the opset and improving the implementation to fully
conform to the spec. This is something that we're aiming to address in
the next year.
@burmako burmako merged commit 8007c55 into openxla:main Dec 14, 2022
@burmako burmako deleted the spec branch December 14, 2022 02:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants