Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP: Use constant propagation instead of Val in linalg #25303

Closed
wants to merge 2 commits into from

Conversation

andreasnoack
Copy link
Member

This is my first attempt to remove the uses of Val in the linear algebra code and instead rely on constant propagation. Currently, I think I have to use too make tricks to make inference work here so it would be great if @vtjnash could take a look. E.g. I'd like to get rid of these definitions and this @inline annotations.

@vtjnash
Copy link
Member

vtjnash commented Dec 28, 2017

I think that’s right for now. Our heuristics are currently tied to inlining, so I think we can’t do much better right now.

@andreasnoack
Copy link
Member Author

Our heuristics are currently tied to inlining

Would it be technically possible to make constant propagation a guarantee instead of an optimization? For this application here, I'm literally using it instead of dispatch and right now I'm heavily abusing @inline to make it work.

@andyferris
Copy link
Member

Is using small, redirecting @inline functions (thunks) that lift constants to Vals a decent approach that guarantees constant propagation while letting the inline heuristic do its thing for the (possibly large) implemention functions?

if pivot == :none
return qrfactUnblocked!(A)
elseif pivot == :colnorm
return qrfactPivotedUnblocked!(A)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Transformations like these from explicit dispatch to dispatch-simulation-via-conditional seem a bit unfortunate. Perhaps some best-of-both-worlds approach exists, or could exist in the future? :)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps dispatching to internal Val versions make the code cleaner (like #24960 (comment)).

Not sure if there are any drawbacks.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice! Thanks :). (Interestingly, that approach dovetails nicely with Andy's comment.)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the explicit list approach should be preferred Julian style for these methods

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Out of curiosity, why? :)

@andreasnoack
Copy link
Member Author

Fixes JuliaLang/LinearAlgebra.jl#476

@dkarrasch
Copy link
Member

I have a local rebase of this PR, including some minor adjustments to avoid breaking the current interface. Not surprisingly, the lu tests pass even with a few @inferred added, but the qr inference tests fail because inference finds a Union{LinearAlgebra.QRCompactWY{Float64, Matrix{Float64}}, QRPivoted{Float64, Matrix{Float64}}, which may not be too bad after all. Is this still of interest?

@dkarrasch
Copy link
Member

Aha, if I force inlining more then tests pass, actually. Since I have it anyway, I'll open a new PR for discussion.

@KristofferC
Copy link
Member

#40623

@DilumAluthge DilumAluthge deleted the anj/novals branch August 24, 2021 05:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
linear algebra Linear algebra
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants