-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add vectorize
method for LKJCholesky
#485
Conversation
Pull Request Test Coverage Report for Build 5356891002
💛 - Coveralls |
Codecov ReportPatch coverage:
Additional details and impacted files@@ Coverage Diff @@
## master #485 +/- ##
==========================================
- Coverage 76.45% 76.40% -0.05%
==========================================
Files 21 21
Lines 2514 2522 +8
==========================================
+ Hits 1922 1927 +5
- Misses 592 595 +3
☔ View full report in Codecov by Sentry. |
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Oh wait, |
Thanks, @harisorgn. |
Does this require to update the Bijectors compat entry? |
That should already be covered by the implementation for EDIT: Aaah nah, this is a EDIT 2: But in this case we just need reconstruct(::Distribution{CholeskyVariate}, val::Cholesky) = copy(val) and you should be good:) |
You mean because the value that will hit So technically the change in this PR isn't dependent on fixing Bijectors.jl, but it's not particularly useful without the most recent version of Bijectors.jl 🤷 Personally don't have any strong opinions about this. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me!
Happy to merge once the following has been done:
- Added the `reconstruct impl mentioned above.
- Bump patch version so we can easily release.
- Maybe bump compat entry of Bijectors.jl to 0.12.7.
If I understand this correctly, by including also the zero elements, this will cause the induced distribution to be improper. It really is quite important that in unconstrained space all parameters bijectively map to the target space (here Cholesky factors of correlation). |
It shouldn't:) This is just a default DynamicPPL.jl/src/abstract_varinfo.jl Lines 618 to 629 in 7b01d25
|
@harisorgn Actually, it would be nice with a test for a model using this! |
Ah ok, so this would e.g. be used when converting |
So this isn't necessarily related to how it's represented in the chain. This is mainly just about converting from the (usually) linearized representation in As for representing in a |
But in general Line 240 in 7b01d25
Just to check my understanding. There is an issue even after adding this method. It seems that when using Turing
@model function lkj_demo()
R ~ LKJ(2,1)
end
sample(lkj_demo(), HMC(0.05,10), 10) it looks like differentiation wrt a univariate distribution with a single value and a single partial. Same thing when EDIT: To be clear, because of the shape of the |
Should this test be in |
Ah yes, I completely forgot! My previous suggestion of just implementing We need to implement reconstruct(f::Inverse{VecCorrBijector}, dist::LKJ, val::AbstractVector) = val Basically, now all of our linking and invlinking methods also includes a call to DynamicPPL.jl/src/abstract_varinfo.jl Lines 589 to 603 in 7b01d25
and so for the transformations which completely changes the type, we need to implement the corresponding
Yeah, that's probably a good idea. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just reversing my previous apprve
E.g. the following is working nicely for me locally:: reconstruct(f::Inverse{Bijectors.VecCorrBijector}, dist::LKJ, val::AbstractVector) = copy(val) EDIT: Note that sampling with NUTS, it still eventually breaks but this seems to be due to numerical instability. EDIT 2: So |
There are some issues with positive definiteness when I use it in a more complicated model (EDIT: this is me trying to write a tutorial for LKJ/LKJCholesky).
Yes, same. The bundling issue I guess needs to be addressed elsewhere. |
This might have been part of the problem, but not the whole thing, as Now |
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
@torfjelde good to go? : D |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@harisorgn Again, really awesome work:) It's going to be great to finally have this.
It looks good to me! Just a few minor suggestions, but these should be simple "commit suggestion", so I'll approve:)
Feel free to hit the merge button once you've gone of these few last comments 👍
Co-authored-by: Tor Erlend Fjelde <tor.erlend95@gmail.com>
Co-authored-by: Tor Erlend Fjelde <tor.erlend95@gmail.com>
Co-authored-by: Tor Erlend Fjelde <tor.erlend95@gmail.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
I'm not authorised to merge. @torfjelde @yebai feel free to do it |
@harisorgn, you're a member of the TuringLang org, so you should have all the developer privileges. The setup of DynamicPPL prevents us from directly merging PRs; instead, you can click the "merge when ready" button. When CI tests pass, it will get automatically merged. |
The excitement is tangible 👀 |
I opted for the simplest way to vectorize
LKJCholesky
samples, that is to keep the entire matrix with the extra zeroes, not just the triangular one that is sampled.EDIT: Do you think it's worth keeping just the relevant triangular part?