-
-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
sparse \ polyalgorithm #16
Comments
I did some tests for the dense |
Is http://www.cise.ufl.edu/research/sparse/SuiteSparse/current/SuiteSparse/MATLAB_Tools/Factorize/Doc/factorize_demo.html covered by a license and, if so, what license? I haven't read it yet because of the possibility that it is GPL'd or LGPL'd. |
It would be nice if the intelligence which decides which factorization to use is coded into a Comparing this to Octave (http://www.gnu.org/software/octave/doc/interpreter/Sparse-Linear-Algebra.html#Sparse-Linear-Algebra), I see there is only one place where they reference the properties of the RHS in deciding how to factorize and that is just whether the RHS is sparse or not. Finally, it could be a nice feature to allow |
Yes, I think we will want to follow |
@bsxfan I think most of your suggestions are already implemented in Julia but in different way that MATLAB. We already have factorizations which are used in the way you are suggesting. See e.g. the GLM package. If I know that the matrix Hermitian(A)\b and if I want to reuse a positivedefinite left hand side We could write a |
I think this is a case where we should violate Julia's constant return type convention. Requiring users to fully understand the factorizations to get efficient computation seems a total fail on our parts. |
I agree, yes. I would vote for a generic |
I think this may be one of those places where it's ok for the return type to depend on the input values – the return type is part of the essential information that the function returns. We'll need to consider if we are ok with using the very generic name |
As @andreasnoackjensen said, we already have methods for factorizations. Whether one likes it or not, a lot of people are accustomed to doing |
I'd be happy with a name that's more targeted -- something like |
@bsxfan I am still not convinced from your example with complex differentiation. Why wouldn't the matrix be symmetric when evaluated as a complex matrix? It it was, you would use the Bunch-Kaufman for both. However, I can see the convenience in having a |
@andreasnoackjensen I have indeed been looking at your factorize, which looks very nice at a first glance. Sorry, my previous explanation may have been confusing. Let me give more details. A typical use case could look like this. In practice, you would have some real-valued rectangular matrix, say For real D, C would usually be positive definite, so you might be tempted to hard-code cholesky into your log-likelihood implementation. But ,if you'e doing complex step differentiation, D would be complex, with very small imaginary values, so that C would end up being symmetric, but not hermitian. The complex step which just executes the same code, would then also do cholesky. Since LAPACK it is just accessing one triangle, it does not see that C is not hermitian and the cholesky returns with info==0 and your code will quietly continue to supply the wrong answer. If instead, you originally called an intelligent factorize instead of cholesky, then different factorizations would be used for real and complex C and everything should work out. Still not implemented in your pull request is some additional wiring for logdet for some of the other factorications. I have just been playing around to supply LU with a logdet and it seems to be working out nicely (complex step is giving correct results). Here is what I did to wire up LU for
|
oh, OK my long reply was not lost, it is getting late thiss side of the atlantic ... Thanks @andreasnoackjensen, your factorize looks like it will make my use-case for complex-step differentiation work out very nicely. I have just typed out a long explanation of my use-case, but lost it by navigating away before clicking commit --- aargh! I could add that I would also need
This makes logdet applicable to more general arguments, while maintaining type stability. |
@bsxfan Thank you for the explanation. I see your point now and have also managed to get complex numerical derivatives from my own likelihood function. I'll add your code to the pull request later today. |
See: https://groups.google.com/forum/#!topic/julia-users/FagaqWVkBGU Currently,
Instead, we should be doing as Tim Davis suggested in the thread on the mailing list: |
A quick skim of the cited paper turns up Section 3.2 (digital page 15), which can be translated into code of this form: function factorize{T}(A::AbstractSparseArray{T,2})
n, m = size(A)
if n != m #A is rectangular
#Compute QR factorization of A
#TODO return qrfact of A or A', whichever is tall and thin.
return qrfact(A) #( n>m ? A, A')
end
#At this point A is square
if issym(A) #A is symmetric
if all(diag(A).!= 0) && all(imag(diag(A)).== 0)
#A has all-real nonzero diagonal
try
return cholfact(A)
catch e
isa(e, PosDefError) || raise(e)
end
end
# cholfact failed or condition on the diagonal does
# not hold; do LDLt
try
return ldltfact(A)
#catch some specific failure?
end
end
#Above conditions do not hold or chol and ldl failed
return lufact(A)
end The specific implementations of |
Thanks @jiahao. |
It is a good idea to follow Tim Davis' for the
Actually, I'm not sure why |
Here is our current status:
|
|
I'm on that. Actually, I've already wrapped SPQR, but I have to write tests. |
Wow, you are amazing! |
We need a polyalgorithm for sparse .
This is the discussion from the mailing list.
https://groups.google.com/forum/?fromgroups#!topic/julia-users/FagaqWVkBGU
We may need enhancements to the dense \ as well.
Cc: @dmbates, @andreasnoackjensen
The text was updated successfully, but these errors were encountered: