-
Notifications
You must be signed in to change notification settings - Fork 125
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ITensors] [BUG] Inconsistent Behavior of mindim #1207
Comments
I think this may be because we don't really have a full QR/SVD, only thin versions, so the maximum dimension is set by the smaller matrix dimension and you get these boundary effects depending on the sweep direction. |
Though I'm a bit confused about what behavior you are hoping for, do you want the bonds to be Because of the thin QR/SVD, the bond dimension pattern will depend on the gauge center, which isn't really something particular to the DMRG function (i.e. try |
My personal preference would be forcing of the bulk tensors to be the |
I thought in your example above |
To get the output you expected (based on your first post) does it work to call |
Yeah I can call
|
Is this the bond dimension after each sweep or each 2-site update? If it's the latter, that's what I would expect based on how MPS gauging works. |
It's after each 2-site update (in |
So I guess I'm not seeing an issue here, though happy to discuss offline why this is expected behavior. |
Basically, if you imagine doing a sweep without doing any actual updates (say DMRG on an identity MPO), it should act like calling |
Quick update, a more minimal example. Say we have a two-site tensor that we want to factorize s = Index(2)
s′ = Index(2)
l = Index(64)
r = Index(16)
T = randomITensor(l,s,s′,r) The inconsistent behavior manifests as > L,R,spec = factorize(T,l,s; cutoff=0,mindim=64)
> @show inds(L)
inds(L) = ((dim=64|id=719), (dim=2|id=499), (dim=32|id=847|"Link,fact")) where we get the 64->32->16 stepdown. > L,R,spec = factorize(T,l,s; cutoff=1e-10,mindim=64)
> @show inds(L)
inds(L) = ((dim=64|id=444), (dim=2|id=941), (dim=64|id=961|"Link,fact")) Where the factorize respects the mindim by padding the link dimension. I think the |
Thanks. So when you call You can see that here: julia> L, R = ITensors.factorize_eigen(T, l, s; cutoff=1e-4, mindim=64);
julia> R
ITensor ord=3 (dim=64|id=630|"Link,eigen")' (dim=2|id=791) (dim=16|id=387)
NDTensors.Dense{Float64, Vector{Float64}}
julia> L
ITensor ord=3 (dim=64|id=131) (dim=2|id=103) (dim=64|id=630|"Link,eigen")'
NDTensors.Dense{Float64, Vector{Float64}} This helps explain some of the confusion I had about this behavior, since using an eigendecomposition circumvents the thin QR/SVD behavior. It also explains why it occurs only for larger cutoff values. I quick fix could be to set the |
Description of bug
When using
mindim!
, the link dimension behavior at the "edge" is different depending on the value ofcutoff!
.If there is a cutoff, some tensors get set to the
mindim
while others are not, and if there isn't a cutoff (cutoff!(sweeps,0)
) the link dimensions are symmetric. I'm not sure which output is preferred but I suspect its the latter?Minimal code demonstrating the bug or unexpected behavior
Minimal runnable code
Expected output or behavior
Expected Output
If
cutoff!(sweeps,0)
:Actual output or behavior
Output of minimal runnable code
Version information
versioninfo()
:using Pkg; Pkg.status("ITensors")
:The text was updated successfully, but these errors were encountered: