-
Notifications
You must be signed in to change notification settings - Fork 133
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Some bugfixes and improvements #3672
Some bugfixes and improvements #3672
Conversation
src/Rings/mpoly-ideals.jl
Outdated
# Try to eliminate variables first. This will often speed up computations significantly. | ||
if simplify_ring | ||
Q, pr = quo(R, I) | ||
W, id, id_inv = simplify(Q) | ||
@assert domain(id) === codomain(id_inv) === Q | ||
@assert codomain(id) === domain(id_inv) === W | ||
res_simp = minimal_primes(modulus(W); algorithm, factor_generators, simplify_ring=false) | ||
result = [I + ideal(R, lift.(id_inv.(W.(gens(j))))) for j in res_simp] | ||
for p in result | ||
end | ||
return result | ||
end | ||
|
||
# This will in many cases lead to an easy simplification of the problem | ||
if factor_generators | ||
J = typeof(I)[ideal(R, elem_type(R)[])] | ||
for g in gens(I) | ||
K = typeof(I)[] | ||
is_zero(g) && continue | ||
for (b, k) in factor(g) | ||
for j in J | ||
push!(K, j + ideal(R, b)) | ||
end | ||
end | ||
J = K | ||
end | ||
result = unique!(filter!(!is_one, vcat([minimal_primes(j; algorithm, factor_generators=false) for j in J]...))) | ||
# The list might not consist of minimal primes only. We have to discard the embedded ones | ||
final_list = typeof(I)[] | ||
for p in result | ||
any((q !== p && is_subset(q, p)) for q in result) && continue | ||
push!(final_list, p) | ||
end | ||
for p in final_list | ||
set_attribute!(p, :is_prime=>true) | ||
end | ||
return final_list | ||
end |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@afkafkafk13 , @hannes14 : Could you let me knew what you think of these? I tried to run a script again from about 5 months ago and it broke down on some primary decompositions (which used to go though at some point). Now I'm trying to get back there with these tweaks. And I thought that they might be useful in general.
@hannes14 : This was also used to prepare the example I sent you. Interestingly it is solved rather quickly with algorithm=:charSets
, but never terminated for me with :GTZ
.
src/Rings/mpoly-ideals.jl
Outdated
for p in result | ||
end |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks a bit weird: either some line was lost or some leftovers were not deleted
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Indeed. I will clean that up.
General question: Do you have any idea, what increased the cost of these older scripts? I find this a bit scary! The improvements make sense in any case. I have used such tweaks many times for large settings in Singular, where primary decomposition as a black box did not go through. But you need to be aware that you are deliberately circumventing the heuristics of primdec.lib and this should only be necessary in nasty examples. Is your offending example really sufficiently nasty? |
Thanks for your comments. No, in my opinion the offending example is not nasty. I sent it to @hannes14 via email, but I can put it here again:
Be aware that To me this did not look scary at all. But |
Does anyone have an explanation for the failing tests? I suppose the same things are tested in other threads, aren't they? Why do they fail on ubuntu 1.10 long? |
ubuntu-1.10-long spent 2.5 hours in withoutexperimental spent about 2 hours in ubuntu-1.10-book failed due to some change in output in decker-schmitt-invariant-theory: https://github.com/oscar-system/Oscar.jl/actions/runs/8927137202/job/24519841523?pr=3672#step:8:3632 |
I entered the computation in Singular and found the following observations:
So the difference is in the internal choice of the "maximal independent set" in this primary decomposition algorithm and of course in the difficulty of the lp-groebner basis to be computed internally. This choice is not done by an elaborate heuristic; hence the situation is brittle, whenever the different choices have significantly different runtimes. So probalby some change further up in the hierarchy changed the sequence of the variables, which in turn triggers a different choice of a maximal independent set. |
I changed the default algorithm for the place where primary decomposition is called in the elliptic surfaces. Maybe that already helps to have the tests run more reliably (#3676). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you.
Concerning the long-running tests: It makes sense to keep them, but disable them. We should, however, remember to run them locally for all PR concerning optimizations in the basic functionality for IdealSheaf, ..Divisor etc to detect regression.
|
||
L, loc_map = localization(W, U) | ||
|
||
I = ideal(L, [y^2, 0]) # Dealing with zero generators lead to a bug once, so this is a real test. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I = ideal(L, [y^2, 0]) # Dealing with zero generators lead to a bug once, so this is a real test. | |
I = ideal(L, [y^2, 0]) # Dealing with zero generators led to a bug once, so this is a real test. |
The change to |
You can change the output line in the jlcon file ( |
function _maximal_associated_points( | ||
I::AbsIdealSheaf; | ||
covering=default_covering(scheme(I)), | ||
use_decomposition_info::Bool=true, | ||
algorithm::Symbol=:GTZ | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Quoting the singular manual:
https://www.singular.uni-kl.de/Manual/latest/sing_1426.htm#SEC1507
"In small characteristic and for algebraic extensions, the procedures via Gianni, Trager, Zacharias may not terminate. "
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The algebraic extensions only refer to those implemented as extended basefield in Singular. If you encode the algebraic extension by hand, the problem does not occur (hence the encoding by hand). This is the reason why these particular calls to Singular should not use the algebraic extensions in Singular, but do the encoding themselves (this was subject of discussion in another PR).
The restriction on the characteristic results from the sufficiently general linear coordinate change, which is used (the open set needs to be non-empty, which might be a problem in very small characteristics). This should be stated as a warning somewhere in the docstring, but allowing characteristics like 101 or 32003 is vital for many practical applications where coefficients are known to explode in characteristic zero.
I adjusted the output of the failing book test. |
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #3672 +/- ##
==========================================
- Coverage 82.89% 81.31% -1.59%
==========================================
Files 577 577
Lines 78308 78568 +260
==========================================
- Hits 64916 63886 -1030
- Misses 13392 14682 +1290
|
# We need to avoid creating a polynomial ring with zero variables | ||
dim(X) == 0 && vector_space_dimension(OO(X)) == 0 && return SimplifiedAffineScheme(X, X, identity_map(X), identity_map(X), check=false) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
May I suggest moving the point to 0
? There is the function rational_point_coordinates
.
It will push some of the complexity into the gluings ... but things like saturation should be easier.
Not sure though how exactly simplify
is hooked into everything.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's actually not that easy to program. I don't think it pays off since these schemes are already very simple and stuff like saturation will not be applied here.
I would suggest to merge this and relegate further improvements to a new PR. |
Good to go from my side. |
This reverts commit 85594b1. Commit is made obsolete by PR oscar-system#3672 (85594b1)
This is work in progress for all the small fixes necessary to make some old scripts of @simonbrandhorst and myself run again.