Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pickling with __reduce__() loops creates invalid pickles #15156

Open
vbraun opened this issue Sep 4, 2013 · 24 comments
Open

Pickling with __reduce__() loops creates invalid pickles #15156

vbraun opened this issue Sep 4, 2013 · 24 comments

Comments

@vbraun
Copy link
Member

vbraun commented Sep 4, 2013

cPickle creates invalid pickles when __reduce__() loops are present, but no error message is raised. The only symptom are corrupt pickles. For example (thanks to Jan for isolating this):

class a_class(SageObject): 
    def __init__(self, morphism):
        self._fan_morphism = morphism.fan_morphism()
        self._morphism = morphism
        self._fan_morphism.kernel_fan() # This line is crucial
        self._a1 = None
        self._a2 = None
        self._a3 = None
        self._a4 = None
        self._a5 = None
        self._a6 = None
        self._a7 = None
        self._a8 = None
        self._a9 = None
        self._a10 = None
        self._a11 = None
        self._a12 = None
        self._a13 = None
        self._a14 = None
        self._a15 = None
        self._a16 = None
        self._a17 = None
        self._a18 = None
        self._a19 = None
        self._a20 = None
        # You can add more private variables and the bug still occurs
        # but less of them will make it go away.

P1 = toric_varieties.P(1)
fm = FanMorphism(identity_matrix(1), P1.fan(), P1.fan())        
f = P1.hom(fm, toric_varieties.P(1))
Y = a_class(f)
Yp = loads(dumps(Y))

Although there is no error, the object is not correctly reconstructed:

sage: Yp._morphism.fan_morphism()
'_a20'

Using pickle instead of cPickle gives an assertion:

sage: import pickle
sage: pickle.dumps(Y, pickle.HIGHEST_PROTOCOL)
...

/home/vbraun/Code/sage.git/local/lib/python/pickle.pyc in memoize(self, obj)
    242         if self.fast:
    243             return
--> 244         assert id(obj) not in self.memo
    245         memo_len = len(self.memo)
    246         self.write(self.put(memo_len))

AssertionError: 

This is correct, as the objects are traversed we eventually hit the FanMorphism object again as it is saved by CachedMethodCallerNoArgs.__reduce__ in a circular manner.

This then leads to a corrupt pickle as the pickle bins get out of sync with the objects:

sage: sage.misc.explain_pickle.test_pickle(Y)
    0: \x80 PROTO      2
    2: c    GLOBAL     '__main__ a_class'
   20: q    BINPUT     1
   22: )    EMPTY_TUPLE
   23: \x81 NEWOBJ
...
 3906: }            EMPTY_DICT
 3907: q            BINPUT     227
 3909: (            MARK
 3910: h                BINGET     37
 3912: h                BINGET     45
 3914: h                BINGET     59
 3916: h                BINGET     45
 3918: h                BINGET     197
 3920: N                NONE
 3921: u                SETITEMS   (MARK at 3909)
 3922: t            TUPLE      (MARK at 147)
 3923: R        REDUCE
 3924: q        BINPUT     228
 3926: U        SHORT_BINSTRING '_a20'
 3932: q        BINPUT     228
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-15-52c979f92d15> in <module>()
----> 1 sage.misc.explain_pickle.test_pickle(Y)

/home/vbraun/Code/sage.git/local/lib/python2.7/site-packages/sage/misc/explain_pickle.pyc in test_pickle(p, verbose_eval, pedantic, args)
   2583         p = dumps(p, compress=False)
   2584 
-> 2585     pickletools.dis(p)
   2586 
   2587     current = explain_pickle(p, compress=False, in_current_sage=True, pedantic=pedantic, preparse=False)

/home/vbraun/Code/sage.git/local/lib/python/pickletools.pyc in dis(pickle, out, memo, indentlevel)
   2007             # Note that we delayed complaining until the offending opcode
   2008             # was printed.
-> 2009             raise ValueError(errormsg)
   2010 
   2011         # Emulate the stack effects.

ValueError: memo key 228 already defined

This is a known upstream bug, see http://bugs.python.org/issue1062277

Depends on #15692

Component: misc

Author: Simon King, Jan Keitel, Andrey Novoseltsev, Nils Bruin

Branch/Commit: u/saraedum/ticket/15156 @ 82ba456

Issue created by migration from https://trac.sagemath.org/ticket/15156

@vbraun vbraun added this to the sage-6.1 milestone Sep 4, 2013
@vbraun
Copy link
Member Author

vbraun commented Sep 4, 2013

comment:1

I've posted on https://groups.google.com/d/msg/sage-devel/8EX3H7UKHcU/Z6MLdQCBCVUJ to make more people aware of this issue.

@nbruin
Copy link
Contributor

nbruin commented Sep 9, 2013

comment:2

I think the circularity in this example arises from:

sage: cf=fm.kernel_fan
sage: cf in cf.__reduce_ex__()[1][0].__reduce_ex__()[1][2].values()
True	
sage: cf
Cached version of <function kernel_fan at 0x5ad0cf8>

Note that by selecting __reduce_ex__()[1], we obtain the arguments required upon construction, so cycles here are problematic. In particular we see that cf is required to construct cf.

The problem here is in /usr/local/sage/5.7/devel/sage/sage/misc/cachefunc.pyx:

def __reduce__(self):
    return CachedMethodPickle,(self._instance,self.__name__,self.cache)

As you see, the cache is fed into the construction tuple, but the cache can (and does, in this case) easily lead to circularities. If the cache is to be pickled at all, it should be reinstated with a setstate method.

@nbruin
Copy link
Contributor

nbruin commented Sep 9, 2013

comment:3

Unfortunately, this is not the only source of circularity. There is also fm.__reduce__ in
/usr/local/sage/5.7/devel/sage/sage/categories/map.pyx (starting at line 234):

def __reduce__(self):
    if HAS_DICTIONARY(self):
        _dict = self.__dict__
    else:
        _dict = {}
    return unpickle_map, (self.__class__, self._parent, _dict, self._extra_slots({}))

which indiscriminately puts a whole dictionary in the construction parameters, which in this case contains the cached form of kernel_fan, which refers back to the object itself.

Basically, a dict can be modified (and really IS modified for caching purposes!), so it should never be included in construction parameters. If specific parameters are needed upon construction, those need to be extracted separately.

@nbruin
Copy link
Contributor

nbruin commented Sep 9, 2013

Attachment: trac15156_preliminary.patch.gz

preliminary step towards more setstate

@nbruin
Copy link
Contributor

nbruin commented Sep 9, 2013

comment:4

With these changes in place, it seems that the example does pickle and unpickle OK.

The problem here is really that pickle doesn't raise an error with these circularities. Given the definition of the protocol, it really seems this shouldn't work. Circular stuff needs to be put in place by setstate.

@nbruin
Copy link
Contributor

nbruin commented Sep 9, 2013

comment:5

We should probably check:

sage/categories/functor.pyx:        return _Functor_unpickle, (self.__class__, self.__dict__.items(), self.__domain, self.__codomain)

sage/matrix/matrix0.pyx:        return unpickle, (self.__class__, self._parent, self._is_immutable,
sage/matrix/matrix0.pyx-                                          self._cache, data, version)

sage/structure/factory.pyx:        return generic_factory_unpickle, obj._factory_data

sage/symbolic/function.pyx:        return (unpickle_function, (name, nargs, latex_name, conversions,
sage/symbolic/function.pyx-            evalf_params_first, pickled_functions))

This list is not complete. It was obtained by grep 'return .*pickle assuming that most reduce methods in sage will follow the same pattern.

@sagetrac-vbraun-spam sagetrac-vbraun-spam mannequin modified the milestones: sage-6.1, sage-6.2 Jan 30, 2014
@sagetrac-vbraun-spam sagetrac-vbraun-spam mannequin modified the milestones: sage-6.2, sage-6.3 May 6, 2014
@saraedum
Copy link
Member

comment:8

pickle.dumps seems to detect such invalid references. I added it to _test_pickle and it detects the following:

sage -t src/sage/geometry/fan_morphism.py  # 1 doctest failed
sage -t src/sage/combinat/affine_permutation.py  # 3 doctests failed
sage -t src/sage/combinat/root_system/ambient_space.py  # 1 doctest failed
sage -t src/sage/modular/abvar/homspace.py  # 1 doctest failed
sage -t src/sage/combinat/root_system/type_affine.py  # 1 doctest failed
sage -t src/sage/tests/cmdline.py  # 3 doctests failed
sage -t src/sage/combinat/root_system/weight_space.py  # 2 doctests failed
sage -t src/sage/combinat/root_system/root_space.py  # 1 doctest failed
sage -t src/sage/modules/free_module.py  # 5 doctests failed
sage -t src/sage/combinat/rigged_configurations/rigged_configurations.py  # 3 doctests failed
sage -t src/sage/combinat/alternating_sign_matrix.py  # 1 doctest failed
sage -t src/sage/combinat/gelfand_tsetlin_patterns.py  # 6 doctests failed
sage -t src/sage/combinat/rigged_configurations/rigged_configuration_element.py  # 2 doctests failed
sage -t src/sage/quivers/homspace.py  # 1 doctest failed
sage -t src/sage/combinat/integer_list.py  # 1 doctest failed
sage -t src/sage/algebras/finite_dimensional_algebras/finite_dimensional_algebra_morphism.py  # 1 doctest failed
sage -t src/sage/combinat/root_system/cartan_type.py  # 1 doctest failed

I pushed one commit which contains an example on how I think the problem can be fixed.

Is this how this problem could be addressed? What do you think?

@saraedum
Copy link
Member

Branch: u/saraedum/ticket/15156

@saraedum
Copy link
Member

comment:10

I found this useful to find out which object causes the cyclic reference. It traverses the graph which __reduce__ creates and looks for cycles. It is certainly not complete, so I do not want to include it into sage like this.

from collections import defaultdict
graph = {}
decode_graph = {}
queue = [o]
while queue:
    o = queue.pop()
    if id(o) in graph:
        continue
    decode_graph[id(o)] = o 
    if isinstance(o, (list, tuple)):
        reduction = (None, list(o))
    elif isinstance(o, dict):
        reduction = (None, list(o.keys()) + list(o.values()))
    elif isinstance(o, (str, int, float)):
        reduction = (None, [])
    else:
        try:
            reduction = o.__reduce_ex__(2)
        except TypeError:
            reduction = (None, [])
    args = reduction[1]
    graph[id(o)] = [id(a) for a in args]
    queue.extend(args)    
    if len(reduction) >= 3:
        queue.append(reduction[2])
    if len(reduction) >= 4:
        queue.append(reduction[3])
    if len(reduction) >= 5:
        queue.append(reduction[4])
from sage.graphs.digraph import DiGraph
graph = DiGraph(graph)
for cycle in graph.all_simple_cycles():
    print [decode_graph[i] for i in cycle]

New commits:

98d17b4Check for invalid pickles in _test_pickling()
9e449c3Fix a circular reference in PermutationGroupElement.__reduce__()

@saraedum
Copy link
Member

Commit: 9e449c3

@sagetrac-git
Copy link
Mannequin

sagetrac-git mannequin commented Jun 21, 2014

Changed commit from 9e449c3 to 73dc5c9

@sagetrac-git
Copy link
Mannequin

sagetrac-git mannequin commented Jun 21, 2014

Branch pushed to git repo; I updated commit sha1. New commits:

ce27b84Removed sage.misc.cachefunc.ClearCacheOnPickle
94fcddbMerge branch 'u/saraedum/ticket/16337' of git://trac.sagemath.org/sage into ticket/15692
72fce8bAdded a pickle parameter for @cached_method
0717849Enable pickling of the cache for groebner_basis()
c0f4713Merge branch 'u/saraedum/ticket/15692' of git://trac.sagemath.org/sage into ticket/15156
73dc5c9Fixed the circular references of pickles of cached methods.

@saraedum
Copy link
Member

Dependencies: #15692

@sagetrac-git
Copy link
Mannequin

sagetrac-git mannequin commented Jun 21, 2014

Changed commit from 73dc5c9 to 82ba456

@sagetrac-git
Copy link
Mannequin

sagetrac-git mannequin commented Jun 21, 2014

Branch pushed to git repo; I updated commit sha1. New commits:

82ba456Fixed pickling of Map

@saraedum
Copy link
Member

comment:14

cPickle gets things wrong if any circularity is created in the first parameter returned by __reduce__. Consider the following code, which creates two objects which reference each other:

class A(object):
    def __reduce__(self):
        return A,(),{'b':self.b}     
class B(object):
    def __init__(self,a):
        self.a = a
    def __reduce__(self):
        return B,(self.a,) 
b=B(A())
b.a.b = b

With cPickle(pickle raises an assertion):

sage: b.a.b is b
True
sage: b = loads(dumps(b))
sage: b.a.b is b
False

This is in particular a problem with Homsets. The __reduce__ of Hom does:

return Hom, (self._domain, self._codomain, self.__category, False)

Often the domain or codomain reference back to the Homset somehow. I have no idea how to fix this. The domain is really needed here because it determines the class of the result and it is used for a lookup in a cache. Any ideas?

@nbruin
Copy link
Contributor

nbruin commented Jun 21, 2014

comment:15

Replying to @saraedum:

cPickle gets things wrong if any circularity is created in the first parameter returned by __reduce__.

I think that is more appropriately phrased as "the pickle protocol is not defined when construction parameters have circularities". You'd basically be asking for a __new__ call to have an argument that includes the to-be-instantiated object. The pickle protocol has a way of introducing circularities: via the __getstate__ and __setstate__ routines (i.e., the second return value of __reduce__).

Consider the following code, which creates two objects which reference each other:

class A(object):
    def __reduce__(self):
        return A,(),{'b':self.b}     
class B(object):
    def __init__(self,a):
        self.a = a
    def __reduce__(self):
        return B,(self.a,) 
b=B(A())
b.a.b = b

Note that the circular reference wasn't put there in the corresponding __new__ call. It was a later change of state that caused the circularity (this is always necessarily the case). So the circularity should be recreated during the __setstate__ phase of unpickling.

This is in particular a problem with Homsets. The __reduce__ of Hom does:

return Hom, (self._domain, self._codomain, self.__category, False)

Often the domain or codomain reference back to the Homset somehow.

Domain and codomain cannot possibly refer to the homset in their __new__ call. This is state later introduced, so domain and codomain, if they want to put references in to homsets they should do so in the __setstate__ phase.

I have no idea how to fix this. The domain is really needed here because it determines the class of the result and it is used for a lookup in a cache. Any ideas?

Yes, domain and codomain are really part of a homset, so it's not unreasonable to make them part of the __new__ parameters. If circularities arise here, then I'd say the domain and codomain are at fault referencing a homset in their instantiation phase. They can't possibly have done that when they were originally created either!

We've see problems before where __setstate__ values (e.g., python class attributes stored in __dict__ and pickled by default) were needed for hash and equality, and the circularities arose on keys in to-be-reconstructed dictionaries. Then __hash__ can be requested on an object that hasn't completed its __setstate__ yet and see __hash__ fail. However, I can't see how homsets can affect the hash of domain and codomain, so that problem shouldn't arise here.

@saraedum
Copy link
Member

comment:16

Replying to @nbruin:

Replying to @saraedum:

Consider the following code, which creates two objects which reference each other:
[...]

Note that the circular reference wasn't put there in the corresponding __new__ call. It was a later change of state that caused the circularity (this is always necessarily the case). So the circularity should be recreated during the __setstate__ phase of unpickling.

Right. But this is hard to do automatically.

I have no idea how to fix this. The domain is really needed here because it determines the class of the result and it is used for a lookup in a cache. Any ideas?

Yes, domain and codomain are really part of a homset, so it's not unreasonable to make them part of the __new__ parameters. If circularities arise here, then I'd say the domain and codomain are at fault referencing a homset in their instantiation phase.

I'm not sure what you mean by "instantiation phase".

They can't possibly have done that when they were originally created either!

Sure. But when pickling you do not only want to restore the initial state of an object but also to some extent the current state.

Here is an idea how to solve the problem with restoring the state. Say we have Hom(A,A) and A somehow circles back to Hom(A,A) through its state (e.g. its __dict__). The problem is that to unpickle Hom(A,A), A is created, A's state is restored (which fails), and then Hom(A,A) is created. What if Hom(A,A) took care of restoring the state of A until Hom(A,A) exists?
I implemented a sketch of this idea and it at least solves this problem for the cases that I had a look at. I changed Hom's __reduce__ to

args = (DelayedStatePickle(self._domain), DelayedStatePickle(self._codomain), self.__category, False)
states = [a.inner_getstate() if isinstance(a,DelayedStatePickle) else None for a in args]
return unpickle_from_DelayedStatePickles, (Hom, args), states

and make __setstate__ set the state of domain and codomain:

        for a,s in zip(self._args_delayed, states):
            if s is not None:
                a.__setstate__(s)

With

def unpickle_from_DelayedStatePickles(callable, args):
    ret = callable(*args)
    ret._args_delayed = args
    return ret

class DelayedStatePickle(object):
    def __init__(self, obj):
        self.obj = obj

    def inner_getstate(self):
        reduction = self.obj.__reduce__()
        assert all([r is None for r in reduction[3:]])
        self.callable, self.args, self.state = reduction[:3]
        return self.state

    def __reduce__(self):
        return self.callable, self.args

There is of course lots of space for improvement, but what do you think about this idea?

@nbruin
Copy link
Contributor

nbruin commented Jun 22, 2014

comment:17

Replying to @saraedum:

I'm not sure what you mean by "instantiation phase".

Unpickling an object happens in two phases: first, the object is instantiated using the first set of parameters given by __reduce__. There can't be circularities here, because that would mean passing an object that hasn't been instantiated yet to the routine that does the instantiating.

The second phase is done by __setstate__ and allows further modification of the state to obtain the desired one. Here, circularities are not a problem. The pickler carefully ensures that all objects involved in circularities have been instantiated (how would you pass them otherwise?)
The trick here is that during __setstate__ you may encounter objects that haven't had a chance to run their own __setstate__ yet.

Pure python classes pickle great. They can have all kinds of circularities, but they're all stored in __dict__, which is restored in __setstate__. The catch is that if you have a custom __hash__ that depends on attributes stored in __dict__, you may have trouble.

There is of course lots of space for improvement, but what do you think about this idea?

I think it's a bad idea because it doesn't solve the problem where it needs to be solved. Homsets fundamentally need domain and codomain for their construction. It's part of their hash. If a parent wants to include a homset in its construction parameter rather than in the __setstate__ then it is introducing an unavoidable cycle. It is trying to pickle itself in a way that can never work. It's not the responsibility of the homset pickler to hide that error.

If the domain and codomain want to store a homset involving them somewhere, then their unpickling should do so via __setstate__, i.e., their __reduce__ should include the homset in the second return value, not the first. All pickling should put only the absolute minimum in the first parameter (basically, the parameters that can be passed to __new__. That's what python class pickling does too: the entire instance __dict__ goes into __setstate__ for exactly that reason.

Circularities in __setstate__ data are not a problem. Do you actually have a concrete example where you have observed this causing problems? I would expect that most parents would already do the right thing, because you'd have to work pretty hard (e.g., write a custom __reduce__) to mess this up.

@saraedum
Copy link
Member

comment:18

Replying to @nbruin:

Replying to @saraedum:

I'm not sure what you mean by "instantiation phase".

Unpickling an object happens in two phases: first, the object is instantiated using the first set of parameters given by __reduce__. There can't be circularities here, because that would mean passing an object that hasn't been instantiated yet to the routine that does the instantiating.

The second phase is done by __setstate__ and allows further modification of the state to obtain the desired one. Here, circularities are not a problem. The pickler carefully ensures that all objects involved in circularities have been instantiated (how would you pass them otherwise?)
The trick here is that during __setstate__ you may encounter objects that haven't had a chance to run their own __setstate__ yet.

I don't think that's really what happens. Maybe I misunderstand what you are saying but I don't think that all objects in the pickle are instantiated and then all objects' __setstate__ is called.
Have a look at the example I posted above. There, to unpickle b, a is created, its __setstate__ is called (which fails), and only then b is created and its __setstate__ is called.

There is of course lots of space for improvement, but what do you think about this idea?

I think it's a bad idea because it doesn't solve the problem where it needs to be solved. Homsets fundamentally need domain and codomain for their construction. It's part of their hash. If a parent wants to include a homset in its construction parameter rather than in the __setstate__ then it is introducing an unavoidable cycle.

This is not what happens. Here is a real-world example:

sage: class A(Parent):
    def __reduce__(self):
        return A,(),self.homset
    def __setstate__(self, homset):
        self.homset = homset
....: 
sage: P=A()
sage: H=Hom(P,P)
sage: P.homset = H
sage: import pickle
sage: pickle.dumps(H)
AssertionError

If the domain and codomain want to store a homset involving them somewhere, then their unpickling should do so via __setstate__, i.e., their __reduce__ should include the homset in the second return value, not the first.

That's what happens above and does not work. I'm also confused because I always assumed that it would be enough to break the circularity on the first parameter in a single place. Somehow this does not seem to work.

@nbruin
Copy link
Contributor

nbruin commented Jun 22, 2014

comment:19

Replying to @saraedum:

I don't think that's really what happens. Maybe I misunderstand what you are saying but I don't think that all objects in the pickle are instantiated and then all objects' __setstate__ is called.

Not all objects. Just all objects that are involved making the relevant __setstate__ call. And of course this has to happen, otherwise there are no objects to pass.

Have a look at the example I posted above. There, to unpickle b, a is created, its __setstate__ is called (which fails), and only then b is created and its __setstate__ is called.

There is of course lots of space for improvement, but what do you think about this idea?

I think it's a bad idea because it doesn't solve the problem where it needs to be solved. Homsets fundamentally need domain and codomain for their construction. It's part of their hash. If a parent wants to include a homset in its construction parameter rather than in the __setstate__ then it is introducing an unavoidable cycle.

This is not what happens. Here is a real-world example:

...
sage: pickle.dumps(H)
AssertionError

That is indeed puzzling. Would python pickle be unnecessarily strict compared to cPickle? It seems to me that cPickle produces a perfectly workable pickle from this:

sage: p=dumps(H)
sage: explain_pickle(p)
pg_Hom = unpickle_global('sage.categories.homset', 'Hom')
pg_A = unpickle_global('__main__', 'A')
pg = unpickle_instantiate(pg_A, ())
pg_unreduce = unpickle_global('sage.structure.unique_representation', 'unreduce')
pg_Sets = unpickle_global('sage.categories.sets_cat', 'Sets')
si = pg_unreduce(pg_Sets, (), {})
unpickle_build(pg, pg_Hom(pg, pg, si, False))
pg_Hom(pg, pg, si, False)

Also for your earlier example, it looks like the pickle produced by dumps should be just fine:

sage: explain_pickle(dumps(b))
pg_B = unpickle_global('__main__', 'B')
pg_A = unpickle_global('__main__', 'A')
pg = unpickle_instantiate(pg_A, ())
unpickle_build(pg, {'b':pg_B(pg)})
pg_B(pg)
sage: explain_pickle(dumps(b.a))
pg_A = unpickle_global('__main__', 'A')
pg = unpickle_instantiate(pg_A, ())
pg_B = unpickle_global('__main__', 'B')
unpickle_build(pg, {'b':pg_B(pg)})
pg

That's what happens above and does not work. I'm also confused because I always assumed that it would be enough to break the circularity on the first parameter in a single place. Somehow this does not seem to work.

Well -- it does not work in the sense that pickle.dumps raises an AssertionError. Since that defaults to protocol 0, while sage has already settled on protocol 2, it's not an entirely fair test. However, specifying protocol=2 doesn't help, so it's really a problem in pickle and possibly also in cPickle. However, the code produced by explain_pickle seems sane and that is directly interpreted from the produced pickle, so it could well be that cPickle does perform as we hope.

I did a quick search on whether this is a known problem, but I didn't find compelling evidence. Closest was this discussion

@saraedum
Copy link
Member

comment:20

Replying to @nbruin:

Replying to @saraedum:
That is indeed puzzling. Would python pickle be unnecessarily strict compared to cPickle? It seems to me that cPickle produces a perfectly workable pickle from this:
[...]
Also for your earlier example, it looks like the pickle produced by dumps should be just fine:
[...]
Well -- it does not work in the sense that pickle.dumps raises an AssertionError. Since that defaults to protocol 0, while sage has already settled on protocol 2, it's not an entirely fair test. However, specifying protocol=2 doesn't help, so it's really a problem in pickle and possibly also in cPickle. However, the code produced by explain_pickle seems sane and that is directly interpreted from the produced pickle, so it could well be that cPickle does perform as we hope.

Thanks for pointing this out. I just assumed that the AssertionError was pointing at an invalid pickle but you are right, my example actually works with cPickle. I'll see if this explains the errors I've seen while working on this ticket.

@sagetrac-vbraun-spam sagetrac-vbraun-spam mannequin modified the milestones: sage-6.3, sage-6.4 Aug 10, 2014
@fchapoton
Copy link
Contributor

Changed author from SimonKing, jkeitel, novoselt, nbruin to SimonKing, Jan Keitel, Andrey Novoseltsev, Nils Bruin

@fchapoton
Copy link
Contributor

Changed author from SimonKing, Jan Keitel, Andrey Novoseltsev, Nils Bruin to Simon King, Jan Keitel, Andrey Novoseltsev, Nils Bruin

@mkoeppe mkoeppe removed this from the sage-6.4 milestone Dec 29, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants