-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Doesn't serialize custom types in list/dict #28
Comments
Yep, definitely agree about the error message. Pros are:
Cons are:
An alternative is to simply convert everything to a json-like thing and just store it in the single column. In addition:
Still not sure about this, what do you think? |
yeah, the only thing I'm unsure of here is if recursive NT/classes would be converted back? I've had a few cases where if I have very basic namedtuples as a list, and it converts it back to a list of lists (converted from a namedtuple to a tuple, and then since JSON doesnt have the concept of a tuple I guess its just stored as a list) instead of a list of namedtuples - here are those definitions Seems like there isnt a ton of benefit to using a sqlite database then though? Since its just storing JSON blobs, with no type information, its just the sqlalchemy machinery you're using to convert to NTs and back. Though, I don't know if mapping onto the columns is slower than doing the conversion from a dict back to the namedtuple, will just have to be tested... I do this manually and with a lot more error checking here by inspecting the signature for the namedtuple, but I assume the sqlalchemy bindings are better for this |
As long as we can rely on schemas (i.e. fully annotated type) passed both on encoding and decoding, this should work? It's not different from laying out on sqlite, e.g.
Yeah, not sure either
Yeah, good question. And not in FAQ, so wonder if I seriously considered it :) Let's think... Cons of having a JSONL file over sqlite with single-column json blob:
Pros:
?
Hmm. |
Somewhat related https://github.com/jsonpickle/jsonpickle
|
Wanted to simplify cachew a bit by potentially switching to third party serialization -- tried So decided to optimize the one we have instead, and as an experiment tried just serializing as json in a single column (instead of flattening it over multiple like we do currently) And it actually seems quite a bit faster (and the code is much easier to read!), not sure why I settled on the flattening back then, perhaps wasn't benchmarking properly.
We won't necessarily lose on individual column mappings either, it's possible to define virtual columns for accessing inside JSON like described here https://antonz.org/json-virtual-columns/ Using a text file with json ( For now it's just an isolated experiment here https://github.com/karlicoss/cachew/tree/nosql, but I'll transpant to cachew soon |
Wrote up a comparison here, might be interesting for other projects as well https://github.com/karlicoss/cachew/blob/master/doc/serialization.org Actually |
should be fixed in the latest pypi version! |
Only recently realized that I could put
Dict
/List
items as values on a NamedTuple cached by cachew, previously was doing some weird stuffSo, in the process of switching more things to support cachew, I ran into an issue here, when one of the items in the List being serialized by cachew couldn't be converted to json
generates quite the error:
I'm not expecting this to be fixed, as I understand the problems with serializing/deserializing from JSON, I was just stuck here for a while as the readme says it should work for
lists
anddatetime
, so I wasn't sure where this error why there was an error (its being thrown here)Makes sense that its using JSON, as otherwise you would have to maintain an intersection table and map ids from another table onto list/dict items, which sounds like a pain
Anyways, as far as any changes, either a better warning message could be raised, and/or the documentation could be updated to better reflect this pitfall, so that no one else runs into the cryptic error in the future?
The text was updated successfully, but these errors were encountered: