-
Notifications
You must be signed in to change notification settings - Fork 50
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
patch: first draft. #350
patch: first draft. #350
Conversation
This now appears to work with all six operations: add, replace, remove, copy, move, and test. Fixtures exist in the ipld/ipld repo for each. |
Probable future work:
I'm not sure if these are worth doing immediately, or if this PR is worth landing as-is and iterating more in future PRs. (Leaning towards the latter, especially because having long-running branches in the ipld/ipld specs&meta repo becomes a bit frictional.) |
Status of this: I think I call this "alpha". The current behavior appears to be correct so far. The tests we have cover key features. I'd still consider it relatively experimental and the odds of bugs relatively high -- "use at your own risk". (I want many more test fixtures probing corner cases, in the long run, before promising complete stability.) I think we may be able to merge this, with those caveats. |
What are the next steps here? Can this be merged after feedback is incorporated? |
TODO post merge: Update |
Two questions:
|
@RangerMauve we have a git submodule sync problem here; both ipld/ipld and If you run I think what you probably should do is rebase both ipld/ipld#187 and then here against current master versions and get them both in sync. Then, if you replace your enum in the schema with:
it should work OK, bindnode is using the strings Also as an aside, now I look at this conversion code in parse.go, I reckon I can make |
Also, the import |
The JSON Patch specification is pretty decent; this roughly follows it. Most of the heavy lifting is done by shelling out to the existing traversal.FocusedTransform functions. We're just gluing that to a more declarative API, which is more ready to be used via a serial API. The performance of this system in the face of multiple patch operations on the same target is probably abysmal. This implementation is an MVP. It's likely that a smarter implementation could combine more than one operation application onto the same data build process, and thus induce a *lot* fewer copies. This also gets much trickier to implement (maybe you need some patricia trees for ordering operations? or more mutable nodes for the duration of the computation, then ideally a way to freeze them again at the end so we don't break other consistency?), so it's "future work". The goal of this changeset is probably going to be limited in scope to getting the declarative API part hammered out. As of this commit, there's still several TODOs, regarding validation of the patch instructions during their parse, generalizing the API to support more codecs, and testing around upsets and moves and etc.
I'm thinking of trying to loosely (loosely!) standardize the idea of some phases for features like this one: parse, compile, eval, report; these seem to be somewhat recurrent. (Conventionalizing function names could actually save headaches for the package author, because it removes some decisions that must be made but are not interesting. It can also help users, for obvious reasons.)
I've cleaned this up a bit:
I reckon this might be mergeable along with ipld/ipld#187, what do you reckon @RangerMauve? |
Thanks @rvagg! I was away last week so I'm just getting into this now. I was thinking some link traversal tests would be useful for patch before folks start building on it, but I guess that could be done in a separate PR so we can get this out the door. 🤔 |
Also, I'm seeing a bunch of warnings saying that there's lines not covered by tests. Is that something we should address or is it expected? |
The codecov is new, comes with the rebase to master where we now have the unified-ci workflow! So we get to feel that little bit more guilty about poor test coverage. The uncovered bits are all for error cases, which are pretty annoying to cover in their entirety; and for a first pass at an experimental feature I don't think it's worth chasing those down here - at least that's not been the past practice in this repo where coverage hasn't been a super high priority. The one case where it's not for errors is the
But no tests for it. We maybe should add one for that since we have two codepaths that are relevant. |
Tests pass, merging. |
(Work in progress.)
The JSON Patch specification is pretty decent; this roughly follows it.
Most of the heavy lifting is done by shelling out to the existing
traversal.FocusedTransform functions. We're just gluing that to a more
declarative API, which is more ready to be used via a serial API.
The performance of this system in the face of multiple patch operations
on the same target is probably abysmal. This implementation is an MVP.
It's likely that a smarter implementation could combine more than one
operation application onto the same data build process, and thus
induce a lot fewer copies. This also gets much trickier to implement
(maybe you need some patricia trees for ordering operations? or more
mutable nodes for the duration of the computation, then ideally a way
to freeze them again at the end so we don't break other consistency?),
so it's "future work". The goal of this changeset is probably going to
be limited in scope to getting the declarative API part hammered out.
As of this first commit, there's still several TODOs, regarding validation of
the patch instructions during their parse, generalizing the API to
support more codecs, and testing around upsets and moves and etc.