-
Notifications
You must be signed in to change notification settings - Fork 95
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Start defining the Schema type. #11
Conversation
Alright, I'm calling this "ready for review", but I don't think the code is what we'll end up merging. It still needs tests, it needs more basic, builtin types added, and it needs better docstrings on a lot of different things. I mostly just wanted to get enough down to give the basic idea. I'd like to get some reviews on this direction before I put too much time or energy into polishing it. Is this something y'all are comfortable with? Do you like the way this is shaping up? As an example, here's a schema I think a provider may write: mySchema := tf.Schema{
Attributes: map[string]tf.Attribute{
"my_string": {
Type: types.String{},
Required: true,
Sensitive: true,
},
"my_list_of_strings": {
Type: types.List{ElemType: types.String{}},
Optional: true,
Computed: true,
DeprecationMessage: "my_list_of_strings is deprecated and will be removed in the next major version. Please use my_nested_attributes instead.",
},
"my_nested_attributes": {
Attributes: map[string]tf.Attribute{
"my_nested_string": {
Type: types.String{},
Optional: true,
Computed: true,
},
},
AttributesNestingMode: tf.NestingModeList,
Optional: true,
Computed: true,
},
}
} The This overlaps a bit with the work that @kmoe is doing, so I'd love early feedback on it. My goal was mostly "how can I support user-defined types like Timestamp or ARN or CIDR or whatever?" and I kinda fell backwards into "how can we expose data to users in a non-frustrating way. I think there is a downside here in that we would probably still need users to assert the type of the Finally, this PR makes the aggressive choice of
Interested in y'all's thoughts on that, too. |
I like this direction. Some initial partial feedback:
"my_string": {
Type: types.StringType{},
Required: true,
Sensitive: true,
}, (instead of
The If the user is calling |
Also, regarding package name, I'm in favour of |
Currently the root package |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some initial thoughts/comments -- happy to chat about them!
Yeah, gonna need to do some package hierarchy work here. I think the only intentional design decision I made here is that I wanted types to be implemented in a separate package, to let us build out a reasonable standard library of them, and to reinforce that our built-in types weren't doing anything Special. |
obj implementation was slightly broken - should also cherry-pick 0c41edf |
Start building out the schema type, leaving some TODOs for things I'm still mulling over and thinking about. This also brings in the StringKind type, which is a pattern I'm not thrilled about and we should probably think about it some more. I also included the AttributeType and AttributeValue types, which I'll add a package with some builtins for in a future commit, to show off how they're used.
Add implementations of lists and strings, so we can see what an actual type may look like under this system, and to make sure that complex types still work under this system.
Try tweaking the package architecture a bit. Add more types. Make the AttributeTypes more composable. Make nested attributes and descriptions a bit less verbose to work with. Rename our root package to tfsdk.
Add an internal reflect package, with one exported function, Into. This lets us use common reflection rules to map a tftypes.Value onto a Go type. This doesn't handle unknown or null flags yet. My current thoughts are that we'll have flags for different options re: handling those situations: * we can error if an unknown or null value is encountered * we can treat unknown and null as empty values, flattening out some context * we can see if there's an optional `SetKnown` and `SetNull` method on the Go type being targeted, and call it if there is, letting users opt into caring about these details if they want. Also updated the List type to use this new package. Committing to share, going to think a bit more on the design of the thing: * Are we sure we want to go from tftypes.Value instead of from AttributeValue? * Let's get null and unknown values taken care of * teeeeeeessssts * the godoc could honestly be much more helpful
Make it configurable whether null and unknown values can be substituted for empty values when they're not supported by the type they're being put into. Make it so types can have a SetUnknown and SetNull method that's called to handle null and unknown values. Let pointers to pointers, pointers to maps, pointers to slices, and pointers to interfaces be set to nil when there's a null value. Fix a potential panic in reflectPrimitive by making sure the value can be set before we try to set it.
Add test for reflection helpers. Simplify canBeNil. It used to check that a type can hold a nil value _and_ can be set, but we already have a method to check if something can be set: reflect.Value.CanSet. So we don't need to mirror that, which lets us simplify to just "is this of a type that can be set to nil?" This also fixes bugs in the canBeNil implementation, which wouldn't consider things like slices on a struct pointer to be nillable. Also, fix some bugs in setting things to zero; we were wrapping reflect.Values in another reflect.Value (oops) and we weren't checking if dereferencing the pointer would let us set it, which seems like a nice thing to do.
Rather than passing around interface{}s everywhere, convert from interface{} once and use a reflect.Value. This ensures we don't lose any information about whether a value can be set or not, because reflect.ValueOf(pointerToStruct).FieldByName("NonPointerField") can be set but reflect.ValueOf(pointerToStruct.NonPointerField) cannot be. Easiest just to use reflect.Values everywhere internally. Numbers were panicking when trying to set them because *big.Float was a nil pointer. Special-case numbers, allowing provider devs to use any of the built-in number types, and manually set them, rather than letting tftypes.Value.As do it for us. This took a lot of work and I am in a blood feud with the math/big package and also architecture-specific int sizes (why is the size of an int accessible through the strconv package?) Options gained an AllowRoundingNumbers option so we can decide later whether we want to round or throw an error when provider devs try to put a value in a type that it won't fit in. Rounding seems nice until you remember that Terraform hates rounding with a fiery passion that eats away at its soul. Errors seem nice until you remember my blood feud with the math/big package. Primitives got tests. Well, strings. Same difference, really. Objects got tests. Started getting tests. Then I got distracted with numbers. Into is now into, with Into calling into. Recursion hurts my brain. This is so we can accept an interface{} but recurse on reflect.Values.
Add some more tests for object to struct conversion. Remove a log line. Oops. Simplify primitives by passing a concrete type to tftypes.Value.As and then setting a reflect.Value from the result. This avoids shenanigans around pointers on reflect.Value types. Make elements of a slice always settable by adding the reflect.Value to the slice, then retrieving it by index, before recursing into it with the `into` function. This means even when we add a new reflect.Value to a slice, we can still set its value. Add TODOs for the object tests we should probably write and the fact that we've just ignored maps entirely so far.
Add support for maps. Maps, of course, have to be an entirely unaddressable type, meaning we can't use `Set` on the values of a map. This is incredibly annoying and basically means all the code we had already written wouldn't work for map values. That's obviously not going to work, so I changed the paradigm a bit. Instead of passing around a reflect.Value that these functions are setting values on, these functions are now _constructing_ reflect.Values of the same type as the reflect.Value they're given, filled with the data from the tftypes.Value. This means we don't need to worry about the Set semantics anymore. We'll build up a Value, then the entrypoint Into function can set that Value on whatever is being passed in. Also, instead of using trueReflectValue everywhere to let us ignore pointers, the fact that we're constructing reflect.Values means we need to actually care about pointers and interfaces, because you can't set a string on a *string, they're different types. So we need to handle the interfaces and pointers explicitly now, which I've left TODO comments for. But hey, the tests all pass, so, that's cool.
Add support for our setUnknownable and setNullable interfaces, as well as for the AttributeValue interface and the tftypes.ValueConverter interface. This involved some sleight of hand. We want to have an empty version of the type implementing the interface, but that type may be a pointer, so an empty version would be nil, which leads to accidental panics. Which we don't want. Worse, it could, in theory, be a pointer to a pointer to a pointer to a pointer to a pointer to a pointer... you get the idea. So we needed a new `pointerSafeZeroValue` that would find whatever type underlies all the pointers, create a zero value of it, and then create a pointer to that zero value, matching the level of indirection. We also stopped converting values to interfaces to call the method directly, letting us avoid nil pointer panics. It's more verbose, but using reflection lets us have zero values and just not worry about things that are nil. Much better. Add support for pointers, dereferencing them and reflecting on the type they implement, then wrapping that in a pointer on the way back out. This also involves some sleight of hand to make the pointer settable, but with some ingenuity and some cursed convoluted code, we manage it. See comments in the reflectPointer function for more details. Don't handle interface kinds, because there's no solid type underlying them, and I don't feel like picking one. People can set real types, it's ok. Add tests for setUnknownable, setNullable, and AttributeValue values, as well as for maps and pointers. Supporting AttributeValue as targets required AttributeValues to gain a SetTerraformValue method, updating the contents to match a tftypes.Value. This required our AttributeValues to become pointers. Annoying. It also required our composite AttributeValues to have AttributeTypes as the ElemType, so they could create new AttributeValues in the AttributeValue.SetTerraformValue method, if needed. This, in turn, required AttributeType to gain an Equal method, so the AttributeValue Equal method could compare AttributeTypes. Exhausting. Also, we're actually testing that complex structs are having their properties set correctly now, which is novel and neat. (They are.)
Test support for the tftypes.ValueConverter interface.
Fix ObjectType to be an AttributeType by implementing the Equal method. Add GoDoc comments everywhere to make golint happy.
This is another attempt at allowing providers to define schemas. It's born out of the ashes of #11, which gradually grew from baking an apple pie from scratch into creating the universe. Or however that saying goes. This is intentionally limited in scope to just setting up the types for declaring schemas and the types required by that, namely our attribute interfaces. Unlike #11, it makes no attempt to use these types for anything or prove they're the right types; the work done with #11 gives me confidence that they're a worthwhile direction to pursue. I'm submitting this as a separate PR to make review easier and to optimize for mergeability, letting us get some shared types established while still taking an appropriate amount of time to review the reflection code that is in our future.
This is another attempt at allowing providers to define schemas. It's born out of the ashes of #11, which gradually grew from baking an apple pie from scratch into creating the universe. Or however that saying goes. This is intentionally limited in scope to just setting up the types for declaring schemas and the types required by that, namely our attribute interfaces. Unlike #11, it makes no attempt to use these types for anything or prove they're the right types; the work done with #11 gives me confidence that they're a worthwhile direction to pursue. I'm submitting this as a separate PR to make review easier and to optimize for mergeability, letting us get some shared types established while still taking an appropriate amount of time to review the reflection code that is in our future.
This is another attempt at allowing providers to define schemas. It's born out of the ashes of #11, which gradually grew from baking an apple pie from scratch into creating the universe. Or however that saying goes. This is intentionally limited in scope to just setting up the types for declaring schemas and the types required by that, namely our attribute interfaces. Unlike #11, it makes no attempt to use these types for anything or prove they're the right types; the work done with #11 gives me confidence that they're a worthwhile direction to pursue. I'm submitting this as a separate PR to make review easier and to optimize for mergeability, letting us get some shared types established while still taking an appropriate amount of time to review the reflection code that is in our future.
I'm going to lock this pull request because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active contributions. |
Start building out the schema type, leaving some TODOs for things I'm
still mulling over and thinking about.
This also brings in the StringKind type, which is a pattern I'm not
thrilled about and we should probably think about it some more.
I also included the AttributeType and AttributeValue types, which I'll
add a package with some builtins for in a future commit, to show off how
they're used.