-
-
Notifications
You must be signed in to change notification settings - Fork 251
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Proposal: Bounded Field Functions / Premodifiers #696
Comments
We decided we are probably better off, for now, developing a less versatile version of this independently without attempting to upstream it. But do let us know if the idea seems reasonable for upstreaming, maybe at some point we will be in a better place to do that |
Depends, you can generate and save on the fly like Minecraft does. It just takes more space, but nowadays space is relatively cheap.
Hmmm I don't feel very good seeing this coming ^^" Modifiers haven't been on my maintainance list for a while, and I have no plans for them.
Arghhh
They can't do that because they were not meant to (and also more a proof of concept more than anything else). If you want a generation system that mixes up modifiers and global stuff, you pretty much need to implement your own generator, which then would have a collection of your own kind of modifiers.
Nope... those are still modifiers. At least if they were integrated to the module... there is no pre/post "Types" of modifiers, only one, with an "order of application" or "layers". In short: everything is modifiers. "generators" just happen to pre-date modifiers and are global, while modifiers arrived later as an experiment to apply on top. But the evolution of such a system would pretty much be "everything is a modifier with either local or global bounds", including generators, and maybe even edits.
Another multiplication of complexity: I have no idea how would that even work with it, but it sounds like a ton of work (and yet, only if you want to use such a system! Plenty of games don't need that).
There is a lexical conflict here, because "channel" is already used in
Sounds like a rectangular seam generator. I would expect them to have a falloff eventually.
Sounds easy, but I have a feeling that at some point something different will be needed by someone. "modifiers" were meant to interact a bit similar way to layers in a painting program, where each layer can have different blending modes. So depending on that and their order, landscape could be shaped manually ("add" being a possible choice).
Depends on what you do with them. A lot of edits done in the same place becomes increasingly expensive to sample, and memory does increase endlessly in the long run. If you consider doing edits with them though. When I added modifiers, I meant them to be an off-line tool to manually place landscape features. I stopped working on them early on though.
I would use a spline modifier. Overall, you pretty much expanded over modifiers which I already had ideas about. However, as you might guess, I lost motivation in working on that because these are tools meant for manual landscape editing, while my main interests were fully-procedural generation with sampled in-game edits (in addition to things that are not specifically terrain stuff). And generally, a lot of work I don't have time to invest into currently (I'll say this often, because I usually do one thing at a time, and say that to everyone else xD).
A bit curious how you're going to implement this (notably the resource/node awkwardness). |
We estimate the space required for this at our scale would be, like, on the order of a terabyte. It is entirely possible I did my math wrong, but for a 35000 x 35000 x 1000 terrain, let's first suppose that throwing out out empty chunks reduces that to more like a
Sure, sure, though I think this is more a matter of semantics. I've switched to calling them Fields, rather than Pre-Modifiers, to better distinguish them, since these do not output SDF data, which is the main distinction I think is important to grasp. Their value is fully defined before any SDF data gets operated on, which is really the main idea behind them (and they cannot have any knowledge of SDF data). So they bypass voxel storage completely.
Maybe so! But for other reasons. Remember, I mentioned these are not SDF, so 0 is the fully-fallen-off value. The best way to think about these is, as, like, non-destructive, floating-point spatial tagging data that the generator and shader have access to, and which is not stored anywhere in the voxel structure itself. The main thing right now that worries me, regarding seams, is that I really don't understand all this range-analysis code I keep seeing, I may be forced to implement that for this feature. But I am trudging along, we'll see how it goes.
I imagine everyone will have slightly different uses, yeah. But I think if you go back to my description above: Or, you could have an integer-valued version of this concept, which would not be very useful for me, but could be extremely useful for the biomes use-case. IDK. Or in the worst case, if you have four biomes, you could have four floating point fields representing the influence of each biome. But you're right, integrating this into the engine in a truly clean way will require a lot of extra work, and maybe this is too niche for that to be worth it. So that is why we decided to just do path of least resistance in our own fork. Pray for us that this decision does not come back to bite us xD
Do spline modifiers exist somewhere? Cause we will need them, for other scenarios. In this scenario, though, it would be nice to have a 2D path region within-which the generator knows to turn down roughness contributors. We're faking it right now by using a cellular noise, but the key is that we need these paths to follow specific shapes. So a spline field (or premodifier as I was calling them earlier) would get this job done. But a spline modifier would just write new SDF data for a spline. To your credit, that's useful in other scenarios. For instance, should we move forward on this concept, we will want to add ravines, and these also need to meet very specific point-to-point guarantees. So in that case a spline modifier is 100% the right solution, with maybe a spline field alongside it to help the generator produce different weight paints in the area. Or perhaps we'll be forced to just implement weight paint outputs for modifiers, although I have a feeling that might turn into a slight headache. Really the choice between the two is gonna be path of least resistance for us.
Yeah, this is a valid concern. I think you could mitigate this by globalizing the So, for example, in But what if this function were passed And what if the generation logic were centralized? Like, right now, you have it implemented in at least two places: But what if you just had a single class,
Meanwhile, Most generators would just ignore VoxelFieldStack. The TransvoxelMesher could (optionally) pass that data to the shader via a similar mechanism to weight paints. Only in this case, it would be dynamically sampled during meshing, rather than stored and then loaded from the voxel data, thus avoiding quite a few of the pitfalls involved with weight paints. This would ideally be available in a similar sort of "optional context parameter" struct to what I describe with But, yeah, that's a pretty significant refactor. If you implement that refactor (just ignore the VoxelFieldStack aspect), we would of course build our system on top of it. But for us path of least resistance is to really just update the subsystems that we are using to support this feature, and not bother with refactoring everything else. So that is why we are developing this locally for now.
My current approach is this: I am essentially cloning your I will be adding the Then, it will be up to each In our local version, this is going to be quite gross, because I won't be implementing the In an upstreamable version of this feature, I would have implemented some kind of optional-attribute-context structure (such as the VoxelPipeline idea), and then added VoxelFieldStack as one of the properties. I would be happy to share with you what the final version ends up looking like, assuming we ever get it to work |
Rereading this, I think I understand a bit better. Were this to be integrated, rather than implement a separate I did forget that this exists when replying above. The reason I chose not to go with this, at least for local implementation, is largely just simplicity. The sampling methodology is, IMO, better off being slightly different for It does create quite a bit of code duplication, although this could be worked around with some code sharing. In my mind, this would be via a "VoxelEffect" class which contains just the logic surrounding transform/AABB, leaving the sampling logic and data structures up to subclasses. So "VoxelField" and "VoxelModifier" would inherit from that, and then the two stack types would not worry about VoxelEffect, again due to the distinctions in how sampling logic probably should work. But, I think you are saying that "VoxelEffect" could just as easily be "VoxedlModifier". In which case, I would rename "VoxelModifierStack" to "VoxelSDFModifierStack" and then I would create "VoxelFieldModifierStack". But for non-upstreamable implementation, it suffices for me, and is easier, to just clone the VoxelModifierStack system and not do the refactors required for a cleaner integration. |
It occurs to me that there is no need for separated stacks, either. There could be a single stack type, and just multiple different implementations of I think I will switch to this approach, since it will be less violent than the approach I had been thinking. @Zylann Would you accept a PR renaming the existing This would allow our modifications to live more peacefully with upstream. |
So you need LOD to see it all at once? Note that from what I said, you don't have to pre-generate all of that. But I guess you want your field system to manually author all this?
The graph system is actually more like an expression evaluator. It has 2 backends: CPU and GPU. They produce the same results, but use different approaches.
This is manual authoring of discrete items in the scene though. It requires user action to design this in the editor, and not every generator will want to work like that.
In Minecraft that's just a noise combination. Not something placed specifically at some XYZ coordinate with some specific size.
No, it's just a concept. The idea is that modifiers/fields could be anything in the scene that may or may not have bounds, and affect generation. In that sense, they can be a path with control points, which is usually called a spline, or a bezier curve. The distance to the spline can be used as the field. Then a path could have a bounding box or be further subdivided to optimize things a bit (if you use a lot of fields you will eventually need a spatial partitionning system to look them up quickly and in a thread-safe manner). Easier to describe than to implement of course.
If you think about it, this is litterally what VoxelGenerator is. You might not even need the existing modifiers system. The only issue as I said earlier, is the absence of connection to the scene tree, which is partly why I ended up passing it around separately. It's an arbitrary Godot rule that resources should not contain anything about stuff in the scene tree. But that rule is not a requirement if you have specific needs. With a custom generator and a collection of custom nodes that recognize it and expose the interaction with fields, you can get a similar result without having to modify everything.
Resource or node? That is the question. But so far everything discussed is about generation, so it's all been VoxelGenerator.
Why VoxelMesher all of a sudden 🤔 I though this was about generation?
I don't think meshers should be aware of that system at all. Meshers should only work on VoxelBuffer that they are given, which represents the result of sampling an area. It's the work of the previous stage (generating/copying voxels) to pass this data along. It doesn't mean it gets stored, however this is where things get very frustrating because this is disregarding destructive edits, which are the main way player edits happen in this module. All that data would have to be stored persistently in this case. That would mean the huge amount of changes and work that fields represent would be unusable for that other approach, or you'd have to use fields as edits too at runtime which brings yet another bunch of work (unless of course you don't want any editing at runtime, which I think defeats a lot of the point of this module compared to just using Blender or some other static terrain tools and associated rendering optimizations).
Or like I said, just make your generator have a reference to it directly (or contain it). I might not do that from a plugin point of view, but from a dedicated approach for a game project that might do.
Let's say that what I mentionned was the ultimate goal at the time, fields are just another generalization. Though such generalization means lack of structure, like fields requiring tinkering with a generator while modifiers were meant to work on their own.
I dont think it's any simpler, just different usage.
Not sure of that. It sounded to me like modifiers were doing their own thing inside
I don't know... also it's not the only thing you'd have to change to insert your system there. If you want nodes to derive from the same base you might; Godot likes OOP so exposed classes may reflect that workflow, but the internals aim to be lighter and would have to be split, or largely rewritten. Again I didn't mean to maintain this anymore, at least for now, and even having to review PRs touching it is similar to taking time working indirectly on it. I realize I took a whole hour going through all this, overall I'm not really available to work in this area (whether it's code or reviews). I might come back to it if one day I get more interest for this kind of general design. |
Thank you for the explanation, it is helpful for orienting myself in your codebase. It is especially a relief to hear that range analysis is not required, as that will allow us to skip it (for now) for prototyping.
Heh, yeah, I saw that -- Skipping GPU support is one of the various shortcuts we will be taking, since we are not currently aiming to upstream this. That being said, I hope to implement this feature in such a way that incompatibilities with upstream are minimized. So at least having a basic understanding of your generation systems (GPU included) seems prudent. So thank you for going into detail on that as well.
I guess my intended meaning here is that "manual authoring of discrete items in the scene" may be a more common desire than you think. But this could just be my own bias speaking!
We are interested in meshers being aware, too, so that shader behavior can also be (optionally) affected by fields. Although we will probably save implementing that for when we are certain it makes artistic sense. But, for instance, fields could present an alternative solution to the same problem that currently motivates us to use weight-painting. So we are interested in exploring that, especially since it suffers fewer of the drawbacks. Actually, for that reason, I would have thought you would be very interested in this potential use-case!
I don't really see a problem here -- You already demonstrated the solution with everything else in this engine which is non-destructive. Destructive edits override non-destructive edits. Fields would/will be used only by non-destructive elements of the engine (generators and maybe meshers). As you point out, this makes this feature far more useful to workflows that emphasize non-destructive editing. I think that's fine, personally. Not every user of your engine needs or desires a destructive workflow.
Definitely an approach I considered, but having access to fields inside the graph editor would be useful, and taking advantage of the already existing infrastructure for defining, tracking, and selectively applying modifiers was enough to lure me into going with the "fields as a subtype of modifier" approach.
I agree, and this became more clear as I gained more familiarity with your existing implementation. Integrating this feature as a modifier subtype has (so far) been relatively painless.
Yep, this is what I mean. But I decided not to rename
I do like the big posts, because it creates a public, searchable record in a nice recognizable spot which can then be read by others who may have an interest in the issue, and further discussed. I actually am in the discord, but to me it is better to discuss in a forum format like this.
Very understandable. Right now, my primary interest is in minimizing the divergence between our custom solution, and upstream. I have a double purpose in that -- One, it may lead to a semi-upstreamable solution (in case you change your mind), and two, it may save us from nasty refactoring if we decide to upmerge. Speaking on that, I will now turn my attention to what I actually returned here to discuss:
I see! This was not something I was aware of, I apologize. I did read a decent chunk of the Godot engine development docs (as your development docs recommend), but I missed this particular requirement, or perhaps I simply did not understand when I read it. That does complicate things. BUT! It seems to me you have worked around this problem already? Technically speaking, But the class itself has no knowledge. (I think? It is entirely possible something is hiding right under my nose, I am still getting familiar) So IMO, could I ask because I am thinking about making this change in my local copy to improve modularity. The modifier having direct access would mean I would not have to add it as an argument to I will admit, I am not yet 100% certain this is the best course to take. But your input would be valuable to me for deciding. This is the kind of tweak I could potentially upstream, for instance. The benefit to you, in that case, would be (hypothetically) a small incremental step toward better modularity. Would such a change seem desirable to you? |
What I meant by that is that meshers should not be aware of how those "fields" get generated. They could access the results the same way they access voxels without knowing how they got generated. Maybe one advantage of having access to evaluation would be to do it per vertex after the meshing process, but that depends a lot what kind of transfert is being done.
Not sure what you mean here. By essence, a destructive edit has no choice but to "freeze and store" all the data it modified in place, which means if that happens to field data it will start blowing up taken space (in particular if you layer tons and tons of fields). I guess field data would never be edited, but that might have undesired side effects depending on how it is used. Like, if you use fields for texturing, then a player wants to place voxels with a different texture, those voxels need to override somehow the whole process in which you make those textures shown. Which usually means taking chunks intersecting with the edit, and starting to save/load their values so they replace all procedural source.
So you mean fields never get destructively edited I guess?
Maybe, but to me it's currently huge walls of debating text with very little synthesis for whoever wants to get a summary. That's why when posts start to become too long-winded with so much quoting I begin to think about voice chat because there is just too much going on.
This is not exactly what I wanted to highlight. Indeed,
As shown above, you can't do that, unless you enforce the breakage of resource rules. That is, you would decide that your generator must not be shared and must have a lifetime tied to the terrain node.
It doesn't improve modularity, it introduces coupling, with the constraints that come with it. |
Yes, exactly. This would be an intentional limitation, and it would be the thing that makes fields distinct from, say, weight paints.
Precisely, precisely. This is what I have been trying to describe. And the ability to do this without those values ever being stored in the voxel is one of the key advantages of fields (given the right circumstances/use-case, at least)
I think this is covered by my above two responses, but just to reiterate the point: Fields, by design, should not be destructively editable. If a destructive edit happens, the only thing that has to be saved is the final SDF there. By the time destructive edits happen, field values are no longer relevant at all (except to the mesher, for per-vertex sampling to be passed off directly to the shader).
Yes, precisely. Those undesired side-effects are very much an intentional tradeoff to this proposed feature. It is not a feature for everyone that is very much true.
Yes, exactly! Fields are simply not a good solution for this use-case. Generally, in this situation, you will be better off using weight paints (and dealing with the associated tradeoffs there). If, on the other hand, you do not intend to allow destructive editing, or if you are okay with field values not being destructively editable, then fields might be quite useful to you, as is the case for us.
Ahh, I see, thank you. I did not understand until this last explanation, I apologize for my confusion. Thank you for taking the time to explain it to me, this will help me stick to a more maintainable implementation.
You're right, I see now that this would be a problem. I guess we are back to needing a more consistently used generator context of some kind. But for my personal purposes, it seems my best bet is just to modify the signature of the
I can definitely understand that perspective. Though I would point out that even if we take this to VC, anyone exterior to us will still not have a good summary. At least this way there is some record. And with discussions like these, engineering decisions can be understood, supposing for instance that this feature ever gets upstreamed. It's one of my go-to tricks when dealing with open-source projects to track down the PR or issue associated with a given commit, and see what the engineers were discussing. Those PRs also give me a good place to contact those engineers if I wind up having questions pertinent to the additions. But, there is another issue, just personal to me: I find it much harder to discuss these things in voice format. I appreciate the time that text-based formats give me to formulate my thoughts and try to make them concise and clear. So if you are willing to continue to discuss in this format, that would be very helpful to me. But I understand that I am using up your time, here, so if you'd prefer otherwise, I am happy to let it drop. You have already been greatly helpful and set us on the right track for our downstream implementation. |
Is your feature request related to a problem?
Hi! We've quickly realized that it will be completely impractical for us to save our procedurally generated world to disk, so destructive methods for generating it are out of the question. That means we need more advanced tools for non-destructive worldgen changes.
Modifiers are already a brilliant tool for this, and we anticipate we will be using them extremely heavily (with our own custom modifier implementations).
However, they are limited by the fact that they apply after the terrain generation has finished. It would be very useful to us if modifiers could somehow influence the terrain gen step itself. I'll give some concrete examples of why a little later.
Describe the solution you'd like
The solution we arrived at is pre-modifiers. These would essentially be very very simple modifier regions which would non-destructively define custom channel fields which would be accessible to terrain generators (such as the graph generator).
Each pre-modifier would have an associated channel, and an associated Bounded Field Function. Essentially, BFFs are just SDFs, but intentionally dropping the association of the field value with an isosurface.
BFFs instead simply define a real-valued field that can be sampled at any location in space, and used, for example, as an input to a voxel generator graph. The key thing that makes them bounded is that each BFF has a well-defined AABB region outside which it can be assumed to output the value zero.
So, extremely similar to SDFs. You can make a BFF from any SDF by negating the SDF and clamping the output of that to > 0.
Each channel is sample-able by the generator. When a channel is sampled within a chunk, the output values for all of the BFFs with that channel index are simply summed. This list can be filtered to only include BFFs whose AABB intersects the AABB of the chunk being processed, so I believe this could be quite efficient. And, being completely non-destructive, it would come with little to no memory cost.
This simple summation is why the "rest value" of each BFF should be zero, and why I want to distinguish BFFs from SDFs. But the idea overall is very similar.
So, one very simple example: Suppose I wanted to create a road from point A to point B in my terrain. I could create a path BFF that is zero anywhere outside the region where the road exists, and write that value to, say, channel 10.
Then in my generator, I could include logic which samples channel 10, and anywhere that channel 10 is above a threshold, blend away anything creating rough terrain, and place "rocky path" texture paints into the weight data.
Hopefully that simple example demonstrates why this could be a very powerful feature.
The nice thing is, with the architecture I've laid out here, there would be no practical limit to the number of channels, and the number of pre-modifiers could be quite high with minimal impact. So this could allow for very detailed non-destructive terrain customization!
Describe alternatives you've considered
We need to be able to arbitrarily and non-destructively define structures in the generated terrain that do not derive simply from parallelize noise distributions, but which affect characteristics of the terrain generation.
This proposal is, IMO, the de-facto way of doing that efficiently. The key decision is really whether we should try to integrate it into godot_voxel directly, or if we would be better off implementing it as part of a custom generator class. And that really depends on whether you have interest in having it integrated.
So: What do you think of the idea? We'll absolutely be happy to implement it if you like it.
The text was updated successfully, but these errors were encountered: