-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Suggestion for Redesign 1 #158
Comments
Just wanted to comment on the previous suggestions and comments, and primarily getting to better understand the cons
What would be features that are general and would have to be added to the general class?
Not sure I understand the problem, I mechanics problem does not have temperature boundary conditions, so would you then to return NULL when getting those?
IMO, this is just a base class and everyone can derive from that. In the redesign, we might actually think about modularizing the tools (tool agnostic problem definition with experimental data, simulation sensor module with visualization capabilities), such that you do not need to hack the classes.
That could potentially be solved by the new approach.
I agree.
The question is, what we mean by geometry, I guess this is the mesh (eg. a msh input) and the boundary conditions. For the sensor data (and potentially the definition of sensors) we might separate that potentially in the same module as the simulation sensors, such that we create them only once and activate some experimental sensors also as relevant for the FE postprocessing.
I agree, the straightforward way is just function definitions. If they are defined as string, they could be converted to python functions and then executed, while still being serializable in JSON.
Here we would have to agree on an interface, or even see, if there are advantages compared to just providing examples and then copy paste those. I have the impression the latter might be more user-friendly (and complies better with the flexibility that most people require when doing post-processing.
See above, otherwise I agree.
Would be nice, but I guess this is rather an additional addon, most people would not use on a regular basis (maybe at the end when doing their paper).
|
Thank you for your feedback.
For my case, that would be time-dependent boundary conditions (Neuman and Dirichlet) and initial conditions. I was vague here because I don't know the limitations of the other potential users.
An experiment for a temperature problem is expected to have a method for temperature bcs. But there is no interface that enforces this and therefore I currently need to look into the specific FE problem to see which methods are needed. The
Base classes can enforce methods. I would also be in favor of enforcing attributes of the class with
I think I mean mesh when I say geometry. I mainly want this because currently the definition of the mesh is always linked to a very specific set of bcs. If I want two experiments that have the same mesh but different bcs, I would need to write two classes for that (or have the bcs be parameters that are passed in the init). I currently have a I think we agree on the rest;) |
Could we do a merge request (without actually merging), because that allows to have a discussion on single lines of code rather than abstract text in an issue. |
Sure, I'll make a draft |
In order to advance the discussion, it would be great to get a feedback from everyone. Please (@aradermacher, @div-tyg , @danielandresarcones , @ajafarihub , @saifr68) have a look at the suggestions from @srosenbu in order to start a discussion. In particular, we want to build something that is useful and used within all related projects. If this is just interesting for a single person (and that is to be found out in the discussion), there is no need to develop an overarching solution. And if that solution only comprises minimal additional functionality, this would also be fine. |
I just wanted to point out one observation/challenge and propose my suggestion. IMO, a tricky part of using/developing the interface is being able to flexibly handle the definition of BCs (boundary conditions) just in terms of the location/sub-domain where those BCs should be applied. In particular, when we deal with a real mesh that has been constructed for example from real CT data or any other meshing tool (and this is at least for me a relevant situation to be in), it will be complicated to define/use flexible routines/methods in the interface to identify specific surfaces or collection of nodes where a desired BC should be applied. My suggestion is to design the interface by relying on: (i) mesh files that essentially store mesh objects, (ii) other files that represent certain desired sub-domains of the mesh in terms of collections of nodes. IMO, the above scenario is the most flexible way, which is - I think - pretty much unavoidable if we deal with arbitrary real meshes. Besides, and to make the interface more user-friendly, we can also provide specific methods that return such list of coordinates for certain sub-domains in certain geometries; especially if the mesh is nominal with known geometric features/dimensions. |
I believe this is exactly what the proposed interface provides, the input of groups/sets of nodes or elements (to e.g. apply Dirichlet boundary conditions on nodes) can be done either using a functional representation (similar to fenics) or using e.g. the physical components from the mesh file. So what exactly would you change? The only open issue IMO is the correct definition for applying Neumann bc, since that requires not only elements and nodes, but surfaces. |
Thanks for your reply, @joergfunger .
Well, from your conversations above, I got the impression that a flexible way of handling BCs was still quite unclear, although it was clear that the BCs should be specified regardless of the FEM problem. One question is, how you would expect those "physical components" to be represented. Are they specific features/components that are stored within a same mesh file? Or are they represented in any other separate files? My proposal was to go with the second approach and rely on files that store certain collections of nodes for every sub-domain (potentially sub-boundary) where Dirichlet BCs should be applied. The main reason I like this approach is in dealing with real meshes obtained from CT data. These meshes usually do not have regular/smooth surfaces, edges, and facets. They are - at the end - consisting of a set of nodal coordinates and a set of cells connectivity. For such meshes, I believe the most practical way of specifying the domain of BCs is to define/extract sets of coordinates over which any BC should apply. Of course, the specification of such nodes collection is challenging by itself, nevertheless, at least I do not see any better way.
I am not sure what kind of changes the interface would need for the approach that I suggested (since I have not been involved in developing the interface). Or maybe I did not get your question.
I am not aware of how Neumann BCs are treated in dolfinX. From my experience in dolfin however, I think we can still stay with collection of nodes (as I proposed above), because (at-least in dolfin) we can define a sub-domain based on any arbitrary collection of nodes, and then we can use that sub-domain for specifying the integral measure (ds) over which a Neumann BCs is applied. But this is just a rough idea and may be different in dolfinX than dolfin. |
A follow-up: |
Hi Abbas, |
Hi @pdiercks , thank you for your reply. I agree in general. But I think this approach (using higher-order geometry in dolfinX) is optional and not the only way to follow. Often, a liner geometry (mesh) is rather enough, although we still wish to have higher-degree shape-functions for representing specific fields. In such cases, we may need to define BC domains just as usual and directly from geometric features of the domain (like x[0]=0 or so). So, my suggestion of relying on "node collections" might be flexible for handling more complex geometries, but could be sub-optimal for simple linear geometries. Furthermore, even if we are going to use higher order geometry representation (for meshing), I am wondering about another challenge. It now seems that we should define/generate the mesh already in view of the interpolation degree of a Function space that we would naturally determine in a later stage. However, I actually would prefer to have the mesh generation in a separate module and prior stage that is independent of any problem definition. So, I am not sure how to design the interface in this respect. |
hi @ajafarihub ,
I think with your node collection approach you already assume an isoparametric formulation and this does not work as you noticed when the formulation is not isoparamtric (or at least would require additional information if the DOFs associated to the mid-nodes should also be constrained). So I think if you want to describe the boundary as a set of discrete entities you should not limit that to topological dimension 0, but also consider edges, facets and cells. How this information is used to define boundary conditions would depend on the PDE solver and some kind of conversion step might be necessary. |
After our discussion, I thought again about what I like about this repo:
ureg.Quantity
What I don't like:
__init__
. This is i.m.o. too flexiblebe because it encourages to "hack your way around the limitations of the design" but then your hack becomes unusable to others. There is of course no perfect design, but I would be more in favor of updating the interfaces from time to time to allow new features which then all new problems must adhere to.What I would want (in addition to what I like):
I have some code suggestions on the branch https://github.com/BAMresearch/FenicsXConcrete/tree/sjard_suggestion.
It's all WIP, but I am currently tending to make the description of the experiment fully independent of FEniCS, because the FEniCS-objects might be different depending on the problem that is solved. All classes use
pydantic.dataclasses
in order to enforce the class variables that are being used and to enable serializability by default.To be continued...
The text was updated successfully, but these errors were encountered: