Learning parametric functions with PySR #623
-
Hi PySR community! We are currently stuck with a physics problem and it seems like PySR is just what we need. Instead of brute-forcing the problem with neural networks, we would like to know more about our system so having some equations to look at would help us gain more insight for future research. The problem is very straight-forward: we have a 100 bodies with three euclidean coordinates each, so a total number of 300 parameters in our Our questions essentially are:
I think that was it for now. We really look forward to trying PySR in a later project! 😃 EDIT: It is also worth noting that some of these bodies are of the same character. So in that case it would definitely be unnecassary to find the same expression yet again and very worrying if it did not! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Hi @pastaalfredo, Great question! We have done this in https://github.com/MilesCranmer/symbolic_deep_learning (check out the colab demo here: https://colab.research.google.com/github/MilesCranmer/symbolic_deep_learning/blob/master/GN_Demo_Colab.ipynb – which goes through this exact sort of problem). However for you, you will need to also learn a "parameter dictionary" which we did in https://iopscience.iop.org/article/10.1088/2632-2153/acfa63/meta as a way to do "basis learning". Basically you would first learn a deep learning model on your problem, and inside that model, you would have a set of per-object parameters (act as your function "parameters" such as class ParametricModel(nn.Module):
def __init__(self, num_objects, num_params):
self.parameter_map = nn.Parameter(torch.randn(num_objects, num_params, dtype=torch.float32))
# Implement neural net model for learning expression
self.mlp = ... # has (num_params + num_features) input
def forward(self, x):
object_index = x[:, 0] # For example - you put in `i` here.
parameters = self.parameter_map[object_index]
features = x[:, 1:]
features_and_parameters = torch.cat((parameters, features), dim=1)
return self.mlp(feature_and_parameters) After training this, you can approximate That will be a parametric function, and the per-object parameters will be found in This essentially turns a 100-input problem into a parametrized 5-input problem, which is much easier for PySR to handle. Cheers, |
Beta Was this translation helpful? Give feedback.
Hi @pastaalfredo,
Great question! We have done this in https://github.com/MilesCranmer/symbolic_deep_learning (check out the colab demo here: https://colab.research.google.com/github/MilesCranmer/symbolic_deep_learning/blob/master/GN_Demo_Colab.ipynb – which goes through this exact sort of problem). However for you, you will need to also learn a "parameter dictionary" which we did in https://iopscience.iop.org/article/10.1088/2632-2153/acfa63/meta as a way to do "basis learning".
Basically you would first learn a deep learning model on your problem, and inside that model, you would have a set of per-object parameters (act as your function "parameters" such as$m_i$ and $m_j$ in your examp…