Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implementation of FE-ANN EoS #115

Closed
gustavochm opened this issue Apr 1, 2024 · 1 comment · Fixed by #117
Closed

Implementation of FE-ANN EoS #115

gustavochm opened this issue Apr 1, 2024 · 1 comment · Fixed by #117
Milestone

Comments

@gustavochm
Copy link

Hi Ian,

So last year, we published an article about developing equations of state using artificial neural networks (FEANN-EoS). This type of EoS could be easily implemented in teqp (I think). Here is a dummy code of how to implement it using numpy:

import numpy as np

feanneos_params = dict(np.load(filename))

# Helper function to get the alpha parameter
def helper_get_alpha(lambda_r, lambda_a):
    """
    Helper function to get the alpha parameter

    Parameters
    ----------
    lambda_r : float or array
        lambda_r parameter
    lambda_a : float or array
        lambda_a parameter

    Returns
    -------
    alpha : float or array
        alpha parameter
    """
    c_alpha = (lambda_r / (lambda_r-lambda_a)) * (lambda_r/lambda_a)**(lambda_a/(lambda_r-lambda_a))
    alpha = c_alpha*(1./(lambda_a-3) - 1./(lambda_r-3))
    return alpha


def feanneos(alpha, rhoad, Tad, feanneos_params):
    
    # inputs the alpha parameter, density and temperature
    # all inputs should be 1d
    alpha = np.atleast_1d(alpha)
    rhoad = np.atleast_1d(rhoad)
    Tad = np.atleast_1d(Tad)
    
    # just checking if all the inpues have the same lenght
    assert alpha.shape == rhoad.shape
    assert alpha.shape == Tad.shape

    rhoad0 = np.zeros_like(rhoad)
    x = np.stack([alpha, rhoad, 1./Tad]).T
    x_rhoad0 = np.stack([alpha, rhoad0, 1./Tad]).T
    
    # getting the number of hidden layers
    hidden_layers = int(feanneos_params['hidden_layers'])
    
    for i in range(hidden_layers):
        # getting kernel and biases
        kernel = feanneos_params[f'kernel_{i}']
        bias = feanneos_params[f'bias_{i}']
        x = np.matmul(x, kernel) + bias
        x_rhoad0 = np.matmul(x_rhoad0, kernel) + bias

        x = np.tanh(x)
        x_rhoad0 = np.tanh(x_rhoad0)
    
    # last layer doesn't have bias
    x = np.matmul(x, feanneos_params['kernel_helmoltz'])
    x_rhoad0 = np.matmul(x_rhoad0, feanneos_params['kernel_helmoltz'])

    # just to the final arrray 1d
    helmholtz = (x - x_rhoad0).flatten()  
    return helmholtz

Here are the ANN's parameters.
feanneos_params.npz.zip

Gustavo

@ianhbell
Copy link
Collaborator

ianhbell commented Apr 1, 2024

Thanks @gustavochm this is a good idea. It fits in nicely with the other Lennard-Jones EOS that have been implemented already. It looks like the amount of data required for the coefficients is a bit too much to store in a header so we will have to store it in a source file, but that is no practical problem. I think this is the first ML-based EOS implemented in any of the EOS libraries, so an exciting step into the future! I'll close this feature request when the code gets merged.

@ianhbell ianhbell added this to the 0.20.0 milestone Apr 1, 2024
@ianhbell ianhbell mentioned this issue Apr 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants