-
-
Notifications
You must be signed in to change notification settings - Fork 100
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add tensor_ops::prelu and nn::PReLU #389
Comments
Can this really be implemented on a tensor level? LeakyReLU should be no problem but PReLU, which effectively has learnable parameters, is not really possible on purely tensors, is it? Edit: Have been thinking about this for some time. Maybe we could make |
This should be closed |
Yep, thanks! |
Lately I've tried porting various computer vision networks to dfdx, and one function that seems quite popular in that field but isn't supported yet is PReLU. For now I just stay away from networks that use it, but it would be nice to have it available in the future.
https://pytorch.org/docs/stable/generated/torch.nn.PReLU.html#torch.nn.PReLU
ParametricReLU is bascially a LeakyReLU where the negative slope is a learned parameter, so there is going to be some overlap with the LeakyReLU implemenation (#287).
The text was updated successfully, but these errors were encountered: