Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve a few activation functions #347

Merged
merged 8 commits into from
Oct 18, 2021
Merged

Conversation

mcabbott
Copy link
Member

@mcabbott mcabbott commented Aug 9, 2021

This makes leakyrelu 3x quicker, and relu 10% quicker. I'm sure these weren't bottlenecks for anyone, but still. Also upgrades gradients to use InplaceableThunk.

src/activations.jl Outdated Show resolved Hide resolved
src/activations.jl Outdated Show resolved Hide resolved
src/activations.jl Outdated Show resolved Hide resolved
Co-authored-by: Carlo Lucibello <carlo.lucibello@gmail.com>
src/activations.jl Outdated Show resolved Hide resolved
src/activations.jl Outdated Show resolved Hide resolved
@CarloLucibello CarloLucibello merged commit eb2f248 into FluxML:master Oct 18, 2021
@mcabbott mcabbott deleted the leakyrelu branch October 18, 2021 07:57
Sleort added a commit to Sleort/NNlib.jl that referenced this pull request Nov 18, 2022
A few percent faster `softplus` using `relu(x)` instead of `max(x, 0)`. Ref. FluxML#347.
@Sleort Sleort mentioned this pull request Nov 18, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants