Skip to content

FluxML/Flux.jl

This branch is 11 commits ahead of, 55 commits behind master.

Folders and files

NameName
Last commit message
Last commit date

Latest commit

a216eb8 · Oct 20, 2024
Oct 12, 2024
Jan 28, 2021
Sep 10, 2023
Oct 12, 2024
Sep 10, 2023
Oct 20, 2024
Oct 20, 2024
Apr 25, 2019
Feb 14, 2023
Oct 20, 2024
Oct 20, 2024
Sep 10, 2023
Sep 8, 2019
Oct 13, 2024
Mar 7, 2021
Jul 10, 2020
Sep 10, 2023
Apr 15, 2019
Oct 11, 2024
Oct 20, 2024
Oct 14, 2024

Repository files navigation

DOI Flux Downloads
ColPrac: Contributor's Guide on Collaborative Practices for Community Packages

Flux is an elegant approach to machine learning. It's a 100% pure-Julia stack, and provides lightweight abstractions on top of Julia's native GPU and AD support. Flux makes the easy things easy while remaining fully hackable.

Works best with Julia 1.9 or later. Here's a very short example to try it out:

using Flux, Plots
data = [([x], 2x-x^3) for x in -2:0.1f0:2]

model = Chain(Dense(1 => 23, tanh), Dense(23 => 1, bias=false), only)

optim = Flux.setup(Adam(), model)
for epoch in 1:1000
  Flux.train!((m,x,y) -> (m(x) - y)^2, model, data, optim)
end

plot(x -> 2x-x^3, -2, 2, legend=false)
scatter!(x -> model([x]), -2:0.1f0:2)

The quickstart page has a longer example. See the documentation for details, or the model zoo for examples. Ask questions on the Julia discourse or slack.

If you use Flux in your research, please cite our work.