Skip to content

Latest commit

 

History

History
76 lines (54 loc) · 2.57 KB

README.md

File metadata and controls

76 lines (54 loc) · 2.57 KB

vim-autograd

Automatic differentiation library written in pure Vim script.

test test-vim9

vim-autograd provides a foundation for automatic differentiation through the Define-by-Run style algorithm such as Chainer or PyTorch. Since it is written completely in pure Vim script, there are no dependencies.

This library allows us to create next-generation plugins with numerical computation of multidimensional arrays or deep learning using the gradient descent method.

Installation

Vim script

If you are using vim-plug, can install as follows.

Plug 'pit-ray/vim-autograd'

Vim9 script

If you want to use the more efficient Vim9 script, please install the experimental vim9 branch implementation.

Plug 'pit-ray/vim-autograd', {'branch': 'vim9'}

Usage

A computational graph is constructed by applying the provided differentiable functions to a Tensor object, and the gradient is calculated by backpropagating from the output.

function! s:f(x) abort
  " y = x^5 - 2x^3
  let y = autograd#sub(a:x.p(5), a:x.p(3).m(2))
  return y
endfunction

function! s:example() abort
  let x = autograd#tensor(2.0)
  let y = s:f(x)

  call y.backward()
  echo x.grad.data
endfunction

call s:example()

Output

[56.0]

The computational graph is automatically generated like the below.

Examples

Related posts

References

License

This library is provided by MIT License.

Author

  • pit-ray