Skip to content

Version 2.0

Latest
Compare
Choose a tag to compare
@rahulsrma26 rahulsrma26 released this 11 Jul 19:35

Redesigned for more powerful networks (now supports autoencoders as well). Very easy to write small models (inspired by keras).

Features:

  • Layers: Dense, Dropout, Flatten
  • Activators: Sigmoid, Tanh, Relu
  • Loss Functions: Quadratic, Hillinger, Cross-entropy
  • Variable Initializers: Zeros, Normal, Xavier
  • Optimizers: SGD, Momentum, RMSProp, AdaGrad, Adam