A C++ implementation of a modular object-based neural network framework (NeuralNetwork, 3 Layer class wrappers, Layer, Neuron) without ML libraries
main: basic version with no real performance additions \n AdamW: AdamW weight updating, converges better then main, can be slow to train \n AdamW+Batches: AdamW weight updating with batch based training, incomplete \n
Will eventually recreate with CUDA C++ to improve performance and practicality