Effect of Optimizer Selection and Hyperparameter Tuning on Training Efficiency and LLM Performance
-
Updated
Oct 7, 2024 - Python
Effect of Optimizer Selection and Hyperparameter Tuning on Training Efficiency and LLM Performance
JAX compilation of RDDL description files, and a differentiable planner in JAX.
Code snippets for a crash course in ml & biomed eng.
A simple deep learning library for training end-to-end fully-connected Artificial Neural Networks (ANNs), primarily based on numpy and autograd.
This project focuses on land use and land cover classification using Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs). The classification task aims to predict the category of land based on satellite or aerial images.
Nadir: Cutting-edge PyTorch optimizers for simplicity & composability! 🔥🚀💻
Artificial neural network package written in python
Retrieval based biomedical chatbot to answer questions related to diseases
deforce: Derivative-Free Algorithms for Optimizing Cascade Forward Neural Networks
Built a custom adam scheduler using gradient clipping, LR scheduling, momentum updates, with two different loss functions
MetaPerceptron: Unleashing the Power of Metaheuristic-optimized Multi-Layer Perceptron - A Python Library
This repository contains code for the PhD thesis: "A Study of Self-training Variants for Semi-supervised Image Classification" and publications.
Stochastic gradient descent (often abbreviated SGD) is an iterative method for optimizing an objective function with suitable smoothness properties.
Linear Regression SGD Optimization Implementation
Tensorflow-Keras callback implementing arXiv 1712.07628
Manually hand-coding in step-size optimizers (momentum and Adam) for deep learning neural networks
Object recognition AI using deep learning
Prevention of accidents in school zones using deep learning
Implement a Neural Network trained with back propagation in Python
In compressed decentralized optimization settings, there are benefits to having multiple gossip steps between subsequent gradient iterations, even when the cost of doing so is appropriately accounted for e.g. by means of reducing the precision of compressed information.
Add a description, image, and links to the sgd-optimizer topic page so that developers can more easily learn about it.
To associate your repository with the sgd-optimizer topic, visit your repo's landing page and select "manage topics."