Implementation of (overlap) local SGD in Pytorch
-
Updated
Jul 12, 2020 - Python
Implementation of (overlap) local SGD in Pytorch
A compressed adaptive optimizer for training large-scale deep learning models using PyTorch
Communication-efficient decentralized SGD (Pytorch)
Lookahead optimizer ("Lookahead Optimizer: k steps forward, 1 step back") for tensorflow
Implement a Neural Network trained with back propagation in Python
Nadir: Cutting-edge PyTorch optimizers for simplicity & composability! 🔥🚀💻
MetaPerceptron: A Standardized Framework For Metaheuristic-Driven Multi-layer Perceptron Optimization
JAX compilation of RDDL description files, and a differentiable planner in JAX.
基于粒子群PSO+随机梯度下降SGD优化器的Pytorch训练框架
ND-Adam is a tailored version of Adam for training DNNs.
Object recognition AI using deep learning
Tensorflow-Keras callback implementing arXiv 1712.07628
Effect of Optimizer Selection and Hyperparameter Tuning on Training Efficiency and LLM Performance
This was a project case study on nonlinear optimization. We implemented the Stochastic Quasi-Newton method, the Stochastic Proximal Gradient method and applied both to a dictionary learning problem.
In compressed decentralized optimization settings, there are benefits to having multiple gossip steps between subsequent gradient iterations, even when the cost of doing so is appropriately accounted for e.g. by means of reducing the precision of compressed information.
Prevention of accidents in school zones using deep learning
MNIST Handwritten Digits Classification using 3 Layer Neural Net 98.7% Accuracy
This repository contains code for the PhD thesis: "A Study of Self-training Variants for Semi-supervised Image Classification" and publications.
This project focuses on land use and land cover classification using Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs). The classification task aims to predict the category of land based on satellite or aerial images.
A simple deep learning library for training end-to-end fully-connected Artificial Neural Networks (ANNs), primarily based on numpy and autograd.
Add a description, image, and links to the sgd-optimizer topic page so that developers can more easily learn about it.
To associate your repository with the sgd-optimizer topic, visit your repo's landing page and select "manage topics."