Beyond backpropagation
Title: Beyond backpropagation
DNr: Berzelius-2024-137
Project Type: LiU Berzelius
Principal Investigator: Rasmus Kjær Høier <hier@chalmers.se>
Affiliation: Chalmers tekniska högskola
Duration: 2024-04-01 – 2024-10-01
Classification: 10207
Keywords:

Abstract

Today backpropagation is the primary learning algorithm used in deep learning. However, in recent years interest in fully local learning algorithms have increased due to their potential for energy effeciency and speed on next generation neuromorphic hardware. Among various local learning algorithms single-phase contrastive Hebbian learning algorithms are particularly promising as they don't require computing derivatives explicitly and in certain cases are capable to learn with neurons operating asynchronously. In this project we plan to apply such algorithms to challenging tasks. This work is a continuation of previous work carried out in our research group: - Høier & Zach, Lifted Regression/Reconstruction Networks, BMVC 2020 - Zach, Bilevel Programs Meet Deep Learning: A Unifying View on Inference Learning Methods, 2021 - Le, Høier, Lin & Zach et al., AdaSTE: An Adaptive Straight-Through Estimator to Train Binary Neural Networks, CVPR 2022 - Høier, Staudt & Zach, Dual Propagation: Accelerating Contrastive Hebbian Learning with Dyadic Neurons, ICML 2023 (This paper was made possible thanks to a previous resource allocation at the Berzelius cluster) - Høier & Zach, A Lagrangian Perspective on Dual Propagation, MLNCP workshop @NeurIPS - Høier & Zach, Two Tales of Single-Phase Contrastive Hebbian Learning, in review