Elad Hoffer

PhD, Deep Learning Researcher

Download this project as a .zip file Download this project as a tar.gz file

Short Bio

I hold a Ph.D. (2019) Master (2016) and B.Sc. (2014) degrees in Electrical Engineering from the Technion - Israel Institute of Technology. Research leading to my thesis, titled “Deep Learning: Rethinking Common Practices” was done under the guidance of Prof. Daniel Soudry and Prof. Nir Ailon.

My current research is focused on Deep Learning of representations and other related topics of machine learning and computer vision.

CV is available here


Brian Chmiel, Liad Ben-Uri, Moran Shkolnik, Elad Hoffer, Ron Banner, Daniel Soudry - Neural gradients are near-lognormal: improved quantized and sparse training ICLR 2021 [OpenReview]

Chen Zeno, Itay Golan, Elad Hoffer, Daniel Soudry - Task-agnostic continual learning using online variational bayes with fixed-point updates - Neural Computation 2021 [ArXiv]

Elad Hoffer, Tal Ben-Nun, Itay Hubara, Niv Giladi, Torsten Hoefler, Daniel Soudry - Increasing batch size through instance repetition improves generalization - CVPR 2020 [Arxiv]

Matan Haroush, Itay Hubara, Elad Hoffer, Daniel Soudry - The Knowledge Within: Methods for Data-Free Model Compression - CVPR 2020 [Arxiv]

Niv Giladi, Mor Shpigel Nacson, Elad Hoffer, Daniel Soudry - At Stability’s Edge: How to Adjust Hyperparameters to Preserve Minima Selection in Asynchronous Training of Neural Networks? - ICLR 2020, Spotlight (4.1% acceptance rate) [Arxiv]

Elad Hoffer, Ron Banner, Itay Golan, Daniel Soudry - Norm matters: efficient and accurate normalization schemes in deep networks - NeurIPS 2018 as spotlight (3.5% acceptance rate) [NeurIPS][ArXiv][Code]

Ron Banner, Itay Hubara, Elad Hoffer, Daniel Soudry - Scalable Methods for 8-bit Training of Neural Networks - NeurIPS 2018 [NeurIPS][ArXiv][Code]

Elad Hoffer, Itay Hubara, Daniel Soudry - Fix your classifier: the marginal value of training the last weight layer - ICLR 2018 [ArXiv]

Daniel Soudry, Elad Hoffer, Mor Shpigel Nacson, Nathan Srebro - The Implicit Bias of Gradient Descent on Separable Data - ICLR 2018 [ArXiv]

Daniel Soudry, Elad Hoffer - Exponentially vanishing sub-optimal local minima in multilayer neural networks - ICLR 2018 - workshop [ArXiv]

Elad Hoffer, Itay Hubara, Daniel Soudry - Train longer, generalize better: closing the generalization gap in large batch training of neural networks - NIPS 2017 Oral presentation (1.2% acceptance rate) [ArXiv][Code][Poster][Presentation][Video]

Elad Hoffer, Nir Ailon - Semi-supervised deep learning by metric embedding - ICLR 2017 - workshop [ArXiv][Code]

Elad Hoffer, Itay Hubara, Nir Ailon - Spatial contrasting for deep unsupervised learning - NIPS 2016 - Workshop on Interpretable Machine Learning in Complex Systems [ArXiv][Code]

Elad Hoffer, Nir Ailon - Deep metric learning using Triplet network - ICLR 2015 [ArXiv][Poster][Code]

Additional works

Elad Hoffer, Berry Weinstein, Itay Hubara, Tal Ben-Nun, Torsten Hoefler, Daniel Soudry - Mix & Match: training convnets with mixed image sizes for improved accuracy, speed and scale resiliency SEDL NeurIPS Workshop [ArXiv]

Elad Hoffer, Berry Weinstein, Itay Hubara, Sergei Gofman, Daniel Soudry- Infer2Train: leveraging inference for better training of deep networks - NeurIPS 2018 Workshop on Systems for ML [PDF]

Chen Zeno, Itay Golan, Elad Hoffer, Daniel Soudry - Bayesian Gradient Descent: Online Variational Bayes Learning with Increased Robustness to Catastrophic Forgetting and Weight Pruning [ArXiv]

Elad Hoffer, Shai Fine, Daniel Soudry - On the Blindspots of Convolutional Networks [ArXiv]

Elad Hoffer, Itay Hubara, Nir Ailon - Deep unsupervised learning through spatial contrasting [ArXiv][Code]

Sparse Deep Learning - Merging double sparsity with Deep NN [Report] [Presentation]




I gave a course about Deep Learning at Technion:

Slides and tutorials are available here

Videos are available on YouTube (Thanks to Michael Zibulevsky):


Some talks I’ve given on my research and related topics:


You can contact me at my email: elad.hoffer@gmail.com