By A Mystery Man Writer
Iclr2020: Compression based bound for non-compressed network: unified generalization error analysis of large compressible deep neural network - Download as a PDF or view online for free
1) The document presents a new compression-based bound for analyzing the generalization error of large deep neural networks, even when the networks are not explicitly compressed.
2) It shows that if a trained network's weights and covariance matrices exhibit low-rank properties, then the network has a small intrinsic dimensionality and can be efficiently compressed.
3) This allows deriving a tighter generalization bound than existing approaches, providing insight into why overparameterized networks generalize well despite having more parameters than training examples.
Peter Richtarik
Entropy, Free Full-Text
Iclr2020: Compression based bound for non-compressed network: unified generalization error analysis of large compressible deep neural network
Iclr2020: Compression based bound for non-compressed network: unified generalization error analysis of large compressible deep neural network
Deep node ranking for neuro‐symbolic structural node embedding and classification - Škrlj - 2022 - International Journal of Intelligent Systems - Wiley Online Library
Adversarial Neural Pruning with Latent Vulnerability Suppression
GitHub - shaohua0116/ICLR2020-OpenReviewData: Script that crawls meta data from ICLR OpenReview webpage. Tutorials on installing and using Selenium and ChromeDriver on Ubuntu.
Implicit Bias of Gradient Descent based Adversarial Training on Separable Data
ICLR2021 (spotlight)] Benefit of deep learning with non-convex noisy gradient descent