3260 papers • 126 benchmarks • 313 datasets
Learning theory
(Image credit: Papersgraph)
These leaderboards are used to track progress in learning-theory
No benchmarks available.
Use these libraries to find learning-theory models and implementations
No datasets available.
No subtasks available.
To allow fast and high‐quality reconstruction of clinical accelerated multi‐coil MR data by learning a variational network that combines the mathematical structure of variational models with deep learning.
This work model personalized recommendation of news articles as a contextual bandit problem, a principled approach in which a learning algorithm sequentially selects articles to serve users based on contextual information about the users and articles, while simultaneously adapting its article-selection strategy based on user-click feedback to maximize total user clicks.
This work derives a procedure that allows for learning from all available sources, yet automatically suppresses irrelevant or corrupted data, and shows that this method provides significant improvements over alternative approaches from robust statistics and distributed optimization.
A novel measure-theoretic theory for machine learning that does not require statistical assumptions is introduced and a new regularization method in deep learning is derived and shown to outperform previous methods in CIFar-10, CIFAR-100, and SVHN.
Sparse neural networks attract increasing interest as they exhibit comparable performance to their dense counterparts while being computationally efficient. Pruning the dense neural networks is among the most widely used methods to obtain a sparse neural network. Driven by the high training cost of such methods that can be unaffordable for a low-resource device, training sparse neural networks sparsely from scratch has recently gained attention. However, existing sparse training algorithms suffer from various issues, including poor performance in high sparsity scenarios, computing dense gradient information during training, or pure random topology search. In this paper, inspired by the evolution of the biological brain and the Hebbian learning theory, we present a new sparse training approach that evolves sparse neural networks according to the behavior of neurons in the network. Concretely, by exploiting the cosine similarity metric to measure the importance of the connections, our proposed method, “Cosine similarity-based and random topology exploration (CTRE)”, evolves the topology of sparse neural networks by adding the most important connections to the network without calculating dense gradient in the backward. We carried out different experiments on eight datasets, including tabular, image, and text datasets, and demonstrate that our proposed method outperforms several state-of-the-art sparse training algorithms in extremely sparse neural networks by a large gap. The implementation code is available on Github.
This paper argues that continual learning methods can benefit by splitting the capacity of the learner across multiple models, and develops a method named Model Zoo which grows an ensemble of small models, each of which is trained during one episode of continual learning.
It is constructively proved that, with just an appropriate choice of activation function, any positive-semidefinite dot-product kernel can be realized as either the NNGP or neural tangent kernel of a fully-connected neural network with only one hidden layer.
This paper analyzes boolean formulas associated with model-sampling benchmarks, combinatorial optimization problems, and random 3-CNFs with varying degrees of constrainedness to indicate that neural learning generalizes better than pure rule-based systems and pure symbolic approach.
A unified PAC-Bayesian motivated informativeness measure, PABI, is proposed that characterizes the uncertainty reduction provided by incidental supervision signals, and its effectiveness is demonstrated by quantifying the value added by various types of incidental signals to sequence tagging tasks.
Adding a benchmark result helps the community track progress.