3260 papers • 126 benchmarks • 313 datasets
Probabilistic programming languages are designed to describe probabilistic models and then perform inference in those models. PPLs are closely related to graphical models and Bayesian networks, but are more expressive and flexible. ( Image credit: Michael Betancourt )
(Image credit: Papersgraph)
These leaderboards are used to track progress in probabilistic-programming-11
No benchmarks available.
Use these libraries to find probabilistic-programming-11 models and implementations
No datasets available.
No subtasks available.
The TensorFlow Distributions library implements a vision of probability theory adapted to the modern deep-learning paradigm of end-to-end differentiable computation and enables modular construction of high dimensional distributions and transformations not possible with previous libraries.
A dynamic mechanism for the solution of analytically-tractable substructure in probabilistic programs is introduced, using conjugate priors and affine transformations to reduce variance in Monte Carlo estimators.
Automatic differentiation variational inference (ADVI) is developed, where the scientist only provides a probabilistic model and a dataset, nothing else, and the algorithm automatically derives an efficient Variational inference algorithm, freeing the scientist to refine and explore many models.
A method for using deep neural networks to amortize the cost of inference in models from the family induced by universal probabilistic programming languages is introduced, establishing a framework that combines the strengths of probabilism programming and deep learning methods.
A novel Probabilistic programming framework that couples directly to existing large-scale simulators through a cross-platform probabilistic execution protocol, which allows general-purpose inference engines to record and control random number draws within simulators in a language-agnostic way is presented.
This document starts with a discussion of model-based reasoning and explains why conditioning as a foundational computation is central to the fields of probabilistic machine learning and artificial intelligence, and introduces a simple first-order Probabilistic programming language whose programs define static-computation-graph, finite-variable-cardinality models.
A novel PPL framework that couples directly to existing scientific simulators through a cross-platform probabilistic execution protocol and provides Markov chain Monte Carlo (MCMC) and deep-learning-based inference compilation (IC) engines for tractable inference is presented.
The power of composing Pyro's effect handlers with the program transformations that enable hardware acceleration, automatic differentiation, and vectorization in JAX are demonstrated.
A domain-specific language, Scenic, is designed for describing scenarios that are distributions over scenes, as a probabilistic programming language, that allows assigning distributions to features of the scene, as well as declaratively imposing hard and soft constraints over the scene.
It is proved that optimizing the encoder over any class of universal approximators, such as deterministic neural networks, is enough to come arbitrarily close to the optimum, and advertised this framework, which holds for any metric space and prior, as a sweet-spot of current generative autoencoding objectives.
Adding a benchmark result helps the community track progress.