3260 papers • 126 benchmarks • 313 datasets
This task has no description! Would you like to contribute one?
(Image credit: Papersgraph)
These leaderboards are used to track progress in probabilistic-deep-learning-3
No benchmarks available.
Use these libraries to find probabilistic-deep-learning-3 models and implementations
No subtasks available.
The field research, design and comparative deployment of a multimodal medical imaging user interface for breast screening is described and recommendations from the radiologists for guiding the future design of medical imaging interfaces are summarized.
It is shown that in the context of object detection, training variance networks with negative log likelihood (NLL) can lead to high entropy predictive distributions regardless of the correctness of the output mean, so it is proposed to use the energy score as a non-local proper scoring rule and find that when used for training, it leads to better calibrated and lower entropy predictive distribution than NLL.
A probabilistic deep learning methodology that enables the construction of predictive data-driven surrogates for stochastic systems and can accommodate high-dimensional inputs and outputs and are able to return predictions with quantified uncertainty is presented.
A large-scale benchmark of existing state-of-the-art methods on classification problems and the effect of dataset shift on accuracy and calibration is presented, finding that traditional post-hoc calibration does indeed fall short, as do several other previous methods.
This paper proposes an augmented optimization objective which imposes a probabilistic structure on the latent data space and utilizes maximum mean discrepancy (MMD) to detect potentially catastrophic misspecifications during inference undermining the validity of the obtained results.
D density predictions for 24h-ahead load forecasting compare favorably against Gaussian and Gaussian mixture densities and outperform a non-parametric approach based on the pinball loss, especially in low-data scenarios.
Spectral-normalized Neural Gaussian Process (SNGP), a simple method that improves the distance-awareness ability of modern DNNs with two simple changes: applying spectral normalization to hidden weights to enforce bi-Lipschitz smoothness in representations and replacing the last output layer with a Gaussian process layer.
A novel probabilistic deep learning model is proposed for the task of angular regression using von Mises distributions to predict a distribution over object pose angle and it is demonstrated on a number of challenging pose estimation datasets that the model produces calibrated probability predictions and competitive or superior point estimates compared to the current state of the art.
DeepTract, a deep-learning framework for estimating white matter fibers orientation and streamline tractography, adopts a data-driven approach for fiber reconstruction from diffusion-weighted images (DWI), which does not assume a specific diffusion model.
The hybrid model, despite the invertibility constraints, achieves similar accuracy to purely predictive models, and the generative component remains a good model of the input features despite the hybrid optimization objective.
Adding a benchmark result helps the community track progress.