3260 papers • 126 benchmarks • 313 datasets
This task has no description! Would you like to contribute one?
(Image credit: Papersgraph)
These leaderboards are used to track progress in topological-data-analysis-5
No benchmarks available.
Use these libraries to find topological-data-analysis-5 models and implementations
No subtasks available.
This work proposes a technique that enables us to input topological signatures to deep neural networks and learn a task-optimal representation during training, realized as a novel input layer with favorable theoretical properties.
This work converts a PD to a finite-dimensional vector representation which it is called a persistence image, and proves the stability of this transformation with respect to small perturbations in the inputs.
This paper harnesses the state-of-the-art for persistent homology computation by studying the problem of determining topological prevalence and cycle matching using a cohomological approach, which increases their feasibility and applicability to a wider variety of applications and contexts.
A variation of the mapper construction targeting weighted, undirected graphs is developed, called mapper on graphs, which generates homology-preserving skeletons of graphs, and how the adjustment of a single parameter enables multi-scale skeletonization of the input graph is shown.
This work proposes neural persistence, a complexity measure for neural network architectures based on topological data analysis on weighted stratified graphs and derives a neural persistence-based stopping criterion that shortens the training process while achieving comparable accuracies as early stopping based on validation loss.
This work presents the first scalable solution to explore and analyze high-dimensional functions often encountered in the scientific data analysis pipeline by combining a new streaming neighborhood graph construction, the corresponding topology computation, and a novel data aggregation scheme, namely topology aware datacubes.
This work proposes a novel framework, called Markov-Lipschitz deep learning (MLDL), to tackle geometric deterioration caused by collapse, twisting, or crossing in vector-based neural network transformations for manifold-based representation learning and manifold data generation.
A statistical model for stationary ergodic point processes, estimated from a single realization observed in a square window, inspired by recent works on gradient descent algorithms for sampling maximum-entropy models, that allows for fast sampling of new configurations reproducing the statistics of the given observation.
Adding a benchmark result helps the community track progress.