3260 papers • 126 benchmarks • 313 datasets
Split data into groups, taking into account knowledge in the form of constraints on points, groups of points, or clusters.
(Image credit: Papersgraph)
These leaderboards are used to track progress in constrained-clustering-6
No benchmarks available.
Use these libraries to find constrained-clustering-6 models and implementations
No datasets available.
RepCONC is a novel retrieval model that learns discrete Representations via CONstrained Clustering and substantially outperforms a wide range of existing retrieval models in terms of retrieval effectiveness, memory efficiency, and time efficiency.
The Wasserstein distance is proposed to be smoothed with an entropic regularizer and recover in doing so a strictly convex objective whose gradients can be computed for a considerably cheaper computational cost using matrix scaling algorithms.
This work proposes a hybrid model of Structurally Regularized Deep Clustering, which integrates the regularized discriminative clustering of target data with a generative one, and term the method as H-SRDC, which outperforms all the existing methods under both the inductive and transductive settings.
This paper presents a more natural and principled formulation of constrained spectral clustering, which explicitly encodes the constraints as part of a constrained optimization problem, and demonstrates an innovative use of encoding large number of constraints: transfer learning via constraints.
A novel method to perform transfer learning across domains and tasks, formulating it as a problem of learning to cluster with state of the art results for the challenging cross-task problem, applied on Omniglot and ImageNet.
This work introduces a principled and theoretically sound spectral method for k-way clustering in signed graphs, where the affinity measure between nodes takes either positive or negative values.
This work proposes an intriguing scheme which treats person-image retrieval problem as a constrained clustering optimization problem, called deep constrained dominant sets (DCDS), and shows that the proposed method can outperform state-of-the-art methods.
The experimental results show that the proposed method exploits the constraints to achieve perfect clustering performance with improved clustering to $2-5$ % in classical clustering metrics, e.g., Adjusted Random Index (ARI), Mirkin's Index (MI), and Huber’s Index (HI), outerperfomring all compared-againts methods across the board.
Extensive simulations on realistic head geometries, as well as empirical results on various MEG datasets, demonstrate the high recovery performance of ecd-MTLasso and its primary practical benefit: offer a statistically principled way to threshold MEG/EEG source maps.
Adding a benchmark result helps the community track progress.