3260 papers • 126 benchmarks • 313 datasets
This task has no description! Would you like to contribute one?
(Image credit: Papersgraph)
These leaderboards are used to track progress in ensemble-learning-16
Use these libraries to find ensemble-learning-16 models and implementations
No subtasks available.
Instead of randomly dropping parts of the network as in MC-dropout, Masksemble relies on a fixed number of binary masks, which are parameterized in a way that allows to change correlations between individual models.
This work proposes gossip learning, a generic approach that is based on multiple models taking random walks over the network in parallel, while applying an online learning algorithm to improve themselves, and getting combined via ensemble learning methods.
A novel way to minimize the risk of investment in stock market by predicting the returns of a stock using a class of powerful machine learning algorithms known as ensemble learning, which is an ensemble of multiple decision trees.
This paper explores the use of knowledge distillation to improve a Multi-Task Deep Neural Network (MT-DNN) (Liu et al., 2019) for learning text representations across multiple natural language understanding tasks and shows that the distilled MT-dNN significantly outperforms the original MT- DNN on 7 out of 9 GLUE tasks.
This paper provides a holistic view of Etsy's promoted listings' CTR prediction system and proposes an ensemble learning approach which is based on historical or behavioral signals for older listings as well as content-based features for new listings, and compares the system to non-trivial baselines on a large-scale real world dataset from Etsy.
A new theoretical framework for binary classification, the Strategy for Unsupervised Multiple Method Aggregation (SUMMA), is developed to estimate the performances of base classifiers and an optimal strategy for ensemble learning from unlabeled data.
The results suggest that models trained in the DebiasedDTA framework can achieve improved generalizability in predicting the interactions of the previously unseen biomolecules, as well as performance improvements on those previously seen.
Imbalanced-learn is an open-source python toolbox aiming at providing a wide range of methods to cope with the problem of imbalanced dataset frequently encountered in machine learning and pattern recognition.
A novel multi-level network embedding framework BoostNE, which can learn multiple node embeddings of different granularity from coarse to fine without imposing the prevalent global low-rank assumption, is proposed.
An ensemble learning framework is applied to ensemble statistical features and the outputs from the deep classifiers, with the goal to utilize complementary information to address the noisy label problem within the framework.
Adding a benchmark result helps the community track progress.