3260 papers • 126 benchmarks • 313 datasets
Predicting one or more scalars for an entire time series example.
(Image credit: Papersgraph)
These leaderboards are used to track progress in time-series-regression-10
Use these libraries to find time-series-regression-10 models and implementations
A novel framework for multivariate time series representation learning based on the transformer encoder architecture, which can offer substantial performance benefits over fully supervised learning on downstream tasks, both with but even without leveraging additional unlabeled data, i.e., by reusing the existing data samples.
A framework for general probabilistic multi-step time series regression that exploits the expressiveness and temporal nature of Recurrent Neural Networks, the nonparametric nature of Quantile Regression and the efficiency of Direct Multi-Horizon Forecasting is proposed.
This paper introduces the first TSER benchmarking archive, which contains 19 datasets from different domains, with varying number of dimensions, unequal length dimensions, and missing values, and did an initial benchmark on existing models.
The optimal changepoint in RESPERM maximizes Cohen’s effect size with the parameters estimated by the permutation of residuals in a linear model, making it the method of choice in neuroscience, medicine and many other fields.
The proposed multiobjective HFIT provided less complex and highly accurate models compared to the models produced by the most of other methods, and is an efficient and competitive alternative to the other FISs for function approximation and feature selection.
The heterogeneous creation of HFNT proved to be efficient in making ensemble system from the final population, and a comprehensive test over classification, regression, and time-series datasets proved the efficiency of the proposed algorithm over other available prediction methods.
A new statistical model is proposed that borrows from digital signal processing by recasting the predictors and response as convolutionally-related signals, using recent advances in machine learning to fit latent impulse response functions (IRFs) of arbitrary shape.
The debiased central limit theorem for low-dimensional groups of regression coefficients is established and the HAC estimator of the long-run variance based on the sg-LASSO residuals is studied, which leads to valid time-series inference for individual regression coefficients as well as groups, including Granger causality tests.
Disc-Opt methods can achieve similar performance as Opt-Disc at inference with drastically reduced training costs using neural ODEs for time-series regression and continuous normalizing flows (CNFs).
The results show that the state-of-the-art TSC algorithm Rocket, when adapted for regression, achieves the highest overall accuracy compared to adaptations of other TSC algorithms and state of theart machine learning (ML) algorithms such as XGBoost, Random Forest and Support Vector Regression.
Adding a benchmark result helps the community track progress.