3260 papers • 126 benchmarks • 313 datasets
This task has no description! Would you like to contribute one?
(Image credit: Papersgraph)
These leaderboards are used to track progress in load-forecasting-18
No benchmarks available.
Use these libraries to find load-forecasting-18 models and implementations
No subtasks available.
Deep learning (DL) models can be used to tackle time series analysis tasks with great success. However, the performance of DL models can degenerate rapidly if the data are not appropriately normalized. This issue is even more apparent when DL is used for financial time series forecasting tasks, where the nonstationary and multimodal nature of the data pose significant challenges and severely affect the performance of DL models. In this brief, a simple, yet effective, neural layer that is capable of adaptively normalizing the input time series, while taking into account the distribution of the data, is proposed. The proposed layer is trained in an end-to-end fashion using backpropagation and leads to significant performance improvements compared to other evaluated normalization schemes. The proposed method differs from traditional normalization methods since it learns how to perform normalization for a given task instead of using a fixed normalization scheme. At the same time, it can be directly applied to any new time series without requiring retraining. The effectiveness of the proposed method is demonstrated using a large-scale limit order book data set, as well as a load forecasting data set.
This work focuses on feedforward and recurrent neural networks, sequence to sequence models and temporal convolutional neural networks along with architectural variants, which are known in the signal processing community but are novel to the load forecasting one.
A Building-to-Building Transfer Learning framework was applied to a new technique known as Transformer model and showed that the proposed approach improved the forecasting accuracy by 56.8% compared to the case of conventional deep learning where training from scratch is used.
D density predictions for 24h-ahead load forecasting compare favorably against Gaussian and Gaussian mixture densities and outperform a non-parametric approach based on the pinball loss, especially in low-data scenarios.
On the popular UCR benchmark of 85 TS datasets, WEASEL is more accurate than the best current non-ensemble algorithms at orders-of-magnitude lower classification and training times, and it is almost as accurate as ensemble classifiers, whose computational complexity makes them inapplicable even for mid-size datasets.
The proposed model is able to integrate domain knowledge and researchers’ understanding of the task by virtue of different neural network building blocks and has high generalization capability.
A novel bioinspired metaheuristic simulating how the coronavirus spreads and infects healthy people is proposed, and the problem of electricity load time series forecasting has been addressed, showing quite remarkable performance.
The proposed deep neural network modelling approach based on the N-BEATS neural architecture is very effective at solving MTLF problem and provides statistically significant relative gain in terms of the MAPE error metric.
This paper introduces mobility as a measure of economic activities to complement existing building blocks of forecasting algorithms and designs a transfer learning scheme that enables knowledge transfer between several different geographical regions.
Adding a benchmark result helps the community track progress.