3260 papers • 126 benchmarks • 313 datasets
This task has no description! Would you like to contribute one?
(Image credit: Papersgraph)
These leaderboards are used to track progress in solar-irradiance-forecasting-12
Use these libraries to find solar-irradiance-forecasting-12 models and implementations
No subtasks available.
Generative and discriminative models are comparable when preprocessing is applied to the infrared images and Markov Random Fields achieve the best performance in both unsupervised and supervised generative models.
The predictions demonstrate that the proposed unified architecture-based approach is effective for multi-time-scale solar forecasts and achieves a lower root-mean-square prediction error when benchmarked against the best-performing methods documented in the literature.
This investigation studies the performance of unsupervised learning techniques when detecting the number of cloud layers in infrared sky images using an innovative infrared sky imager mounted on a solar tracker and finds that the sequential hidden Markov model outperformed the detection accuracy of the Bayesian metrics.
The Transformer deep neural network model, in which the attention mechanism is typically applied in NLP or vision problems is extended by combining features based on their spatiotemporal properties in solar irradiance prediction, which gave better results than the directly competing methods.
A novel solar irradiance forecasting model that represents atmospheric parameters observed from multiple stations as an attributed dynamic network and analyzes temporal changes in the network by extending existing spatio-temporal graph convolutional network (ST-GCN) models is proposed.
The proposed anticipative transformer-based model effectively learns to attend only to relevant features in images in order to forecast irradiance, and captures long-range dependencies between sky images to achieve a forecasting skill of 21.45 % on a 15 minute ahead prediction.
This work introduces a novel network architecture, termed Input Convex Lipschitz Recurrent Neural Networks (ICLRNNs), which seamlessly integrates the benefits of convexity and Lipschitz continuity, enabling fast and robust neural network-based modeling and optimization.
Adding a benchmark result helps the community track progress.