Abstract
The increasing integration of photovoltaic (PV) power into energy grids necessitates accurate forecasting to enhance grid stability and optimize energy management. However, traditional machine learning models often lack interpretability, leading to concerns regarding reliability and trust in AI-driven predictions. This study applies Explainable AI (XAI) techniques, including SHapley Additive exPlanations (SHAP), Local Interpretable Model-agnostic Explanations (LIME), and feature importance plots, to improve the transparency of a hybrid Least Absolute Shrinkage and Selection Operator-Random Forest Regression (LASSO-RFR) model for PV power forecasting in Abuja, Nigeria. By analyzing meteorological datasets from 2020–2022, the hybrid model was evaluated using key performance metrics such as Root Mean Squared Error (RMSE), and R-squared (R2). The results indicate that the hybrid LASSO-RFR framework enhances prediction accuracy while maintaining model interpretability. XAI analyses revealed that surface pressure, dew point temperature, and air temperature were the most influential variables in forecasting solar radiation. The findings underscore the importance of integrating XAI tools into AI-driven energy forecasting models to improve transparency, stakeholder confidence, and practical deployment in real-world scenarios.
Original language | English |
---|---|
Title of host publication | 2025 International Russian Smart Industry Conference (SmartIndustryCon) |
Publisher | IEEE |
Pages | 1117-1122 |
Number of pages | 6 |
ISBN (Electronic) | 9798331511241 |
DOIs | |
Publication status | Published - 9 May 2025 |
Keywords
- LASSO
- LIME
- Random Forest Regression
- SHAP
- explainable AI
- photovoltaic power forecasting
- renewable energy
ASJC Scopus subject areas
- Modelling and Simulation
- Artificial Intelligence
- Computer Science Applications
- Decision Sciences (miscellaneous)
- Information Systems and Management
- Civil and Structural Engineering
- Industrial and Manufacturing Engineering