Adaptive Explainable Continual Learning Framework for Regression Problems with Focus on Power Forecasts
Y. He. Organic Computing -- Doctoral Dissertation Colloquium 2021, kassel university press, (2022)
Compared with traditional deep learning techniques, continual learning enables deep neural networks to learn continually and adaptively.
Deep neural networks have to learn unseen tasks and overcome forgetting the knowledge obtained from previously learned tasks as the amount of data keeps increasing in applications.
This article proposes two continual learning application scenarios, i.e., the target-domain incremental scenario and the data-domain incremental scenario, to describe the potential challenges in this context.
Based on our previous work regarding the CLeaR framework, which is short for continual learning for regression tasks, models will be enabled to extend themselves and to learn data successively.
Research topics are related, but not limited, to developing continual deep learning algorithms, strategies for non-stationarity detection in data streams, explainable and visualizable artificial intelligence, etc.
Moreover, the framework- and algorithm-related hyperparameters should be dynamically updated in applications.
Forecasting experiments will be conducted based on power generation and consumption data collected from real-world applications.
A series of comprehensive evaluation metrics and visualization tools are applied to access the experimental results.
The proposed framework is expected to be generally applied to other constantly changing scenarios.