Abstract
The importance of proper data normalization for deep neural networks is well known. However, in continuous-time state-space model estimation, it has been observed that improper normalization of either the hidden state or hidden state derivative of the model estimate, or even of the time interval can lead to numerical and optimization challenges with deep learning based methods. This results in a reduced model quality. In this contribution, we show that these three normalization tasks are inherently coupled. Due to the existence of this coupling, we propose a solution to all three normalization challenges by introducing a normalization constant at the state derivative level. We show that the appropriate choice of the normalization constant is related to the dynamics of the to-be-identified system and we derive multiple methods of obtaining an effective normalization constant. We compare and discuss all the normalization strategies on a benchmark problem based on experimental data from a cascaded tanks system and compare our results with other methods of the identification literature.
Original language | English |
---|---|
Pages (from-to) | 253-258 |
Number of pages | 6 |
Journal | IFAC-PapersOnLine |
Volume | 58 |
Issue number | 15 |
DOIs | |
Publication status | Published - 1 Jul 2024 |
Event | 20th IFAC Symposium on System Identification, SYSID 2024 - Boston, United States Duration: 17 Jul 2024 → 19 Jul 2024 Conference number: 20 |
Bibliographical note
Publisher Copyright:© 2024 The Authors.
Keywords
- Continuous-Time
- Neural Ordinary Differential Equations
- Nonlinear State-Space
- System Identification