How neural networks learn to classify chaotic time series

Alessandro Corbetta, Thomas Geert de Jong (Corresponding author)

Research output: Contribution to journalArticleAcademicpeer-review

1 Citation (Scopus)
95 Downloads (Pure)

Abstract

We tackle the outstanding issue of analyzing the inner workings of neural networks trained to classify regular-vs-chaotic time series. This setting, well-studied in dynamical systems, enables thorough formal analyses. We focus specifically on a family of networks dubbed large Kernel convolutional neural networks (LKCNNs), recently introduced by Boullé et al. [403, 132261 (2021)]. These non-recursive networks have been shown to outperform other established architectures (e.g., residual networks, shallow neural networks, and fully convolutional networks) at this classification task. Furthermore, they outperform “manual” classification approaches based on direct reconstruction of the Lyapunov exponent. We find that LKCNNs use qualitative properties of the input sequence. We show that LKCNN models trained from random weight initialization, end in two most common performance groups: one with relatively low performance (⁠0.72 average classification accuracy) and one with high classification performance (⁠0.94
average classification accuracy). Notably, the models in the low performance class display periodic activations that are qualitatively similar to those exhibited by LKCNNs with random weights. This could give very general criteria for identifying, a priori, trained weights that yield poor accuracy.
Original languageEnglish
Article number123101
Number of pages14
JournalChaos
Volume33
Issue number12
DOIs
Publication statusPublished - 4 Dec 2023

Funding

During this research, Thomas de Jong was also affiliated to University of Groningen and Xiamen University. Many thanks to Alef Sterk for his helpful comments and literature recommendations. Also, many thanks to Klaas Huizenga for providing hardware during Thomas de Jong’s stay in Groningen. This research was partially supported by JST CREST Grant No. JPMJCR2014

Funders
University of Groningen

    Fingerprint

    Dive into the research topics of 'How neural networks learn to classify chaotic time series'. Together they form a unique fingerprint.

    Cite this