Sponsor
Funding: This work was funded in part by NIH R01 EY032284, NIH R01 AG027161, and NSF Campus Cyberinfrastructure 2019216
Document Type
Pre-Print
Publication Date
6-30-2022
Subjects
Differentail Equations -- Applications
Abstract
Learning nonparametric systems of Ordinary Differential Equations (ODEs) x˙=f(t,x) from noisy and sparse data is an emerging machine learning topic. We use the well-developed theory of Reproducing Kernel Hilbert Spaces (RKHS) to define candidates for f for which the solution of the ODE exists and is unique. Learning f consists of solving a constrained optimization problem in an RKHS. We propose a penalty method that iteratively uses the Representer theorem and Euler approximations to provide a numerical solution. We prove a generalization bound for the L2 distance between x and its estimator. Experiments are provided for the FitzHugh Nagumo oscillator and for the prediction of the Amyloid level in the cortex of aging subjects. In both cases, we show competitive results when compared with the state of the art.
Rights
© the author(s)
Locate the Document
DOI
10.48550/arXiv.2206.15215
Persistent Identifier
https://archives.pdx.edu/ds/psu/38047
Citation Details
Published as: Lahouel, K., Wells, M., Lovitz, D., Rielly, V., Lew, E., & Jedynak, B. (2022). Learning Nonparametric Ordinary differential Equations: Application to Sparse and Noisy Data. arXiv preprint arXiv:2206.15215.
Description
This is the author’s version of a work that was accepted for publication in arXiv preprint. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in arXiv preprint.