Overview Statistic: PDF-Downloads (blue) and Frontdoor-Views (gray)

Influence of Ordinary Differential Equations on Neural Networks

  • In this thesis, I want to analyze how neural networks trained to resemble a given function based on particular values react to additional knowledge about the function in terms of a differential equation that the function satisfies. In the first chapter, I will recall the definition of initial value problems and discuss the existence and uniqueness of their solutions, as well as touch on a method to numerically approximate them. Furthermore, I will give a mathematical introduction to neural networks. Then, I will consider a system of two first-order differential equations and test how their solution will be approximated by a neural net. In the next step, I want to see how added noise to the training data and the influence of the differential equation on the loss functional of the neural net will affect the error of the predicted solution against the exact data. Furthermore, I will test if this influence can help reduce the amount of necessary data points in order to reach similar degrees of accuracy.

Export metadata

Additional Services

Share in Twitter Search Google Scholar Statistics - number of accesses to the document
Metadaten
Author:Felix Meitzner
Document Type:Bachelor's Thesis
Granting Institution:Freie Universität Berlin
Advisor:Christof Schütte, Tim Conrad
Date of final exam:2019/11/04
Year of first publication:2019
Page Number:26
Accept ✔
Diese Webseite verwendet technisch erforderliche Session-Cookies. Durch die weitere Nutzung der Webseite stimmen Sie diesem zu. Unsere Datenschutzerklärung finden Sie hier.