Skip to content

Latest commit

 

History

History
46 lines (46 loc) · 1.73 KB

2023-04-11-bishop23a.md

File metadata and controls

46 lines (46 loc) · 1.73 KB
title abstract section layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
Recurrent Neural Networks and Universal Approximation of Bayesian Filters
We consider the Bayesian optimal filtering problem: i.e. estimating some conditional statistics of a latent time-series signal from an observation sequence. Classical approaches often rely on the use of assumed or estimated transition and observation models. Instead, we formulate a generic recurrent neural network framework and seek to learn directly a recursive mapping from observational inputs to the desired estimator statistics. The main focus of this article is the approximation capabilities of this framework. We provide approximation error bounds for filtering in general non-compact domains. We also consider strong time-uniform approximation error bounds that guarantee good long-time performance. We discuss and illustrate a number of practical concerns and implications of these results.
Regular Papers
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
bishop23a
0
Recurrent Neural Networks and Universal Approximation of Bayesian Filters
6956
6967
6956-6967
6956
false
Bishop, Adrian N. and Bonilla, Edwin V.
given family
Adrian N.
Bishop
given family
Edwin V.
Bonilla
2023-04-11
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics
206
inproceedings
date-parts
2023
4
11