Abstract
The synthesis of controllers guaranteeing linear temporal logic specifications on partially observable Markov decision processes (POMDP) via their belief models causes computational issues due to the continuous spaces. In this work, we construct a finite-state abstraction on which a control policy is synthesized and refined back to the original belief model. We introduce a new notion of label-based approximate stochastic simulation to quantify the deviation between belief models. We develop a robust synthesis methodology that yields a lower bound on the satisfaction probability, by compensating for deviations a priori, and that utilizes a less conservative control refinement.
Original language | English |
---|---|
Pages (from-to) | 271-276 |
Number of pages | 6 |
Journal | IFAC-PapersOnLine |
Volume | 51 |
Issue number | 16 |
DOIs | |
Publication status | Published - 1 Jan 2018 |
Event | 6th IFAC Conference on Analysis and Design of Hybrid Systems ADHS 2018 - Oxford, United Kingdom Duration: 11 Jul 2018 → 13 Jul 2018 |
Keywords
- control synthesis
- Markov decision processes
- partially observable
- Temporal properties