Verification of general Markov decision processes by approximate similarity relations and policy refinement

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

3 Downloads (Pure)

Abstract

In this work we introduce new approximate similarity relations that are shown to be key for policy (or control) synthesis over general Markov decision processes. The models of interest are discrete-time Markov decision processes, endowed with uncountably-infinite state spaces and metric output (or observation) spaces. The new relations, underpinned by the use of metrics, allow in particular for a useful trade-off between deviations over probability distributions on states, and distances between model outputs. We show that the new probabilistic similarity relations can be effectively employed over general Markov decision processes for verification purposes, and specifically for control refinement from abstract models.
Original languageEnglish
Title of host publicationQuantitative evaluation of systems, QEST 2016
Subtitle of host publication13th International Conference, QEST 2016, Quebec City, QC, Canada, August 23-25, 2016, Proceedings
EditorsG. Agha, B. Van Houdt
Place of PublicationDordrecht
PublisherSpringer
Pages227-243
ISBN (Electronic)978-3-319-43425-4
ISBN (Print)978-3-319-43424-7
DOIs
Publication statusPublished - 2016

Publication series

NameLecture Notes in Computer Science
Volume9826

Cite this