Do we have enough data? Robust reliability via uncertainty quantification

Roberto Rocchetta, Matteo Broggi, Edoardo Patelli

Research output: Contribution to journalArticleAcademicpeer-review

31 Citations (Scopus)


A generalised probabilistic framework is proposed for reliability assessment and uncertainty quantification under a lack of data. The developed computational tool allows the effect of epistemic uncertainty to be quantified and has been applied to assess the reliability of an electronic circuit and a power transmission network. The strength and weakness of the proposed approach are illustrated by comparison to traditional probabilistic approaches. In the presence of both aleatory and epistemic uncertainty, classic probabilistic approaches may lead to misleading conclusions and a false sense of confidence which may not fully represent the quality of the available information. In contrast, generalised probabilistic approaches are versatile and powerful when linked to a computational tool that permits their applicability to realistic engineering problems.

Original languageEnglish
Pages (from-to)710-721
Number of pages12
JournalApplied Mathematical Modelling
Publication statusPublished - 1 Feb 2018
Externally publishedYes


  • Computational tool
  • Dempster–Shafer
  • Information quality
  • Probability boxes
  • Reliability
  • Uncertainty quantification


Dive into the research topics of 'Do we have enough data? Robust reliability via uncertainty quantification'. Together they form a unique fingerprint.

Cite this