Abstract
The existence of latent variables in practical problems is common, for example when some variables are difficult or expensive to measure, or simply unknown. When latent variables are unaccounted for, structure learning for Gaussian graphical models can be blurred by additional correlation between the observed variables that is incurred by the latent variables. A standard approach for this problem is a latent version of the graphical lasso that splits the inverse covariance matrix into a sparse and a low-rank part that are penalized separately. This approach has recently been extended successfully to Hüsler–Reiss graphical models, which can be considered as an analogue of Gaussian graphical models in extreme value statistics. In this paper we propose a generalization of structure learning for Gaussian and Hüsler–Reiss graphical models via the flexible Golazo penalty. This allows us to introduce latent versions of for example the adaptive lasso, positive dependence constraints or predetermined sparsity patterns, and combinations of those. We develop algorithms for both latent graphical models with the Golazo penalty and demonstrate it on simulated and real data.
| Original language | English |
|---|---|
| Article number | 109468 |
| Number of pages | 24 |
| Journal | International Journal of Approximate Reasoning |
| Volume | 185 |
| DOIs | |
| Publication status | Published - Oct 2025 |
Bibliographical note
Publisher Copyright:© 2025 The Authors
Keywords
- Golazo penalty
- Graphical models
- Latent variables
- Sparse estimators
Fingerprint
Dive into the research topics of 'Latent Gaussian and Hüsler–Reiss graphical models with Golazo penalty'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver