In this article, we consider the identification of linear models from quantized output data. We develop a variational approximation of the likelihood function, which allows us to find variationally optimal approximations of the maximum-likelihood and maximum a posteriori estimates. We show that these estimates are obtained by projecting the midpoint in the quantization interval of each output measurement onto the column space of the input regression matrix. Interpreting the quantized output as a random variable, we derive its moments for generic noise distributions. For the case of Gaussian noise and Gaussian independent identically distributed input, we give an analytical characterization of the bias, which we use to build a bias-compensation scheme that leads to consistent estimates.
- Expectation-maximization (EM)
- finite impulse response (FIR)
- quantization interval