Prematurely born infants are observed at a Neonatal Intensive Care Unit (NICU) for medical treatment. Whereas vital body functions are continuously monitored, their incubator is covered by a blanket for medical reasons. This prevents visual observation of the newborns during most time of the day, while it is known that the facial expression can give valuable information about the presence of discomfort. This prompted the authors to develop a prototype of an automated video survey system for the detection of discomfort in newborn babies by analysis of their facial expression. Since only a reliable and situation-independent system is useful, we focus at robustness against non-ideal viewpoints and lighting conditions. Our proposed algorithm automatically segments the face from the background and localizes the eye, eyebrow and mouth regions. Based upon measurements in these regions, a hierarchical classifier is employed to discriminate between the behavioral states sleep, awake and cry. We have evaluated the described prototype system on recordings of three healthy newborns, and we show that our algorithm operates with approximately 95% accuracy. Small changes in viewpoint and lighting conditions are allowed, but when there is a major reduction in light, or when the viewpoint is far from frontal, the algorithm fails. © 2009 Springer Berlin Heidelberg.
|Title of host publication||Proceedings of the 11th International Conference on Advanced Concepts for Intelligent Vision Systems(ACIVS 2009) 28 September - 2 October 2009, Bordeaux|
|Editors||J. Blanc-Talon, W. Philips, D. Popescu, P. Scheunders|
|Place of Publication||Berlin|
|Publication status||Published - 2009|
|Name||Lecture Notes in Computer Science|