This paper introduces a new method to automate heart-rate detection using remote photoplethysmography (rPPG). The method replaces the commonly used region of interest (RoI) detection and tracking, and does not require initialization. Instead, it combines a number of candidate pulse-signals computed in the parallel, each biased towards differently colored objects in the scene. The method is based on the observation that the temporally averaged colors of video objects (skin and background) are usually quite stable over time in typical application-driven scenarios, such as the monitoring of a subject sleeping in bed, or an infant in an incubator. The resulting system, called full video pulse extraction (FVP), allows the direct use of raw video streams for pulse extraction. Our benchmark set of diverse videos shows that FVP enables long-term sleep monitoring in visible light and in infrared, and works for adults and neonates. Although we only demonstrate the concept for heart-rate monitoring, we foresee the adaptation to a range of vital signs, thus benefiting the larger video health monitoring field.