Incremental placement of interactive perception applications

  • M.N. Yigitbasi
  • , L.B. Mummert
  • , P. Pillai
  • , D.H.J. Epema

    Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

    Abstract

    Interactive perception applications, such as gesture recognition and vision-based user interfaces, process high-data rate streams with compute intensive computer vision and machine learning algorithms. These applications can be represented as data flow graphs comprising several processing stages. Such applications require low latency to be interactive so that the results are immediately available to the user. To achieve low latency, we exploit the inherent coarse grained task and data parallelism of these applications by running them on clusters of machines. This paper addresses an important problem that arises: how to place the stages of these applications on machines to minimize the latency, and in particular, how to adjust an existing schedule in response to changes in the operating conditions (perturbations) while minimizing the disruption in the existing placement (churn). To this end, we propose four incremental placement heuristics which use the HEFT scheduling algorithm as their primary building block. Through simulations and experiments on a real implementation, using diverse workloads and a range of perturbation scenarios, we demonstrate that dynamic adjustment of the schedule can improve latency by as much as 36%, while producing little churn.
    Original languageEnglish
    Title of host publicationProceedings of the 20th International Symposium on High Performance Distributed Computing (HPDC'11, San Jose CA, USA, June 8-11, 2011)
    Place of PublicationNew York NY
    PublisherAssociation for Computing Machinery, Inc.
    Pages123-134
    ISBN (Print)978-1-4503-0552-5
    Publication statusPublished - 2011

    Fingerprint

    Dive into the research topics of 'Incremental placement of interactive perception applications'. Together they form a unique fingerprint.

    Cite this