Direct view manipulation for drone photography

Yi-Ling Chen, Wei-Tse Lee, Liwei Chan, Rong-Hao Liang, Bing-Yu Chen

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

5 Citations (Scopus)


For a long time, photographers hold and move their cameras, and consider how to frame a good shot all at the same time. With the emergence of drones, people start to let the flying carriers to hold their cameras in order to take more compelling pictures. However, the viewports between the photographer and device become decoupled and every single movement needs to be explicitly instructed via a remote controller. Even with the first-person view video streaming, users still have to be very skillful to fluently pilot the drone without causing distraction to photo composition. Inspired by the concept of viewfinder editing [Baek et al. 2013], we propose a more intuitive interface to control the flying camera (i.e., the drone) by direct view manipulation embodied with multi-touch gestures, which allows the users to directly alter and rearrange the visual elements in the picture prior to image capturing. In our proof-of-concept implementation, the viewfinder of a flying camera is mapped to the screen of a mobile device. The physical camera movements are encoded by common photo manipulation operations, such as translation and scaling, with multi-touch gestures.
Original languageEnglish
Title of host publicationSIGGRAPH Asia 2015 Posters
Place of PublicationNew York, NY, USA
PublisherAssociation for Computing Machinery, Inc
ISBN (Print)978-1-4503-3926-1
Publication statusPublished - 2015
Externally publishedYes


Dive into the research topics of 'Direct view manipulation for drone photography'. Together they form a unique fingerprint.

Cite this