New motion control technologies such as drones, gimbals or industrial robots provide smooth repeatable camera motion and image stabilization. These tools are usually motor driven and therefore operated remotely. The user interfaces for this remote operation are often implemented on mobile touch-screen devices. With touch-screens however, a decrease in precision can be generally observed. One way to address the issue by system-design is providing assistance on a higher control level. In cin- ematography this is implemented as assistance through computer vision and visual servoing and allows image-based motion control directly on the video-stream1. For such image-based control, systems often use a touch-to-track (TTT) approach for the selection of an object or person to be tracked and followed. With TTT, users tap on the object or person they see in the video-stream and the system then continuously adjusts its po- sition to keep the selected object in the same position within the frame. As TTT is based on manual selection of a moving object, performance in such goal-directed aiming can be expected to decrease for faster moving targets as in car commercials, sports-broadcasting or high-speed recordings. We developed an alternative concept addressing the issue that lets operators define the desired tracking position in advance and delegates the correct timing to an assisting system. TrackLine allows operators to define a motion trigger within the image frame that is represented by a line and displayed on top of the video-stream.
TrackLine (Poster)
Refining touch-to-track Interaction for Camera Motion Control on Mobile Devices
In Printed Proceedings of the 13th European Conference on Visual Media Production (CVMP ’16), London, United Kingdom
Authors
- Axel Hoesl
- Sarah Aragon Bartsch
- Philipp Burgdorf
- Andreas Butz