Tables are focus points for social interactions and support everyday activities, such as learning, crafting, or dining. These physical interactions on and around the table may be augmented with digital information and tools projected onto the tabletop. For interaction with such projected information, touch input suffers from technical and interactional limitations. Pen input is a more robust alternative that does not suffer from Midas-touch problems. We developed a system for tracking the position of an IR-emitting pen tip on a planar surface with sub-millimeter resolution and an end-to-end latency of less than 30 ms. Distinguishing between drawing and hovering states is done by combining a stereoscopic camera setup and a machine-learning classifier. We demonstrate practical performance, uses and limitations through multiple studies and examples.
Status: ongoing
Runtime: 2022 -
Participants: Vitus Maierhöfer, Andreas Schmid, Raphael Wimmer
Keywords: projected augmented reality, input device, pen input, computer vision
TODO
Source Code: https://github.com/PDA-UR/TipTrack