This work develops design principles and an interaction framework for sharable, interactive public ambient displays that support the transition from implicit to explicit interaction with both public and private information. A prototype system implementation that embodies these design principles is described, with novel display and interaction techniques that use simple hand gestures and touch screen input for explicit interaction and contextual body orientation and position cues for implicit interaction. Techniques are also presented for subtle notification, self-revealing help, privacy controls, and shared use by multiple people each in their own context.
This was originally published at UIST, but later expanded into my MSc thesis. The thesis includes more background, more framework and system details, and a second user study.