Thursday 26 February 2009

Surface User Interfaces

Large horizontal or vertical shared displays are an aspect of HCI or UbiComp in which computation is everywhere and computer functions are integrated into everything. Computationally enhanced basic objects such as whiteboards, tables, benches and desks allow for enhanced forms of face to face computer supported interaction and collaboration not possible with conventional desktop or mobile computing [1,2]. The interaction can be with a stand alone system [1] or with an eco-system of surfaces [2].




Figure 1: SharePic photo sharing system[1]

For example, the projected system shown in collaborative tabletop photo sharing system (SharePic [1]) allows older people the social sharing and story telling common with physical photos using digital photos. In SharePic, users do not feel they are using a computer to collaborate instead the media supports the actions directly. Ultimately, computer supported face to face collaboration will become so common place in bars, restaurants, schools, offices and homes that no one will notice its presence. It will be powerful and enhance our lives but it will be commonplace, obvious and boring.

Generally speaking, a Surface User Interface (SUI), is a class of user interface which relies on a self illuminated (e.g. LCD) or projected horizontal, vertical or spherical interactive surface coupled with control of computation into the same physical surface (e.g. a touch-screen). As with a tangible user interface the outputs and inputs to a SUI are tightly-coupled. They rely on computational techniques including computer vision, resistive membrane, capacitive and surface acoustic wave detection, to determine user input to the system. They are often used in public places (kiosks, ATMs) or small personal devices (PDA, iPhone) where a separate keyboard and mouse cannot or should not be used.

The scale of a SUI can range from small personal devices such as the iPhone or
PDA, through a Tablet PC up to large public interactive surfaces such as the MERL DiamondTouch or the Microsoft Surface. A SUI can rely on a range of input types including passive stylus, active stylus, fingers or tangible objects,or it may be tied to just one, as is the case with the Tablet PC with its active powered stylus. Where the input to the SUI is simply being used as a surrogate for a mouse input then many SUI applications function as classical GUIs do.

Useful example include


  • Digital Desk
  • MERL Diamondtouch
  • Sony SmartSkin
  • TANGerINE
  • Microsoft TouchLight
  • FTIR-based displays
  • Microsoft Surface
  • SMART table using Digital Vision Touch (DViT)
  • LiveBoard
  • HoloWall
  • metaDesk
  • UlteriorScape
  • NUI Snowflake system (software system not hardware per se)
  • Microsoft SecondLight



Issues includeing fatigue, the lack of haptic (e.g. force) feedback on such SUIs, cost, suitability has limited the adoption of large multi-user SUIs while small SUIs in iPhones, PDA, ATMs, Kiosks are commonplace. The coupling of personal displays with public shared displays remains an area of active research [2].

[1] Apted, T., Kay, J., and Quigley, A. 2006. Tabletop sharing of digital photographs for the elderly. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Montréal, Québec, Canada, April 22 - 27, 2006). R. Grinter, T. Rodden, P. Aoki, E. Cutrell, R. Jeffries, and G. Olson, Eds. CHI '06. ACM, New York, NY, 781-790.

[2] Izadi S., Quigley A. and Subramanian S. (editors), Springer Journal of Personal and Ubiquitous Computing (PUC) Special Issue on Interaction with Coupled and Public Displays.

Note this text is adapted from some up coming book chapters I have so the text remains copyright. [ My Blog ]

No comments: