Friday 27 February 2009

Twitter for health care?

According to this article in E-Health Insider, there are now several UK hospitals using Twitter to keep in touch with patients and staff...

Thursday 26 February 2009

Surface User Interfaces

Large horizontal or vertical shared displays are an aspect of HCI or UbiComp in which computation is everywhere and computer functions are integrated into everything. Computationally enhanced basic objects such as whiteboards, tables, benches and desks allow for enhanced forms of face to face computer supported interaction and collaboration not possible with conventional desktop or mobile computing [1,2]. The interaction can be with a stand alone system [1] or with an eco-system of surfaces [2].




Figure 1: SharePic photo sharing system[1]

For example, the projected system shown in collaborative tabletop photo sharing system (SharePic [1]) allows older people the social sharing and story telling common with physical photos using digital photos. In SharePic, users do not feel they are using a computer to collaborate instead the media supports the actions directly. Ultimately, computer supported face to face collaboration will become so common place in bars, restaurants, schools, offices and homes that no one will notice its presence. It will be powerful and enhance our lives but it will be commonplace, obvious and boring.

Generally speaking, a Surface User Interface (SUI), is a class of user interface which relies on a self illuminated (e.g. LCD) or projected horizontal, vertical or spherical interactive surface coupled with control of computation into the same physical surface (e.g. a touch-screen). As with a tangible user interface the outputs and inputs to a SUI are tightly-coupled. They rely on computational techniques including computer vision, resistive membrane, capacitive and surface acoustic wave detection, to determine user input to the system. They are often used in public places (kiosks, ATMs) or small personal devices (PDA, iPhone) where a separate keyboard and mouse cannot or should not be used.

The scale of a SUI can range from small personal devices such as the iPhone or
PDA, through a Tablet PC up to large public interactive surfaces such as the MERL DiamondTouch or the Microsoft Surface. A SUI can rely on a range of input types including passive stylus, active stylus, fingers or tangible objects,or it may be tied to just one, as is the case with the Tablet PC with its active powered stylus. Where the input to the SUI is simply being used as a surrogate for a mouse input then many SUI applications function as classical GUIs do.

Useful example include


  • Digital Desk
  • MERL Diamondtouch
  • Sony SmartSkin
  • TANGerINE
  • Microsoft TouchLight
  • FTIR-based displays
  • Microsoft Surface
  • SMART table using Digital Vision Touch (DViT)
  • LiveBoard
  • HoloWall
  • metaDesk
  • UlteriorScape
  • NUI Snowflake system (software system not hardware per se)
  • Microsoft SecondLight



Issues includeing fatigue, the lack of haptic (e.g. force) feedback on such SUIs, cost, suitability has limited the adoption of large multi-user SUIs while small SUIs in iPhones, PDA, ATMs, Kiosks are commonplace. The coupling of personal displays with public shared displays remains an area of active research [2].

[1] Apted, T., Kay, J., and Quigley, A. 2006. Tabletop sharing of digital photographs for the elderly. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Montréal, Québec, Canada, April 22 - 27, 2006). R. Grinter, T. Rodden, P. Aoki, E. Cutrell, R. Jeffries, and G. Olson, Eds. CHI '06. ACM, New York, NY, 781-790.

[2] Izadi S., Quigley A. and Subramanian S. (editors), Springer Journal of Personal and Ubiquitous Computing (PUC) Special Issue on Interaction with Coupled and Public Displays.

Note this text is adapted from some up coming book chapters I have so the text remains copyright. [ My Blog ]

Microsoft SecondLight

A very interesting new tabletop device that could have relevance in health care:

YouTube video of Microsoft SecondLight


Apparently, work is underway to develop the system for medical applications, e.g. enabling 3D imaging of CT scans.

Tuesday 10 February 2009

UK case study

For those of you that don't know, there is currently a major initiative to introduce a series of new technologies into the NHS in England. The trials and tribulations of this process provide much interesting food for thought about how technologies are introduced into health care and their impact.

One system that is being introduced is called Choose and Book - the basic idea being that a GP can use the system to request an appointment for a patient with an appropriate specialist. The system is meant to increase choice for patients because they can see the range of locations where they could go.

I just came across this news article on the system.

Reading it made me think about the challenges of evaluating such a system but also the importance of doing so. If different health care providers are implementing and engaging with the system in different ways, there is the potential for huge variation in impact on the process of care. But it also means its important to understand the relationship between how the system is implemented (including features of the implementation process such as training) and its subsequent impact on the process of care.

Making use of the blog

Thank you Thomas for posting details of the CiteULike group.

I just wanted to take this opportunity to encourage all of you to make use of the blog as you see fit, whether it's:
  • Posting links that you think will be of interest to the workshop participants, regarding new technologies, relevant research, interesting case studies;
  • Advertising relevant events such as workshops and conferences;
  • Reporting on research you're undertaking;
  • Posting a topic for discussion.
Rebecca