Implicit Human Computer Interaction Through Context - Schmidt 2000

"This paper defines implicit HCI. Processing power and advanced sensor technology can move HCI from explicit interaction, such as direct manipulation GUIs, to implicit interaction based on situational context. In the study, a question-based technique is offered to find applications that allow implicit engagement. Proposed is an XML-based language for implicit HCI. The language uses contextual variables and trigger-called actions. Four main approaches for designing context-aware applications are explored. Sensor-based perception is illustrated using a wearable context awareness component and a sensor-board. Situational context can be used to improve mobile input and output. (from simple temperature sensors to cameras), the resulting perceptional skills, and the fact that the main user population of current computer devices (e.g. mobile phones, PDAs, etc.) are non-experts, we may see another revolution in HCI. Perceptional devices (even if restricted) will change HCI from explicit to implicit. Future gadgets Mobile devices will see, hear, and feel. These devices will respond and react based on their interpretation of the situational setting. This paper will illustrate that this vision isn't so far off. Our research begins with simple concepts and their use. Basic perception could enable a change from explicit to implicit HCI, using examples and demonstrators"

Schmidt, A. (2000). Implicit human computer interaction through context. Personal technologies, 4(2), 191-199.

Previous
Previous

Emotion recognition in human-computer interaction - Cowie 2001

Next
Next

Human–Computer Interaction and Global Development - Toyama 2010