Vous êtes sur la page 1sur 2

Visual-Based HCI:1 2 The visual based human computer interaction is probably the most widespread area in HCI research.

Some of the main research areas are as follow: Facial Expression Analysis, Body Movement Tracking (Largescale), Gesture Recognition y Gaze Detection (Eyes Movement Tracking). Computer vision has a lot of future potential, you just have to look at Microsofts interactive tables, project Natal -which is Microsofts answer to controller free gaming, and the 6th Sense project- which is an impressive wearable gesture based interface. Audio-Based HCI: 3 4The audio based interaction between a computer and a human is another important area of HCI systems. This area deals with information acquired by different audio signals. While the nature of audio signals may not be as variable as visual signals but the information gathered from audio signals can be more trustable, helpful, and is some cases unique providers of information. Research areas can be divided to the following parts: Speech Recognition, Speaker Recognition, Auditory Emotion Analysis, HumanMade Noise/Sign Detections (Gasp, Sigh, Laugh, Cry, etc.), Musical Interaction. If voice recognition can work accurately there are plenty of applications, controlling your TV or computer, instructing your house to turn the lights on, machine automation, robotics control. These have all been tried before, and it is only over the last decade that accuracy has improved enough to make these more viable. Sensor-Based HCI: 5 6The commonality of these different types of HMI is that at least one physical sensor is used between user and machine to provide the interaction. These sensors as shown below can be very primitive or very sophisticated. We have: Pen-Based Interaction, Mouse & Keyboard, Joysticks, Motion Tracking Sensors and Digitizers, Haptic Sensors, Pressure Sensors, Taste/Smell Sensors. Some of these sensors have been around for a while and some of them are very new technologies. Super Cilia Skin (SCS) which has potential as an artificial skin with sensory feedback. Currently it allows two people to communicate over a distance by manipulating the orientations of what you could describe as oversized follicles (such as hair follicles) that you might find on the skin, so it could provide a whole new level of prosthetic feedback, and if the entertainment industry get it, we could see some very interesting applications.
1

Fakhreddine Karray, Milad Alemzadeh, Jamil Abou Saleh and Mo Nours Arab; Human Computer Interaction: Overview on State of the Art; march 2008
2

James Cannan and Huosheng Hu; Human-Machine Interaction (HMI): A Survey; Technical Report: CES-508
3

Fakhreddine Karray, Milad Alemzadeh, Jamil Abou Saleh and Mo Nours Arab; Human Computer Interaction: Overview on State of the Art; march 2008
4

James Cannan and Huosheng Hu; Human-Machine Interaction (HMI): A Survey; Technical Report: CES-508
5

Fakhreddine Karray, Milad Alemzadeh, Jamil Abou Saleh and Mo Nours Arab; Human Computer Interaction: Overview on State of the Art; march 2008
6

James Cannan and Huosheng Hu; Human-Machine Interaction (HMI): A Survey; Technical Report: CES-508

Multimodal HCI Systems: 7 8 The term multimodal refers to combination of multiple modalities. Examples of applications of multimodal systems are listed below: Smart Video Conferencing, Intelligent Homes/Offices, Driver Monitoring, Intelligent Games, E-Commerce, Helping People with Disabilities. Most notably as a source of control for a prosthetic limb like an arm or a leg. There are also full bodied exoskeleton suits that can enhance a users strength, such the Human Assisted Limb (HAL)

Fakhreddine Karray, Milad Alemzadeh, Jamil Abou Saleh and Mo Nours Arab; Human Computer Interaction: Overview on State of the Art; march 2008
8

James Cannan and Huosheng Hu; Human-Machine Interaction (HMI): A Survey; Technical Report: CES-508

Vous aimerez peut-être aussi