Vous êtes sur la page 1sur 68

Accessible and Assistive ICT

VERITAS
Virtual and Augmented Environments and Realistic User
Interactions To achieve Embedded Accessibility DesignS
247765

Parameterization of models to
the context of use
Deliverable No. D1.7.3

SubProject No. SP1 SubProject Title User Modelling


Workpackage W1.7 Workpackage Title Use Cases and application
No. scenarios
Activity No. A1.7.2 Activity Title Parameterisation of models to
the context of use
Authors Eleni Chalkia (CERTH/HIT), Karel Van Isacker (MCA),
Elisa Landini (ReLAB), Romina Catani (INDESIT),
Irene Duci (PIAGGIO), Nikolaos Partarakis (FORTH),
Blanca Jordan Rodriguez (ATOS),

Status F (Final)

Dissemination Level: Pu (Public)

File Name: VERITAS_D1.7.3_final

Project start date and 01 January 2010, 48 Months


duration
VERITAS_D1.7.3 PU Grant Agreement # 247765

Version History table


Version Dates and comments
no.

1 March 2011: 1st Draft of the multidimensional table was presented in the
plenary meeting in Basel and launched in Internal Deliverable 1.7.2 of Month
16.

2 April 2011: 2nd Draft taking under consideration of the comments and
changes from the plenary meeting. Also updates of the tasks analysis.

3 June 2011: 2nd Draft of the multidimensional table and task analysis was
presented in the plenary meeting in Santorini. Updates from partners lead to
the creation of the 3rd version of the document.

4 September 2011: 3rd draft version presented and commented from the
partners. Final multidimensional table distributed to the partners to be filled
in. Task analysis since under finalisation.

5 December 2011: Final version of the task analysis released. Multidimensional


table send to all partners for updated. 4th Version of the Deliverable available.

6 January 2012: Finalisation of the Deliverable and send for peer review.

7 February 2012: Final version after peer review released.

February 2012 3 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Table of Contents
Version History table ...............................................................................3
Table of Contents ...................................................................................4
List of Figures .........................................................................................5
List of Tables ..........................................................................................5
List of Abbreviations................................................................................6
Executive Summary ................................................................................7
1 Introduction .......................................................................................8
2 Analysing the context of Use ............................................................9
3 Methodological framework .............................................................. 11
4 VERITAS final task analysis ........................................................... 14
5 Context of Use in VERITAS application domains ............................ 17
5.1 VERITAS context of Use - Automotive domain ..................................................... 17
5.1.1 Car interior environment ................................................................................ 17
5.1.2 PTW environment .......................................................................................... 20
5.1.3 ADAS/IVIS environment ................................................................................. 22
5.1.4 ARAS/OBIS environment ............................................................................... 22
5.2 VERITAS context of Use - Smart living spaces domain ........................................ 23
5.2.1 House interior environment ............................................................................ 23
5.2.2 Domotic applications interface environment ................................................... 28
5.3 VERITAS context of Use - Workspaces domain.................................................... 30
5.3.1 Workspace interior environment..................................................................... 30
5.3.2 Collaborative tools applications interface environment ................................... 32
5.4 VERITAS context of Use - Infotainment domain.................................................... 37
5.4.1 Metaverses applications interface environment.............................................. 37
5.4.2 Collaborative games applications interface environment ................................ 43
5.5 VERITAS context of Use - Healthcare domain ...................................................... 45
5.5.1 Remote control healthcare applications interface environment ...................... 45
5.5.2 Mobile applications interface environment ..................................................... 46
5.5.3 Health coach applications interface environment ........................................... 47
5.6 VERITAS Models in the context of Use ................................................................. 50
6 Conclusions and future steps .......................................................... 65
Reference ............................................................................................. 66
ANNEX 1: VERITAS Final task analysis for the Use Cases .................. 68
ANNEX 2 : VERITAS Multidimensional table of Models in the context of
Use ....................................................................................................... 68

February 2012 4 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

List of Figures
Figure 1: From the initial to the final Use Cases and Task analysis. .................................... 12
Figure 2: From the user needs extraction to the Usability evaluation through the Context of
Use. .................................................................................................................................... 13
Figure 3: Methodology for the extraction of the multidimensional table of models in the
context of use...................................................................................................................... 50
Figure 4: Multidimensional table of models in the context of use. ........................................ 51

List of Tables
Table 1: Example of the final task analysis. ......................................................................... 16
Table 2: Car interior context of use. .................................................................................... 20
Table 3: PTW context of use. .............................................................................................. 21
Table 4: ADAS/IVIS context of use...................................................................................... 22
Table 5: ARAS/OBIS context of use. ................................................................................... 22
Table 6: House interior context of use. ................................................................................ 28
Table 7: Domotic applications context of use. ..................................................................... 29
Table 8: Workspace interior context of use. ........................................................................ 32
Table 9: Collaborative games context of use. ...................................................................... 36
Table 10: Metaverses context of use. .................................................................................. 42
Table 11: Collaborative games context of use. .................................................................... 44
Table 12: Healthcare remote control application context of use. ......................................... 45
Table 13Mobile applications context of use. ........................................................................ 46
Table 14: Health-coach context of use. ............................................................................... 49

February 2012 5 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

List of Abbreviations
Abbreviation Explanation
Advanced Driver Assistant System ADAS
Advanced Rider Assistant System ARAS
Human Computer Interaction HCI
Information Communication Technologies ICT
In Vehicle Information System IVIS
On Board Information System OBIS
Power Two Wheeler PTW

February 2012 6 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Executive Summary
The modern way of living has increased the way we interact with the various products that
usually have more than one sophisticated function or multiple ways of use. In addition, the
extended usage of all ICT and non ICT applications by people of all ages and all kinds of
special abilities, requires their design to be done in such a way so as to satisfy this diverse
range of users.

This need for flexibility in a system design, is enhanced by the multi-domain scope of
VERITAS, which aims to apply the different models that will be developed, in different
application environments. This multi-domain applicability of the models, adds great
importance to the context of use that needs to be clearly defined and correlated to all the
components of the models to-be-developed.

To this end, the scope of this Deliverable is to provide models that will be adopted to the
different context of use for each application domain of VERITAS, namely the automotive, the
smart living spaces, the home interior and domotics, the workplace, the infotainment and the
healthcare domain.

More specifically, a detailed and updated task analysis that fits exactly the needs of the
beneficiaries is defined, on the basis of the Use Cases. Additionally, this tasks analysis
complements the Use Cases of D1.7.1.b, by compiling each Use Case and scenario with
specific tasks, thus determining the VERITAS use cases with reference to the different
applications (ICT, non-ICT), environments (automotive, smart living spaces, home interior
and domotics, workplace, infotainment and healthcare) and user groups.

The goal of this Deliverable is therefore two-fold: firstly, to describe the context of Use in the
different application domains of VERITAS and secondly to determine to what extent its
different characterizing components of the environment are related to specific tasks for
specific user groups, providing criteria and thresholds for the tasks depending on the
environment and the user.

The outcome of this activity is the object oriented description of the context of use for the
specific applications studied, developed and tested in VERITAS, as well as a
multidimensional table that presents the interaction of the models in each context of use.

The work of this activity and the parameterised models in the context of use has been and
will be used in all activities of SP1, namely WP1.3, WP1.4 and WP1.5 on user models, to
define the concrete values of the models for each task, sub-task and primitive task, either
from the literature or from the multisensorial platform measurements. The parameterised
models in the context of Use will are also be used in SP2 and specifically WP2.2, WP2.3,
WP2.4 and WP2.5 for the development of the VERITAS simulation models. The
methodology of the multidimensional tables has been defined in the current activity, and their
context has been defined preliminary in WP1.3, WP1.4 and WP1.5, as well as in the
aforementioned SP2 activities. The context of the tables has been iteratively updated in the
duration of the second year of the project through the projects meetings and workshops, so
as to have the muilitdimensional tables of the models in the context of use in this
Deliverable, which are the final ones.

February 2012 7 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

1 Introduction
The increased multiple usage of every day ICT and non-ICT products, in collaboration with
the extend variation of users in accordance to age and abilities, prescribes the need of a
certain flexibility in the design of the products. This flexibility issues are becoming even more
important when we include in our target group disabled and elderly people; a user group that
incorporates extremely diverse needs. Additionally, as it has been pointed out by Norman
(Norman, 1998) inadequately designed objects may not only induce negative emotions from
the users but also sometimes put them in a dangerous situation. That is especially the case
for people with disabilities. Thus, systems and products have to be designed in a more
flexible way to better cope the different contexts of use (Dittmar and Forbrig, 2005).

The need of a better understanding of the contexts role has emerged in VERITAS, from the
beginning of the project. Since the models under development are to be used in various,
different and diverse application domains, this also means different environments. Although
this is not new, the concept of VERITAS increases the insight that interactive systems and
models should be designed in a more flexible way, so as to enable their application in a
greater variety of situations. Consequently, designers have to place greater emphasis on the
context of use.

The term context and even more their term context of use have been nominated with
various definitions. The analysis of the story behind the terms as well as their applicability
according to the literature is presented in Chapter 2, were an analysis of the context of use is
being illustrated.

In VERITAS the context of use goes one step further, on interactional context context
that evolves in the course of interaction between the user-model and the specific
environment. Interactional context is task oriented and defined by the interaction of the user
with information and communication devices, as well as with the infrastructure. The
contextual task model is an aggregated body of information constructed in a
multidimensional table that includes information about a) environmental parameters that can
be used to determine the users current situation, and b) interaction characteristics of the
tasks performed by the specific user type. The methodology followed for the extraction of
this multidimensional table is described in Chapter 3. In Chapter4 the final task analysis fo
VERITAS is presented. Additionally, after the literature survey the description of the terms,
and the methodological approach the actual representation of the context of use in VERITAS
is provided in Chapter 5, where the specific environment for each application domain
(automotive, smart living spaces including domotics, workplace, infotainment and
healthcare) are presented following an object oriented approach. In Chapter 6 the
conclusions and future steps the current Deliverable are presented.

The overall goal of the context assessment is to develop context-of-use models for users
interaction with applications of all the VERITAS application domains, as well as provide
information for their further usage within the project.

February 2012 8 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

2 Analysing the context of Use


When developing interactive systems, within a specific oriented project, the design model
gives structure and meaning to the design problem, enabling the designer to negotiate the
design task using a process or systematic method. Models (task models, user models,
simulation models) help the developer of the system-to-be to visualize the problem and
manage to cope with it by breaking it down into discrete, manageable units. In VERITAS we
are developing a set of models that aim in visualising the user of a system, realising specific
tasks within a specific environment. Thus, the real value of a model can only be determined
within the specific context of use it will be implemented in (Ryder, M. 2006, Fauser et al.
2006).

According to the ISO 9241-11:1998 standard, context of use is defining the usability of a
product as follows, usability is defined as the extent to which a product could be used by
specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a
specified context of use. In the same standard, the context of use is defined by users, tasks,
equipment (hardware, software and materials), and the physical and social environments in
which a product is used. Since usability is evaluated in context, it is important for designers
to rely on a flexible task representation, able to capture context variations.

Trying to determine and categorize the term context of use raises plenty of difficulties
because of the manifold ways that this term have been described in the past by different
groups according to their field of study. The meaning of the concept context by Winogard
(Winogaed, 2002) originates from the words con and text which means with text. A
different approach has been given by the Definitions website which defines context as a
set of facts or circumstances that surround a situation or event. In the field of HCI, context
has been defined according to the location and proximity of objects (Shilit and Theimer
1994) by providing synonyms (Rodden et.al 1998), or by identifying factors that constitute
the current context of user (Day and Abowd 1999). Many ethnographical researchers have
dedicated their work to understand context (Dourish, 2001, 2004; OHara et al., 2006). The
researchers consider context of use as a part of a holistic picture of experience (see e.g.,
Hassenzahl & Tractinsky, 2006; Roto, 2006) although research into context targets the
modeling of features (Cheverts et al., 2000, 2001; Dey, 2001) and usability or user-
experience . Bradley and Dunlop (2005) try to combine theories from different fields, like
linguistic, computer science and psychology to present a multidisciplinary model of context.
In this model, the context of use is defined by task, physical, social and temporal
components of context.

In this Deliverable, we use the term context as an object oriented representation of the
environment where the user is interacting. The closest definition to that has been provided
by Dey et al. (2001): any information that can be used to characterize the situation of
entities (i.e. whether a person, place or object) that are considered relevant to the interaction
between a user and an application, including the user and the application themselves.

According to Dittmar and Forbrig (Dittmar and Forbrig, 2005), humans apply interactive
systems to transfer mental actions to application functions (functional division) or as a
means to co-operate with each other (division of labour). Consequently, a context of use
describing an actual situation under which an interactive system is applied by a user is
determined by the current functional division, the current division of labour, the physical
environment (including devices in use) and the characteristics of the user. According to the
above and since in VERITAS the cooperation between users is not under research, the
context of use is defined by the following:

February 2012 9 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Interaction tasks.
Physical environment.
Characteristics of the user.

Thus, the aforementioned context depends on three very different, but also totally connected
components. We have on the one hand the different environments and on the other the
different user models, while the connection between them lays on the interaction. So, in
VERITAS the context of use goes one step further, the interactional context that evolves in
the course of interaction between the user and the specific environment. Interactional
context is task oriented and defined by the interaction of the user with information and
communication devices, as well as with the infrastructure. The context task model is an
aggregated body of information constructed in a multidimensional table that includes
information about a) environmental parameters that can be used to determine the users
current situation, and b) interaction characteristics of the tasks performed by the specific
user type.

The interactional context is associated with actions and events concerning the user, the
environment and the interaction. The interactional problem of context arises beyond the task
modelling when the models are included in a specific environment: such context is not static
information, but rather constituted through the interaction, defined and sustained by the
activity itself (Liu et al., 2003). An ideal interactional context model should provide
information on context aspects relevant for the given application, hiding irrelevant confusing
and redundant context details and offer a high-level interpretation of lower-level context
details if requested.

February 2012 10 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

3 Methodological framework
This Deliverable comes as the continuation and actually finalisation of WP1.7, combining the
Use Cases developed in A1.7.1 with the task models developed in A1.7.2 and
parameterising the models in the context of their use. Based on the user needs extracted
from WP1.1 and illustrated in Deliverable 1.1.1 (Goranova, Van Isacker et. al., VERITAS
D1.1.1, 2010), the corresponding Use Cases have been built, translating the needs of the
end-users into scenarios describing the sequences of events that, taken together, lead to a
system doing something useful (Chalkia et, al, VERITAS D1.7.1.a, D1.7.1.b, 2009-2010).
Developing Use Cases has been a joint task between users, usability engineers and
software engineers which have been called to evaluate and assess the Use Cases of
VERITAS numerous times within the two first two years of the project, until their finalisation
in Month 24. Hand in hand with the Use Cases the task analysis has also been discussed
with stakeholders and has been iteratively evaluated by them at every event that was
organised for VERITAS, user workshops and dedicated user focus groups.

Developing Use Cases for VERITAS has been a very complex issue, since the Use Cases
are by definition addressed to the designer, but in VERITAS all the Use Cases for the
specific domains (automotive, smart living spaces, home automation and domotics,
workplace, entertainment and personal healthcare) refer to tasks that are to be realised by
the user models who are actually a simulation of the beneficiary. To this end, combining the
technical oriented Use Cases, who are meant to be used by the developers and correspond
to their needs, with the tasks that the actual beneficiaries need to realise and are critical to
be tested in each application domain, is a task that has been handled by the A1.7.3 and
reflected in the current Deliverable. The knowhow gained from the developer combined with
the real needs of beneficiaries in each application domain, has lead us to a concrete
analysis of tasks, based upon the Use Cases.

Thus, the Use Cases have been developed so as to address the developers and the
functional requirements of the system. Nevertheless, in the Use Case the tasks of the task
analysis are included since they compile the models that will run during the simulations. In
the beginning of the project the Use Cases and the task analysis ran in parallel, trying to
reach a point of maturity when they would be merged. To this end, in the beginning of the
project the Use Cases were a subset of the task analysis, including also some additional
tasks. In the continuation of the project and after the Use Cases prioritisation by the
developers and the task analysis from the beneficiaries the Use Cases and the tasks
analysis were combined in order to create the final tasks analysis that is presented in the
current document and illustrates the top priority tasks of the top priority Use Cases. The
evolvement of the Use Cases and the tasks analysis is depicted in the following figure.

February 2012 11 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

First version of
the Task Iterative evaluation
with internal and
analysis
external to the
D1.7.2 First version of Task Analysis
Consortium experts.
the Use Cases Prioritisation and & Use Cases
D1.7.1.a discussion through D1.7.3,
Workshops and user D1.7.1.b
groups.

Figure 1: From the initial to the final Use Cases and Task analysis.

But that is not the only scope of this Deliverable. Additionally, since we are studying the
context of use and trying to parameterise the models in to the context of use of each
different application, we had to go deep into each application domain and identify metrics
and thresholds which would be valid for each specific application. To this end we have
created a multidimensional matrix that is based on the Use Cases, mapped with the
identified final tasks decomposed into sub-task and primitive task, combined with specific
physical impairments, cognitive attributes and behavioural & psychological states, defined
the context of use using specific criteria and thresholds that characterise the specific task for
each application of each domain.

Thus, with this Deliverable, we provide a multidimensional table that includes a detailed and
updated task analysis that fits exactly the needs of the beneficiaries and comes to determine
and analyse all the Use Cases of the D1.7.1.b to specific tasks, with reference to the
different applications (ICT, non-ICT), environments and contexts (automotive, smart living
spaces, home automation and domotics, workplace, entertainment, personal healthcare and
the specific sub-domains to be identified), as well as user groups, embedding the detailed
decomposition of tasks, that will be used for the generation of the virtual task models
developed in the other activities of the SP1, as well as SP2.

Therefore, following the work of the previous activities in WP1.7, this activity goes a step
further. For each of the specific Use Cases the final tasks are defined and decomposed, so
as to encompass a concrete tasks analysis. This task analysis table is mapped with specific
task oriented metrics and thresholds so as to result in a concrete parameterisation of models
in the context of Use. This parameterisation of models in the context of Use will be detailed
for each type of model targeted in VERITAS (physical models, cognitive models,
behavioural/psychological models) for each of which different tasks are proposed.

The overall development process of the context-of-use comprises of the following:


Requirements analysis from the developers, in which the scenarios and the Use
Cases are built in, collaboration with potential users.
Requirements analysis from the beneficiaries, so as to define a concrete task
analysis.
Development of a context-of-use model based on the above scenarios, Use Cases
and tasks.
Implementation of a context-of-use model in the simulation models of VERITAS.
Evaluation of the simulation models and the final VERITAS application and thus of
the context-of-use model in specific tasks from the task analysis.

February 2012 12 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Scenarios were chosen because they provide a method which can be used at different
stages during the entire development process: envisioning future technologies, defining user
requirements, describing how people will be using new systems, analyzing user tasks,
prototyping and evaluating systems or prototypes (Go and Carroll 2004). In addition, a Use
Cases definition of the scenarios in the design stage (Carroll and Rosson, 2002) ensures the
understanding of the scenarios from the developers. Scenario-based and Use Cases system
design is suitable for gaining insights into how users will accept and work with future
technologies (Carroll 2000) providing certain flexibility because they can be used in different
phases of the development cycle, as in the case of VERITAS.

The work of this activity and the parameterised models in the context of use has also been
used in all activities of SP1, namely WP1.3, WP1.4, WP1.5 and WP1.6 on user models, to
define the concrete values of the models for each task, sub-task and primitive task, either
from the literature or from the multisensorial platform measurements. The parameterised
models in the context of use will also be used in SP2 and WP2.2, WP2.3, WP2.4 and WP2.5
to develop in this basis the simulation models.

The detailed representation of the whole methodological approach is provided in the


following figure.

User Groups
Needs and
Identification Requirements Desi-
of Use Groups gners

Benefi-
ciaries
System
Abstract Task requirements
User Analysis
Models Scenarios
and UCs
Context Final
of Tasks and
Use Use Cases

Virtual Usability
User Models in
the Context
evaluation
Models
Of Use
Simulation
Models
VERITAS
Tools and
Models

Figure 2: From the user needs extraction to the Usability evaluation through the Context of
Use.

February 2012 13 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

4 VERITAS final task analysis


Human actions can be characterized as situated and tool-mediated (Leont'ev, 1978).
Task analysis and modelling is an iterative process of identification and description of tasks
that differ between human beings with different abilities, as well as in different context of use.
For task decomposition, several criteria could be applied but their relevance is varying
according to the task model layer. The tasks could be decomposed according to some of the
following criteria, (Pribeanu, 2007):
Function: tasks associated with the same business goal. This criterion applies for
task decomposition at functional level for the mapping of application functions to user
tasks.
Semantics: task performing an operation onto the same domain object. This criterion
is applied to separate tasks which refer to the same object or to the same operation
(add new, delete) or to the same interaction method (when several methods are
available to accomplish a goal). It is relevant for both functional and operational level
and helps in the identification of unit tasks.
Task object: tasks performing operations with the same interaction object or external
object. The criterion is relevant for the operational level and helps in the identification
of basic tasks.
User and work: tasks are performed by the same user (playing a given role) and are
denoting a similar work (manual, interactive, communication). The criterion is mainly
relevant for cooperative tasks.
Temporal: tasks denoting specific temporal constraints (like repetitive or optional
performance). The criterion is relevant for the representation of temporal constraints
among tasks.

For performing a task analysis in VERITAS, we relied on a contextual task analysis, whereby
observation and one-on-one interviewing were applied to understand the (often complex)
task procedures that people with and without disabilities have to follow to reach their goals
(Goranova, Van Isaker, VERITAS D1.7.2, 2010). Due to the expected extent and complexity
of the tasks analysis, this work started already in the first months of the project and lasted for
the next two years. The task analysis as applied in WP1.7 analyses how a task is
completed, including a detailed description of both manual (physical) and mental (cognitive,
psychological and behavioural) activities and (subtasks and primitive) tasks, as well as
environmental conditions (different application areas).

The aim of the task decomposition is to decompose the high level tasks and break them
down into their constituent subtasks and primitive tasks. This results in an overall
hierarchical structure of the main user tasks. The process of task decomposition is best
represented as a Hierarchical Task Analysis. For this reason, tables were defined,
sequencing the different tasks, subtasks and primitive tasks vertically, while detailing them
horizontally (Goranova, Van Isacker, Lazarov, VERITAS D1.7.2, 20010).

The structure of the task analysis is as presented below in a short extract for the Workplace
application area (specifically using a desk), while the summative table with the tasks and the
Use Cases is presented in Annex 1.

The columns of the table are explained hereafter:


Domain: this column defines 1 of the 5 application areas addressed by VERITAS,
namely automotive, work environment, house environment, infotainment and ehealth.
Sector: this column the Use Cases for each application domain.
Task: this is the top level of the task analysis itself and describes an activity such as
steering a car. This is then split up in:

February 2012 14 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

o Subtask
Disability/AT: this column defines what disability reaches a barrier with
a specific subtask, and what assistive technology is used to overcome
this.
o Primitive task that is realized depending on the disability involved.
Apart from physical primitive tasks, there are also 2 other aspects we need to
consider:
o Cognitive attributes
o Psychological and behavioural state

Cognitive state Primitive


Sub P&B state affecting the Primitiv Body
Task Disability affecting the task
task task e tasks part
task Object
Enterin Blind and low
Acute stress, very high
g the vision,
Reduced acute stress, chronic
worksp Cognitive Locate Eyes Door
vision stress, stress,mental
ace and aging,
Fatigue
sitting Alzheimer's.
on desk Reduced Cognitive mental Fatigue,
Walk Feet Door
vision aging emotions
Blind and low
Acute stress, very high
vision,
Reduced acute stress, chronic Door
Cognitive Locate Eyes
vision stress,mental Fatigue, handle
aging,
emotions
Alzheimer's.
Locate Acute stress, very high
and Reduced Cognitive acute stress, chronic Door
Grasp Hands
enter vision aging stress, stress,mental handle
office Fatigue
Acute stress, very high
Reduced Cognitive acute stress, chronic Door
Push Hands
vision aging stress, stress,mental handle
Fatigue
Acute stress, very high
Reduced Cognitive acute stress, chronic
Push Hands Door
vision aging stress, stress,mental
Fatigue
Acute stress, very high
Cognitive
Reduced acute stress, chronic Door
aging, Release Hands
vision stress, stress,mental handle
Alzheimer's
Fatigue
Locate Blind and low
Acute stress, very high
and vision,
Reduced acute stress, chronic
walk Cognitive Locate Eyes Chair
vision stress,mental Fatigue,
to the aging,
emotions
office Alzheimer's.

February 2012 15 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Cognitive state Primitive


Sub P&B state affecting the Primitiv Body
Task Disability affecting the task
task task e tasks part
task Object
chair
Reduced Cognitive mental Fatigue,
Walk Feet Ground
vision aging emotions

Acute stress, very high


Reduced Cognitive acute stress, chronic
Push Hands Chair
vision aging stress, stress, mental
Sit on Fatigue
chair
Cognitive
Reduced Sit (full Full
aging (full - Chair
vision body) Body
body)

Table 1: Example of the final task analysis.

February 2012 16 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

5 Context of Use in VERITAS application domains


As presented in Chapter 2, in VERITAS with the term context of use we describe an object
oriented representation of the environment where the user is interacting, following the
definition that has been provided by Dey et al. (2001): any information that can be used to
characterize the situation of entities (i.e. whether a person, place or object) that are
considered relevant to the interaction between a user and an application, including the user
and the application themselves.

In this Chapter we present the framework for the context of use for each different application
domain of VERITAS. The context of Use is defined for the specific Use Cases and tasks that
have been identified in the task analysis, and is described through a related to the task
object, accompanied by the object parameters.

The context of use tables and the object parameters of the deferent domains, have also
been used for the extraction of the simulation models in SP2, namely in the Deliverables
D2.2.1 (Varalda, et al, VERITAS D2.2.1, 2011), D2.3.1(Telkamp , et al, VERITAS D2.3.1,
2012), D2.4.1(Petridis, et al, VERITAS D2.4.1, 2011), D2.5.1(Nunnari, et al, VERITAS
D2.5.1, 2012), D2.6.1(Tamburini, et al, VERITAS D2.6.1, 2012).

5.1 VERITAS context of Use - Automotive domain


In this Chapter the environment of Automotive application domains of VERITAS is
presented, indentifying the task related objects that are included in each context and their
parameters, adopted to the updated task analysis. In the automotive domain we have two
different sectors the non-ICT, which is the car interior environment (subchapter 5.1.1) and
the PTW design (subchapter 5.1.2) and the ICT which is the ADAS/IVIS (subchapter5.1.3)
and the ARAS/OBIS (subchapter 5.1.4) applications.

5.1.1 Car interior environment

Task -
Use Case Related Object Object parameters
Scenario
Position in x, y, z.
Door handle dimensions.
Door handle
Up-down and left-right. forces and knob
movement range.
Button dimensions.
Button that opens Movement range.
Open left front doors Unlock force law.
Getting in a door Labelling.
car Recognition rate.
Voice activated opening
Semantics.
doors
Interaction tree.
Movement range.
Door Opening forces.
Opening angle.
Enter in car Car seat Position in x, y, z.
through left Car interior Position in x, y, z.

February 2012 17 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Task -
Use Case Related Object Object parameters
Scenario
front door Support bar inside car Position in x, y, z.
(on top of car door Movement range.
entrance on the inside) Up-down and left-right forces.
or portable handle
Position in x, y, z.
Door handle dimensions.
Interior door handle
Up-down and left-right forces and knob
movement range.
Recognition rate.
Voice activated closing
Semantics.
doors
Interaction tree.
Recognition rate.
Voice activated door
Semantics.
lock
Interaction tree.
Close left front
Movement range.
door
Door Opening forces.
Opening angle.
Button dimensions.
Movement range.
Lock button
Lock force law.
Labelling.
Button dimensions.
Button that closes Movement range.
doors Lock force law.
Labelling.
Position in x, y, z.
Steering wheel dimensions.
Steering wheel
Up-down and left-right. forces and
steering wheel movement range.
Steering
Position in x, y, z.
Steering wheel dimensions.
Steering ball or spinner.
Up-down and left-right. forces and knob
movement range.
Position in x, y, z.
Conducting a
Gear pedal Movement range
car
Actuation / de-actuation force law
Position in x, y, z.
Gear handle dimensions.
Gear handle
Changing gear Up-down and left-right. forces and knob
movement range.
Position in x, y, z.
Button dimensions and grip.
Button gear selector
Control force law.
Labelling.

February 2012 18 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Task -
Use Case Related Object Object parameters
Scenario
Position in x, y, z.
Controls on steering Buttons dimensions and grip.
wheel Control force law.
Labelling.
Position in x, y, z.
Buttons dimensions and grip.
Push button clutches
Control force law.
Labelling.
Position in x, y, z
Mirror dimensions
Central rear view mirror
Viewing Surface grip
backwards Movement friction
(inside car) Recognition rate.
Voice activated Mirror Semantics.
Interaction tree.
Position in x, y, z
Mirror dimensions
Left lateral mirror
Surface grip
Movement friction
Position in x, y, z
Knob dimensions
Viewing Left lateral mirror knob
Up-down and left-right forces and knob
backwards
movement range
(outside car,
Position in x, y, z.
left mirror)
Button dimensions and grip.
Mirror button
Control force law.
Labelling.
Recognition rate.
Voice activated Mirror Semantics.
Interaction tree.
Position in x, y, z.
Lever length.
Handbrake lever with
Lever range.
pull unlock handle
Handle diameter and grip.
Parking brake
Pull force law.
activation
Position in x, y, z.
Handbrake lever unlock Button dimensions and grip.
button Control force law.
Labelling.
Position in x, y, z.
Lever length
Parking brake Handbrake lever with
Lever range
deactivation pull unlock handle
Handle diameter and grip
Resistance force law

February 2012 19 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Task -
Use Case Related Object Object parameters
Scenario
Position in x, y, z.
Handbrake lever unlock Button dimensions and grip.
button Control force law.
Labelling.
Position in x, y, z.
Storage door
Label specifications
Push-to-open storage Position in x, y, z.
door Resistance force law.
Accessing
Using interior Position in x, y, z.
interior
equipment Storage compartment Handle dimensions
storage
door handle Movement range
Movement force law.
Storage compartment Position in x, y, z.
Dimensions and physical accessibility.

Table 2: Car interior context of use.

5.1.2 PTW environment

Use Case Task - Scenario Related Object Object parameters


Position in x, y, z. of the
Riding a Powered Two handle bars.
Steering Handle bars
Wheeler Left-right forces and knob
movement range.
Motorcycle Handling Position in x, y, z. of the
Ignition switch Ignition switch.
Starting the engine Ignition law forces.
- Position in x, y, z. of the kick-
Kick-start lever start lever.
Resistance law forces.
Position in x, y, z. of the
Throttle
Decelerating throttle.
Position in x, y, z. of the
Throttle
Accelerating throttle.
Gearing Position in x, y, z. of the clutch.
Clutch
(motorbike) Resistance law forces
Dual lever system Position in x, y, z. of the lever.
(front and rear
Braking
brake) or front
brake lever
Position in x, y, z.
Button dimensions and grip.
Indicate direction Turn signal
Control force law.
Labelling.
Locating and Position in x, y, z.
Left lateral mirror
adjusting the Mirror dimensions

February 2012 20 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Use Case Task - Scenario Related Object Object parameters


lateral mirror (left) Surface grip
Movement friction
Usage of scooter on Vibration comfort index for
Handlebars
bumpy roads. the hands.
Vibration comfort index for
Riding Posture Footrest
the hands.
Vibration comfort index for
Seat
the buttock and the head.
Position in x, y, z. of the
Handle bars
handle bars.
Get on a motorbike Seat Position in x, y, z. of the seat.
(driver) Position in x, y, z. of the
Ground ground related to the body
posture.
Position in x, y, z. of the
Handle bars
handle bars.
Position in x, y, z. of the
Footrest footrest.
Get on a scooter
Resistance law forces
(driver)
Seat Position in x, y, z. of the seat.
Position in x, y, z. of the
Ground ground related to the body
posture.
Position in x, y, z. of the Side
Park a powered Side stand stand.
Parking the two wheeler Resistance law forces
motorcycle/scooter. vehicle onto a side Position in x, y, z. of the
Handle bars
stand handle bars.
Seat Position in x, y, z. of the seat.
Position in x, y, z. of the
Park a powered Left handlebar
handle bars.
two wheeler Position in x, y, z. of the Frame
vehicle onto a Frame member
member.
centre stand (if Position in x, y, z. of the centre
possible start with Centre stand
stand
the vehicle on its
Position in x, y, z. of the
side stand) Handlebars handle bars.
Position in x, y, z of the ground
Ground
related to the body posture.
Get off a powered Position in x, y, z. of the
Handle bars
two wheeler handle bars.
vehicle (driver) Seat Position in x, y, z. of the seat.
Position in x, y, z of the vehicle
Vehicle
related to the body posture.

Table 3: PTW context of use.

February 2012 21 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

5.1.3 ADAS/IVIS environment

Use Case Task - Scenario Related Object Object parameters


Position in x, y, z of
Sensor enabled the sensor.
Managing function a (ADAS)
function Range area of the
sensors.
Position in x, y, z of
Sensor enabled the sensor.
Managing function b (ADAS)
function Range area of the
sensors.
Position in x, y, z of
Managing the Sensor enabled the sensor.
Managing function c (ADAS)
ADAS system function Range area of the
(Gestures/ ACC) sensors.

Position in x, y, z of
Managing function a Sensor enabled
Managing IVIS the sensor.
(IVIS/Typewriter) function
(Typewriter) Range of the sensors.
system by using touch
(Tablet PC)

Table 4: ADAS/IVIS context of use.

5.1.4 ARAS/OBIS environment

Use Case Task - Scenario Related Object Object parameters

Receiving information Listening to the Recognition rate.


Speed limit exceeded
from the device while sounds in the Semantics.
during PTW riding
driving helmet Interaction tree.
To get aware about Listening to the Recognition rate.
low fuel level during sounds in the Semantics.
regular riding helmet Interaction tree.

Table 5: ARAS/OBIS context of use.

February 2012 22 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

5.2 VERITAS context of Use - Smart living spaces domain


In this Chapter the environment of each application domains of VERITAS is presented,
indentifying the task related objects that are included in each context and their parameters,
adapted to the updated task analysis.

5.2.1 House interior environment


Task -
Use Case Scenario Related Object Object parameters
Moving around Ground Position x, y, z of the ground in respect to the user
Running
outside house model.
down a
Texture.
pathway
Inclination.
Door Position in x, y, z. of the door.
Door handle Position in x, y, z.
Door handle dimensions.
Opening Up-down and left-right.
door forces and handle movement range.
Voice Recognition rate.
controlled Semantics.
door Interaction tree.
Running Wheelchair Position in x, y, z.
down a wheel handles Wheelchair wheel handles dimensions.
pathway Up-down forces.
Stop Wheelchair Position in x, y, z.
running wheel handles Wheelchair wheel handles dimensions.
down the Up-down forces.
pathway Resistance forces.
Support bar Position in x, y, z.
Running Ground Position x, y, z of the ground in respect to the user
down a model.
pathway Texture.
Inclination.
Window bar Position in x, y, z.
Voice Recognition rate.
controlled Semantics.
Open window Interaction tree.
window Switch Position in x, y, z.
Switch dimensions and grip.
Control force law.
Moving around Labelling.
inside house Ground Position x, y, z of the ground in respect to the user
model.
Navigatin Texture.
g in room Inclination.
(unfamili Furniture Position in x, y, z.
ar house) Walking cane Position in x, y, z.
Ground Position x, y, z of the ground in respect to the user
model.

February 2012 23 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Task -
Use Case Scenario Related Object Object parameters
Texture.
Inclination.
Wheelchair Position in x, y, z.
wheel handles Wheelchair wheel handles dimensions.
Up-down forces.
Toilet
door Position in x, y, z.
Ground Position x, y, z of the ground in respect to the user
Using
model.
toilet
Texture.
Inclination.
Toilet seat Position in x, y, z.
Flushing Position in x, y, z.
actuator knob Wheelchair wheel handles dimensions.
Up-down forces.
Flush
Flushing Position in x, y, z.
toilet
actuator Button dimensions and grip.
button Control force law.
Using toilet Labelling.
Faucet Position in x, y, z.
Position in x, y, z.
Faucet knob dimensions.
Faucet knob
Up-down and left-right. forces and knob
Washing movement range.
hands Sensor enabled Position in x, y, z of the sensor.
faucet control Range of the sensors.
Foot pedal Position in x, y, z.
faucet control Foot pedal faucet control dimensions.
Up-down forces and knob movement range.
Towel Position in x, y, z.
Drying
Sensor hand Position in x, y, z of the sensor.
hands
dryer Range are of the sensors.
Position in x, y, z.
Turning
Switch on the Light switch Button dimensions and grip.
on light
lights button Control force law.
switch
Labelling.
Position in x, y, z.
Button dimensions and grip.
Device buttons
Handling Control force law.
kitchen Labelling.
device Position in x, y, z.
Kitchen Handling
interface Lever control dimensions.
Lever control
(electric Up-down and left-right forces and knob movement
oven) range.
Electric control Position in x, y, z.
touch buttons Button dimensions and grip.

February 2012 24 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Task -
Use Case Scenario Related Object Object parameters
Labelling.
Gas hob Position in x, y, z.
Pot Position in x, y, z.
Pot (single) Position in x, y, z.
handle Pot handle dimensions.
Position in x, y, z.
Hob gas
Using gas Faucet knob dimensions.
control knob
hob Rotation forces and knob movement range.
(physical Knob Position in x, y, z.
interface) Position in x, y, z.
Button dimensions and grip.
Ignition button
Control force law.
Labelling.
Pots with 2 Position in x, y, z.
handles Pot handle dimensions.
Position in x, y, z
Pot
pot.
Pot (single) Position in x, y, z.
handle Pot handle dimensions.
Touch Position in x, y, z.
buttons/interfa Button dimensions and grip.
ce Labelling.
Visual Position in x, y, z.
Using feedback (red Density of the light.
induction light)
hob Position in x, y, z.
(touch Buttons/ Button dimensions and grip.
control) interface Control force law.
Labelling.
Induction hob Position in x, y, z.
Pots with 2 Position in x, y, z.
handles Pot handle dimensions.
Touch Position in x, y, z.
buttons/interfa Button dimensions and grip.
ce placed in Control force law.
front of stove Labelling.
Washing
machine Position in x, y, z.
Porthole Position in x, y, z.
handle Porthole handle dimensions.
Using Position in x, y, z.
Porthole
washing
Position in x, y, z.
machine
Detergent Faucet knob dimensions.
drawer Up-down and left-right forces and knob movement
range.
Detergent Position in x, y, z.

February 2012 25 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Task -
Use Case Scenario Related Object Object parameters
Detergent Mass in kg.
Position in x, y, z.
Knob Knob dimensions.
In-out forces and knob movement range.
Laundry/clothe Position in x, y, z.
s Mass in kg.
Drums filling Position in x, y, z.
level Labelling.
Detergents Position in x, y, z.
filling level Labelling.
Porthole Position in x, y, z.
button (on top Button dimensions and grip.
washing Control force law.
machine) Labelling.
Dish washer Position in x, y, z.
Position in x, y, z.
Door handle dimensions.
Door handle
Up-down and left-right. forces and knob
movement range.
Position in x, y, z.
Dishes
Mass in kg.
Door Position in x, y, z.
Position in x, y, z.
Detergent
Detergent drawer dimensions.
drawer
In-out forces and knob movement range.
Detergent Position in x, y, z.
drawer lid
Position in x, y, z.
Detergent
Mass in kg.
Using
Detergents Position in x, y, z.
dishwash
er filling level Labelling.
Position in x, y, z.
Knob Knob dimensions.
In-out forces and knob movement range.
Button with Position in x, y, z.
tactile Button dimensions and grip.
interface Labelling.
Position in x, y, z.
Door handle
Button dimensions and grip.
button (on top
Control force law.
dish washer)
Labelling.
Detergent Position in x, y, z.
drawer (opens Detergent drawer dimensions.
by being Resistance forces.
pushed)
Door Position in x, y, z.

February 2012 26 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Task -
Use Case Scenario Related Object Object parameters
handle/button Button dimensions and grip.
(in front dish Control force law.
washer) Labelling.
Fridge Position in x, y, z.
Door handle Position in x, y, z.
Door handle dimensions.
Up-down and left-right. forces and knob
movement range.
Using Door button Position in x, y, z.
fridge Button dimensions and grip.
Control force law.
Labelling.
Food per
shelve Position in x, y, z.
Food Position in x, y, z.
Hood Position in x, y, z.
Lever controls Position in x, y, z.
Lever control dimensions.
Up-down and left-right forces and knob movement
range.
Buttons with Position in x, y, z.
very distinctive Button dimensions and grip.
shapes (blade Control force law.
Using knobs) Labelling.
hood Visual Position in x, y, z.
feedback Colour.
Density visual feedback.
Audio Audio volume and frequency.
feedback
Buttons with Position in x, y, z.
very distinctive Button dimensions and grip.
shapes. Control force law.
Labelling.
Oven Position in x, y, z.
Door handle Position in x, y, z.
oven Door handle oven dimensions.
Up-down and left-right. forces and knob
movement range.
Pull-out shelf
Opening
(located e.g.
oven
directly under
the counter
top) Position in x, y, z.
Oven rack Position in x, y, z.
Dimensions
Level or push Position in x, y, z.

February 2012 27 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Task -
Use Case Scenario Related Object Object parameters
button Button dimensions and grip.
Control force law.
Labelling.
Permanent Position in x, y, z.
shelf Dimensions
Oven pan Position in x, y, z.
Dimensions

Table 6: House interior context of use.

5.2.2 Domotic applications interface environment


Use Task -
Case Scenario Related Object Object parameters
Position in x, y
HMI panel
Position in x, y
Dimensions in w, h
Switching Contrast level, as specified by the Weber
Control buttons for
the lights contrast sensitivity formula:
programming, temperature,
using a
timer, start-stop
domotics
Position in x, y
interface
Dimensions in w, h
Contrast level, as specified by the Weber
contrast sensitivity formula:
Visual feedback on HMI
display
Position x, y, z of the ground in respect to
Domoti the user model.
Ground
c Texture.
applicat Inclination.
ions Door Position in x, y
design Door Support bar Position in x, y
opening with Position x, y, z of the ground in respect to
sensors the user model.
Ground
Texture.
Inclination.
Position in x, y, z.
Wheelchair wheel handles Wheelchair wheel handles dimensions.
Up-down forces.
Kitchen Oven HMI panel Position in x, y
Handling Position in x, y
using a Dimensions in w, h
domotics Contrast level, as specified by the Weber
Control buttons for
interface contrast sensitivity formula:
programming, temperature,
(electric timer, start-stop

February 2012 28 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Use Task -
Case Scenario Related Object Object parameters
oven) 1 Position in x, y
Dimensions in w, h
Contrast level, as specified by the Weber
contrast sensitivity formula:
Visual feedback on HMI
display

Table 7: Domotic applications context of use.

February 2012 29 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

5.3 VERITAS context of Use - Workspaces domain


In this Chapter the environment of each application domains of VERITAS is presented,
indentifying the task related objects that are included in each context and their parameters,
adapted to the updated task analysis.

5.3.1 Workspace interior environment


Use Case Task - Scenario Related Object Object parameters
Main entrance Position in x, y, z
Position in x, y, z
Rotation in degrees
Door handle
Mass in kg
Locate and enter
Resistance Force in Nt
main entrance
Position in x, y, z
Rotation axis
Door
Mass in kg
Resistance Force in Nt
Position in x, y, z
Locate and
Lift doors Translation axis
approach lift
Mass in kg
Position in x, y, z.
Locate and push Button dimensions and grip.
Button
open button Control force law.
Labelling.
Get aware of Position in x, y, z
open doors and Lift doors Translation axis
Entering the enter lift Mass in kg
workspace Position in x, y, z.
and sitting on Locate and push Button dimensions and grip.
Button
desk button 2nd floor Control force law.
Labelling.
Position in x, y, z
Get aware of Lift doors Translation axis
open doors and Mass in kg
exit lift Outside Lift
Position in x, y, z
doors
Position in x, y, z
Rotation axis
Door
Mass in kg
Locate and enter Resistance Force in Nt
office Position in x, y, z
Rotation in degrees
Door handle
Mass in kg
Resistance Force in Nt
Position in x, y, z
Locate and walk Rotation axis
Chair
to the office chair Mass in kg
Resistance Force in Nt

February 2012 30 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Use Case Task - Scenario Related Object Object parameters


Position in x, y, z.
Locate button and Button dimensions and grip.
Power button
turn on PC Control force law.
Labelling.
Drawer
Position in x, y, z
Opening drawer
Drawer handle Translation axis
Mass in kg Resistance Force in Nt
Position in x, y, z.
Button dimensions and grip.
Activating PC Power button
Control force law.
Labelling.
Position in x, y, z.
Button dimensions and grip.
Power button
Activating Control force law.
Using office
computer screen Labelling.
equipment
Computer
Position in x, y, z
screen
Phone (or
printer, Position in x, y, z
scanner)
Position in x, y, z
Phone hook
Using devices Mass in kg
Position in x, y, z.
Push buttons Button dimensions and grip.
on devices Control force law.
Labelling.
Position in x, y, z
Rotation axis
Go to storage Chair
Mass in kg
area
Resistance Force in Nt
Storage area Position in x, y, z
Position in x, y, z
Get paper Stored paper
Mass in kg
Position in x, y, z
Print a Translation axis
document and Printing tray
Mass in kg
classify it in a Feed paper in Resistance Force in Nt
dossier on the printing tray
Printer area Position in x, y, z
closer
Position in x, y, z
Paper
Mass in kg
Get printed Printer Position in x, y, z
document from Printed Position in x, y, z
printer document Mass in kg
Position in x, y, z
Get a dossier Dossier on shelf
Mass in kg
from shelf
Shelf Position in x, y, z

February 2012 31 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Use Case Task - Scenario Related Object Object parameters


Shelf Position in x, y, z
Store document
Position in x, y, z
on shelf Dossier
Mass in kg
Table 8: Workspace interior context of use.

5.3.2 Collaborative tools applications interface environment

Related
Use Case Task - Scenario Object Object parameters
Position in x, y
Dimensions in w, h
Contrast level, as specified by the Weber contrast
Select File
upload from File Menu sensitivity formula:
menu Position in x, y
Mouse
Mouse left Position in x, y
button Button id
Position in x, y
Locate file File upload Dimensions in w, h
upload dialog box Dialog box Contrast level, as specified by the Weber contrast

sensitivity formula:
Position in x, y
Dimensions in w, h
File
Contrast level, as specified by the Weber contrast
upload
Select file to
and File name sensitivity formula:
upload
online Position in x, y
Mouse
editing
Mouse left Position in x, y
button Button id
Position in x, y
Locate Upload Dimensions in w, h
button Contrast level, as specified by the Weber contrast
Upload
button sensitivity formula:
Mouse Position in x, y
Click Upload
Mouse left Position in x, y
button
button Button id
Position in x, y
Get aware of File Dimensions in w, h
uploaded Contrast level, as specified by the Weber contrast
message
Message sensitivity formula:

February 2012 32 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Related
Use Case Task - Scenario Object Object parameters
Position in x, y
Dimensions in w, h
Contrast level, as specified by the Weber contrast
Select an uploaded
file File name sensitivity formula:
Mouse Position in x, y
Mouse left Position in x, y
button Button id
Position in x, y
Dimensions in w, h
Contrast level, as specified by the Weber contrast
Select Edit File
from menu Menu item sensitivity formula:
Mouse Position in x, y
Mouse left Position in x, y
button Button id
Position in x, y
Locate new Dimensions in w, h
window with file Contrast level, as specified by the Weber contrast
open for editing
Window sensitivity formula:
Position in x, y
Dimensions in w, h
Text input Contrast level, as specified by the Weber contrast
area on
screen sensitivity formula:
Start inserting text Mouse Position in x, y
in the file Mouse left Position in x, y
button Button id
Keyboard Position in x, y
Keyboard
key key value
Position in x, y
Dimensions in w, h
Contrast level, as specified by the Weber contrast
Create
Select Create new Menu item sensitivity formula:
discussion
discussion from Mouse Position in x, y
and post
menu
item Mouse left Position in x, y
button Button id
Speech
input

February 2012 33 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Related
Use Case Task - Scenario Object Object parameters
Position in x, y
Locate New Dimensions in w, h
discussion dialog Contrast level, as specified by the Weber contrast
box
Dialog box sensitivity formula:
Position in x, y
Dimensions in w, h
Contrast level, as specified by the Weber contrast

Title text sensitivity formula:


Insert title Mouse Position in x, y
Keyboard Position in x, y
Keyboard Position in x, y
key Key id
Speech Sound level in dB
input
Locate Post Position in x, y
button Button Button id
Mouse Position in x, y
Mouse left Position in x, y
Click Post button button Button id
Speech
input Sound level in dB
Position in x, y
Locate Discussion Dimensions in w, h
created message Contrast level, as specified by the Weber contrast

Message sensitivity formula:


Position in x, y
Dimensions in w, h
Contrast level, as specified by the Weber contrast

Select Post item Menu item sensitivity formula:


from menu Mouse Position in x, y
Mouse left Position in x, y
button Button id
Speech
input Sound level in dB
Position in x, y
Locate New item Dimensions in w, h
dialog box Contrast level, as specified by the Weber contrast

Menu item sensitivity formula:

February 2012 34 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Related
Use Case Task - Scenario Object Object parameters
Position in x, y
Dimensions in w, h
Contrast level, as specified by the Weber contrast

Text sensitivity formula:


Mouse Position in x, y
Insert text Mouse left Position in x, y
button Button id
Keyboard Position in x, y
Keyboard Position in x, y
key Key id
Speech
input Sound level in dB
Locate Post Position in x, y
button Button Button id
Mouse Position in x, y
Mouse left Position in x, y
Click Post button button Button id
Speech
input Sound level in dB
Position in x, y
Locate Item Dimensions in w, h
posted message Contrast level, as specified by the Weber contrast

Message sensitivity formula:


Position in x, y
Locate Incoming Dimensions in w, h
call message box Contrast level, as specified by the Weber contrast
Message
box sensitivity formula:
Position in x, y
Dimensions in w, h
Contrast level, as specified by the Weber contrast
Locate and click
Realise a accept button Button sensitivity formula:
teleconfer Mouse Position in x, y
ence Mouse left Position in x, y
button Button id
Position in x, y
Dimensions in w, h
Contrast level, as specified by the Weber contrast
Locate and adjust Volume
volume button sensitivity formula:
Mouse Position in x, y
Mouse left Position in x, y
button Button id

February 2012 35 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Related
Use Case Task - Scenario Object Object parameters
Volume
test Sound level in dB
Hear other Participant
participants s speaking Sound level in dB
Speak to other Speech
participants output Sound level in dB

Table 9: Collaborative games context of use.

February 2012 36 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

5.4 VERITAS context of Use - Infotainment domain


In this Chapter the environment of each application domains of VERITAS is presented,
indentifying the task related objects that are included in each context and their parameters,
adapted to the updated task analysis.

5.4.1 Metaverses applications interface environment

Use Case Task - Scenario Related Object Object parameters


Enter username Position in x, y
username text field
Dimensions in w, h
Select username text
mouse Position in x, y
field
Position in x, y
left mouse button
Dimensions in w, h
Type username Keyboard key Key value
Enter password Position in x, y
password text field
Dimensions in w, h
Select password text
mouse Position in x, y
field
Position in x, y
left mouse button
Dimensions in w, h
Type password keyboard key Key value
Enter the Position in x, y
log in button
metaverse Dimensions in w, h
Confirm credentials 1 mouse Position in x, y
Position in x, y
left mouse button
Button ID
Confirm credentials 2 keyboard key Key value
Access avatar Appearance Menu Position in x, y
appearance menu button Dimensions in w, h
Select Appearance UI
component mouse Position in x, y
Position in x, y
left mouse button
Button ID
Select specific Position in x, y
outfit button
Shapes for Body Dimensions in w, h
and Head to Select outfit mouse Position in x, y
customize the Position in x, y
avatar left mouse button
Button ID
Save the avatar Position in x, y
wear button
Dimensions in w, h
Select wear button mouse Position in x, y
Position in x, y
left mouse button
Button ID
Access Inventory Position in x, y
Select Inventory UI inventory button
Screen Dimensions in w, h
component
mouse Position in x, y

February 2012 37 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Use Case Task - Scenario Related Object Object parameters


Position in x, y
left mouse button
Button ID
Load new 3D add new item Position in x, y
object to the component Dimensions in w, h
inventory Select 'Add new
item' UI component mouse Position in x, y
Position in x, y
left mouse button
Button ID
Position in x, y
upload component
Dimensions in w, h
Select 'Upload' UI
mouse Position in x, y
component
Position in x, y
left mouse button
Button ID
Position in x, y
image component
Dimensions in w, h
Select 'Image' UI
mouse Position in x, y
component
Position in x, y
left mouse button
Button ID
Position in x, y
file component
Dimensions in w, h
Select file mouse Position in x, y
Position in x, y
left mouse button
Button ID
Position in x, y
open button
Dimensions in w, h
Select 'Open' button mouse Position in x, y
Position in x, y
left mouse button
Button ID
Position in x, y
upload button
Dimensions in w, h
Select 'Upload'
mouse Position in x, y
button
Position in x, y
left mouse button
Button ID
Position in x, y
ok button
Dimensions in w, h
Select 'OK' button mouse Position in x, y
Position in x, y
left mouse button
Button ID
Close inventory Position in x, y
inventory button
screen Dimensions in w, h
Select Inventory UI
mouse Position in x, y
component
Position in x, y
left mouse button
Button ID

February 2012 38 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Use Case Task - Scenario Related Object Object parameters


Access in-world Position x,y of the 3D render
menu area
3d world area
Dimensions in w,h of the 3D
Access context menu render area
mouse Position in x, y
Position in x, y
right mouse button
Button ID
Position in x, y
build component
Dimensions in w, h
Select 'build' UI
mouse Position in x, y
component
Position in x, y
left mouse button
Button ID
Create a new 3D cube icon Position in x, y
box object component Dimensions in w, h
Select cube icon mouse Position in x, y
Position in x, y
left mouse button
Button ID
Position in x, y
build component
Dimensions in w, h
Click inside 3D area mouse Position in x, y
Position in x, y
left mouse button
Button ID
Scale box to 200% Position in x, y
3D box object
size Dimensions in w, h
ctrl key Key value
shift key Key value
Select box scaling
scale 3D Position in x, y
component
component Dimensions in w, h
mouse Position in x, y
Position in x, y
left mouse button
Button ID
mouse Position in x, y
Position in x, y
left mouse button
Scale box Button ID
ctrl key Key value
shift key Key value
Rotate box 90o on Position in x, y
3D box
x, y, z axes Dimensions in w, h
Modify box x ctrl key Key value
rotation rotate x rotation Position in x, y
component Dimensions in w, h
mouse Position in x, y

February 2012 39 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Use Case Task - Scenario Related Object Object parameters


Position in x, y
left mouse button
Button ID
rotate y rotation Position in x, y
component Dimensions in w, h
mouse Position in x, y
Modify box y Position in x, y
rotation left mouse button
Button ID
mouse Position in x, y
Position in x, y
left mouse button
Button ID
rotate z rotation Position in x, y
component Dimensions in w, h
mouse Position in x, y
Position in x, y
Modify box z left mouse button
Button ID
rotation
mouse Position in x, y
Position in x, y
left mouse button
Button ID
ctrl key Key value
Move box 1m Position in x, y
3D box
right, up and Dimensions in w, h
forward rotate x translation Position in x, y
component Dimensions in w, h
Modify box x mouse Position in x, y
position Position in x, y
left mouse button
Button ID
mouse Position in x, y
Position in x, y
left mouse button
Button ID
rotate y translation Position in x, y
component Dimensions in w, h
mouse Position in x, y
Modify box y Position in x, y
position left mouse button
Button ID
mouse Position in x, y
Position in x, y
left mouse button
Button ID
rotate z translation Position in x, y
component Dimensions in w, h
mouse Position in x, y
Modify box position
Position in x, y
left mouse button
Button ID
mouse Position in x, y

February 2012 40 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Use Case Task - Scenario Related Object Object parameters


Position in x, y
left mouse button
Button ID
Move the avatar to Position in x,y screen space
a specific visible visible 3D location Transformation matrix of the
Face visible location
location inside the viewing camera
3D environment key left Key value
Move avatar to
keyboard key
location Key value
Move the avatar to Position in x,y screen space
a specific location visible 3D location Transformation matrix of the
Face audible location
where a positional viewing camera
sound comes from key left Key value
Move avatar to
keyboard key
location Key value
Interact with a Position in x,y in scren space
dynamic 3D object dynamic 3D object Dimensions in w,h in screen
Click on dynamic 3D space
object mouse Position in x, y
Position in x, y
left mouse button
Button ID
Interact with a 3D Position in x,y in scren space
3D object with
object with Dimensions in w,h in screen
Click on 3D object embedded media
embedded media space
with embedded
mouse Position in x, y
media
Position in x, y
left mouse button
Button ID
Initiate chat with Position in x,y in scren space
another user 3D avatar Dimensions in w,h in screen
inside the 3D space
environment mouse Position in x, y
Select user to chat
3D avatar info Position in x, y
with
component Dimensions in w, h
mouse Position in x, y
Position in x, y
left mouse button
Button ID
options UI Position in x, y
component Dimensions in w, h
mouse Position in x, y
Initiate IM session Position in x, y
left mouse button
Button ID
Position in x, y
IM UI component
Dimensions in w, h
mouse Position in x, y

February 2012 41 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Use Case Task - Scenario Related Object Object parameters


Position in x, y
left mouse button
Button ID
Position in x, y
chat text entry
Dimensions in w, h
Enter text
mouse Position in x, y
Position in x, y
left mouse button
Button ID
keyboard key Key value
Share content with Position in x, y
inventory button
another user Dimensions in w, h
Select Inventory UI
mouse Position in x, y
component
Position in x, y
left mouse button
Button ID
Position in x, y
photo album menu
Dimensions in w, h
mouse Position in x, y
Position in x, y
Select image left mouse button
Button ID
Position in x, y
image component
Dimensions in w, h
mouse Position in x, y
Position in x,y in scren space
3D avatar Dimensions in w,h in screen
Share image with space
other user mouse Position in x, y
Position in x, y
left mouse button
Button ID

Table 10: Metaverses context of use.

February 2012 42 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

5.4.2 Collaborative games applications interface environment

Task -
Use Case Scenario Related Object Object parameters
Collabor Position in x, y
UI Elements
ative Dimensions in w, h
games Touch Screen Position in x, y
for Position in x, y
elderly Dimensions in w, h
Reply Touch Screen
Button Contrast level, as specified by the Weber contrast
Question
sensitivity formula:
Keyboard keys Key value
Text Output Audio level in dB and frequency Hz.
Say A,B,C or D Sound level in dB
Position in x, y
UI Elements
Dimensions in w, h
Touch Screen Position in x, y
Position in x, y
Touch Screen Dimensions in w, h
Move peon Button Contrast level, as specified by the Weber contrast

sensitivity formula:
Keyboard keys Key value
Text input Audio level in dB and frequency.
Say "Move peon" Sound level in dB
Position in x, y
UI Elements
Dimensions in w, h
Touch Screen Position in x, y
Touch Screen Position in x, y
Button (both Dices Dimensions in w, h
Roll Dices
as one touch Contrast level, as specified by the Weber contrast
button) sensitivity formula:
Text input Key value
Say "Roll Dices" Sound level in dB
Position in x, y
UI Elements
Dimensions in w, h
Position in x, y
Acces
UI Element on Dimensions in w, h
switch on
screen Contrast level, as specified by the Weber contrast
button
sensitivity formula:
Touch Screen Position in x, y
Touch on 4 The corners and Position in x, y
corners and center Dimensions in w, h

February 2012 43 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Task -
Use Case Scenario Related Object Object parameters
center of
screen area Touch Screen
1
Position in x, y
Select game Position in x, y
UI Elements
on the Dimensions in w, h
screen Touch Screen Position in x, y
Position in x, y
Dimensions in w, h
Choose
System's prompt Contrast level, as specified by the Weber contrast
number of
player 2 sensitivity formula:
Touch Screen Position in x, y
Position in x, y
UI Elements
Dimensions in w, h
Touch Screen Position in x, y
Choose Position in x, y
levels Dimensions in w, h
System's prompt Contrast level, as specified by the Weber contrast

sensitivity formula:
Position in x, y
UI Elements
Dimensions in w, h
touch start Touch Screen Position in x, y
the game Position in x, y
button Dimensions in w, h
System's prompt Contrast level, as specified by the Weber contrast

sensitivity formula:
Position in x, y
UI Elements
Dimensions in w, h
learn Position in x, y
instructions Dimensions in w, h
System's prompt Contrast level, as specified by the Weber contrast

sensitivity formula:
confirm Position in x, y
UI Elements
learn Dimensions in w, h
instructions Touch Screen Position in x, y
recognize Position in x, y
UI Elements
the player Dimensions in w, h
turn Touch Screen Position in x, y
choose the Position in x, y
UI Elements
movement Dimensions in w, h
of the
Touch Screen
counter Position in x, y

Table 11: Collaborative games context of use.

February 2012 44 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

5.5 VERITAS context of Use - Healthcare domain


In this Chapter the environment of each application domains of VERITAS is presented,
indentifying the task related objects that are included in each context and their parameters,
adapted to the updated task analysis.

5.5.1 Remote control healthcare applications interface


environment
Use Task -
Case Scenario Related Object Object parameters
Touch screen Position in x, y
Loudspeaker output Audio level in dB and frequency Hz.
Position in x, y
Dimensions in w, h
Remind Contrast level, as specified by the Weber
of events contrast sensitivity formula:
(drugs,
appointm Button on the Touch screen
ents..) Position in x, y
Dimensions in w, h
Contrast level, as specified by the Weber
contrast sensitivity formula:
Button OK on the Touch
screen
Text Window on the Touch Position in x, y
Remote
screen Dimensions in w, h
patient
Position in x, y
accessi
Dimensions in w, h
bility
Contrast level, as specified by the Weber
contrast sensitivity formula:

Answer Loudspeaker output


to Position in x, y
automati Dimensions in w, h
c Contrast level, as specified by the Weber
questions contrast sensitivity formula:
Answer Buttons on the Touch
Screen (Well, Not Well)
Position in x, y
Dimensions in w, h
Contrast level, as specified by the Weber
contrast sensitivity formula:
Button OK on the Touch
screen

Table 12: Healthcare remote control application context of use.

February 2012 45 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

5.5.2 Mobile applications interface environment

Use
Case Task - Scenario Related Object Object parameters
Position in x, y
Dimensions in w, h
Select day Contrast level, as specified by the Weber contrast

day icon sensitivity formula:


Position in x, y
Dimensions in w, h
Select meal Contrast level, as specified by the Weber contrast

Meal icon sensitivity formula:


Select Position in x, y
items Dimensions in w, h
from Select service Contrast level, as specified by the Weber contrast
the
Service icon sensitivity formula:
main
Position in x, y
menu
Dimensions in w, h
of the Select recipe Contrast level, as specified by the Weber contrast
applicat
ion Recipe icon sensitivity formula:
Position in x, y
Dimensions in w, h
Select ingredient Contrast level, as specified by the Weber contrast

Ingredient icon sensitivity formula:


Position in x, y
Dimensions in w, h
Select procedure Contrast level, as specified by the Weber contrast

Procedure icon sensitivity formula:


Position in x, y
Dimensions in w, h
Select service Contrast level, as specified by the Weber contrast

Select Service icon sensitivity formula:


recipies Position in x, y
Dimensions in w, h
Select photo Contrast level, as specified by the Weber contrast

Photo icon sensitivity formula:


Position in x, y
Select Dimensions in w, h
shoppin Select item Contrast level, as specified by the Weber contrast
g list
Item icon sensitivity formula:

Table 13Mobile applications context of use.

February 2012 46 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

5.5.3 Health coach applications interface environment


Task -
Use Case Scenario Related Object Object parameters
Touch screen Position in x, y
Position in x, y
Dimensions in w, h
Activate/Deactivate Contrast level, as specified by the Weber
sound button contrast sensitivity formula:
Activate/De
activate
sound Position in x, y
Dimensions in w, h
Back icon Contrast level, as specified by the Weber
contrast sensitivity formula:

Touch screen Position in x, y


Position in x, y
Dimensions in w, h
Activate/Deactivate Contrast level, as specified by the Weber
Configur vibration button contrast sensitivity formula:
Activate/De
e the
activate
applicati
vibration Position in x, y
on
Dimensions in w, h
Back icon Contrast level, as specified by the Weber
contrast sensitivity formula:

Touch screen Position in x, y


Position in x, y
Dimensions in w, h
Language button Contrast level, as specified by the Weber
contrast sensitivity formula:
Select
language
Position in x, y
Dimensions in w, h
Back icon Contrast level, as specified by the Weber
contrast sensitivity formula:

Question Question output Audio level in dB and frequency Hz.


naires on Position in x, y
health Dimensions in w, h
status Contrast level, as specified by the Weber
Answer Button (answer)
(differen contrast sensitivity formula:
question
t types,
e.g.have
you Position in x, y
Back icon
taken Dimensions in w, h

February 2012 47 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Task -
Use Case Scenario Related Object Object parameters
your Contrast level, as specified by the Weber
medicati contrast sensitivity formula:
on?, do
you feel
well
today?)
Position in x, y
Dimensions in w, h
Measurement icon Contrast level, as specified by the Weber
contrast sensitivity formula:

Position in x, y
Dimensions in w, h
Take Contrast level, as specified by the Weber
Instructions
measureme contrast sensitivity formula:
nt

Position in x, y
Dimensions in w, h
Start/Back icon Contrast level, as specified by the Weber
contrast sensitivity formula:

Position in x, y
Dimensions in w, h
Instructions Contrast level, as specified by the Weber
contrast sensitivity formula:
Get
feedback Position in x, y
Dimensions in w, h
OK button Contrast level, as specified by the Weber
contrast sensitivity formula:

Automat Position in x, y
ic trigger Dimensions in w, h
to Take Instructions Contrast level, as specified by the Weber
measure contrast sensitivity formula:
ments at Take
period X measureme
Position in x, y
of the nt
Dimensions in w, h
day (the
Start/Back icon Contrast level, as specified by the Weber
system
contrast sensitivity formula:
automati
cally
triggers Get Position in x, y
Instructions
the user feedback Dimensions in w, h

February 2012 48 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Task -
Use Case Scenario Related Object Object parameters
to take a Contrast level, as specified by the Weber
measure contrast sensitivity formula:
ment at
different
Position in x, y
periods
Dimensions in w, h
of the
OK button Contrast level, as specified by the Weber
day)
contrast sensitivity formula:

Position in x, y
Dimensions in w, h
Day/week icon/option Contrast level, as specified by the Weber
contrast sensitivity formula:
Check
Select
medicati
calendar
on Position in x, y
(day/week)
calendar Dimensions in w, h
Touch screen Contrast level, as specified by the Weber
contrast sensitivity formula:

Table 14: Health-coach context of use.

February 2012 49 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

5.6 VERITAS Models in the context of use


The multidimensional table of the models in the context of use presented in the current
Chapter is the main outcome of this Deliverable, which contains all the aforementioned
components. The representation of the interactional context through the task oriented
models of VERITAS allows the identification of the interaction of the user with the
environment where the specific task is executed.

The components and the steps that have been followed for the extraction of the user model
in the context of use multidimensional table are illustrated in the figure that follows. While,
the parameterisation of the context of, use has been done by defining a specific success
criterion and a success threshold for each primitive task, according to the object that is
participating in this task.

MULTIDIMETNIONAL TABLE OF MODLES IN THE CONTEXT OF USE


Preliminary Task
CONTEXT OF USE
Analysis
Success criteria
Thresholds SP1
(WP1.3,
Use Cases Automotive
WP1.4,
domain
WP1.5)
Virtual
Smart living User
Disability Groups spaces Models
(direct connection domain
to user models) SP2
Workplaces (WP2.1,
domain WP3.1,
WP4.1)
Infotainment
Simulation
domain Models

Healthcare
Task Models Tables
domain

Figure 3: Methodology for the extraction of the multidimensional table of models in the context
of use.

The multidimensional table is an aggregated body of information that includes the following
information:
Task: The tasks identified here are matched to the Use Cases of D1.7.1.b.
Subtask: The subtask is the first decomposition step of the tasks.
Disability: This defines the type of user models that executes the specific sub-
task.
Primitive tasks: Lower level of decomposition of the task analysis. Tasks differ
according to the disability and the related object.
Body part: Which exact body part takes actions for the primitive task
realization.
Cognitive state affecting the task: possible effect on a primitive task due to a
specific cognitive state.

February 2012 50 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

P&B state affecting the task: possible effect on a primitive task due to a
specific P&B state.
Primitive task Object: Environmental object that is participating in the
specific primitive task.
Object parameters: Parameters that define the object participating in the
specific primitive task.
Success criterion: A measurable criterion that guaranties the success of the
primitive task in relation to the specific object.
Success threshold: A measurable threshold that must be achieved for the
success of the primitive task in relation to the specific object.

The figure below illustrates the multidimensional table columns and describes which
activities/WPs have participated in its development.

Cognitive P&B
Primitive
Primitive Body state state Object Success Success
Task Subtask Disability task
tasks part affecting affecting parameters criterion threshold
Object
the task the task

TASK ANALYSIS (WP1.7) CONTEXT OF USE PARAMETERISA


USE CASES (WP1.7) TION OF THE
SIMULATION CONTEXT OF
SCENARIOS (SP2) USE (WP1.7)
USER MODELS
(WP13)
USER MODELS (WP1.4,
WP1.5)

Figure 4: Multidimensional table of models in the context of use.

The sections below provide one example of the multidimensional table of models in the
context of use for each application domain. The full multidimensional table of models in the
context of use is given in Annex 2 that accompanies the current document.

A small extract of the models in the context of use for each application domain is presented
below.

February 2012 51 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Automotive

Cognitive P&B
Primitive
state state Primitive Body Object Success Success
Domain Section Task Subtask Disability task
affecting affecting tasks part parameters criterion threshold
Object
the task the task
Automotive Car Getting Open None Blind and Acute Locate Eyes Door Position in Locate Locate
interior in a left front low vision, stress, handle x, y, z. the Door
car door 1 Cognitive very high Movement handle.
aging, acute range.
Alzheimer's. stress, Opening
chronic forces.
stress, Opening
mental angle.
Fatigue, Handle
emotions dimensions.
Up-down
and left-
right.
Forces and
knob
movement
range.
Cognitive Acute Reach Hand Door Position in Position in The
aging, stress, handle x, y, z. x,y,z hand is
Alzheimer's. very high Movement stretched
acute range. until it
stress, Opening reaches
chronic forces. the
stress, Opening position
mental angle. in x,y,z
Fatigue, handle of the
emotions dimensions. door
Up-down handle.
and left-
right. forces
and knob

February 2012 52 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Cognitive P&B
Primitive
state state Primitive Body Object Success Success
Domain Section Task Subtask Disability task
affecting affecting tasks part parameters criterion threshold
Object
the task the task
movement
range.
Cognitive Acute Grasp Fingers Door Position in Position in At least
aging stress, handle x, y, z. x,y,z 2 of the
very high Movement five
acute range. fingers
stress, Opening get
chronic forces. inside
stress, Opening and
mental angle. touch the
Fatigue, Handle door
emotions dimensions. handle.
Up-down
and left-
right.
Forces and
knob
movement
range.
Cognitive Acute Pull Hand Door Position in Angle of x angle
aging stress, handle x, y, z. the door of the
very high Movement handle door
acute range. handle.
stress, Opening
chronic forces.
stress, Opening
mental angle.
Fatigue, Handle
emotions dimensions.
Up-down
and left-
right.
Forces and
knob

February 2012 53 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Cognitive P&B
Primitive
state state Primitive Body Object Success Success
Domain Section Task Subtask Disability task
affecting affecting tasks part parameters criterion threshold
Object
the task the task
movement
range.

Cognitive Acute Pull Hand Door Movement Angle of x angle


aging stress, range. the door of the
very high Opening door.
acute forces.
stress, Opening
chronic angle.
stress,
mental
Fatigue,
emotions

Table 15: Models in the context of use for automotive domain.

February 2012 54 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Smart living spaces

Cognitive P&B
Primitive
state state Primitive Body Object Success Success
Domain Section Task Subtask Disability task
affecting the affecting tasks part parameters criterion threshold
Object
task the task
Smart Home Moving Opening None Blind and Acute Locate Eyes Door Movement Locate Locate the
living interior around door 1 low vision, stress, range. Door handle.
spaces outside Cognitive very high Opening
house aging, acute forces.
Alzheimer's. stress, Opening
chronic angle.
stress,
mental
Fatigue,
emotions
Cognitive Acute Grasp Hand Door Position in reach hand inside
aging stress, handle x, y, z. door geometry,
very high Door handle apparently
acute handle correct
stress, dimensions. physiological
chronic Up-down access
stress, and left-
mental right.
Fatigue, Forces and
emotions handle
movement
range.

February 2012 55 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Cognitive P&B
Primitive
state state Primitive Body Object Success Success
Domain Section Task Subtask Disability task
affecting the affecting tasks part parameters criterion threshold
Object
task the task
Cognitive Acute Pull Hand Door Position in reach hand inside
aging down stress, down handle x, y, z. door geometry,
very high Door handle apparently
acute handle in correct
stress, dimensions. operated physiological
chronic Up-down position access
stress, and left-
mental right.
Fatigue, forces and
emotions handle
movement
range.
Cognitive Acute Push Hand Door Movement door can no collision
aging stress, range. be
very high Opening moved
acute forces. 90
stress, Opening
chronic angle.
stress,
mental
Fatigue,
emotions

Table 16: Models in the context of use for smart living spaces domain.

February 2012 56 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Workplaces

Cognitive P&B
Primitive
state state Primitive Body Object Success Success
Domain Section Task Subtask Disability task
affecting affecting tasks part parameters criterion threshold
Object
the task the task
Workplaces Workplace Entering Locate Reduced Blind and Acute Locate Eyes Main Position in The Distance
interior the and vision low vision, stress, entrance x, y entrance to
design workspace enter Cognitive very high door is position
and sitting main aging, acute found
on desk entrance Alzheimer's. stress,
chronic
stress,
mental
Fatigue,
emotions
Reduced Cognitive mental Walk Deet Main Position in The user Distance
vision aging Fatigue, entrance x, y is in to
emotions front of position
the door
Reduced Blind and Acute Locate Eyes Door Position in the door Distance
vision low vision, stress, handle x, y, z. handle is to
Cognitive very high Movement found position
aging, acute range.
Alzheimer's. stress, Opening
chronic forces.
stress, Opening
mental angle.
Fatigue, handle
emotions dimensions.
Up-down
and left-
right. forces
and knob
movement
range.

February 2012 57 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Cognitive P&B
Primitive
state state Primitive Body Object Success Success
Domain Section Task Subtask Disability task
affecting affecting tasks part parameters criterion threshold
Object
the task the task
Reduced Cognitive Acute Grasp Hands Door Position in the Distance
vision aging stress, handle x, y, z. user's to
very high Movement hand position
acute range. grasps
stress, Opening the
chronic forces. handle
stress, Opening
mental angle.
Fatigue, handle
emotions dimensions.
Up-down
and left-
right. forces
and knob
movement
range.
Reduced Cognitive Acute Push Hands Door Position in the Rotation
vision aging stress, handle x, y, z. handle is Angle
very high Movement pushed
acute range.
stress, Opening
chronic forces.
stress, Opening
mental angle.
Fatigue, handle
emotions dimensions.
Up-down
and left-
right. forces
and knob
movement
range.

February 2012 58 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Cognitive P&B
Primitive
state state Primitive Body Object Success Success
Domain Section Task Subtask Disability task
affecting affecting tasks part parameters criterion threshold
Object
the task the task
Reduced Cognitive Acute Push Hands Door Movement the door Rotation
vision aging stress, range. is open Angle
very high Opening
acute forces.
stress, Opening
chronic angle.
stress,
mental
Fatigue,
emotions
Reduced Cognitive Acute Release Hands Door Position in the Distance
vision aging, stress, handle x, y, z. user's to
Alzheimer's very high Movement hand is position
acute range. free
stress, Opening from the
chronic forces. handle
stress, Opening
mental angle.
Fatigue, handle
emotions dimensions.
Up-down
and left-
right. forces
and knob
movement
range.

Table 17: Models in the context of use for workplaces domain.

February 2012 59 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Infotainment

Cognitive P&B
Primitiv
Disabilit state state Primitiv Body Object Success Success
Domain Section Task Subtask e task
y affecting affecting e tasks part parameters criterion threshold
Object
the task the task
Info- Meta- Enter Select None/ Blind and Acute locate UI Eyes usernam "Position in x, usernam the user
tainment verses usernam usernam Visually low vision, stress, compon e text y e text can locate
e e text Impaired Cognitive very high ent field Dimensions in field is the
field / Elderly aging, acute w, h Contrast located component'
Alzheimer' stress, level, as s x,y
s. chronic specified by position
stress, the Weber within r
mental contrast radius
Fatigue, sensitivity threshold
emotions formula: C=
[I (character)-
I
(background)
/I
(background)]
" (both Dices
as one touch
button)
Cognitive Acute mouse Hand mouse Position in x, Position the mouse
aging stress, move y in x,y of pointer is
very high the moved to
acute mouse x,y position
stress, pointer within r
chronic radius
stress, threshold
mental
Fatigue,
emotions

February 2012 60 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Cognitive P&B
Primitiv
Disabilit state state Primitiv Body Object Success Success
Domain Section Task Subtask e task
y affecting affecting e tasks part parameters criterion threshold
Object
the task the task
Cognitive Acute mouse Hand left Position in x, User correct
aging stress, button mouse y button ID able to button
very high press button reach pressed
acute the
stress, mouse
chronic and
stress, operate
mental on the
Fatigue, correct
emotions button
Cognitive Acute mouse Finge left Position in x, User correct
aging, stress, button r mouse y button ID able to button
Alzheimer' very high release button reach pressed
s acute the
stress, mouse
chronic and
stress, operate
mental on the
Fatigue, correct
emotions button

Table 18: Models in the context of use for infotainment domain.

February 2012 61 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Healthcare

P&B
Cognitive
state Primiti Primitive
state Body Object Success Success
Domain Section Task Subtask Disability affecti ve task
affecting part parameters criterion threshold
ng the tasks Object
the task
task
Healthca Health Config Activate/ None Blind and Acute Locate Eyes Touch screen Touch screen Position in Locate the
re coach ure the Deactivat low vision, stress, x,y,z of the screen on
applica e Cognitive very Touch position
tion vibration aging, high Screen x,y,z
Alzheimer' acute
s. stress,
chronic
stress,
mental
Fatigue
,
emotio
ns
Cognitive Acute Press Finger Activate/Deac Position in x, Position in Locate the
aging stress, tivate y x, y icon from
very vibration Dimensions Dimension the
high button in w, h s in w, h characteristi
acute Contrast Contrast cs
stress, level, as level, as
chronic specified by specified
stress, the Weber by the
mental contrast Weber
Fatigue sensitivity contrast
, formula: C= sensitivity
emotio [I formula:
ns (character)-I C= [I
(background) (character)
/I -I
(background) (backgrou

February 2012 62 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

P&B
Cognitive
state Primiti Primitive
state Body Object Success Success
Domain Section Task Subtask Disability affecti ve task
affecting part parameters criterion threshold
ng the tasks Object
the task
task
] nd)/ I
(backgrou
nd)]
Cognitive Acute Select Cogniti Activate/Deac Position in x, Position in Locate the
aging, stress, on tivate y x, y icon from
Alzheimer' very vibration Dimensions Dimension the
s. high button in w, h s in w, h characteristi
acute Contrast Contrast cs
stress, level, as level, as
chronic specified by specified
stress, the Weber by the
mental contrast Weber
Fatigue sensitivity contrast
, formula: C= sensitivity
emotio [I formula:
ns (character)-I C= [I
(background) (character)
/I -I
(background) (backgrou
] nd)/ I
(backgrou
nd)]

February 2012 63 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

P&B
Cognitive
state Primiti Primitive
state Body Object Success Success
Domain Section Task Subtask Disability affecti ve task
affecting part parameters criterion threshold
ng the tasks Object
the task
task
Cognitive Acute Press Finger Back icon Position in x, Position in Locate the
aging stress, y x, y icon from
very Dimensions Dimension the
high in w, h s in w, h characteristi
acute Contrast Contrast cs
stress, level, as level, as
chronic specified by specified
stress, the Weber by the
mental contrast Weber
Fatigue sensitivity contrast
, formula: C= sensitivity
emotio [I formula:
ns (character)-I C= [I
(background) (character)
/I -I
(background) (backgrou
] nd)/ I
(backgrou
nd)]

Table 19: Models in the context of use for healthcare domain.

February 2012 64 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

6 Conclusions and future steps

Studying and designing interactive applications that are to be used and adopted to different
context of use, requires to consider the characteristics of the user, the interactive platform as
well as the constraints and capabilities of each environment of use.( Tarpin-Bernard F.,
Samaan K., 2008). In VERITAS the development of the various models is characterised by
the multiple usage in different environments, namely the automotive, the smart living spaces,
the workplaces, the infotainment and the healthcare. What comes to connect the various
models with their environments is the interaction with the context of use. So, in the current
Deliverable, our scope was to merge the outcomes of the majority of the previous SP1
activities and connect the users, and their tasks with the context of use in all different
domains. Our belief is that, what we refer to as an interaction model is the right place to glue
together all the task models with usability attributes.

The scope of the current Deliverable was to create the models in the context of use within a
multidimensional table that merges the Use Cases (A1.7.3), the task analysis (A1.7.1), the
disabilities (WP1.3), the cognitive attributes (WP1.4), the psychological and behavioural
states (WP1.5) to the primitive task and objects related to the task, giving the objects
parameters and the success criteria and thresholds that define each task in relation to the
object.

The structure of this multidimensional table was defined very in the end of the first year of
the project, while during the second year it has been updated cording to its structure and
content. The context of use has been also a trivial issue to be identifies for the simulation
models, so some of the columns of this table can be found in the respective Deliverables of
SP2, mapped though with the initial tasks of the 1st year.

After iterative and interactive updates of the multidimensional table of models in the context
of use, during every plenary meeting of the second year, as well as during the projects
workshop and events with external experts; this table has been finalised and is presented in
the Annex 2 of the current Deliverable.

This table will be used from now on in the project to identify all the models related with the
different Use Cases, scenarios and tasks in the context of use. Also, this table will be used in
order to define parameters according to the usability of the final VERITAS tools and it will
assist on the development of the pilot application scenarios with the developers and the
beneficiaries.

February 2012 65 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

Reference
[1] Brad Johanson, Armando Fox, and Terry Winograd, The Interactive Workspaces
Project: Experiences with Ubiquitous Computing Rooms, IEEE Pervasive Computing
1:2 (April-June 2002) page 118
[2] Bradley NA & Dunlop MD. (2005) Towards a Multidisciplinary Model of Context to
Support Context-Aware Computing. Human-Computer Interaction. 20(4). p.403-446.
Lawrence Erlbaum Associates.
[3] Carroll, J.M. (2000). Making Use: Scenario-Based Design of Human-Computer
Interactions. Cambridge / London: MIT Press
[4] Cheverst, K., Davis, N., Mitchell, K., Friday, A. and Efstreatiou, C. 2000. Developing
a Contextaware Elec-tronic Tourist Guide: Some Issues and Experiences,
Proceedings of ACM Conf. Human Factors in Computer Systems CHI 2000, 17-24,
New York, NY: ACM Press.
[5] Cheverst, K., Davis, N., Mitchell, K., Friday, A. and Efstreatiou, C. 2000 Using
Context as a Crystal Ball: Rewards and Pitfalls
[6] Dey, A. K., Salber, D., & Abowd, G. D.: A conceptual framework and a toolkit for
supporting the rapid prototyping of context-aware applications. Human-Computer
Interaction, 16(2-4), 97-166. (2001)
[7] Dey, A., and Abowd, G. (1999). Towards a Better Understanding of Context and
Context-Awareness. GVU Technical Report GIT-GVU-99-22
[8] Dey, Anind K. (2001). "Understanding and Using Context"Personal Ubiquitous
Computing 5 (1): 47
[9] Dittmar, A., Forbrig, P. Models and Patterns for the Specification of Context of Use
Proc. of the HCI International 2005, Las Vegas, 2005
[10] Dittmar, A., Forbrig, P.: Models and Patterns for the Specification of Context of Use.
In: HCII'05. (2005)
[11] Dourish, P. 2001. Where the Action Is: The Foundations of Embodied Interaction.
Cambridge: MIT Press
[12] Dourish, P. 2004. What We Talk About When We Talk About Context. Personal and
Ubiquitous Computing, 8(1), 19-30.
[13] Fauser, M., Henry, K., & Norman, D. K. ( 2006) Comparison of alternative
instructional design models. University of Central Florida.).
[14] Go, K., and Carroll, J.M. (2004). The Blind Men and the Elephant: Views of Scenario-
Based System Design. Interactions 11, 6. 4453.
[15] Hassenzahl, M., Tractinsky, N. 2006, User Experience a Research Agenda.
Behaviour and Information Technology, Vol. 25, No. 2, March-April 2006, pp. 91- 97
[16] ISO 13407, 1999 Human-centred design processes for interactive systems
[17] ISO 9241-11:1998 Information Technology Ergonomic requirings for office work
with visual display terminals (VDTs) Guidance on usability.
[18] Leont'ev, A. N. (1978). Activity, consciousness, and personlity. Englewood Cliffs:
Prentice-Hall.
[19] Liu, S., Grinter, R., and Dourish, P. 2003. Informality and IM-formality: Contexts of
Computer-Mediated Communication.
[20] Norman, D. (1988). The design of everyday things, Doubleday, New York.
[21] OHara K, Black A., Lipson M. Everyday practices with Mobile Video Telephony

February 2012 66 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

[22] Pribeanu, C. (2007). Tool support for handling mapping rules from domain to task
models. In Task Models and Diagrams for Users Interface Design, 4385/2007, pages
16_23. Springer.
[23] Rodden, T., Cheverst, K., Davies, K. Dix, A.. Exploiting Context in HCI Design for
Mobile Systems. Workshop on Human Computer Interaction with Mobile Devices
(1998)
[24] Rosson M. B. and Carroll J. M., Scenario-Based Design, Chapter 53 in J. Jacko & A.
Sears (Eds.), The Human-Computer Interaction Handbook: Fundamentals, Evolving
Technologies and Emerging Applications. Lawrence Erlbaum Associates, 2002, pp.
1032-1050.
[25] Roto, V. 2006. Web Browsing on Mobile Phones Characteristic of User Experience.
Dissertation for the degree of Doctor of Philosophy, Otamedia OY,Helsinki University
of Technology
[26] Ryder, M. (2006). Instructional Design models,
[27] Seffah, A. & Forbig, P. Multiple User Interfaces: Towards A Task- Driven And
Patterns-Oriented Desigm Model. In Forbig et al. (Eds.) Proceedings of DSV-IS
2002, Springer, 2002. 118-132
[28] Shilit and M. Theimer. Disseminating active map information to mobile hosts. IEEE
Network, 1994
[29] Souchon, N., Limbourg, Q., Vanderdonckt J. Task Modelling in Multiple Contexts of
Use In Forbig et al. (Eds.) Proceedings of DSV-IS 2002, Springer, 2002
[30] Souchon, N., Limbourg, Q., Vanderdonckt J. Task Modelling in Multiple Contexts of
Use In Forbig et al. (Eds.) Proceedings of DSV-IS 2002, Springer, 2002
[31] Tarpin-Bernard F., Samaan K., Achieving Usability of Adaptable Software: The AMF-
based Approach, Human-Centered Software Engineering: Software Engineering
Models, Patterns and Architectures for HCI, Springer HCI Series, A. Seffah, J.
Vanderdonckt, M. Desmarais (eds.), 2008, pp. 237-254
[32] Vnnen-Vainio-Mattila, K. & Ruuska,S. (2000). Designing Mobile Phones and
Communicators for Consumers' Needs at Nokia. In E. Bergman (Ed.), Information
Appliances and Beyond: Interaction Design for Consumer Products (pp. 169-204).
Morgan Kaufmann.
[33] VERITAS project (IST- 247765) Annex I- Description of Work 2009.
[34] Winogard, K. ABCs of the virtual high school, 2002.

February 2012 67 CERTH/HIT


VERITAS_D1.7.3 PU Grant Agreement # 247765

ANNEX 1: VERITAS Final task analysis for the


Use Cases
ANNEX 2: VERITAS Multidimensional table of
Models in the context of Use

February 2012 68 CERTH/HIT

Vous aimerez peut-être aussi