Académique Documents
Professionnel Documents
Culture Documents
VERITAS
Virtual and Augmented Environments and Realistic User Interactions To
achieve Embedded Accessibility DesignS
247765
Status: F (Final)
Abbreviation List
3D THREE DIMENSIONAL
IT INFORMATION TECHNOLOGIES
IT INTERACTION TOOL
IND INDUSTRY
RES RESEARCH
Table of contents
ABBREVIATION LIST ........................................................................................................................ 2
LIST OF FIGURES ............................................................................................................................... 6
LIST OF TABLES ................................................................................................................................. 9
1. FINAL PUBLISHABLE SUMMARY REPORT .................................................................... 10
1.1. EXECUTIVE SUMMARY ..................................................................................................... 10
1.2. PROJECT SUMMARY .......................................................................................................... 11
1.2.1. Project context and objectives ....................................................................................... 12
1.3. SCIENTIFIC AND TECHNOLOGICAL OBJECTIVES .................................................... 15
1.3.1. Main Scientific and Technological results/foregrounds ................................................ 17
1.3.1.1. The VERITAS User Model Methodology ......................................................................... 22
1.3.1.2. The Abstract User Models ................................................................................................. 22
1.3.1.3. The Multisensorial Platform............................................................................................... 26
1.3.1.4. The Task Models ................................................................................................................ 31
1.3.1.5. The Use Cases .................................................................................................................... 33
1.3.1.6. The Generic Virtual User Model ........................................................................................ 38
1.3.1.7. The VERITAS Model Platform ......................................................................................... 38
1.3.1.8. The VERITAS Core Simulation Platform .......................................................................... 41
1.3.1.9. The VERITAS Exportable Toolbox ................................................................................... 51
1.3.1.10. Simulation Models development and Integration of Veritas Platform into Application
Domains 71
1.3.1.11. Pilot planning, Pilot development and testing .................................................................... 85
1.3.1.12. User Evaluation & Feedback - Framework and Models Validation ................................... 94
1.4. POTENTIAL IMPACT ........................................................................................................ 101
1.4.1. Socio-economic impact ................................................................................................ 101
1.4.2. Ethical Implications ..................................................................................................... 102
1.4.3. Market impact .............................................................................................................. 104
Open Source Solution (OSS) ...................................................................................................... 105
Proprietary Solutions ................................................................................................................. 105
Services to third parties ............................................................................................................. 105
1.5. CONTACT DETAILS .......................................................................................................... 107
1.5.1. Project Website ............................................................................................................ 107
1.5.2. List of Beneficiaries / Consortium Composition .......................................................... 107
2. USE AND DISSEMINATION OF FOREGROUND ............................................................ 111
SECTION A. DISSEMINATION MEASURES ......................................................................... 111
A.01 DISSEMINATION STRATEGY AND CONSORTIUM ACTIVITIES .............................................. 112
A.01.1 Raising awareness to a wider public ...................................................................... 113
A.01.2 Direct Communication and knowledge management .............................................. 117
A.01.3 Standardization and International Impact .............................................................. 120
A.02 COMMUNITY ADDED VALUE AND CONTRIBUTION TO EU POLICIES OBJECTIVE .................. 121
(i) Main legal instruments ............................................................................................................... 121
(ii) European policy developments ................................................................................................... 121
(iii) Upcoming policy developments ............................................................................................ 123
A.03 SOCIETAL IMPLICATIONS OF THE PROJECT......................................................................... 124
SECTION B. EXPLOITATION OF RESULTS ........................................................................ 148
B.01 EXPLOITATION OF RESULTS ............................................................................................... 148
B.01.1 EXPLOITATION FOREGROUND ....................................................................................... 148
B.01.2 BUSINESS AND EXPLOITATION PLANS ........................................................................... 149
B.01.3 EXPLOITABLE PRODUCTS DESCRIPTION ........................................................................ 156
B.01.4 FURTHER RESEARCH BEYOND VERITAS ..................................................................... 161
List of Figures
FIGURE 1: EXTRAPOLATION OF TECHNOLOGY TRENDS ........................................................................... 14
FIGURE 2: TYPICAL DEVELOPMENT CHAIN BEFORE VERITAS ............................................................... 16
FIGURE 3: TYPICAL DEVELOPMENT CHAIN AFTER VERITAS ................................................................. 16
FIGURE 4: THE USER MODELLING METHODOLOGY OF VERITAS .......................................................... 22
FIGURE 5: AUM TABLE INFORMATION ................................................................................................... 26
FIGURE 6: KNEE FLEXION/EXTENSION MEASUREMENTS WITH THE ELECTROGONIOMETER AND MOCAP 29
FIGURE 7: ELBOW FLEXION/EXTENSION AND INERTIAL PLATFORM .................................................. 29
FIGURE 8: WRIST ULNAR-RADIAL BEND AND HUMAN GLOVE .......................................................... 29
FIGURE 9: PULL/PUSH FORCE AND MULTI AXES LOAD CELL.............................................................. 29
FIGURE 10: FORCE PANEL WITH THE VERTICAL LINE SHOWN ON THE LCD FOR TRANSFER-
FUNCTION ESTIMATION PROCEDURE .......................................................................................... 30
FIGURE 11: DATA PROCESSING PROCEDURE ....................................................................................... 30
FIGURE 12: TASK ANALYSIS TABLE INFORMATION ............................................................................. 32
FIGURE 13: MERGING INFORMATION INTO THE GENERIC VIRTUAL USER MODEL (IN THIS CASE
PHYSICAL) .................................................................................................................................... 38
FIGURE 14: THE VERITAS USER MODEL GENERATOR TOOL ................................................................... 39
FIGURE 15: THE VERITAS MODEL PLATFORM TOOL............................................................................... 39
FIGURE 16: WORKFLOW AND COMPONENTS OF THE VERITAS MODEL PLATFORM............................... 40
FIGURE 17: CUSTOMIZATION OF THE INTELLIGENT AVATAR THROUGH THE VERITAS AVATAR EDITOR
..................................................................................................................................................... 40
FIGURE 18: VERITAS CORE SIMULATION PLATFORM ARCHITECTURE ................................................. 42
FIGURE 19: VERITAS CORE SIMULATION MOTOR MODULE ARCHITECTURE ........................................ 43
FIGURE 20: VERITAS CORE SIMULATION VISION SUB-MODULES ARCHITECTURE ............................... 43
FIGURE 21: VERITAS CORE SIMULATION HEARING MODULE ARCHITECTURE ..................................... 43
FIGURE 22: VERITAS CORE SIMULATION COGNITIVE MODULE ARCHITECTURE .................................. 43
FIGURE 23: GRAPHICAL AND PHYSICAL REPRESENTATIONS OF THE INTELLIGENT AVATAR IN THE
VERITAS CORE SIMULATION PLATFORM .................................................................................... 45
FIGURE 24: AVATAR JOINTS AND THEIR DEGREES OF FREEDOM ............................................................. 45
FIGURE 25: AVATAR POINTS OF INTEREST ............................................................................................. 46
FIGURE 26: MOTION PLANNING AND GAIT CYCLE SIMULATION EXAMPLES ............................................ 47
FIGURE 27: GRASP MODULE EXAMPLE ................................................................................................... 47
FIGURE 28: LOOKAT MODULE EXAMPLE ................................................................................................ 48
FIGURE 29: FROM LEFT: NORMAL VISION, PROTANOPIA, DEUTERANOPIA, TRITANOPIA, GLAUCOMA,
MACULAR DEGENERATION ........................................................................................................... 48
FIGURE 30: FROM LEFT: OTITIS, OTOSCLEROSIS, PRESBYCUSIS MILD, PRESBYCUSIS SEVERE ............... 48
FIGURE 31: THE THREE STATES SHOWING THE CAR'S STORAGE COMPARTMENT FUNCTIONALITY. THE
ARROWS REPRESENT THE ROTATIONAL DEGREES OF FREEDOM. THE RED BOX SHOWS THE POI,
USED FOR INTERACTION WITH THE OBJECT. THREE OBJECTS ARE PRESENTED IN THE SCREENSHOTS:
THE HANDLE (MOVEABLE), THE STORAGE COMPARTMENTS' DOOR (MOVEABLE) AND THE CAR'S
DASHBOARD (STATIC). .................................................................................................................. 50
FIGURE 32: BLOCK DIAGRAM OF A MOTOR SIMULATION CYCLE. THE SIMULATION CYCLE IS REPEATED
UNTIL EVERY PRIMITIVE TASK IS COMPLETED. AT EACH STEP THE TASK MANAGER MODULE TESTS
AND REPORTS TO THE USER IF A CONSTRAINT (I.E. ANGLE/TORQUE LIMIT, COLLISION, ETC.) WAS
VIOLATED. .................................................................................................................................... 50
FIGURE 33: THE VERITAS HOLISTIC ARCHITECTURE AND THE EXPORTABLE TOOLBOX .......................... 52
FIGURE 34: IMMERSIVE SIMULATION PLATFORM DATA FLOW ................................................................ 54
FIGURE 35: THE VERSED-3D GRAPHICAL USER INTERFACE; A SCENE WITH AN OFFICE DESK HAS BEEN
LOADED. THE OBJECTS OF THE SCENE ARE ASSIGNED IN THE LEFT PANEL AND THE PROPERTIES,
SUCH AS MASS, POSITION, ORIENTATION, ETC., OF EACH OBJECT MAY BE DEFINED IN THE RIGHT
PANEL. .......................................................................................................................................... 55
FIGURE 36: THE GRAPHICAL USER INTERFACE OF THE VERSIM-3D TOOL .............................................. 56
FIGURE 37: VERSIM-3DS SIMULATION CASCADE; THE USER LOADS A SCENE FILE, A SCENARIO FILE AND
ADDS THE VIRTUAL USER MODELS THAT WILL BE TESTED SEQUENTIALLY. ................................. 56
FIGURE 38: THE SIMULATION REPORT WINDOW, PART OF VERSIM-3D TOOL, PROVIDING INFORMATION
REGARDING SIMULATED SESSIONS OF DIFFERENT DESIGNS. .......................................................... 57
FIGURE 39: THE IMMERSIVE 3D SIMULATION VIEWER IN ACTION.......................................................... 58
FIGURE 40: THE GRAPHICAL USER INTERFACE OF THE VERSED-GUI TOOL; IN THIS EXAMPLE, A CAPTURE
PROJECT HAS JUST BEEN COMPLETED. THE CAPTURE INFO WINDOW AND THE USER ACTIVITY
CHART ARE DEPICTED. .................................................................................................................. 58
FIGURE 41: THE VERSIM-GUI TOOL INTERFACE. A SCENARIO OF A HEALTHCARE APPLICATION HAS BEEN
LOADED. ....................................................................................................................................... 59
FIGURE 42: RUNNING SIMULATION WITH HEALTHCARE APPLICATION USING A VISUALLY IMPAIRED USER
MODEL .......................................................................................................................................... 60
FIGURE 43: SETTING UP AND FINAL RESULT OF A MULTI-VUM SIMULATION......................................... 60
FIGURE 44: REPORT GENERATION FOR THE FINAL OUTCOMEOF A SIMULATION BY VERSIM-GUI ........... 61
FIGURE 45: INTEGRATED INTERACTION TOOL ARCHITECTURAL SCHEME .............................................. 62
FIGURE 46: VISUAL FUNCTIONAL LIMITATION EXAMPLE AND CORRESPONDING VERIM CONTROL PANEL
..................................................................................................................................................... 62
FIGURE 47: KINEMATIC LIMITATION INTERACTION TOOL I EXAMPLE AND CORRESPONDING VERIM
CONTROL PANEL .......................................................................................................................... 63
FIGURE 48: SCHEME OF THE VIBROTACTILE KFL IT II LEFT, PROTOTYPE OF THE DEVICE INTEGRATED ON
THE ARM OF THE USER. ................................................................................................................. 63
FIGURE 49: EXAMPLE OF THE KINEMATIC LIMITATION INTERACTION TOOL II AND CORRESPONDING
VERIM CONTROL PANEL .............................................................................................................. 64
FIGURE 50: PICTURE OF THE FUNCTIONAL TEST OF THE INTEGRATED DFL IT SYSTEM .......................... 64
FIGURE 51: GRAB HAPTIC USER INTERFACE IS ABLE TO DELIVER A FORCE ALONG ANY WANTED
ORIENTATION IN 3D SPACE ............................................................................................................ 65
FIGURE 52: CONTROL FUNCTIONAL LIMITATION IT APPLICATION EXAMPLE ......................................... 66
FIGURE 53: N-BACKER COGNITIVE INTERACTION TOOL OUTPUT ............................................................ 66
FIGURE 54: BEHAVIOURAL & PSYCHOLOGICAL INTERACTION TOOLS EXAMPLES ................................... 68
FIGURE 55: THE GRAPHICAL USER INTERFACE OF THE MODALITY COMPENSATION AND REPLACEMENT
MODULE, AS PART OF THE MULTIMODAL INTERFACES MANAGER TOOL. ..................................... 69
FIGURE 56: INTEGRATION OF VERITAS TOOLS IN THE AUTOMOTIVE SOLUTIONS DOMAIN ................... 72
FIGURE 57: GENERATION OF OVERALL SIMULATION TASK TABLE .......................................................... 74
FIGURE 58: SUCCESS CRITERIA IN TASK TABLE ...................................................................................... 74
FIGURE 59: GENERATION OF SIMULATION SCENARIO FILES ................................................................... 75
FIGURE 60: SMART LIVING SPACES SIMULATION SYSTEM INTEGRATION METHODOLOGY ..................... 78
FIGURE 61: WORKPLACE DESIGN INTEGRATION METHODOLOGY ........................................................... 80
FIGURE 62: VERITAS GUI-BASED SIMULATION INTEGRATION ................................................................ 82
FIGURE 63: METAVERSE SIMULATION MODEL TABLE EXAMPLE ............................................................ 84
FIGURE 64: HEALTHCARE SIMULATION MODEL TABLE EXAMPLE .......................................................... 85
FIGURE 65: VERITAS OVERALL PILOT PLAN METHODOLOGY ......................................................... 86
FIGURE 66: GENERAL PILOT PLAN WITH DESIGNERS/DEVELOPERS ................................................. 87
FIGURE 67: USE OF THE CAR INTERIOR STORAGE COMPARTMENT .................................................... 90
FIGURE 68: HANDLE A POWERED TWO WHEELER ............................................................................... 90
FIGURE 69: PROGRAM THE ON-BOARD NAVIGATION SYSTEM AND ACTIVATE THE FUNCTIONALITY 91
FIGURE 70: RECEIVE AUDIBLE ALERTS FROM THE DEVICE WHILE RIDING A PTW ........................... 91
FIGURE 71: GET IN AND MOVE AROUND INSIDE A HOUSE (IMMERSIVE SIMULATION)........................ 91
FIGURE 72: USE KITCHEN APPLIANCES (INTERACTIVE CONTROL LIMITATION APPLIED IN
SIMULATION) ............................................................................................................................... 91
FIGURE 73: NAVIGATE AND INTERACT WITH FURNITURE AND DEVICES IN A WORK ENVIRONMENT 92
FIGURE 74: USE ICT COLLABORATIVE TOOLS AT WORK ................................................................... 92
FIGURE 75: USE METAVERSES ............................................................................................................. 92
FIGURE 76: PLAY A COLLABORATIVE GAME ....................................................................................... 92
FIGURE 77: PLAY A MULTI-USER GAME AIMED AT THE ELDERLY ...................................................... 93
FIGURE 78: USE REMOTE PATIENT MONITORING SOLUTION .............................................................. 93
FIGURE 79: USE MOBILE NUTRITIONAL ADVICE APPLICATION.......................................................... 93
FIGURE 80: USE HEALTH COACH APPLICATION ................................................................................. 93
FIGURE 81: USE OF THE CAR INTERIOR STORAGE COMPARTMENT: WITH HANDLE (INITIAL) AND
PUSH SPRING MECHANISM (FINAL) ............................................................................................. 96
FIGURE 82: HANDLE A POWERED TWO WHEELER: NORMAL AND LOWERED LAYOUT OVERLAPPED
SEAT LINES OF THE TWO LAYOUTS HIGHLIGHTED IN RED ......................................................... 97
FIGURE 83: PROGRAM THE ON-BOARD NAVIGATION SYSTEM AND ACTIVATE THE FUNCTIONALITY:
INITIAL AND OPTIMAL INTERACTION AREAS AND FINAL IMPLEMENTED GESTURES ................ 97
FIGURE 84: RECEIVE AUDIBLE ALERTS FROM THE DEVICE WHILE RIDING A PTW (INITIAL AND
FINAL) .......................................................................................................................................... 97
FIGURE 85: GET IN AND MOVE AROUND INSIDE A HOUSE (INITIAL AND FINAL DESIGNS) ................... 97
FIGURE 86: USE KITCHEN APPLIANCES (INITIAL AND FINAL DESIGNS) .............................................. 97
FIGURE 87: NAVIGATE AND INTERACT WITH FURNITURE AND DEVICES IN A WORK ENVIRONMENT
(INITIAL AND FINAL DESIGNS) ..................................................................................................... 98
FIGURE 88: USE ICT COLLABORATIVE TOOLS AT WORK(INITIAL AND FINAL DESIGNS) ................... 98
FIGURE 89: USE METAVERSES (INITIAL AND FINAL DESIGNS) ............................................................ 98
FIGURE 90: PLAY A COLLABORATIVE GAME (INITIAL AND FINAL DESIGNS) ...................................... 98
FIGURE 91: PLAY A MULTI-USER GAME AIMED AT THE ELDERLY (INITIAL AND FINAL DESIGNS) ..... 99
FIGURE 92: USE REMOTE PATIENT MONITORING SOLUTION (INITIAL AND FINAL DESIGNS) ............. 99
FIGURE 93: USE MOBILE NUTRITIONAL ADVICE APPLICATION (INITIAL AND FINAL DESIGNS) ........ 99
FIGURE 94: USE HEALTH COACH APPLICATION (INITIAL AND FINAL DESIGNS) ................................ 99
FIGURE 95: VERITAS AS A SERVICE BUSINESS MODEL ...................................................................... 105
FIGURE 96: SCREENSHOTS OF VERITAS DISSEMINATION MATERIAL (NEWSLETTERS, POSTERS,
LEAFLETS, ETC/) ......................................................................................................................... 113
FIGURE 96: SCREENSHOTS OF THE VERITAS WEBSITE ........................................................................ 114
FIGURE 97: VERITAS PARTICIPATION AT MAJOR EXHIBITION EVENTS ................................................ 116
FIGURE 98: PERCENTAGE OF PROJECT RESULTS DISSEMINATION VIA DIFFERENT COMMUNICATION
CHANNELS .................................................................................................................................. 117
FIGURE 99: STANDARDIZATION EFFORTS OF THE VUMS CLUSTER ...................................................... 121
List of Tables
TABLE 1: LIST OF PUBLIC DELIVERABLES AVAILABLE THROUGH PROJECT WEB SITE .............................. 20
TABLE 2: ICD VISUAL DISABILITIES ....................................................................................................... 23
TABLE 3: ICD CLASSIFICATION FOR MOTOR ........................................................................................... 24
TABLE 4: ICD CLASSIFICATION FOR SPEECH .......................................................................................... 24
TABLE 5: HEARING DISABILITIES............................................................................................................ 24
TABLE 6: LIST OF SENSORS USED IN THE MULTISENSORIAL PLATFORM................................................. 27
TABLE 7: PARAMETERS AND INSTRUMENTS ........................................................................................... 28
TABLE 8: PRIMITIVE TASKS EXAMPLE................................................................................................... 32
TABLE 9: TASK MODEL EXAMPLE ......................................................................................................... 33
TABLE 10: USE CASES LIST .................................................................................................................... 34
TABLE 11: USE CASE EAXMPLE: UC 1.2: USER MODEL GENERATOR..................................................... 35
TABLE 12: GENERIC VIRTUAL USER MODEL EXAMPLE .......................................................................... 38
TABLE 13: INTERFACE MODALITIES AND INPUT / OUTPUT DATA FORMAT. .............................................. 70
TABLE 14: USE CASE ANALYSIS RESULTS (DESKTOP APPLICATION) .................................................. 73
TABLE 15: USE CASE ANALYSIS RESULTS (IMMERSIVE APPLICATION) .............................................. 73
TABLE 16: WORKPLACE DESIGN SIMULATION TASK TABLE EXAMPLE ............................................. 80
TABLE 17: SMART GOAL SETTINGS FOR VERITAS ......................................................................... 87
TABLE 18: PILOT SITE PLAN PER APPLICATION AREA ....................................................................... 87
TABLE 19: PILOT SITE PLAN PER APPLICATION AREA ....................................................................... 88
TABLE 20: SUMMARY OF BENEFICIARY GROUPS IN EACH APPLICATION DOMAIN .................................... 89
TABLE 21: SUMMARY OF QUANTITATIVE RESULTS COMPARISON IN ALL DOMAINS ................................ 94
TABLE 22: SUMMARY CORRELATIONS BETWEEN RESULTS WITH VUMS AND ACTUAL USERS IN TASK
COMPLETION TIMES IN THE WORKPLACE DESIGN SCENARIOS ....................................................... 95
TABLE 23: VERITAS PRODUCTS/SERVICES CUSTOMER SEGMENTS ...................................................... 105
TABLE 24: LIST OF BENEFICIARIES ....................................................................................................... 108
TABLE 25: OVERVIEW OF VERITAS DISSEMINATION ACTIVITIES ........................................................ 116
TABLE 26: LIST OF CONFERENCES/USER FORUMS/WORKSHOPS/EVENT PARTICIPATION ..................... 126
TABLE 27: LIST OF OTHER EVENT PARTICIPATION ............................................................................... 128
TABLE 28: LIST OF CONFERENCE PAPERS/POSTERS ............................................................................. 131
TABLE 29: LIST OF JOURNAL/BOOK PUBLICATIONS ............................................................................. 144
TABLE 30:VERITAS MAIN EXPLOITABLE SERVICES ............................................................................ 148
TABLE 31: VERITAS EXPLOITABLE FOREGROUND ............................................................................. 151
TABLE 32: RESEARCH PRIORITIES ........................................................................................................ 161
systems should improve the level of independence, promote the social relationships
and encourage the psychological and physical state of the person.
AmI and AAL spaces could potentially bridge the accessibility gap
While the interest in Ambient Intelligence (AmI) and Ambient Assisted Living
(AAL) has increased exponentially due to the widespread use of portable IT devices
and networked sensors, AmI and AAL spaces could provide the necessary means to
achieve accessible products. Technological advances are making the AmI and AAL
visions a reality, but there still remains a question for designers and developers on
how to effectively develop and deploy smart applications, devices and services to
serve different categories of end users (beneficiaries) demonstrating flexibility,
adaptability and situation adaptability (context awareness). As a result, the support of
accessible solutions and services that can seamlessly operate in various and changing
settings (e.g. home, workplace, etc.) sets new challenges that must be addressed by
both the hardware manufacturers, software developers and designers.
ISTAG (the IST Advisory Group) has proposed Ambient Intelligence Space as a layer
connecting different AmI environments (e.g. home, car, public spaces) in a seamless
and unobtrusive way. Obviously, interoperability and standards are crucial in this
respect. Context awareness is another issue that needs to be looked into more closely,
especially when trying to integrate forms of context that are more sophisticated and
user-oriented than current definitions of context.
AmI and AAL tomorrow
The convergence of pervasive computing, ambient networks and intelligent-user
interfaces has enabled the development of ambient intelligence and associated
services. Human beings and machines will be surrounded by intelligent interfaces
supported by computing and networking technology in everyday objects. This will
lead to situations in which the environment is aware of a human or agent presence,
and in which the agents and devices are aware of their environment, their location and
also the abilities and disabilities of their human operator. Taking into account the
personal preferences, the current activities of the user and the behaviour of machines,
services will be capable of tracking users and responding intelligently to all kinds of
requests. Thus, the intelligent user interaction with systems and services is an
essential aspect for emerging applications and has specific requirements to cope with
peoples abilities.
The technology trends foreseen for the next 20 years as defined in the 2020 roadmap
for the future1 are outlined in the following Figure, which expects an evolution of the
current technological advancements in Ambient Intelligence and Smart Environments
that will make key technologies available for the adoption of accessible designs and
services.
It is widely expected that increased interoperability and smart appliances will become
mainstream in the retail industry around 2015. As this scenario will evolve, a vast
amount of objects will be addressable, and could be connected to IP-based networks,
to constitute the very first wave of the Internet of Things. Another very important
aspect that needs to be addressed at this early stage is the one related to interaction
standards, accessibility and personalised objects.
Moreover, interface or design optimization for each individual is also possible. The
adoption of dynamic virtual user models of end users (people with disabilities and
elderly) could be instrumental in deciding how to use multimodality and interface
adaptation for different users in different contexts and for setting the final design
goals. Furthermore, it is crucial also for physical interaction with non-ICT objects.
experience developed during the project. The Interaction Adaptor has the dual
function of (1) adaptation and encoding of user model parameters into control
parameters used for the control of Interaction Tools for designer experience and (2)
serving as interface layer from the core simulation platform and the immersive
interaction devices.
The Integrated Core Simulation Platform and Exportable Toolbox constitutes the
main technical achievement of the project. The Core Simulation Platform combined
with the adaptor and interaction tools can be used by a product designer or software
developer in order to simulate virtual user models with or without disabilities and
provides to the end user, designer or developer, various metrics to be used for
accessibility assessment of a product. The Exportable Toolbox adapts the information
of the virtual user and the simulation models and moreover exports it to existing
developer/design platforms that are already used for the design/development of
mainstream ICT and non-ICT products.
In order to evaluate the frameworks performance, we applied the use cases to each
application domain covered (automotive, smart home design, workplace design,
infotainment and healthcare) by integrating them inside the VERITAS Core
Simulation Platform which entailed the identification of the simulation parameters in
each case and providing the adaptation of all the parameters affecting each use case
scenarios simulation. The outcome was the formal definition of simulation models
for each application domain, proposed design and use case scenario, ready to be
tested for accessibility using the Exportable Toolbox, against multiple Virtual User
Models covering several physical, cognitive and behavioural impairments through
both automatic and immersive/interactive simulation of the tested design.
The evaluation of the framework consisted of an extensive pilot plan extending over 2
pilot iterations with designers/developers and 2 pilot iterations with beneficiaries for
each tested design in each domain as well as usability and acceptance testing of the
VERITAS Framework and tools themselves. We researched several evaluation
methodologies and in the end we implemented a pilot methodology that provided
both Formative evaluation, i.e. evaluation carried out during the design and
implementation phase of VERITAS and targeted at collecting information for
improvement and Summative evaluation, i.e. the evaluation process on VERITAS at
the end of the development phase and targeted at collecting information on the
outcomes of the implementation.
Our evaluation strategy included both qualitative as well as quantitative metrics.
Quantitative evaluation focused on measurements providing tangible results and
statistics using indicators such as the time to perform a task versus the number of
errors as a consequence of applying tests to the online interface of an application.
Indicators and measures were collected via log files analysis. Qualitative evaluation,
on the other hand, was directed to the user and considered opinions and values of
users. Indicators and measures concerning this were questionnaires and user
experience reports, interviews etc. Generic questionnaires, which were ethically
approved by the Ethical Advisory Board of the project, pilot ethical committees and
national data protection authorities, have been conducted during VERITAS pilots for
the assessment of projects usability and acceptance level. The questionnaires
addressed all types of users participating and presented useful information regarding
the following parameters:
D1.7.1a Final version of VERITAS Use Cases and Application Scenarios CERTH/HIT
D1.7.1b Final version of VERITAS Use Cases and Application Scenarios CERTH/HIT
D1.7.2 Task analysis per application area MCA
D1.7.3 Parameterization of models to the context of use CERTH/HIT
D2.1.1 Core simulation platform CERTH/ITI
D2.2.1 Simulation models for the automotive scenario CRF
D2.3.1 Generation of simulation models Domologic
D2.4.1 Simulation models for the workplace design scenario Hypertech
D2.5.1 Simulation models for the game scenario VRMMP
D2.6.1 Simulation models for the healthcare scenario I+
D2.7.1 VERITAS interaction tools architecture definition PERCRO
D2.8.1 First prototypes of the multimodal interface tool set CERTH/ITI
D2.8.2 Integration of Multimodal Interfaces into the VERITAS Simulation and CERTH/ITI
Testing Framework
D2.8.3 Testing and validation Refinement of the interface tool set CERTH/ITI
D3.4.1 Accessible metaverses and collaborative tools integrated in the VERITAS CERTH/ITI
platform
D3.8.2 Pilot results consolidation UNEW
D4.1.3 Project Presentation and Project Description Leaflet FhG/IAO
D4.1.4 VERITAS Ethics Manual COAT
D4.3.1a Dissemination plans and materials (appendix of PR reports, Report on MCA
raising public participation and awareness) plus leaflets and posters
D4.3.1b Dissemination plans and materials (appendix of PR reports, Report on MCA
raising public participation and awareness) plus leaflets and posters
D4.3.1c Dissemination plans and materials (appendix of PR reports, Report on MCA
raising public participation and awareness) plus leaflets and posters
D4.3.1d Dissemination plans and materials (appendix of PR reports, Report on CERTH/ITI
raising public participation and awareness) plus leaflets and posters
D4.3.2 Proceedings of second VERITAS International Workshop CVUT
D4.3.3a Project video MCA
D4.3.3b Project video FhG/IAO
D4.3.4 Proceedings of the first VERITAS International Conference CERTH/HIT
D4.5.1 Application Guidelines, Research Roadmap, policy and standards FORTH
recommendations
D4.5.2 VERITAS roadmap CERTH/HIT
D4.5.3 VERITAS White Paper VERITAS vision CERTH/ITI
Each of the main VERITAS scientific and technological achievements, as well as the
VERITAS system integration in each of the demonstrators is briefly presented in the
following sections.
1.3.1.1. The VERITAS User Model Methodology
The generation of VERITAS Virtual User models followed the next steps:
1. Creation of Abstract User Models (AUM), based on: analysis of existing
models (theoretical, computational, biomechanical) and medical literature,
existing practices and guidelines, user needs and international accessibility
standards.
2. Appropriate Task models implementation, common for the three views of the
model: physical, cognitive and psychological & behavioural, based on
UIML/USIXML language that represents users performing specific tasks and
interactions
3. Iterative update and verification of the models with metrics collected with the
help of VERITAS Multisensorial platform.
4. Generation of Generic Virtual User Models (GVUM) per disability/affective
state, by linking the Abstract User Model parameters with the affected tasks.
5. Creation of the Virtual User Models, by instantiating the GVUM, which
represents a specific user with a disability/s or psychological state
Based on these major components a methodology has been synthesized in the first
months of the project that is described in the following sections.
for each disease and impairment was quantitative, objective, and measurable indices.
We also researched medical scales that can be found in the literature, however, these
scales were developed for rating the severity of the disease and are of little usefulness
for developing virtual human beings with modelled disabilities.
We can roughly divide abstract data and parameters into two groups: the first group
collects parameters that can be mapped on the models in a straightforward fashion,
e.g. the range of motion of joints. The second group collects parameters that need
identification methods to be transformed into models, e.g. we devised the spiral test,
which measures performance of people affected by Parkinsons Disease (PD) to
derive measurable parameters that indicate the severity of PD and thus provide a scale
to base a VUM for PD on.
The Abstract User Models refer to a high level description of potential user models.
They are developed with respect to several specific disabilities and are broken down
according to the disability category, i.e. cognitive user models, physical user models,
behavioural & psychological user models. An abstract user model that is stored in
ontologies includes several disability related parameters like disability description,
disability metrics, ICF functional abilities, etc.
The creation of the Abstract User Models was initiated by performing literature
research on Physical, Cognitive, Psychological and Behavioural impairments to
derive the most important impairments and the relevant parameters that affect them.
Physical Impairments
There is a wide spectrum of diseases that can cause some kind of physical impairment
in people. These range from diseases of the nervous system to diseases of the
musculoskeletal system and connective tissues. We organized physical impairments
in four broad categories:
motor impairments
visual impairments
speech impairments
hearing impairments
The World Health Organization (WHO) [1] endorsed a classification of various
diseases and other health problems, called International Classification of Disease
(ICD). From an overview of this classification, it is clear that the number of disability
types is very large. In addition, the category of older people needs to be also
considered. This is a transversal class, which, besides general aging effects, can
include one or more of the above diseases. We focused on a relatively small group of
important pathologies, which have to be considered with priority for the simulations
and hence for the literature review. The review was carried out to obtain the data
about physical and motor impairments (in this manner it was possible to characterize
the disabled users and the relative pathologies). Based on the review, the following
pathologies were prioritized:
Table 2: ICD visual disabilities
VISUAL DISABILITIES
Pathology ICD code
Diabetic retinopathy H36.0
Glaucoma H40
Senile or age related degeneration H35.3
VISUAL DISABILITIES
Pathology ICD code
Color vision deficiencies H53.5
A set of metrics were identified for each of the physical impairments and were used
to define the Abstract User Model to describe functional limitations. For each type of
physical impairment (motor, visual, speech, hearing) the following set of metrics
were considered:
Motor Impairments:
1. Kinematics functional limitations. Reduction in mobility of joints, velocity of
joints, (possibly geometry of joints and bones) and ultimately reach and
March 2014 24 CERTH/ITI
Veritas D4.1.13 Dissemination Level (PU) Grant Agreement # 247765
dexterity abilities.
2. Dynamics functional limitations. Reduction in muscle strength and ultimately
the ability to produce useful forces.
3. Control Functional limitation. Neuromuscular deficiencies, which ultimately
results in difficulty of controlling movements.
Vision Impairments:
Visual skills assessment involved the assessment of:
Visual Acuity the ability to perceive details presented with good contrast;
Visual Field the ability to simultaneously perceive visual information from
various parts of the environment.
Contrast Sensitivity the ability to perceive patterns of poor contrast. Loss of
this ability can interfere significantly with many daily activities.
Glare sensitivity, including delayed Glare recovery, Photophobia and reduced
or delayed Light and Dark Adaptation are other functions that may interfere
with proper contrast perception.
Color vision deficiencies
Speech Impairments:
PCC (Percentage of Consonants Correct)
ACI (Articulation Competence Index)
Hearing Impairments:
For hearing impairments the audiogram is a good indicator: it reports the hearing
level in Decibels versus the frequency of the signal. The resulting metric is:
Decibel as function of Frequency
states that influence elderly and disabled, so qualitative and quantitative metrics and
rules of behaviour were analysed and compiled to create the P&B Abstract User
Model tables.
A review was made to the abstract cognitive, psychological and behavioural user
models in order to create a new table structure that relates the cognitive,
psychological and behavioural disabilities with the affected task models. Two
different approaches were followed in order to define the value of the parameters
(quantitative) that were not included in the definition of the Physical Abstract User
models. The first one was based on a research of existing models of the cognitive
architecture Adaptive Control of Thought-Rational (ACT-R) and existing studies and
their relation with the affected tasks. The second one was based on a virtual simulator
with real users. The selected scenario for this last one was the automotive and the
elderly people were the selected simulation targeted users.
The parameters described above were combined with anthropometric, age-related and
sociodemographic data gathered from the literature to derive minimum and maximum
ranges for each parameter as well as human transfer functions for each and resulted in
the definition of the Extended Abstract User Model that provides a concise
description for each physical disability.
The main information present in the Extended AUM table is:
ICD classification
Short pathology description
Parameters (metrics) based on literature survey for every disability
The figure below shows a resulting AUM table example:
Consequently, one of the main innovations in VERITAS with respect to virtual user
modeling is the introduction and use of a Multisensorial Platform for the training of
parameterized user models based on real user measurements in real testing conditions.
The multisensorial platform is a system of different sensors adapted to the VERITAS
application areas and is able to capture user feedback while executing a number of
tasks that will be mapped in the VERITAS virtual user models.
Special sensors, as presented below, were used for data capturing ranging from trace
monitoring cameras for driver monitoring to wearable sensors for body motion
analysis, to motion trackers and gait analysis sensors for analyzing user kinematic
patterns while executing specific activities and tasks and also environmental sensors
for monitoring the interaction of users with the real environment.
The final list of sensors used in the measurement campaign with the Multisensorial
Platform is given in the following table.
Table 6: List of sensors used in the Multisensorial Platform
Group System Instrument accuracy
Video sensing Omni vision camera set up better than 5 cm
Wearable systems Electrogoniometer standard deviations are
less than 1.5
Human glove Sensor Accuracy: 0.1V /
2.5V
Sensor Non-Linearity: <
2.0%
Sensor Range: > 110
Inertial Platform standard deviation is 3
degrees
Motion tracking Kinect Height ~ 3 cm
Bumblebee Stride length ~ 5 cm
Step length ~ 3 cm
Step width ~ 3 cm
Gait asymm. ~ 5 %
Cadence ~ 2 %
Double sup. ~ 4 %
Body oscill. ~ 4 %
Swing ~ 4 %
Velocity ~ 5 %
Environmental sensors Multi-axes load cell in x direction: 2.3 N,
95% confidence level
in y direction: 2.1 N,
95% confidence level
Force panel spatial resolution of 0.4
mm, accuracy of 1.8 mm
force resolution of 0.05
N, accuracy of 0.1 N
MOCAP position accuracy: 5
mm
angular accuracy: 1.5
Vicon ~ 1 mm
The following table correlates each of the measured parameter with the corresponding
sensor of the Multisensorial Platform.
Figure 10: Force panel with the vertical line shown on the LCD for transfer-function estimation
procedure
During the measurements, 209 persons were subjected to test campaigns in which the
relevant parameters for elderly people and people with disabilities were measured.
These campaigns took place at 5 test sites: Florence, Trento, Newcastle, Thessaloniki
and Plovdiv. The persons involved covered the majority of physical impairments
targeted by the project and the measured data was processed in order to calculate the
mean and standard deviation for each quantitative metric, and to correlate them with
specific disabilities.
The procedure of data processing was the following:
- The recorded data was introduced into the template tables at each test site.
- Each table with the recorded data was collected in a central sheet. This central
sheet contains all the measured parameters of all the subjects and all the
anthropometric data
- Next step was to sort the measured data by disabilities. Since a unique number
was associated to all the subjects correlated with their disabilities, the data from
the central sheet could be reorganized according to disabilities.
- Assuming a Gaussian distribution the collected data was statistically processed to
derive the mean and standard deviation of the specific parameters for all the
considered disabilities
- The mean and standard deviation values of the specific parameters correlated with
disabilities were imported into the physical VUM table in their correct location
for each disability and affected metric.
An example of such a task analysis table is presented in the following figure for
illustrating the information contained in the task analysis tables.
As it can be noticed in figure above, the task analysis table defines the link between
application sectors, tasks, subtasks, and primitive tasks. One can define a task as a
group of basic actions taken by the user. Entering a car or changing gear can be
examples of tasks for a specific sector (automotive in this case).
With the aim of defining a closer link between the tasks and metrics, every task is
substructured in subtasks: a subtask can be defined as a list of primary physical
actions needed to perform one task. For example for the task of changing gear in a car
the list of subtasks would be: pushing (left foot), reach (left arm), position (left hand),
grasp (right hand), push (right hand) and lift (left foot). Following the red circles and
lines from the figure one can see an example how the task of entering the car is linked
to different subtasks, like opening the left door, further linking to several primitive
tasks, which may consider different impairments, like for example upper limb
impairments.
The primitive tasks define the primitive human actions and are related to the
disability category (physical, cognitive, behavioral). The number of primitive tasks
should be limited but also sufficient so as to efficiently model all systematically
performed actions in the five VERITAS application scenarios. The degree of
primitiveness that was adopted was carefully chosen within VERITAS.
Concerning implementation, each primitive task should contain a name as well as the
category in which it belongs to. The list of primitive tasks that will be defined within
VERITAS will include tasks of different categories such as: motor, cognitive,
perceptual, visual, hearing, speech, etc. The following table lists some indicative
primitive tasks.
In some cases it can happen that the list of metrics affected by some specific
disability will not affect the performance of a specific task. For example if the virtual
dummy is performing a change of gear action, parameters related to human gait and
lower body will not affect the results, in this case the important parameters would be
strength and upper body metrics.
Table 8: Primitive Tasks Example
The Task Models that were implemented within VERITAS were based on the five
application scenarios. They refer to user actions/interactions with a specific
environment (e.g. car, workplace, user interface) and follow a hierarchical structure
ranging from abstract high-level task to primitive tasks. They support the use of
assistive devices to perform a specific task, through multiple instances of a specific
action. They were developed based on existing relevant stat-of-the-art, standards and
guidelines but also based on domain knowledge and relevant attributes with respect to
the contents of the VERITAS application scenarios such as automotive, smart living
spaces and building, domotics, infotainment, health. The following table lists an
indicative instance of a task model.
Table 9: Task Model Example
The information described in the Task Models was merged in the Generic VUMs
establishing a link between the different tasks, subtasks, primitive tasks and the
information contained in the AUM: disabilities, metrics and parameters. Combining
these two sets of information one will be able to start for example from a task related
to a specific sector, choose a subtask, and for the related primitive tasks find
information regarding the affected metrics and parameters as a function of
disabilities.
1.3.1.5. The Use Cases
Based on the Task Models that were generated, the next step was the development of
a set of Use Cases representative of each Application domain covered by the project,
in order to derive the key aspects of accessibility evaluation parameters, targeted user
groups and expected outcomes.
A use case defines the interactions between external stakeholders and the system to
be developed, in order for them to achieve a specific goal. As presented before, a
stakeholder is any person or organization who will be affected (either positively or
negatively) by the system to be developed. Use Cases allow capturing the users need
by focusing on a task that the user needs to do. Starting by naming the Use Cases,
value is created since this list of titles-context is the list of goals that announces what
the system will do.
Also, the Use Cases help discovering and gathering requirements for all users and
beneficiaries, and work as a hub that links together different sorts of information. Use
Cases are at the centre of the requirements process, even for many of the non-
functional requirements. Finally, by using the Use Cases methodology we can
achieve both verification and validation of the system under development. With the
verification, we make sure that the system has been well structured and developed
and with the validation we ensure that the system corresponds to the Users needs and
expectations.
The Use Cases, are goals (use cases and goals are used interchangeably) that are
made up of scenarios. Scenarios consist of a sequence of steps to achieve the goal;
each step in a scenario is a sub goal of the use case. As such, each sub-goal represents
either another use case (subordinate use case) or an autonomous action that is at the
lowest level desired by our use case decomposition. This hierarchical relationship is
needed to properly model the requirements of a system being developed. In addition,
it helps avoid the explosion of scenarios that would occur if we were to try to simply
list all possible ways of interacting with the system.
Table 10: Use Cases list
the various application domains of the project, and the beneficiaries are the elderly
and disabled users.
For each Use Case, a number of scenarios were drafted, in order to have a clear vision
of what is the accessibility problem we are targeting in each application domain that
we are dealing with. Using the Use Cases, in the form of problem scenarios, at the
analysis stage, can prevent the occurrence of costly error correction at later stages of
the development cycle. In the following table an example of the Category 1 Use
Cases, drafted to evaluate the VERITAS framework itself is presented.
Table 11: Use Case Eaxmple: UC 1.2: User model generator
Figure 13: Merging information into the Generic Virtual User Model (in this case Physical)
The following table lists an indicative instance of a Generic Virtual User Model.
Table 12: Generic Virtual User Model example
disability that will be represented by the exported virtual user model. For example,
the designer wants to calculate the gait parameter values corresponding to the 90% of
the stroke patients population. Using the graphical user interface of the VERITAS
User Model Generator, the designer can select the disability and the preferred
population percentage to be covered through the use of regression for each disability
parameter. The available regression types include: a) parametric regression, b) non-
parametric regression and c) hybrid regression. The final result is a virtual user model
generated in UsiXML format, according to the value of each disability parameter.
The abstract avatar model is then converted into a concrete virtual physical avatar
taking into account the parameters of the Virtual User Model. The conversion process
is performed by the VERITAS Model Platform (VerMP) and consists of a translation
from the geometry used by the user generator to the actual (and richer) format needed
by the simulator.
Figure 17: Customization of the Intelligent Avatar through the VERITAS Avatar Editor
the Humanoid Module (or human model module), which is responsible for
simulating the avatar's capabilities and performing its actions.
the Scene Module (or scene model module), which is responsible for
managing the scene objects and theirs functionality.
the Task Manager Module, which has the purpose of managing the tasks
needed to be performed, i.e. the sequence of the avatar's actions and their
results on the scene objects, stemming from the hierarchical task definitions of
the Veritas Task Models.
The overall architecture of the Core Simulation Platform can be seen in Figure 18.
Figure 23: Graphical and Physical representations of the Intelligent Avatar in the VERITAS
Core Simulation Platform
The humanoid skeleton used in the Core Simulation Platform consists of 51 bones,
while each bone has a rigid body and is defined by the following attributes:
Mass
Volume
Position
Orientation
Linear Velocity
Angular Velocity
Linear Acceleration
Angular Acceleration
Collision body
There are a total of 50 joints supporting the skeleton each one with distinct Degrees
of Freedom (DoF) with support from 1 to 3 DoFs. Each joint permits its attached
bones to rotate:
a) by either setting the bones' orientations directly (in level-1 simulations)
b) by generating and applying torques (in level-2 simulations) to them
Besides the bone and joint elements, in order to properly define some specific regions
on the skeleton, another set of basic elements had to be introduced, the Points of
Interest or PoI. The PoI are special points on the humanoid's body that are needed so
as to properly define the higher level modules' structures. An example of the
supported avatar PoI can be seen in the following figure:
The Motion Planning algorithms used in the Core Simulation Platform and their
respective classes are part of the Collision Avoidance Manager. The algorithms used
are sampling-based. There are two sampling-based motion planners supported: a
multi-query graph based approach and an exploring-tree based approach.
Furthermore, the Gait Module is responsible for the locomotion of the avatar. Its role
is double: first is responsible for providing the shortest path between two points and
then, it generates the gait data. Examples of the results of these two components can
be seen in the figure below:
The LookAt Module takes into account various parameters in order to achieve natural
head motions. Several restrictions are applied in the computation of each joint
rotation. The parameters that define this behaviour are:
maximum allowable head and neck joint angular velocity,
maximum allowable eyeball angular velocity,
range of motion of the head and neck
Examples of supported impairments and their respective filters can be seen in the
figure below:
Figure 29: From left: Normal vision, Protanopia, Deuteranopia, Tritanopia, Glaucoma, Macular
Degeneration
Hearing Simulation
Hearing simulation is based on the analysis of the virtual user model's audiogram
parameters. An audiogram is a standard way of representing a person's hearing loss.
Initially, the audiogram parameters, stored inside the virtual user model specification,
are passed into the hearing model. Then, the hearing model constructs the audiogram
of the avatar. Using the audiogram, the hearing model generates the audio filter and
applies it on all input sounds. Using the above method, several audio impairment
symptoms can be simulated, such as otitis, otosclerosis, noise induced hearing loss,
presbycusis etc. Audio impairment audiogram examples are presented in the
following figure:
Figure 30: From left: Otitis, Otosclerosis, Presbycusis Mild, Presbycusis severe
Cognitive-Psychological/Behavioural Simulation
Cognitive simulation is performed by the cognition module. In its current state, the
cognition modules cooperates with the Motion Manager and applies delay to the
avatar motions, based on the avatar's cognitive parameters. There are two kinds of
time delay that are used to emulate the avatar's thinking process:
Pre-action delay: this delay is applied before the avatar's action. It simulates
the thinking process of the task. The complex tasks result in greater durations
than the simple ones. The complexity of one task is given by the path of the
end effector. Long curve paths that avoid lots of obstacles increase the pre-
action delay.
In-action delay factor: the in-action delay factor is applied while the avatar is
in motion. The target time is increased by this factor, and the whole process
takes longer. This is achieved by increasing the interpolating duration between
the key-postures.
Scene Module
The scene module is responsible for creating the scene, managing the objects in it and
defining their special attributes. The scene is modeled by two sets of objects: static
objects and moveable objects. Both objects have geometry (volume) and visual
representation (meshes and textures). Static objects do not have mass and cannot be
moved by the humanoid. On the other hand, moveable objects are modeled as rigid
bodies, having properties like uniformly distributed mass over their volume (constant
density), linear and angular velocity. Complex objects, can be modeled by using a
combination of static and moveable objects, and manually defining the degrees of
freedom of the moveable parts.
Special points of interest (PoI) can be declared on the objects to best define the type
of interaction with the humanoid. These points can be used to help the humanoid to
interaction with the object, but they do not define the interaction. Their only purpose
is to decrease the complexity of the task. The moveable objects can be moved freely
or constrained, by altering their degrees of freedom (DoF).
As an example of a scene object, it will be mentioned the car's storage compartment
shown in Figure 31. Its functionality described by two DoF: one that connects the
handle with the storage compartment and another that connects the storage
compartment with the dashboard. In this example, a scene rule is used to check at
every timestep the state of the handle. If the angle to its parent (i.e. compartment's
door) exceeds the predefined limit, the storage compartment opens by a spring force.
Collision between the objects and properable contact reaction is fully supported by
the scene module. A moveable object can collide with either a moveable/static object
or with the humanoid. Various attributes such as surface object material properties
need to be defined for a realistic friction model. In order to decrease the dynamical
complexity of the scene, gravitational forces can be applied only to the moveable
objects.
Figure 31: The three states showing the car's storage compartment functionality. The arrows
represent the rotational degrees of freedom. The red box shows the PoI, used for interaction with
the object. Three objects are presented in the screenshots: the handle (moveable), the storage
compartments' door (moveable) and the car's dashboard (static).
Figure 32: Block diagram of a motor simulation cycle. The simulation cycle is repeated until
every primitive task is completed. At each step the task manager module tests and reports to the
user if a constraint (i.e. angle/torque limit, collision, etc.) was violated.
At every step, the task manager, as supervisor, checks for task completion and reports
to the system if something went wrong. The motor simulation cycle pattern that is
followed at each simulation step is shown in Figure 32. More precicely, at the start of
the cycle, the task manager module generates a series of movements (i.e. movement
path) for the humanoid to follow. Every state of the generated path must contain
information about the target configuration of the bones and joints. This target
configuration will be declared as Ctask. Collision avoidance techniques are applied at
this step so that the humanoid geometry does not penetrate its own elements or any
scene object. If a path cannot be found after a number of iterations in search space
then the task manager reports task failure to the system.
After that, the task manager, provides the humanoid model with the generated target
configuration Ctask. The humanoid gets the provided configuration and applies to it a
set of restrictions, such as joint-angle restrictions. Angle-constrained inverse
kinematics methods are used in order to find a new configuration Cconstrain that is
close to the targeted one (Ctask). Cconstrain contains a target angle for each of one of
the joints. If the difference between Ctask and Cconstrain is above a limit then the
Task Manager reports failure. The difference metrics that have been used are: a)
average distance of the joints positions, b) average distance only end effectors
positions, c) average angle absolute difference of each joint angle.
Having a target angle for each joint, the system computes via inverse dynamics the
torques that need to be applied at each joint and set the attached bones in motion. If
the computed torques values exceed the predefined limits, the task will fail. This step
is valid only when simulation is running in dynamic mode. In kinematic mode, this
step is omitted and the targeted angles are set directly.
In order to decrease the complexity of each primitive task and its success/failure
condition, the task manager uses a system of task rules. Each primitive task can have
a set of rules that are checked at each timestep. Following the same rule model as in
the scene module, each rule has two main parts: condition part and result part. When
a condition is met, the rule's result part is applied. Conditions can check various
simulation's elements and states, such as current distance of a specific bone from a
PoI, measure time since task started, count how many times a specific humanoid
action was performed etc.
Figure 33: The Veritas holistic architecture and the Exportable Toolbox
Simulation engine
Core Simulation Platform Simulation Runtime Engine
(DLL)
User Model Generator Virtual User and Simulation Model Adaptor, Editor
(VerGen) Adaptor (Executable)
Veritas Model Platform Virtual User and Simulation Model Adaptor, Editor
(VerMP) Adaptor (Executable)
3D Core Simulation
Simulation Runtime Engine Viewer (Executable)
Viewer (VerSim-3D)
3D Immersive Simulation
Simulation Runtime Engine Viewer (Executable)
Viewer (IVerSim-3D)
Other than the Core Simulation Platform, the User Model Generator, the Model
Platform and the Avatar Editor that have been described already, the functionalities
and features of the rest of the tools are described briefly below:
Immersive Simulation Platform
The Immersive Simulation Platform and, provides to the VERITAS framework all the
needed functionality for managing the immersive users simulation in VR
environments. The Immersive Simulation Platform is connected to the Core
Simulation Platform. Thus, impairment simulation is applied to the 3D avatar of the
immersed user the same way as in the 3D automatic avatar case. Cooperation with the
VerIM component ensures proper handling of the devices that receive the user
interactions.
Immersed VR Interaction
User Tools (Hardware)
Interaction
Manager (DLL)
Assigning masses to the volumes this process is necessary for the inclusion
of physical quantities such as, velocity, acceleration, force, torque, etc, in the
simulation.
Connecting primitive objects to one another by restricting their motion
degrees of freedom, either in a rotational or translational way.
Adding resistance forces and velocities to the rigid scene bodies.
Adding special rules to the objects, such as when the object-handle is rotated by 30
degrees, the object-door opens.
Figure 35: The VerSEd-3D graphical user interface; a scene with an office desk has been loaded.
The objects of the scene are assigned in the left panel and the properties, such as mass, position,
orientation, etc., of each object may be defined in the right panel.
3D Core Simulation Viewer (VerSim-3D)
The 3D Core Simulation Viewer (VerSim-3D) is a tool that has been developed as
part of the VERITAS activity A2.1.1. It is integrated with the Core Simulation
Platform DLL and is used for running the 3D simulations based on special scenario
xml files.
Product designers and developers who use the VERITAS platform may use the
VerSim-3D tool for assessment of a product prototype against virtual user groups
regarding their ability to effectively make use of the prototype. Existing or future
devices or installations can be such prototypes as soon as their physical model (mesh)
and their properties are adapted for simulation in VerSEd-3D
The tool is used for performing automatic simulations, i.e. all the avatars actions are
generated automatically by the Core Simulation Platforms engine. VerSim-3D
cannot be used for immersive simulation sessions for that purpose the IVerSim-3D
is used. The VerSim-3D tool targets towards the evaluation of the accessibility of
non-ICT products by simulating the functionality of real objects with mass and
volume.
It uses a variety of files in order to perform a simulation session: files regarding the
customization of the avatar model and its physical characteristics (CAL3D and
Virtual User Model); a file for generating the 3D scene and its functionality (adapted
3D scene file); and a file that describes the scenario that contains the actions of the
avatar (adapted simulation scenario from a simulation model file).
The user loads these files via the VerSim-3Ds GUI and starts the simulation session.
During the simulation the main window of the tool is updated by rendering the
simulation environment and the avatar model. Information is provided to the user,
such as arrows depicting the generated body torques, indications of whether a body
joint has reached its limits, etc. In case of task success or failure, the VerSim-3D tool
informs the user by displaying the respective information.
Figure 37: VerSim-3Ds simulation cascade; the user loads a scene file, a scenario file and adds
the Virtual User Models that will be tested sequentially.
At the end of each simulation session, the Core Simulation Platform engine generates
a simulation report containing various metrics and factors of the avatar actions.
Besides the time duration of the tasks, information about physical and comfort factors
are included in the simulation report. The report is stored inside an xml file that can
be viewed by the VerSim-3D tool and even compared to other simulation report files.
Figure 38: The Simulation Report window, part of VerSim-3D tool, providing information
regarding simulated sessions of different designs.
Immersive Simulation Viewer (IVerSim-3D)
IVerSim-3D provides user interfacing capabilities to the Immersive Simulation
platform. The immersed user can use it to load and provide the 3D scene, the VUM
file, scenario files and avatar models to the Immersive Simulation platform.
IVerSim-3D renders the VR environment using stereoscopic methods (Figure 39) in
order to achieve the users immersion. IVerSim-3D is an application that has
integrated both the Immersive and the Core Simulation Platforms.
Figure 40: The graphical user interface of the VerSEd-GUI tool; in this example, a capture
project has just been completed. The capture info window and the user activity chart are
depicted.
The VerSEd-GUI tool is used for creating the Simulation Scenario files needed by the
VERITAS GUI Core Simulation Viewer (VerSim-GUI, 0) to perform the
accessibility assessment of a GUI design. This tool is used to capture the users
activity in a desktop application and process the loaded Simulation Model to produce
the Simulation Scenario File independently of hardware and software characteristics.
VerSEd-GUI can capture mouse and keyboard activity in a specified area of the
screen during the recording phase (Recorder Mode). After loading a Simulation
Model file, the prerecorded user driven events can be directly connected to tasks after
processing in the Editor Mode. The outcome is a scenario file describing users
activity on a simple task.
It should be noted that VerSEd-GUI serves a second important scope in VERITAS
project: It is the logging tool for all other VERITAS software tools which run as
desktop applications (excluding IVerSim-3D, the full immersive real-time simulator).
As an advanced logging and user tracking tool it will be used to create the raw data
(based on scenarios) for the usability evaluation.
GUI Core Simulation Viewer (VerSim-GUI)
The GUI Core Simulation Viewer (or VerSim-GUI) is a tool that has been developed
as part of the VERITAS activity A2.1.1. It is integrated with the Core Simulation
Platform DLL and is used for running the 2D/GUI simulations based on special
scenario xml files.
Application designers and developers who use the VERITAS platform may use the
VerSim-GUI tool for assessment of an application prototype against special user
groups regarding their ability to effectively make use of the prototype. Existing or
future applications can be such prototypes as soon as their interface and their
properties are set in the VerSEd-GUI (Error! Reference source not found.). The
GUI of the tool is depicted in the following figure:
Figure 41: The VerSim-GUI tool interface. A scenario of a healthcare application has been
loaded.
The tool requires two files as input: the scenario file (created using VerSEd-GUI) and
a virtual user model file (created using the Veritas Virtual User Generator Tool).
Also, the application prototype, that is being tested, is required to be running. The
tool can be used to perform either automatic or interactive simulation.
In the case of automatic simulation, the Core Simulation Engine pre-calculates all
necessary actions (mouse movements, mouse clicks, keyboard strikes etc.) according
to the virtual user model that has been loaded, and plays them on the screen.
For the interactive simulation mode, the user has control of the mouse and keyboard
(and any other input device, haptic, voice etc.), and the Core Simulation Engine
compensates/corrects his/hers actions with respect to the loaded virtual user model
(i.e. decreases/increases mouse speed or accuracy etc).
VerSim-GUI also provides a Multiple-VUM offline simulation mode, called Cascade
Simulation, useful when different levels of an impairment need to be tested in groups
and later compared to against each other or the optimal user recording. Performing a
cascade simulation consists of loading a number of VUMs and one or more
simulation scenarios and running the simulation, as seen in the following Figures.
When the simulation is finished, the Core Simulation Engine creates a report
containing various metrics and factors of the user actions. These metrics and factors
include time duration, mouse movement and click statistics as well as keyboard
strokes statistics. These metrics are also compared with values extracted for a normal
(non-disabled) user.
Figure 42: Running simulation with healthcare application using a visually impaired user model
Figure 44: Report generation for the final outcomeof a simulation by VerSim-GUI
Interaction Tools and Interaction Manager (VerIM)
The Immersive Simulation Platform is equipped with novel interaction tools
developed as part of WP2.7. Such tools are able to simulate the disabilities allowing
the designers/developers to design accessible services and products by getting
conscious of the disability of the Virtual User Model. The Veritas Interaction
Manager (VerIM) supports all the functionality needed for handling the various
interaction tools for simulating the different disabilities. More specifically, the
Interaction Manager is responsible of transmitting to the designer/developer the
sensory feedbacks according to the selected users disabilities and the applications
scenarios.
The interaction tools (IT) aim at simulating or intuitively communicating a specified
disability or impairment to the designer through an immersive experience. This
objective is achieved through the full integration of ITs within the Immersive
Simulation Platform (ISP). The whole set of Interaction Tools include:
Physical IT:
Visual Impairments:
o Visual Field
Motor Functional Limitations:
o Kinematics Functional Limitations
o Dynamics Functional Limitations
o Control Functional Limitations
Cognitive IT:
o Attention Limitations
o Reaction Time Alteration
Behavioural and Psychological IT:
o Stress Level
o Emotional State Alteration
The architecture of the Interaction Tools and the role of the Interaction Manager is
depicted in the following diagram:
Figure 46: Visual Functional Limitation example and corresponding VerIM Control Panel
Figure 47: Kinematic Limitation Interaction Tool I example and corresponding VerIM Control
Panel
The second implementation of Kinematic Functional Limitation IT has been achieved
through the hardware prototyping of a novel vibrotactile interface that is integrated on
a suit that can be worn by the designer. The designer receives warning through
vibrations that are generated on his own body. In the context of simulation of
Kinematic Functional Limitation the VERITAS consortium developed a vibrotactile
system for communicating to the designer information regarding limitation of
movements of impaired users. The system is composed by four vibrating motors and
a wireless controlled driving electronics.
Figure 48: Scheme of the vibrotactile KFL IT II left, prototype of the device integrated on the
arm of the user.
The vibration is interpreted as simulation of pain or discomfort felt by the end-user on
one or more specific joints. The four vibrating elements are collocated on each of the
joint of the arm as in Figure 48 (wrist, elbow, shoulder and eventually on shoulder
blade) on the area that usually affected by pain sensation when joint limits are
overcome.
The definition of the joint limits take place through the Parameter Encoder (included
in the Interaction Manager) that loads a specific user model and applies the
corresponding control parameters for the simulation of the specified visual
impairments. The Parameter Encoder has been developed in order to support the
above presented control mask parameters.
Figure 49: Example of the Kinematic Limitation Interaction Tool II and corresponding VerIM
Control Panel
Dynamic Functional Limitation IT: DFL IT
Dynamic Functional Limitation Interaction Tool (DFL IT) is a system for simulating
functional limitations related with disabilities which cause loss of muscles tone and
reduced muscular mass i.e. typical disabilities associated with aging and with other
specific pathologies like Ulnar Neuropathy or as consequence of Lordosis. DFL IT
supports the VERITAS Immersive Simulation Platform in an Automotive scenario
and the physical interface consists of a Force-Feedback Steering Wheel FFSW.
The FFSW is an example of the so-called Task replica haptic user interfaces. Such
devices are haptic interfaces which are characterised by comprising a reproduction of
the real tool that is handled by the end-user in the real world (a steering wheel in this
case). Within VERITAS the device is exploited for putting designers in the condition
of getting an equivalent perception of force of the disabled user.
Figure 50: Picture of the functional test of the integrated DFL IT system
The role of the VerIM for this tool is to translate, via the Parameter Encoder, the User
Model data into a related amount of extra force needed by the specified user to
operate the steering wheel. In order to compute the exact force feedback to be applied
to the steering wheel, VerIM needs also some static data (the dynamic model of the
car) and the simulated car speed. VerIM receives from the IT the value of the steering
angle which is communicated to the main application in order to accordingly update
the scenario.
Control Functional Limitation IT: CFL IT
Control Functional Limitation Interaction Tool (CFL IT) is employed within the
VERITAS platform for simulating functional limitations associated with motor
control disabilities alike Essential (postural) and Parkinson (rest) Tremor, as well as
other kinds of Neurodegenerative diseases which may be caused by aging.
In order to simulate such disabilities to the VERITAS end user, a kinesthetic haptic
interface called GRAB (Figure 51) has been purposely equipped with special end-
effector and closed-loop controller. Basically, by imposing harmonic disturbance
forces, the considered CFL IT is capable of inducing on the designer fingertip/wrist
controllable harmonic oscillatory motions (superimposed on the desired designer
movement trajectories) which closely reproduce the impairments experienced by
tremor-affected people.
The GRAB is a 6 DOF device that can be attached to the users finger or wrist, and is
able to exert peak forces in the range of 0N-20N. Grab functionalities are essentially
equivalent to the commercial Phantom device from Sensable Corp., but the
workspace of the device is much larger and covers almost completely the workspace
of the human arm.
Figure 51: GRAB Haptic user interface is able to deliver a force along any wanted orientation in
3d space
The purpose of the CFL IT is to make an able-bodied user feel the effects of the
functional limitations that are experienced by tremor-affected people while moving
objects with their hands. As aforementioned, this objective is accomplished here by
externally perturbing the user wrist with harmonic forces that are provided via the
GRAB device. To achieve high levels of realism and comprehensiveness a specific
closed-loop controller has been implemented on the GRAB device which is able to
accurately regulate the frequency and the amplitude of the artificially-induced tremor
with varying levels that range from high, through moderate to low intensity.
The n-back tool doesnt interact directly with the immersive simulation platform. The
sequence of numbers is translated into speech that the designer listens via a headset.
The vocal responses of the designer are acquired by a microphone and analysed with
a Speech Recognition module in order to verify the correctness of the answer. The
global n-back task performance is eventually logged at the end of the task.
The system initialises the n-back tool with a quantity of Cognitive Load (or a level
of difficulty), whose value is determined by the parameter encoder according to the
User. The cognitive load is mapped to the values of a set of parameters of the n-back
tool, such as the speed at which numbers as exposed to the user and the n value.
The extra cognitive load is a function not only of the model of the simulated user but
also of the cognitive model of the designer; therefore such a characterization of the
specific designer should be made available before using the platform.
The role of the IM for this tool is to translate, via the Parameter Encoder, the User
Model data into a measure of the additional cognitive load to impose to the designer
for the n-back task. If an error is detected by the IT, the IM receives the ID of the
warning which is transmitted to the ISP in order to trigger an opportune audio/visual
signaling.
Behavioural & psychological interaction tools
Behavioural & Psychological IT is devoted to let the designer experience the
expected psychological reactions of VERITAS end-user while performing tasks that
are considered in the VERITAS scenarios via the VERITAS Immersive Simulation
Viewer. Stress, fatigue, motivation, and emotions are closely related to cognition and
influence cognitive performance. Therefore, behavioural and psychological
interaction tools are closely related to the cognitive interaction tools developed.
Prior to performing the VERITAS tasks in the immersive environment, stress
induction (pre-stressing the user) can be realised with a GUI, e.g. using the Montreal
Imaging Stress Task, the Trier Social Stress Test or the Trier Mental Challenge Test.
During the VERITAS tasks in the immersive environment, the presentation of the
arithmetic task can be either:
Auditory using headphones, or
Visual via superimposing the text of the arithmetic task over the graphic
output generated by the immersive platform.
The software begins with a main start menu (Figure 5.2). The user can access other
menus by using as input device: mouse or speech recognition system (by reading and
pronouncing the key word for example: Stress , Neutral , Fear , Sadness, or
number 1,2,3). The use of such a system (limit the range of answers) can
reduce the errors. It is recommended to use relatively short speech inputs shorten the
time it takes to train the users voice and can allow the user to enter speech inputs
quickly and easily.
platform. The Multimodal Interfaces Manager is also responsible for selecting and
applying the multimodal interfaces models in order to perform the modality
compensation when needed. The Modality Compensation GUI is depicted in Figure
55.
Figure 55: The graphical user interface of the Modality Compensation and Replacement Module,
as part of the Multimodal Interfaces Manager tool.
The Multimodal Interfaces Manager module is a C++ dynamic link library that
provides to the VERITAS framework a two-fold functionality:
First, management of the Multimodal Interfaces models and proper task
analysis of the simulation model based on them, and
Second, management of the external modules/libraries that implement
modality interfaces, e.g. speech synthesizers.
The modality compensation process is the main functionality of the module. More
specifically, the Multimodal Interaction Manager cooperates with the Core
Simulation Platform and it intervenes when needed in order to replace a modality
with another which is more appropriate for the capabilities of the selected Virtual
User Model.
The Multimodal Interfaces Manager library supports the following functionality:
Parsing the Multimodal Interfaces Models files,
Organizing them and selecting the appropriate model for each situation,
Replacing the current task flow with other task that are based on modalities
more appropriate to the Virtual User Model capabilities.
A table reporting for each use case the subtask success criteria, which are one
basis for evaluating the effectiveness of the accessible solutions developed
with VERITAS tools
A table describing the transfer of simulation models in the target simulation
environments.
Finally, the integration of each identified VERITAS tool was performed starting with
a review of the requirements for these tools, followed by integration specifications
and finally with the integrations implementation and documentation.
The review of the requirements indicated the following ergonomic aspects as key to
the accessibility simulation using the VERITAS framework:
Visibility: Can the target user see all necessary visual information? This
includes the recognition of ICTs as well as classical aspects as sight limits
inside and outside the vehicle.
Clearance and accessibility: Does the vehicle have the sufficient room and
solutions so that the target user can carry out all necessary movements freely
without collision and in comfort conditions?
Ease of use: Can the target user reach and manipulate the controls easily
within his/her physical capacity (strength, joint maximum range of motion,
equilibrium keeping)?
Another key functional performance aspect in automotive design and ergonomics
analysis was identified as the the assessment of the vibrational comfort. Since the
exposition to vibrations may have adverse effects on the human health, the comfort
performance of the final product must be carefully evaluated and optimized in the
virtual engineering process, in order to guarantee the comfort performance of the final
product that hits the road.
Based on the findings of the review, the integration methodology with VERITAS was
formulated as shown in Figure 56.
The process of identification of each use case requirements was applied for four
different subjects (car interior design, motorcycle riding posture, ADAS-IVIS
systems, ORAS-OBIS systems) and related specific scenarios were identified both for
a desktop based accessibility assessment as for immersive accessibility assessment.
The use case scenarios selected can be seen in the following tables:
Table 14: Use case analysis results (desktop application)
Based on the use case analysis, the simulation models were developed based on the
following methodology:
1. Generation of overall task table
All necessary data were collected from the use cases, task models and interaction
modalities. Finally they were structured in table as illustrated in Figure 57.
feedback with haptic devices which has its own limitations and necessitates
more involved set up and specialized hardware. b) Qualitative feedback which
indicates the limitations as warnings but does not inhibit the performed
motion.
Third person perspective: The user observes the impaired person acting on the
design. This mode is comparable with the experience of the VerSim3D
desktop tool but with greater spatial navigation flexibility. The core
simulation provides the avatar representation in the virtual environment.
Based on the use case definition for the Domotics use case, common simulation
models for the core simulation of Smart Living Spaces were derived. With regard to
The tasks were combined to multi-task scenarios, shown in the substructures below.
Walk form main door to the kitchen
1. Walk path
2. Open all doors on path with handle
3. Switch on all lights in all passing rooms
Operate dishwasher
1. Open dishwasher
2. Reach in dishwasher
3. Open dishwasher drawer
Operate fridge
1. Open fridge door
2. Reach in
Operate toilette
1. Open door
2. Switch on light
3. Open toilette lid
4. Reach flushing
The integration with the VERITAS Exportable Toolbox was implemented following
the methodology depicted in Figure 60:
The SLS Simulation systems integration with other subsystems is further enhanced
by the use of extensions, consisting of three modules:
1. SLS UI Plugin: UI tab for the specifics of the SLS control UI
2. SLS Behavior: Here the dynamic and trigger objects of the SLS modules are
managed
3. Domotic Connector: Network link to the Domologic Server
The SLS UI Plugin extends the web interface with an additional tab where
control events are sent to the IVerSim-3D application. This includes
specifically the control of the connection to the domotic server which can be
started and stopped.
Since CFL_IT is haptic interface, this integration was a physical integration among
each gas hob and the CFL_IT, which in turn needed to be connected to a computer
running the VerIM software. This meant that the CFL_IT was moved to Indesit lab
and a particular experiment set-up was built, in order for the designer to be able to
wear the CFL_IT haptic interface while using the gas hob.
Workplace Design & Collaborative Tools
Starting from results of task analysis and use cases and the virtual user model files
relevant to workplace, the requirements of end-users and of technology providers for
ergonomic design with respect to humans with limited functionalities were analysed
first. Then this analysis was used to derive specific tasks to be performed in
simulation scenarios that follow the use cases. The tasks were in turn analysed to
describe the interactions, objects, success criteria and thresholds that fully describe
the simulation modeling. Finally, the simulation models were formulated into
UsiXML form, in order to abstractly describe the simulation models in a form that
can be then adapted by the Veritas Tools in order to match specific designs and users.
We started the work to create the simulation models by first performing a study of the
workplace (workplace design and collaborative tools) needs, requirements, guidelines
and standards which govern usability and accessibility in the workplace domain. The
work carried out during this activity revolved around literature research,
questionnaires and expert interviews, in order to provide a compendium of
requirements with regards to accessibility assessment in the workplace.
Based, on the Use Cases and multidimensional task analysis, we derived the
specific simulation models in table forms, from which, the resulting UsiXML files
were generated. In total, 6 Simulation Models were created, 3 for workplace design
and 3 for Collaborative tools sub-domains, while also 39 task models (20 for
Workplace and 19 for Collaborative tools) and 98 Multimodal Interaction Models (72
in the Workplace and 26 in the Collaborative tools) were created and the
corresponding UsiXML files were generated.
Workplace design using VERITAS Core Simulation Platform (CSP) makes use of
existing CAD environments, so the main effort behind the simulation aspects is
similar to those of the Automotive domain while the integration follows a similar
methodology as that of Ramsis but instead, in this domain the integration deals with
the inclusion of the CSP into an existing Workplace design-oriented CAD
environment. The external design environment serves as a carrier of the simulation
preparation and simulation performing functionality. The chosen CAD environment is
VrDeco ver.2.0 and its Graphical User Interface was extended in order to contain the
necessary functionalities of exportable toolbox.
With designers developing concepts for accessible workplaces, the VERITAS toolbox
is introduced in the design environment to connect the design stored in computer
memory with simulation editing and performing software. As a result, the toolbox
was equipped with a minimal set of loaders, parameterizations of simulation scenes,
virtual user models, avatars and simulation scenarios. Cycles of testing using virtual
user models and optimization of the workplace design were integrated to the design
process activities. The integration API, followed methodology, the architecture of
VERITAS Tools Message Interchange System (V-TOMIS) and the way of its use
with system calls between different tools (internal or external) completed the modular
structure of the system. The use case analysis methodology to derive simulation
models while the generation of the simulation models followed the methodology
presented in the Automotive domain. An example of a simulation model produced
can be seen in the following table while the integration methodology is exemplified in
Figure 61.
Table 16: Workplace Design Simulation task table example
output image can be desaturated to simulate the color blindness or the audio channel
can be muted to simulate hearing problems.
The implementation of the sensory limitations is linked to the virtual user models and
has to be directly integrated into the game or metaverse application. The integration is
performed by feeding the outputs of the game/metaverse engine to the core simulation
platform. The platform will then post-process the output and it will embed it into the
virtual scene, which models the particular workplace configuration (monitor,
keyboard, mouse, and tablet). In order to allow the developer to primarily focus on
accessibility of the game instead of solving the workplace issues. The platform should
also support an ideal workplace mode in which the limitations given by the particular
workplace configuration will be minimized. In particular in this mode we assume that
the user can focus on the whole screen without any restrictions.
Simulating cognitive and behavioral limitations
In order to simulate cognitive and behavioral limitations we need to extract relevant
indicators of cognitive load and behavioral requirements from the game engine. This
is in general very difficult problem, which has not yet been sufficiently solved and
understood for more complex scenarios. In VERITAS we assume that we can obtain
aggregate information indicating a cognitive load of the user by either analyzing the
game output image using certain image complexity metrics or by additional explicit
hints from the game/metaverse engine. These values are passed to the cognitive user
model and in turn will affect reaction times of the users in the simulation of other
tasks.
The simulation aspects described above are not exclusive to games and metaverses,
but cover any ICT application with a GUI and typical interaction devices (mouse,
keyboard and joystick). Therefore, this methodology can be used in all other domains
that feature GUI-based designs.
Infotainment applications, as all ICT software applications to some extent, consist of
a user interface to specific interactions in either a Game or a Virtual Metaverse
Environment. As such, the accessibility assessment through simulation, consists of a
set of steps taken by the designer/developer, which are outlined in Figure 62.
The general approach consists of 5 steps:
Record Interaction Session: The designer records a session where a minimum
number of events in order to perform a task specified in a simulation model is
logged. The recording performed in the previous step contains only raw
information on the basic input events (mouse motion and button clicks,
keyboard entry, joystick motion etc.) The events recorded do not dictate
specific motion paths or keyboard entry typing rates. Mouse or Joystick
motion events are only described by source and destination positions, while
timestamped mouse button presses/releases and keyboard key presses/releases
indicate the minimum time required to perform the events and the event
targets.
Adapt Session for Veritas Simulation: In order to attribute these events to the
generic simulation model specified in a UsiXML file, the designer must match
each event or sequence of events to a specific task. Each UI has different
characteristics such as screen resolution and windows size, UI element types,
sizes and locations and functionalities of these elements. For example, the
generic task defined in a UsiXML simulation model may be Enter a
username. This task can be applied to various applications, but the specific
parameters for the execution of this task in each application are unique. For a
given application, this task may consist of moving the mouse to the password
entry field, clicking in the field, entering a variable set of characters using the
keyboard and finally pressing Enter or clicking on a button to submit the
username. When all the events are attributed to tasks, the final outcome is a
simulation scenario, adapted specifically to that particular application and
ready to be simulated with virtual user models of impaired users, using the
Veritas GUI Simulation framework. For each event, the designer can modify
the strictness of the evaluation of the events outcome when it is simulated, by
indicating if a failure to perform the task should be considered a failure of
performing the entire task or if the event can be performed until the outcome
is the desired one.
Apply Virtual User Model Parameters: The next step to the accessibility
assessment of an ICT application, in general, consists of applying the
parameters of a VUM that affect the accuracy, speed and sensory output of an
interaction event. Using the algorithms that were defined in SP1 and
implemented in WP2.1, each event described in the adapted simulation
scenario of the previous step is filtered using the virtual user model
parameters affecting the outcome of this event. E.g. motor impairments on the
upper body and specifically the upper limbs in a virtual user model are applied
to the mouse motion, mouse button, joystick motion, joystick button and
keyboard entry events, modifying the optimal paths, times and target positions
recorded accordingly. Vision impairment parameters are used to generate
simulated visual output of the UI, again affecting the ability to locate and
interact with UI elements, while hearing impairments are use to provide
simulated audio output, again affecting the capability of the virtual user to
identify, comprehend and locate sound events in a 3D spatial context.
Simulate Session: When these parameters described in the previous step are
applied to the simulation scenario, the outcome of the simulated interaction
session may or may not be successful, may take an inordinate amount of time
to complete, or may be very difficult to perform altogether. The Veritas GUI
Simulation framework provides two modes of Simulation, automatic and
interactive.
o In automatic mode, the core simulation platform takes over the entire
execution of the simulation scenario, by injecting simulated input
events, in effect controlling the mouse and joystick motion,
performing mouse button and joystick button presses and releases and
keyboard entries, filtered by the parameters of the VUM. This filtering
affects the motion paths, times taken between events, accuracy of
target location, etc, based on the motor and cognitive parameters of the
VUM. At the same time the visual and audio outputs of the system are
also filtered, to provide a realistic simulation of the interface for a user
with the impairments defined in the virtual user model. The outcome
of the simulation depends on the strictness defined in the adapted
simulation scenario for each event and the overall of the simulated
virtual users ability to perform the tasks defined in the simulation
model.
o In interactive mode, the core simulation platform again simulates the
effects of the impairments on motor, visual and hearing modalities, but
the designer has control of the input. In this case accessibility
assessment is performed directly by the application
designers/developers, by placing themselves in the position of an
impaired user, trying to interact with an application.
Report Accessibility Assessment: After the simulation session is over,
depending on the outcome of the simulation in performing the tasks defined in
the simulation model, the Veritas GUI Simulation framework reports on
screen both on the success or failure or the session and each task involved,
individually. When a task fails, the reason is reported (e.g. wrong key pressed,
mouse click outside of a UI elements area etc.). Furthermore, the entire
simulated session is recorded in a similar manner to the initial session
recording of the first step, in order to quantitatively evaluate the simulated
session against the optimal one. Through this quantitative analysis, useful
insights can be extracted, such as error rates, path deviations, speed of
execution per task, among others.
Due to the varying software platforms, UI sub-systems and other factors specific to
each application (e.g. 3D content and UI elements), several approaches were
researched and developed to integrate the Veritas Core Simulation. The three main
integration methods researched and developed were:
Internal Integration using Core Simulation Library: The direct software
integration of the Veritas Simulation Core DLL library in the applications code,
to facilitate interactive and automatic 2D UI accessibility assessment.
Internal Integration using Event Logging/Playback: The direct software
implementation of capturing and playback of close-coupled application events
and their outcomes within the pilot application, to facilitate interactive
accessibility assessment for 3D UI elements
External Integration using the Veritas GUI tools: The indirect adaptation of
simulation scenarios for 2D UI-based accessibility assessment, providing generic
event logging, adaptation of logged events to simulation tasks and playback of
these tasks using the Veritas GUI accessibility assessment tools (VerSEd-GUI
and VerSim-GUI). This is the integration method used in the majority of GUI-
based designs throughout the targeted application domains of VERITAS.
An example of the simulation models produced for the Metaverses and Games
domain are presented in the tables below:
physically, cognitive, visual and hearing. For the immersive simulation environment,
only the automotive, smart living and workplace application areas were targeted.
The main outcome was the definition and implementation of a pilot planning scheme
and the appropriate benchmarking platforms for software testing in the VERITAS
environment.
Both formative and summative evaluations were put forward for use in the project.
These two phases can be defined as:
Formative evaluation, i.e. evaluation carried out during the design and
implementation phase of VERITAS and targeted at collecting information for
improvement.
Summative evaluation, i.e. the evaluation process on VERITAS at the end of
the development phase and targeted at collecting information on the outcomes
of the implementation.
The overall pilot plan methodology can be seen in the following figure:
In total, a number of 140 test-users were used to test the VERITAS tools, chosen in
each application domain so as to be directly targeted by the relevant tools they were
asked to test. The pilot tests were held over two iterations, in order to provide
feedback for the improvement of the VERITAS platform during the development
phase. The overall plan follows the diagram shown below:
The pilot plans main aim with regard to stakeholders was to evaluate 4 main aspects
of the VERITAS Platform:
Usability
Reliability
Accuracy
Response
Suitability
User Acceptance
To that end both qualitative and quantitative metrics were chosen for the evaluation.
The tools used to measure the aspects of the evaluation indicated above were the
following:
Interview
Focus Groups
Questionnaires
Observation
System Log Data
The questionnaires the test users filled in the pilots were the following:
Demographic Questionnaire
TAM Technology Acceptance Model
System Usability Scale SUS
The quantitative data from the pilots with designers/developers was captured using
interaction logging, actually using the same VERITAS tools developed for GUI-
oriented accessibility assessment.
Pilots with Beneficiaries
Pilots for beneficiaries constitute demonstration pilots and were aimed for summative
evaluation as defined. It provided the opportunity to gather feedback based on the
users first-hand experiences with the applications of the VERITAS tools, i.e. the
VERITAS simulation models.
Therefore, these pilots were planned and organized so as to assess the tools through
the examination of the accessibility of their applications (products and services). The
pilots were held in two iterations in order to provide valuable insight into the
improvement of the tools and deliver evidence on how well the goal of VERITAS has
been achieved, which is to ensure that future products and services are being
systematically designed for all people, including those with disabilities, functional
limitations and older people. The pilots also identified any limitation and adjustment
of the existing use cases and the simulation platform to assess and verify the
accessibility of an application. Furthermore, they helped define under what
circumstances a real user should be required instead of a virtual one.
Table 19: Pilot Site Plan per Application Area
Application
Pilot site Country Title of the pilot
domain
CRF IT Car Interior
Continental DE ADAS/IVIS
CRF ADAS/IVIS
Automotive
PIAAGIO IT
Motorbike handling, posture and use of OBIS
RELAB
BAUUNION
Smart Living DE Interior Design
FhG/IAO
Space
INDESIT IT Home Appliances
RELAB
BYTE GR Collaborative Tools
Workplace
CERTH/ITI GR Workplace design
CERTH/ITI GR Metaverse
Infotainment UNEW UK Metaverse, Collaborative Games
AIJU ES Collaborative Games
CERTH/HIT GR Remote Patient Monitoring
Healthcare Mobile device solution
I+ IT Health coach
In total, 380 beneficiaries participated in the pilots. The beneficiaries that participated
in each application domain pilot and the group of impairment the belong to can be
seen in the following table:
Table 20: Summary of beneficiary groups in each application domain
Parkinsons
Smart Living Space 90
Older people
Older people
Older people
The following Pilot applications were developed to be used in the pilot tests of each
Application domain:
Automotive
o Use of the car interior storage compartment
o Program the on-board navigation system and activate the functionality
o Handle a powered two wheeler (PTW)
o Receive audible alerts from the device while riding a PTW
Smart Living Space
o Get in and move around inside a house
Figure 69: Program the on-board navigation system and activate the functionality
Figure 70: Receive audible alerts from the device while riding a PTW
Figure 71: Get in and move around inside a house (immersive simulation)
Figure 72: Use kitchen appliances (interactive control limitation applied in simulation)
Figure 73: Navigate and interact with furniture and devices in a Work environment
The cases marked with an asterisk indicate the durations of tasks where certain
assumptions made during the simulation were not matched in the tests with actual
users. In particular, in these cases, the actual users took much longer to perform the
task than anticipated which upon observation, in most cases can be attributed ti the
fact that comprehension of the exact manner to complete the task took more time than
the perfect case of the virtual user who is programmed to just perform the task with
little time spent on cognitively figure out the way on how to perform it.
This is a theme that was witnessed along most of the application domains. The actual
task completion times once the user initiates the action is highly correlated to the time
expected by the simulation system, but there was no provision for the time taken by
the users to accustom themselves to the situation or the time taken by the users to
figure out how to perform the action in the first place.
This indicated problems on both end the simulation methodology and the
methodology of the pilots with beneficiaries in that comparison of what is in reality a
dry run on an unknown task by the actual users cannot be compared directly to a
simulated task performed by a virtual user that is programmed to know beforehand
how to complete the task. This means that both the training and preparation on using
the pilot applications must be more extensive, but also, that the simulation models
should be expanded to include configurable virtual users expertise levels on the
completion of tasks parameters.
Another issue identified was that it was not always possible to include the same
persons in both iterations, so the results could be directly compared on per user case.
Furthermore, the levels of impairment and characteristics of users in some cases did
not match the virtual users used in the simulations in all cases exactly, leading to
minor deviations from the average expected.
The 2nd iteration of the pilots was performed on updated versions of the pilot
applications that were redesigned in key areas where the simulations indicated
accessibility issues. In most cases, as described earlier, these areas were the same as
those indicated by actual user tests.
Examples if the updated versions of the pilot application designs can be seen in the
following figures in comparison to the initial ones:
Figure 81: Use of the car interior storage compartment: With handle (initial) and push spring
mechanism (final)
Figure 82: Handle a powered two wheeler: Normal and lowered layout overlapped Seat lines of
the two layouts highlighted in red
Figure 83: Program the on-board navigation system and activate the functionality: Initial and
Optimal interaction areas and final implemented gestures
Figure 84: Receive audible alerts from the device while riding a PTW (initial and final)
Figure 85: Get in and move around inside a house (initial and final designs)
Figure 87: Navigate and interact with furniture and devices in a Work environment (initial and
final designs)
Figure 88: Use ICT collaborative tools at work(initial and final designs)
Figure 91: Play a multi-user game aimed at the Elderly (initial and final designs)
Figure 92: Use remote patient monitoring solution (initial and final designs)
Figure 93: Use mobile Nutritional Advice application (initial and final designs)
Figure 94: Use Health Coach application (initial and final designs)
The results of the 2nd pilot in the majority of cases and scenarios confirmed the
improvement expected by the results of the simulations with the VERITAS platform.
This result also validates to a large extent, barring the few deviations described
earlier, the correlation of the virtual user models and the simulation engines
capabilities with regard to physical and cognitive simulation.
These results were confirmed both by the analysis of the qualitative data (user
satisfaction, comfort and performance) and the quantitative analysis of the logs
recorded during pilot sessions (task durations and error rates). Overall,
Therefore, the VUMs created using the VERITAS tools were found to be consistent
with the impairments of the beneficiaries and were proven effective and useful to
redesign a fully accessible application. In fact, after the redesign, the beneficiaries
could perform the tasks between 10% and 30% faster on average and make as much
as 45% less errors on average.
The outcome of the pilots does not mean that VERITAS is a perfect catch-all
solution. On the contrary, several areas for improvement were identified that can lead
to further research on more robust and realistic simulation-based accessibility.
Among the areas of improvement the following were found to be the most important:
Better and more refined Cognitive simulation
Better correlation of cognitive aspects to physical actions simulation
Improved Immersive functionalities support
Expansion of the multisensorial tests with actual users to derive better
statistical models of targeted impairments and expansion of the models to
cover more impairments
Inclusion of virtual model expertise level parameter to cater for lack of
experience of actual users in the interaction with designs
Full dynamic musculoskeletal simulation of motor tasks
More research into the effects of visual and hearing impairments to the
quantitative performance of user tasks is required
1
Europe in figures - Eurostat yearbook 2008, ISBN: 978-92-79-06607-8, Eurostat 2008.
defined and followed. During the project the pilot plans they have been
checked by the Ethics Advisory Board, which has been set up. The board
members reported their comments and these were included, together with
partners feedback, and the emanating Deliverable changes were added in the
project pilot plans Deliverable (final version).
2. That the project developments as a whole, have been conducted in a way that
guarantees that future products and services are ethically designed for all
people, including those with disabilities and avoid the creation of barriers, as
well as the protection of their personal data (health status, capacities, abilities,
location, routing, etc.).
A Deliverable on the ethical, legal and privacy issues relating to the technologies has
been developed by the VERITAS project and a dedicated Activity has been devoted
to address ethical and privacy issues raised by the project and the biometric in
general. A depth analysis has been provided within VERITAS for the ethical issues
that may rise throughout the lifetime of the project and its activities. These were
included in the VERITAS Ethics Manual.
Within this VERITAS Ethics Manual, background information concerning the
recognition of key ethical and legal issues is provided. A relevant project policy has
been developed (ethics code of conduct). It is specified which data are essential for
the project and which will be excluded from retention (especially any information
which could be linked with the identity of a participant, as well as any religious or
philosophical beliefs, political opinions or experience). All relevant national and
international European conventions (i.e. Helsinki Declaration) are fully integrated.
With the aid of the Template on ethical and legal issues, national standards and the
norms of Local Ethics Research Committees are gathered. Within the Ethics manual,
the ethical frame for the conduct of the pilots and various trainings is also justified on
scientific and legal basis in depth.
All national legal and ethical requirements of the Member States where the research
is performed were fulfilled. Personal data of participants has been strictly held
confidential at any time of the research.
This means in detail that:
All the test subjects had the ability to give informed written consent to
participate.
All the test subjects were strictly volunteers and preserved the right for to
withdraw at any time from the trials, without prior notification.
The personal data gathered during the tests or the iterative development phase
were strictly protected and unlinked anonymised. No genetic information was
collected. No user personal data was centrally stored, nor sent around in the
Network, nor was available to any third party (i.e. for advertisement,
marketing or even research outside VERITAS objectives). Only one person
per site (relevant Ethics issues responsible) had access to the relation between
test participants' code and identity, in order to administer the tests. One month
after the pilots end, this reference was deleted, thus safeguarding fully
The VERITAS exploitable results follow one or more of the following three types of
employability in business activity:
Open Source Solution (OSS)
The complete source code of the open source applications will be available and freely
redistributable. The community will benefit from these solutions, producing new
knowledge and applications. The partners will use these solutions and knowledge as
stepping stones in novel research efforts in the future. Additionally, specific services
could be offered for the implementation and customization of this OSS for a fee (see
ER02 as an example).
Proprietary Solutions
Some of the developed technologies are partially or totally composed of copyright
code. The source code for the proprietary applications could be delivered through
commercial license. Additionally, specific consultancy services based on these assets
could be offered for a fee (see ER01 as an example).
Services to third parties
The planned business model for VERITAS is either based on OSS, Copyright code or
the so-called dual licensing model. This last approach permits a customer to choose
one of the two licenses: either the GNU General Public License (GPL) or the
commercial licenses. The exploitation model of several ERs also considers the offer
of technical solutions together with implementation guidelines, as well as training and
consultancy services: the VERITAS as a Service model.
There are four groups of potential customers, with a different impact on the business
model VERITAS as a SERVICE. The way to address to each segment will be
different, as well as the benefits obtained from them.
Moreover, apart from the developers/designers to whom they mainly address, the
development of VERITAS results implies several impacts to the other addressed
stakeholders, indicatively mentioning the following:
European industrial players: VERITAS built-in accessibility support will incite for
more systematic usage of VR simulation in the design and development of
commercial products accessible for all, mainly for the handicapped.
ICT and non-ICT companies with specialized departments in human factors may be
attracted by the VERITAS concept which proposes virtual reality accessibility testing
as an alternative to testing in the real environment, in order to develop fully
accessible products as close as possible to the particular needs of the handicapped
persons.
Software companies and SMEs may use by the VERITAS VR Open Simulation
Platform in order to achieve the optimum result in design and development of
software accessibility. VERITAS tools could also be the starting point for the research
and development of innovative concepts for ambient, multi-device, universally
accessible and usable multimodal interfaces through VR simulation. Conclusively,
VERITAS will support the software industry in its objective of producing better,
universally accessible software products /services at a lower cost.
Indesit Company SpA IND Italy Technical requirements and pilot site of the Smart Living
Spaces application scenarios of VERITAS
DOMOLOGIC SME Germany Technical requirements and development of the Smart Living
Figure 96: Screenshots of VERITAS dissemination material (Newsletters, Posters, Leaflets, etc/)
As seen in Figure 96, a number of different dissemination material have been
produced during the project, such as the project logo, leaflets, posters and electronic
newsletters towards diffusing VERITAS progress and achievements to society and
relevant stakeholders.
The second SAB meeting was organised in the forenoon of the 20th September
2011 in Nottingham UK. The SAB members also participated in the User
forum in the afternoon of the same day.
A VUMS Special Thematic Session (STS) was organised at the 13th
International Conference on Computers Helping People with Special Needs,
on July 11-13, 2012 at the University of Linz in Austria.
The 3rd VERITAS User Forum Accessibility design at the service of age-
friendly environments took place on 16 May 2013 hosted at the HUSA
President Park, in Brussels, Belgium.
VERITAS participated in the ICT 2013 Exhibition Event, which took place on
6th to 8th of November, 2013 at LitExpo in Vilnius, Lithuania.
CRF organised a 1-day workshop focused on high-accessibility solutions for
automotive interiors, on 13th December 2013. This event was held in CRF
site, in Orbassano, Italy and included 2 presentations on the achievements of
Veritas in the Automotive sector by CRF and Piaggio. The target audience
comprised 30 engineers and designers that work on present and future
solutions for car interior and motorcycle ergonomics design.
On the 28th November 2013, UPM organized a dissemination workshop to
present VERITAS concepts and tools at the Smart House Living Lab in
Madrid. The mission of the Smart House Living Lab is the research and
development in the Ambient Intelligence context of technology and services
to prevent, care and promote the health and welfare of people, support for
social inclusion and the independent living of fragile and dependent groups, in
all stages of the value chain: training, experimental research, technological
development and technology transfer.
At the 2013 LMS Automotive User Conference held in Munich, Germany on
October 29-30, besides a joint presentation regarding the use of VERITAS on
CAE for motorcycle rider comfort evaluation with Piaggio, a booth dedicated
to Veritas was set up, where the interested attendees have had the opportunity
to take a deep dive into the technology, and to discuss the technical details.
A workshop on cognitive simulation was organized by CERTH/HIT and
CERTH/ITI in cooperation with the project INTERSTRESS. The main topic
was the influence of stress in driving and ways, simulation of stress in HCI
accessibility assessment and techniques to overcome it or mitigate its
consequences, also with the help of ICT. The workshop was performed in
Greek as it addressed an exclusively greek audience (about 20 participants,
mainly doctors and engineers.
Conference Papers/Posters 52
Chapters in books 2
VERITAS Workshops 6
Press releases 9
Summarizing, it must be highlighted that significant efforts have been made by all
participants to disseminate their work towards the finalization of each single
prototypes, but also of the whole VERITAS system. This, in combination with the
dissemination of the 4th years main achievements (i.e. the systems final integration,
the realization of the pilots, the finalization of the individual exploitation plans and
the development of the common business plan) does not only significantly increase
the potential commercialization success of the VERITAS prototypes, but affects the
users acceptance towards the system as well.
Figure 99: Percentage of project results dissemination via different communication channels
foreground and knowledge obtained within project have been diffused to the
respective target groups:
Description Date, Place Aim and Outcomes
UniversAAL Open Day 19 January 2012 Dissemination to stakeholders
Thessaloniki, GR
inCASA Ethical Board 02 February 2012 Dissemination to stakeholders
Chorleywood, UK
First face-to-face meeting of 09-10 February, 2012 Dissemination to Researchers/
the Model-Based User Kaiserslautern, Germany Developers/ Industry/
Interfaces (MBUI) W3C Standardisation organisations
Working Group.
2. The European Year 2012 on Active Ageing (EY 2012) shares the same
objective as the EIP AHA: it will seek to engage a wide range of stakeholders
(public authorities, the business sector, social actors, civil society
organisations) to commit to supporting active ageing. The EY 2012 and the
EIP AHA will play very complementary roles: while the objectives of the EY
2012 are to get all relevant stakeholders to take a political commitment to act,
the EIP AHA should provide the means and resources to translate these
commitments into reality in a coherent and sustainable way. These two EU
processes should be mutually reinforcing to avoid a waste of energies and
resources.
5. The Urban mobility Action Plan can be used to improve accessibility of the
urban built environment and transport.
The work on Mandate 473 has begun. This mandate aims at including
accessibility following "Design for all" (or Universal Design) in relevant
mainstream standards and to develop process standards for manufactures and
services providers on how to include accessibility in their product
development cycle and service provision. This Mandate addresses
accessibility in the sense of article 4 (f) of the UNCRPD. The work is just at
the beginning.
2
http://ec.europa.eu/justice/discrimination/files/2011-12-13_consultation_background_document.pdf
The Commission proposed rules to make public sector websites accessible for
all.
Presentation title Date, Place Event and other data Targeted Impact (high / medium / low) Dissemination event:
(if applicable) beneficiaries / end- Key (K) or regular (R)
users /
stakeholders
Already attended Reach Reach Reach to Reach
to end- to developers to
users researc applic
hers ation
domai
ns/ind
ustry
Products: The
VUMS Project
Cluster and
VERITAS
Human Solutions 20 September 2011 RAMSIS (user conference) Researchers X X K
presented progress Kaiserslautern, Developers
and relevant tools Germany Industry
VERITAS.
VERITAS User 20 September 2011 VERITAS User forum Researchers X X X R
forum Nottingham, United Developers
Kingdom Beneficiaries
VERITAS 21 September 2011 JVRC 2011 Researchers X X X R
workshop Nottingham, United Developers
Kingdom Industry
VERITAS virtual 09-10 February, First face-to-face meeting of the Researchers X X K
user model 2012 Model-Based User Interfaces (MBUI) Developers
structure has been Kaiserslautern, W3C Working Group. Industry
presented Germany Through the activities of the MBUI Standardisation
W3C group, the VERITAS project will organisations
try to standardize the structure of the
virtual user model. Further refinements
that will be proposed by the MBUI
group will be integrated in the virtual
user model.
Seminar on Seniors entrepreneurs 30.05.2013 Brussels VERITAS brochures presented at the seminar
in support of youth employment
Paper title Date, Place Event and other data Targeted Impact and importance of Contribution to
stakeholders conferences and workshops innovation in the area
Added value
(technological - T,
social - S,
methodological issues -
M)
Already attended High Medium Low
Immersive And
Non-Immersive
Accessibility
Simulation For
Smart Living
Spaces
Michele 26-28 June 2012 pHealth 2012, 9th International Conference on Developers X T, M
Confalonieri, Porto, Portugal Wearable Micro and Nano Technologies for Researchers
Giovanni Personalized Health Industry
Guandalini, Mauro
Da Lio, Mariolino
De Cecco, Force
And Touch Make
Video Games
Serious For
Dexterity
Rehabilitation
Polek O., Mkovec Innsbruck, Austria, IASTED International Conference Assistive 100+ X
Z., Slavk P. 2012. Technologies (AT 2012)
Predictive Scanning
Keyboard
Oparated by
Hissing
Polek O., Sporka Linz, Austria, 2012. Computers Helping People with Special Needs. 100+ X
A., Mkovec Z.
Measuring
Performance of a
Paper title Date, Place Event and other data Targeted Impact and importance of Contribution to
stakeholders conferences and workshops innovation in the area
Added value
(technological - T,
social - S,
methodological issues -
M)
Already attended High Medium Low
Predictive
Keyboard
Operated by
Humming
Mal I., Bittner J., Linz, Austria, 2012. Computers Helping People with Special Needs. 100+ X
and Slavk P., Using
Annotated Task
Models for
Accessibility
Evaluation
Nikolaos Kaklanis, 11-13July 2012 ICCHP 2012 Researchers X T, M
Konstantinos Linz, Austria Developers
Moustakas and Beneficiaries
Dimitrios Tzovaras,
A methodology for
generating virtual
user models of
elderly and
disabled for the
accessibility
assessment of new
products
Eleni Chalkia, 17-20 July 2012 IMETI 2012 X M
Evangelos Bekiaris, Orlando, Florida,
Virtual and USA
augmented
environments and
Paper title Date, Place Event and other data Targeted Impact and importance of Contribution to
stakeholders conferences and workshops innovation in the area
Added value
(technological - T,
social - S,
methodological issues -
M)
Already attended High Medium Low
realistic user
interactions to
achieve embedded
accessibility
designs, the
VERITAS project
Use Cases
methodological
framework and
outcomes
Hunor Erdelyi, 17-19 September The biennial ISMA conference on Noise and X T, M
Matteo Kirchner, 2012, Leuven, Vibration Engineering ISMA2012 in conjunction
Simone Manzato, Belgium, with USD2012
Stijn Donders,
Multibody
simulation with a
virtual dummy for
motorcycle
vibration comfort
assessment
Sulzmann, Frank; 21-25 July 2012, International Conference on Applied Human Factors 100+ X T
Melcher, Vivien; San Francisco, and Ergonomics, AHFE 2012. CD-ROM
Diederichs, Frederik; California
Sayar, Rafael,
Modular
dashboard for
flexible in car HMI
Paper title Date, Place Event and other data Targeted Impact and importance of Contribution to
stakeholders conferences and workshops innovation in the area
Added value
(technological - T,
social - S,
methodological issues -
M)
Already attended High Medium Low
testing
P. Moschonas, A. 20th conference on User Modeling, Adaptation, and 100+ X T, M
Tsakiris, N. Personalization, UMAP, 2012.
Kaklanis, G.
Stavropoulos, and D.
Tzovaras, Holistic
accessibility
evaluation using
VR simulation of
users with special
needs
Panagiotis October 17th, 2012 International Workshop on Personalisable Media 50+ X T, M
Moschonas, in Istanbul, Turkey Systems & Smart Accessibility
Athanasios Tsakiris,
Ioannis Paliokas, and
Dimitrios Tzovaras,
User Interfaces
Accessibility
Assessment Using
Virtual User
Models
N. Biasi, F. Setti, M. Lugano, 3rd International Conference and Exhibition on 3D 100+ X T, M
Tavernini, A. Switzerland, 16-17 Body Scanning Technologies
Fornaser, M. October 2012
Lunardelli, M. Da
Lio, M. De Cecco,
Low-cost garment-
Paper title Date, Place Event and other data Targeted Impact and importance of Contribution to
stakeholders conferences and workshops innovation in the area
Added value
(technological - T,
social - S,
methodological issues -
M)
Already attended High Medium Low
based 3D body
scanner
Carlo Mancuso, Toledo, Spain, 14- International Conference on Neurorehabilitation 100+ X T, M
Gianluca De Toma 16 November 2012 2012
and Rita Paradiso, http://www.icnr2012.org/
Wearable
Electrogoniometer
for Knee Joint
Parameters
Capture"
Paper title Date, Place Event and other data Targeted Impact and importance of Contribution to
stakeholders conferences and workshops innovation in the area
Added value
(technological - T,
social - S,
methodological issues -
M)
Already attended High Medium Low
motion and visually
impaired virtual
humans in
interactive
immersive
environments
Laura Boffi, Monica 6-10 March 2013, AD/PD 2013 - The 11th International Conference +100 X S
Milani and Romina Florence , Italy on Alzheimer's & Parkinson's disease, Beneficiaries
Catani (Indesit) http://www2.kenes.com/adpd2013/Pages/Home.aspx /Practitioners
Marco Fontana
(Percro),
Supporting the
Design of Products
Accessible by
Parkinson People
through Physical
Simulation of
Tremor: An
Experiment with
Gas Hob
Manzato, S., Sesimbra, Portugal, ICEDyn 2013, International Conference on +100 X T, M
Kirchner, M., 17-19 June 2013 Structural Engineering Dynamics,
Erdlyi, H., Baglini,
G., Pieve, M.,
Numerical and
Operational
Identification and
Paper title Date, Place Event and other data Targeted Impact and importance of Contribution to
stakeholders conferences and workshops innovation in the area
Added value
(technological - T,
social - S,
methodological issues -
M)
Already attended High Medium Low
Assessment of
Motorcycle
Dynamics and
Comfort
A. Tsakiris, P. Vilamoura, Association for the Advancement of Assistive +100 X T,M
Moschonas, N. Algarve, Portugal, Technology in Europe, 12th European AAATE
Kaklanis, I. Paliokas, 19-22 September Conference, AAATE 2013
G. Stavropoulos, D. 2013
Tzovaras, Cognitive
Impairments
Simulation in a
Holistic GUI
Accessibility
Assessment
Framework
Kaklanis, N., Votis, 15 July 2013, RDWG Symposium on User Modeling for +100 X T, M
K., Tzovaras, D. Online Symposium Accessibility, 2013
Personalised web
accessibility
assessment using
virtual user models,
Guidotti, Calefato, October 16-18, Human Factors and Ergonomics Society Europe +50 X M
Landini, Minin, 2013, Torino, Italy Chapter 2013
Catani, & Milani,
User requirements
for supporting the
accessible design
Paper title Date, Place Event and other data Targeted Impact and importance of Contribution to
stakeholders conferences and workshops innovation in the area
Added value
(technological - T,
social - S,
methodological issues -
M)
Already attended High Medium Low
process: Survey &
user test results in
the framework of
VERITAS project
Paper title Date, Place Event and other data Targeted Impact and importance of Contribution to innovation
stakeholders conferences and workshops in the area
Added value (technological
- T, social - S,
methodological issues - M)
Sceduled/To present papers/publications High Medium Low
Dangelmaier, M., 3-6 June 2014, The R&D Management Conference +50 X T, M
Tzovaras, D. Blach, Stuttgart, Germany 2014 (accepted)
R., C. and Tsakiris,
T., Accessibility
Engineering
Simulation and User
Experience Tools
for Designing
Products for All
Panagiotis Oldenburg, Germany, 2nd Patient Rehabilitation Research +100 ,
Moschonas, Ioannis 20-23 May, 2014 Technologies Workshop,
Paliokas, Dimitrios PervasiveHealth 2014 (submitted)
Tzovaras, Automatic
Accessibility
Paper title Date, Place Event and other data Targeted Impact and importance of Contribution to innovation
stakeholders conferences and workshops in the area
Added value (technological
- T, social - S,
methodological issues - M)
Sceduled/To present papers/publications High Medium Low
Assessment using
Virtual User Models
for Office
Environments
Athanasios Tsakiris, 22 - 27 June 2014, HCI International 2014 (accepted) +100 X M, T
Ioannis Paliokas, Crete, Greece
Dimitrios Tzovaras ,
Simulation-Based
Accessibility
Evaluation of
Graphical User
Interfaces using
Virtual User Models
Ioannis Paliokas, 22 - 27 June 2014, HCI International 2014 (accepted) +100 X M, T
Athanasios Tsakiris, Crete, Greece
Athanasios Vidalis,
Dimitrios Tzovaras
Sense of Presence
and Metacognition
Enhancement in
Virtual Reality
Exposure Therapy
in the Treatment of
Social Phobias and
the Fear of Flying
Paper title Date, Place Event and other data Targeted Impact and importance of Contribution to innovation
stakeholders conferences and workshops in the area
Added value (technological
- T, social - S,
methodological issues - M)
Sceduled/To present papers/publications High Medium Low
Ioannis Paliokas, 22 - 27 June 2014, HCI International 2014 (accepted) +100 X M, T
Panagiotis Crete, Greece
Moschonas,
Athanasios Tsakiris,
Dimitrios Tzovaras,
Evaluation of a
Virtual User
Modeling
Framework on
Automatic
Accessibility
Assessment: a Case
Study on Workplace
Design
Paper title Date, Place Event and other data Targeted Impact and importance of Contribution to innovation
stakeholders conferences and workshops in the area
Added value (technological
- T, social - S,
methodological issues - M)
Sceduled/To present papers/publications High Medium Low
Using Virtual User
Models
Ioannis Paliokas, Aalborg, Denmark, 7- 22nd Conference on User Modeling, +100 X M, T
Athanasios Tsakiris, 11 July 2014 Adaptation and Personalization,
Dimitrios Tzovaras, UMAP2014 (accepted)
Automatic
Accessibility
Assessment Using
Virtual User Models
for Infotainment
User Interfaces
Spyridonis, F., Como, Italy, 27-29 12th International Working +100 X T, M
Moschonas, P., May 2014 Conference on Advanced Visual
Touliou, K., Tsakiris, Interfaces (ACM AVI 14) (accepted)
A., Ghinea, G.
Erdelyi, H., Manzato, Budapest, Hungary, Tenth International Symposium on +100 X T,M
S., Donders, S., Van May 19-23, 2014 Tools and Methods of Competitive
der Auweraer, H. Engineering (TMCE 2014) (accepted)
Pieve, M., An
integrated CAE
solution for
motorcycle rider
comfort evaluation
considering Virtual
User Models
P. Moschonas, A. 29th March 2014, IEEEVR 2014 (accepted) +100 X T,M
Tsakiris, D. Minneapolis,
Tzovaras, Product Minnesota, USA
Accessibility
Paper title Date, Place Event and other data Targeted Impact and importance of Contribution to innovation
stakeholders conferences and workshops in the area
Added value (technological
- T, social - S,
methodological issues - M)
Sceduled/To present papers/publications High Medium Low
Evaluation using
Virtual User
Models, research
demo, (to appear)
Author(s) Title of paper Journal details (Vol. Date Impact and importance of journals
no./Page ref.)
Miroslav Macik, Do temporary SIGCHI Bulletin, 2012 X
Adam J. Sporka, disabilities require May 2012
Pavel Slavk, specific UI design?
Author(s) Title of paper Journal details (Vol. Date Impact and importance of journals
no./Page ref.)
Kaklanis, N., Biswas, P., Towards Journal: Universal Under review X
Mohamad, Y., Gonzalez, Standardization of User Access in the
M.F., Peissner, M., Models for Simulation Information Society,
Langdon, P., Tzovaras, D., and Adaptation Special Issue: 3rd
Jung, C, Purposes generation
accessibility:
Information and
Communication
Technologies towards
universal access,
Springer (under
review)
Kaklanis, N., Modeling motor Journal: User Under review X
Stavropoulos, G., disabilities of people Modeling and User-
Tzovaras, D. through regression Adapted Interaction,
analysis for the The Journal of
development of Personalization
accurate virtual user Research (under
models review)
Segouli, S., Paliokas, I., Exploring the Influence Journal: American Under review X
Tzovaras, D., Tsakiris, A., of MCI and Journal of
Tsolaki, M., Related Diseases on Alzheimer's Disease
Karagiannidis, C. Robust Cognitive & Other
Virtual User Model Dementias (under
Development review)
Planned journal and book publications
S. Segouli, I. Paliokas, A. Enabling accessibility Book: IGI Topic Accepted, to appear X
Tsakiris, M. Tsolaki, K. features in enhanced Selections (accepted,
Votis, D. Tzovaras, VR environments for to appear)
supporting spatial
abilities and social
interaction in elderly
and MCI patients
Author(s) Title of paper Journal details (Vol. Date Impact and importance of journals
no./Page ref.)
Panagiotis Moschonas, Virtual Human Factors Journal: Transactions submitted X
Dimitrios Tzovaras for Product on Occupational
Ergonomics Evaluation Ergonomics and
Human Factors
(Taylor & Francis)
journal
Georgios Stavropoylos, A Multi-Sensorial Journal: Medical submitted X
Dimitrios Tzovaras, Platform framework, Engineering and
Andrea Cesarini, for measuring Physics
Mariolino De Cecco, physiological human
Mauro Da Lio parameters
A. Tsakiris, P. Moschonas, Cognitive Impairments Journal: Technology submitted X
N. Kaklanis, I. Paliokas, Simulation in GUI and Disability
G. Stavropoulos, D. Accessibility
Tzovaras, Assessment using
Virtual User Models
1. Basic Technologies are the hardware/software tools and the pieces of knowledge
useful to measure and study the movements of older people and people with
disabilities and their interaction with the objects and the environment. They were
used to design and develop the VERITAS tools. The Exploitable Results belonging to
this group are 1, 2, 10, 11 and 14.
2. VERITAS tools are the solutions that can be directly employed in the
design/production process of accessible products. They can be classified according
to the business sector where they could be profitably employed: applications
oriented to automotive industry, to motorcycle industry, to smart living spaces
industry, to software industry. They have been tested and evaluated in the pilots
with designers (WP3.7). The applications developed with these tools have been
evaluated in the pilots with beneficiaries (WP3.8). The ERs belonging to this group
are 4-9 and 14-16.
3. Finally, Applications are the products or final solutions designed within VERITAS,
including embedded accessibility for older people and people with disabilities. They
have been tested and evaluated in the pilot with beneficiaries (WP3.8). The ERs
belonging to this group are the 12 and 13.
An overview of the VERITAS exploitable foreground can be seen in the table below:
EXPLOITABLE FOREGROUND
Type of Description of Confidential Foreseen Exploitable Sectors of Timetable, Patents or other IPR exploitation (licences) Owner & Other
Exploitable exploitable Click YES/NO embargo products or application commercial Beneficiaries
Foreground foreground date measures or any other involved
use Lead Other
Partner involved
Partners
4 Commercial Software tool and NO - Software tool Other January 2015 I+, CERTH and UPM are the owners of the IPRs I+ UPM,
exploitation of methodology for for designing information (background and foreground). They have achieved CERTH/
R&D results, design of HEALTHCAR technology and a license agreement (see Exploitation Agreement). HIT
accessible and E & computer UPM and CERTH will license out it to I+. CERTH &
usable products WELLBEING service activities UPM could exploit this technology with research
and solutions in APPLICATIO (J62.0.9) aims and to provide consultancy. The three
the personal NS with partners want to collaborate again to continue the
healthcare and embedded development of this technology. I+ is in charge of
wellbeing accessibility managing the IPRs. I+ plans to use this
domain. methodology in the development of its own e-
healthcare and Ambient Assisted Living (AAL)
applications
5 Commercial Software tool for NO - Software tool Other July 2015 The methodology (know-how) has been developed BYTE
exploitation of supporting for designing information by BYTE based on VSF (see ER02). BYTE is
R&D results, developers and COLLABORA technology and implementing this methodology in its development
s/w engineers in TIVE TOOLS computer process. The 3 developed tools are the intellectual
designing APPLICATIO service activities property of BYTE and will be available to its
accessible GUI NS with (J62.0.9) customers as software licences.
for software embedded
applications. accessibility
6 General New YES - Simulation Other Available The technology is based on ER02 plus Proprietary LMS Piaggio
advancement of methodology and Methods to information Code owned by LMS. LMS used standard products
knowledge, workflow to assess and technology and to develop the methodology, and put in place a
predict and optimize the computer workflow for motorcycle comfort assessment.
optimize the rider comfort of service activities Piaggio has tested the technology within its design
comfort based on powered two- (J62.0.9) and development process. No opportunity for
virtual prototyping wheeler riders protecting/patenting the results is seen. Therefore,
Manufacture of LMS and Piaggio have opted for an open
motorcycles publication strategy, where LMS has published the
(C30.9.1) methodology highlights and results in international
publications.
7 Commercial Software desktop YES - Software Other December All intellectual property rights on the technology HS
exploitation of tool for designing desktop tool information 2014 belong to Human Solutions, commercial vendor,
R&D results, AUTOMOTIVE for designing technology and and will be protected by software licenses.
solutions with AUTOMOTIV computer
embedded E solutions service activities
accessibility with (J62.0.9)
EXPLOITABLE FOREGROUND
Type of Description of Confidential Foreseen Exploitable Sectors of Timetable, Patents or other IPR exploitation (licences) Owner & Other
Exploitable exploitable Click YES/NO embargo products or application commercial Beneficiaries
Foreground foreground date measures or any other involved
use Lead Other
Partner involved
Partners
(plug-in in embedded
commercial tool accessibility Manufacture of
RAMSIS) (plug-in in motor vehicles
commercial (C29.1)
tool RAMSIS)
8 Commercial Immersive YES - ltDomoDesign Other January 2017 The software is property of FhG/IAO, USUTT, FhG/IAO USUTT,
exploitation of software solution / information Domologic and includes background software from Domologi
R&D results, for planning Bauherrenkin technology and the developing partners. Bauherrenkino++ is also c
smart living o++ computer related to 3rd party IP in case calculation functions
spaces including service activities have to be included. ISE/Domotics is a software
architectural and (J62.0.9) product by DOMOLOGIC and hence it is protected
domotic aspects by Copyright (Software License). The final vendor
according to the Architectural is Domologic, partner that will be the licensee and
needs of users and engineering FhG/IAO and USUTT will license their part to
with special activities; Domologic
needs. technical testing
and analysis
(M71)
9 Commercial Software Virtual YES - ltRAMSIS Other January 2015 This solution is based on proprietary code. The FhG/IAO HS
exploitation of Reality tool for information software is based on other software belonging to
R&D results, designing technology and consortium partners HS and FhG/IAO. Intellectual
automotive computer property originated in the projects is owned by both
solutions with service activities partners, and they have achieved a joint ownership
embedded (J62.0.9) agreement to exploit it.
accessibility (plug
in to Manufacture of
commercial tools motor vehicles
Lightning and (C29.1)
RAMSIS)
10 Commercial System able to YES - GaMoCap - Other research December All the background and foreground belongs to UNITN
exploitation of measure human Garment and 2018 UNITN. The background is protected. The
R&D results, shape in motion Based human experimental foreground can be protected by Copyrights. The
and extract its Motion development on software is written in C++, there are no software
kinematics, Capture natural sciences licences to consider.
dynamics and system and engineering
anthropometric (M72.1.9)
EXPLOITABLE FOREGROUND
Type of Description of Confidential Foreseen Exploitable Sectors of Timetable, Patents or other IPR exploitation (licences) Owner & Other
Exploitable exploitable Click YES/NO embargo products or application commercial Beneficiaries
Foreground foreground date measures or any other involved
use Lead Other
Partner involved
Partners
parameters.
11 Commercial Touch force YES - Force Panel Other research Available Universit degli studi di Trento (Italy) is the only UNITN
exploitation of interface for and partner involved in this technology. The foreground
R&D results, interaction, able experimental can be protected by Copyrights. The software is
to measure both development on written in C++, there are no software licences to
finger position on natural sciences consider.
the screen and and engineering
the exerted force. (M72.1.9)
12 Commercial Software YES - Pc@Home Other December I+ has the intellectual property of PC@Home. I+
exploitation of instrument that (final solution) information 2015 PC@Home is a proprietary code in Java protected
R&D results, helps elderly and technology and by copyright.
patient with computer
disability such as service activities
mild dementia, (J62.0.9)
Parkinson, stoke,
physical
impairments to
structure their
day and to help
him/her with
remembering
13 Commercial Collaborative YES - PrismIX (final Other September AIJU. Toy Research Institute is the only partner AIJU CERTH,
exploitation of game for older solution) information 2015 involved in the game development (programming FORTH
R&D results, people combining technology and and graphic issues). No patents are expected. The
the development computer entire developed Code and 3D Engine used in
of a game classic service activities Game is intellectual property of AIJU, and is
board and the (J62.0.9) protected by Copyright.
new technologies
14 General Structures of how N/R - Indepth task Other research Available CERTH/HIT is the only owner of this knowledge CERTH/
advancement of a task is analysis of and (know-how). Part of this methodology has been HIT
knowledge, completed, various man- experimental disclosed in scientific publications.
including a performed development on
detailed tasks in natural sciences
description of various and engineering
both manual application (M72.1.9)
(physical) and domains
mental (cognitive,
EXPLOITABLE FOREGROUND
Type of Description of Confidential Foreseen Exploitable Sectors of Timetable, Patents or other IPR exploitation (licences) Owner & Other
Exploitable exploitable Click YES/NO embargo products or application commercial Beneficiaries
Foreground foreground date measures or any other involved
use Lead Other
Partner involved
Partners
psychological
and behavioural)
activities and
tasks, as well as
environmental
conditions
(different
application areas)
15 exploitation of Design insights YES - Guidelines for Manufacture of June 2014 Indesit collaborated with PERCRO during the pilots, INDESIT PERCRO
results through related to the the design of domestic but the Guidelines and the gallery sketches were
(social) accessibility of accessible appliances developed only by Indesit Design Center, and all
kitchen gas hobs kitchen gas (C27.5) the generated foreground belongs to Indesit.
innovation
from the hobs and
perspective of gallery of new
Parkinson people design
whose hands possibilities
shake and touch
panel for smart
oven from the
perspective of
people with visual
impairments
16 exploitation of On line tool YES - Multimedia Educational Available The training environment is an Open Source asset ATOS The
results through based on course for support and can be used under GPL license. ATOS has entire
(social) interactive developers activities customised the platform and has designed the consortiu
learning (P85.6) methodology according to VERITAS requirements. m
innovation
environment 2.0 The contents of training materials were designed by
providing access the different Pilots coordinators and Tool
to a group of developers. Consequently, they are responsible for
materials which the decision on which can be the most appropriated
allow a dynamic license to use. Part of the materials have Common
preparation for Creative by share-alike (SA) license, since it is the
using each usual Open source license used for sharing general
component of the advancements of knowledge. In other cases they
tool developed by are under copyright license. Further extensions or
the project improvement in the training services are possible,
and would belong to specific partners
This system can accurately record data to measure a set of human motion parameters
such as gait, joints range of motion, torso flexibility etc. Currently, it integrates a
number of sensors that include 3D cameras, electrogoniometers, force sensors and
more.
Characteristics / Advantages:
API for sensor integration: An API is available so that almost any new sensor
can be integrated and used with the system without much effort.
Distributed/modular architecture: the system can be used with any sub-set of
the available/integrated sensors, depending on the current recording needs.
With the current VERITAS sensor set, the system can measure most of the
human body joints range of motion.
Recorded data is automatically annotated for easier retrieval.
User-friendly Graphical User Interface.
Distributed architecture: The system can be set-up using multiple computers,
communicating over TCP/IP.
The VERITAS Simulation Framework consists of several tools which can be used
for the accessibility assessment of both prototype products and user interfaces
designs. For the assessment process, a simulation takes place in a virtual
environment. In this virtual environment, the behavioural characteristics of virtual
elderly and impaired users are simulated. The simulation is based on Virtual User
Models which contain parameters of vision, hearing, cognitive and motor
characteristics of people with special needs. The Framework is consisted of several
tools which: a) create the Virtual User Models, b) define the simulation scenarios and
c) perform the simulation and evaluate the virtual products. Lots of tests in the
automotive, workplace, smart living, infotainment and healthcare application
domains, depicted that the VERITAS Simulation Framework can be considered as a
valuable asset for the developer, designer and of course, any individual with special
needs.
The expected date to launch the product is December 2016. A beta version
will be available by June 2015.
3. ER03: Software tool for designing GAMES & GAMES INTERFACES with embedded
accessibility
This software tool is focused on designing GAMES & GAMES INTERFACES with
embedded accessibility with its two modules, the VerSEd-GUI and the VerSim-GUI
that helps the GUI designer on effective and faster GUI developing and taking into
account extensive types of constraints that fit embedded accessibility.
This is a pilot application toolkit, functional to a great extent with the basic
functionality implemented and tested. It requires some polishing, debugging and
extension of several features to become a commercial product, but other than that, it
is functional and was tested in the pilots.
The expected date to launch the product is January 2015
4. ER04: Software tool for designing HEALTHCARE & WELLBEING APPLICATIONS with
embedded accessibility
Thanks to the availability of the VSF and of other software, the partners can offer
ready to use tools for supporting developers and SW engineers in designing
accessible GUI for software applications. In addition, thanks to the availability of
basic technologies to monitor, measure and study human-software interaction,
VERITAS consortium can also face new and complex problems of accessible
software design, offering designers a tailor made and complete consultancy service.
The value delivered by such a consultancy service is not simply the solution to
accessibility problems for older people and people with disabilities, but it is the
opportunity for software builders to enlarge their potential market including potential
customers who are not reached nowadays by an offer of products suitable for older
people and people with disabilities.
Within the project, three collaborative tools applications with embedded accessibility
have been developed, a discussion application, an ftp application and a teleconference
application. The tools have been developed in Java. Java was selected because it is
OS independent due to the availability of a run time environment and due to the lower
development cost and more convenient integration with the VERITAS Core platform.
The methodology to develop these collaborative tools is based on Asset 02, although
also includes proprietary code.
The methodology is already implemented within the Byte Computer
development process. The expected date to launch product developed through
this methodology is July 2015.
6. ER06: Simulation Methods to assess and optimize the comfort of powered two-
wheeler riders
The aim of the collaboration between PIAGGIO and LMS has been to develop a new
methodology and workflow to predict and optimize the rider comfort based on virtual
prototyping. At present, Powered Two-Wheeler (PTW) rider comfort optimization
relies on subjective experimental data, due to a lack of methods and data for vertical
transmissibility analysis in the frequency domain. This is overcome with the new
objective methodology for PTW rider comfort assessment in the time domain,
involving Virtual Prototypes of a PTW and a human rider. Virtual tests can be done
for objective comfort assessment of a PTW rider, enabling comfort assessment for
both able-bodied riders and riders with different disabilities. This allows PTW
manufacturers to achieve a PTW design for optimal rider comfort without the need to
produce physical prototypes.
The targeted valorisation of this asset is through consultancy projects for
automotive/motorcycle manufactures faced with comfort design challenges in
their product design. LMS is already reaching out for possible customers
through publications and through its marketing organisation.
7. ER07: Software desktop tool for designing AUTOMOTIVE solutions with embedded
accessibility
The commercial design tool and digital human model RAMSIS provides technologies
and solutions to analyse and ensure the ergonomic accessibility of products and
environments with respect to average and healthy customers. The human model
simulation technologies and data resources are extended to non-standard customer
groups as disabled and elderly people. Hence these groups can be addressed in the
digital main stream vehicle development in the same way as average and healthy
people. As a result vehicle designs can be efficiently tested and optimized with
respect to ergonomics of a wide range of potential customers.
The technology will be ready for the market after another year of refinement
at the end of 2014.
9. ER09: ltRAMSIS
Software Virtual Reality tool for designing AUTOMOTIVE solutions with embedded
accessibility (plug-in in commercial tools Lightning & RAMSIS). The software uses
VERITAS user model parameters in automotive VR design process through the tools
Lightning/RAMSIS. This technology integrates elderly and disabled customers in
digital main stream vehicle development.
This system is able to measure human shape in motion and therefore extract its
kinematics, dynamics and anthropometric parameters based on multi-camera passive
vision and a special wearable garment.
The expected date to launch the product is December 2018.
The Force Panel, touch force interface for interaction, exer-games and diagnostics, is
an instrument based on a touch screen and a force sensor able to measure both finger
position on the screen and the exerted force. It allows to create a virtual environment
where to control devices (e.g. in a domotics context), exercise dexterity by means of
serious games for health and concurrently diagnose the user health state.
PC@Home is an instrument that helps elderly and patient with disability (PwD) such
as mild dementia, Parkinson, stoke, physical impairments to structure their day and to
help him/her with remembering.
The functionalities of the application are:
Agenda and reminders
Questionnaire
All the services are easily accessible and each one is designed to require the minimum
interaction with the end user. Accessibility and usability of this application arise from
the integration of different user profiles to whom the product refers. A particular
attention was dedicated to the UI design, this is the key element of this application.
The aim of acceptability, accessibility and usability was achieved thanks to the use of
VERITAS Tools and the relative methodology.
The benefits that this technology can bring are a support for health coach,
maintenance of memory and sense of reality for elderly people. PC@Home has been
developed using the ER04.
The expected date to launch the product is December 2015.
The guidelines present design insights related to the accessibility of kitchen gas hobs
from the perspective of Parkinson people whose hands shake and touch panel for
smart oven from the perspective of people with visual impairments. The associated
gallery of new design possibilities consists of a series of sketches derived from the
insights which explore new solutions for gas hobs and kitchen environments that can
be used also by Parkinson or visual impairments users. Both the guidelines and the
gallery represent an inspiration for designers to come up with new possible solutions
for the gas hobs and kitchen environment in line with the principles of Design for All.
The expected date to launch the Guidelines is June 2014
B Ethics
1. Did your project undergo an Ethics Review (and/or Screening)?
If Yes: have you described the progress of compliance with the relevant Ethics No
Review/Screening Requirements in the frame of the periodic/final project reports?
3. Workforce statistics for the project: Please indicate in the table below the number of
people who worked on the project (on a headcount basis).
D Gender Aspects
5. Did you carry out specific Gender Equality Actions under the project? Yes
6. Which of the following actions did you carry out and how effective were they?
Not at all Very
effective effective
x Design and implement an equal opportunity policy x
x Set targets to achieve a gender balance in the workforce x
Organise conferences and workshops on gender
Actions to improve work-life balance
Other:
7. Was there a gender dimension associated with the research content i.e. wherever people were
the focus of the research as, for example, consumers, users, patients or in trials, was the issue of gender
considered and addressed?
Yes Specific recruitment criteria have been set for both tests with designers/developers and beneficiaries. One
of the criteria was gender equality.
E Synergies with Science Education
8. Did your project involve working with students and/or school pupils (e.g. open days,
participation in science festivals and events, prizes/competitions or joint projects)?
Yes A series of demonstration activities did take place involving universities.
9. Did the project generate any science education material (e.g. kits, websites, explanatory
booklets, DVDs)?
Yes We have produced a web site, leaflets and other type of printed material as well as an e-learning
platform that can be accessed from the web site of the project, which contains training material in the
form of user manuals and tutorial videos.
F Interdisciplinarity
10. Which disciplines (see list below) are involved in your project?
Main discipline3: Electrical engineering, electronics [electrical engineering, electronics, communication
engineering and systems, computer engineering (hardware only) and other allied subjects]
Associated discipline: Civil engineering Associated discipline: Mathematics and computer
(architecture engineering, building sciences [mathematics and other allied fields:
science and engineering, construction computer sciences and other
engineering, municipal and structural allied subjects (software development only; hardware
engineering and other allied subjects) development should be classified in the
engineering fields)]
3
Insert number from list below (Frascati Manual).
No
Yes- in framing the research agenda
Yes - in implementing the research agenda
x Yes, in communicating /disseminating / using the results of the project
13a Will the project generate outputs (expertise or scientific advice) which could be used by
policy makers?
x Yes as a primary objective (please indicate areas below- multiple answers possible)
Yes as a secondary objective (please indicate areas below - multiple answer possible)
No
13b If Yes, in which fields?
Agriculture Energy Human rights
Audiovisual and Media Enlargement Information Society
Budget Enterprise Institutional affairs
Competition Environment Internal Market
Consumers External Relations Justice, freedom and security
Culture External Trade Public Health
Customs Fisheries and Maritime Affairs Regional Policy
Development Economic and Food Safety Research and Innovation
Monetary Affairs Foreign and Security Policy Space
Education, Training, Youth Fraud Taxation
Employment and Social Affairs Humanitarian aid Transport
15. How many new patent applications (priority filings) have been made? N/R
("Technologically unique": multiple applications for the same invention in different
jurisdictions should be counted as just one application of grant).
18. Please indicate whether your project has a potential impact on employment, in comparison
with the situation before your project:
x Increase in employment, or x In small & medium-sized enterprises
x Safeguard employment, or x In large companies
Decrease in employment, None of the above / not relevant to the project
Difficult to estimate / not possible to quantify
19. For your project partnership please estimate the employment effect Cannot be estimated.
resulting directly from your participation in Full Time Equivalent (FTE =
one person working fulltime for a year) jobs:
4
Open Access is defined as free of charge access for anyone via Internet.
5
For instance: classification for security project.
23 In which languages are the information products for the general public produced?
Language of the coordinator English
Other language(s)
Question F-10: Classification of Scientific Disciplines according to the Frascati Manual 2002
(Proposed Standard Practice for Surveys on Research and Experimental Development, OECD 2002):
1. NATURAL SCIENCES
1.1 Mathematics and computer sciences [mathematics and other allied fields: computer sciences
and other allied subjects (software development only; hardware development should be
classified in the engineering fields)]
1.2 Physical sciences (astronomy and space sciences, physics and other allied subjects)
1.3 Chemical sciences (chemistry, other allied subjects)
1.4 Earth and related environmental sciences (geology, geophysics, mineralogy, physical
geography and other geosciences, meteorology and other atmospheric sciences including
climatic research, oceanography, vulcanology, palaeoecology, other allied sciences)
1.5 Biological sciences (biology, botany, bacteriology, microbiology, zoology, entomology,
genetics, biochemistry, biophysics, other allied sciences, excluding clinical and veterinary
sciences)
3. MEDICAL SCIENCES
4. AGRICULTURAL SCIENCES
4.1 Agriculture, forestry, fisheries and allied sciences (agronomy, animal husbandry, fisheries,
forestry, horticulture, other allied subjects)
4.2 Veterinary medicine
5. SOCIAL SCIENCES
5.1 Psychology
5.2 Economics
5.3 Educational sciences (education and training and other allied subjects)
5.4 Other social sciences [anthropology (social and cultural) and ethnology, demography,
geography (human, economic and social), town and country planning, management, law,
linguistics, political sciences, sociology, organisation and methods, miscellaneous social
sciences and interdisciplinary , methodological and historical S1T activities relating to
subjects in this group. Physical anthropology, physical geography and psychophysiology
should normally be classified with the natural sciences].
6. HUMANITIES
6.1 History (history, prehistory and history, together with auxiliary historical disciplines such as
archaeology, numismatics, palaeography, genealogy, etc.)
6.2 Languages and literature (ancient and modern)
6.3 Other humanities [philosophy (including the history of science and technology) arts, history
of art, art criticism, painting, sculpture, musicology, dramatic art excluding artistic "research"
of any kind, religion, theology, other fields and subjects pertaining to the humanities,
methodological, historical and other S1T activities relating to the subjects in this group]
References
1. WHO-World Health Organization [Online] http://www.who.int