Académique Documents
Professionnel Documents
Culture Documents
This deliverable is an intermediate progress report for EXPERIMEDIAs embedded experiment focusing on shared, real-time, immersive and interactive cultural and educational and executed by the Foundation of the Hellenic World at its premises at Hellenic Cosmos in Athens. Starting from the more abstract scenario description provided in the earlier D2.1.2, exploiting the architectural blueprint described in the D2.1.3, taking into consideration the methodological guidelines described in D2.1.1 as well as the ethical oversight principles described in D5.1.1, and of course based on the preliminary work described in D4.3.1, the document provides an overview of the work that has been done to this day, the data gathered, the conclusions drawn and the plans for the future of the experiment.
www.experimedia.eu
EXPERIMEDIA
Dissemination level: PU
Project acronym EXPERIMEDIA Full title Experiments in live social and networked media experiences Grant agreement number 287966 Funding scheme Large-scale Integrating Project (IP) Work programme topic Objective ICT-2011.1.6 Future Internet Research and Experimentation (FIRE) Project start date 2011-10-01 Project duration 36 months Activity 4 Experimentation Workpackage 4.3 EX3: shared, real-time, immersive and interactive cultural and educational experiences Deliverable lead organisation FHW Authors Manolis Wallace (FHW), Pavlos Mavridis (FHW), Anthousis Andreadis (FHW) Reviewers Stephen C. Phillips (ITInnov) Version 1.0 Status Final Dissemination level PU: Public Due date PM14 (2012-11-30) Delivery date 2012-10-31
EXPERIMEDIA
Dissemination level: PU
Table of Contents
1. 2. 3. Executive Summary............................................................................................................................ 4 Introduction ........................................................................................................................................ 6 Background.......................................................................................................................................... 7 3.1. 3.2. 3.3. Goals and success evaluation criteria ..................................................................................... 7 Experiences offered before EXPERIMEDIA ..................................................................... 8 The experience examined in the embedded experiment ..................................................... 9
3.3.1. First stage of the experiment ............................................................................................... 9 3.3.2. Second stage of the experiment ........................................................................................ 10 3.4. 4. 4.1. 4.2. 4.3. 5. 5.1. 5.2. 5.3. 5.4. 6. 6.1. Constraints ............................................................................................................................... 10 Definition of agents ................................................................................................................ 11 Live streaming experiment during the show ....................................................................... 11 Augmented reality experiment after the show .................................................................... 13 Streaming experimental facility ............................................................................................. 16 Augmented reality experimental facility ............................................................................... 16 Social networks experimental facility ................................................................................... 23 Monitoring experimental facility ........................................................................................... 25 Before the beginning of the experiment .............................................................................. 26 Experimental facility architecture ................................................................................................... 11
Experiment execution ...................................................................................................................... 26 6.1.1. Deploy the facility ............................................................................................................... 26 6.1.2. Familiarize ourselves........................................................................................................... 29 6.1.3. Inform staff and schedule accordingly ............................................................................ 29 6.1.4. Prepare the informed consent forms ............................................................................... 29 6.1.5. Prepare the questionnaire. ................................................................................................. 30 6.2. Experiment procedure............................................................................................................ 30 6.2.1. Preparatory phase................................................................................................................ 30 6.2.2. Execution phase .................................................................................................................. 30 6.2.3. Data acquisition phase ....................................................................................................... 30
7.
Data analysis ...................................................................................................................................... 32 7.1. 7.2. Statistical views of the data .................................................................................................... 32 Interesting observations ......................................................................................................... 33 2
EXPERIMEDIA 7.3.
Dissemination level: PU
Evaluation ................................................................................................................................ 34
7.3.1. Internal evaluation .............................................................................................................. 34 7.3.2. External evaluation ............................................................................................................. 35 8. Current status and future plans ...................................................................................................... 36 8.1. 8.2. 9. 9.1. Experimental facility ............................................................................................................... 36 Experimental methodology ................................................................................................... 36 Minimum ethical principles ................................................................................................... 38
Ethics, privacy, PIA.......................................................................................................................... 38 9.1.1. Doing good .......................................................................................................................... 38 9.1.2. Doing no harm .................................................................................................................... 39 9.1.3. Risk management ................................................................................................................ 39 9.1.4. Consent................................................................................................................................. 39 9.1.5. Confidentiality ..................................................................................................................... 40 9.1.6. Data protection ................................................................................................................... 40 9.2. Ethical oversight principles ................................................................................................... 40 9.2.1. Informed consent ............................................................................................................... 41 9.2.2. Deception ............................................................................................................................. 41 9.2.3. Data collection..................................................................................................................... 41 9.2.4. Withdrawal from the investigation ................................................................................... 41 9.2.5. Observational research ....................................................................................................... 42 9.2.6. Data protection regulation................................................................................................. 42 9.2.7. Consortium partner responsibility .................................................................................... 42 9.3. 10.1. 10.2. PIA ............................................................................................................................................ 42 Evolution and handling of risks ............................................................................................ 44 Current risk registers............................................................................................................... 45
10. Risks.................................................................................................................................................... 44
10.2.1. Risks for the participants ................................................................................................... 45 10.2.2. Risks for the experiment .................................................................................................... 46 11. Conclusion ......................................................................................................................................... 48 Appendix A. Appendix B. Appendix C. Appendix D. Appendix E. Informed consent ........................................................................................................ 50 Device lease .................................................................................................................. 51 Structured questionnaire ............................................................................................. 52 Gathered data ............................................................................................................... 54 Independent experiment evaluation.......................................................................... 55 3
EXPERIMEDIA
Dissemination level: PU
1. Executive Summary
This document presents the working report of the 3rd EXERIMEDIA embedded experiment. The document covers all aspects of the experiment, ranging from the purely technical to the purely theoretical ones. More specifically: Section 2 provides a brief introduction to the document, including a summary of the reasoning that led us to split the experiment in two stages. Section 3 highlights relevant elements of the experiment's background in order to improve the readability and completeness of this document, whilst the reader is advised to refer to previous deliverables of the project for further information. Section 4 makes the architecture descriptions of D4.3.1 more detailed and adopts them to the specifics of the first stage of the experiment. The architecture is presented for the two distinct parts of the experimental facility and the information flow between the different components is also shown. Section 5 continues on the same path and explains how these architectures where actually implemented. Moving on to the actual implementation of the experiment, Section 5 describes how the experimental facility was implemented. It is worth noting that due to the early timing of our experiment additional work had to be carried out from our part in order to address issues not yet addressed by our technical partners; occasionally not even successfully. Section 6 focuses on the actual execution of the experiment and gathering of the data. Using the implemented experimental infrastructure and following a reduced version of the scenario already outlined in D4.3.1 we explain how we invited a number of participants to experience the EXPERIMEDIA extensions to the venue and provide us with their feedback on them. Unfortunately we were only able to do this for the augmented reality component at this stage, but we are confident we will also cover the other components for the second stage of the experiment. Section 7 presents the gathered data and based on a brief statistical processing reaches some initial conclusions. Despite the small sample they are based on, these conclusions will be very useful in guiding the preparations for the second stage of the experiment. Evaluations of our work, both internal and independent, are also presented. Having presented the work performed to this day, in Section 8 we also outline the work planned for the second stage of the experiment, so that all of its goals may be reached. Section 9 discusses ethics and privacy. The precautions taken with respect to ethics and privacy were already analysed quite rigorously in D4.3.1. We do revisit the subject here for completeness but the reader should be made aware that much of the text of this section is in fact also found in section 6 of D4.3.1, but here it is adapted to the specifics of the first stage of the experiment, as they finally formed.
EXPERIMEDIA
Dissemination level: PU
Risks have been monitored and handled throughout the design, implementation and execution of the experiment, following the guidelines specified in D1.1.2 and using the registers defined in D4.3.1. A complete review of the evolution of the risk registers is provided in Section 10. Appendices include the form used to get informed consent, the form used to track devices, the questionnaire used during the experiment, and the complete listing of the data gathered. We also include a copy of the external evaluators report.
EXPERIMEDIA
Dissemination level: PU
2. Introduction
This deliverable is an intermediate progress report for EXPERIMEDIAs embedded experiment focusing on shared, real-time, immersive and interactive cultural and educational and executed by the Foundation of the Hellenic World at its premises at Hellenic Cosmos in Athens. Starting from the more abstract scenario description provided in the earlier D2.1.2, exploiting the architectural blueprint described in the D2.1.3, taking into consideration the methodological guidelines described in D2.1.1 as well as the ethical oversight principles described in D5.1.1, and of course based on the preliminary work described in D4.3.1, the document provides an overview of the work that has been done to this day, the data gathered, the conclusions drawn as the plans for the future of the experiment. Very early in our work towards the execution of this experiment we came to realize some difficulties with the scheduling of different tasks and events in the timeline of the project that could prove problematic. Most notable among them the fact that the first working versions of the technical partners contributions were expected late in the project and in fact after the finalization of the plans for the experiment as well as the fact that there would not be a chance for us to have feedback from the project reviewers at a mature stage of the work in order to be sure that we are headed in the right direction. Both constitute very serious dangers for the experiment and therefore appropriate action was required. Therefore, in order to navigate away from the aforementioned dangers we decided to split the experiment in two stages with very distinct character and goals. The first stage is aimed to run quickly, provide an early confirmation of the projects methodology and technical approach, put technical components to a practical test and generate feedback for the technical partners who are working on them, gather know-how that will help the new partners who are just joining the project to seek even higher goals, allow the consortium to have a demonstrable output from early on in the project and give us an opportunity to present our approach to experimenting to the project reviewers during the first year review. The second stage is aimed to build on the experiences and know-how of the first stage in order to perform the complete experimental work that was envisaged for the 3 rd embedded experiment in the most suitable manner possible. This document reports on the first stage of the experiment, which has just been completed. The document covers all aspects of the experiment, ranging from the purely technical to the purely theoretical ones, and also hints at the work planned towards the successful implementation and execution of the second stage of the experiment as well.
EXPERIMEDIA
Dissemination level: PU
3. Background
The experiment's background has already been discussed in D4.3.1 as well as in D31.1, D31.2 and D4.33. We briefly review here some main elements of it in order to enhance the readability and completeness of the current text and advise the reader to refer to the abovementioned documents for further details.
3.1.
The embedded experiment involves the development of an experimental facility and its testing with the participation of real users in real settings. Clearly something will be implemented, some data will be gathered and some analysis will happen. Still, a core question remains unanswered and critically subjective: when has the experimenter done enough? In order to remove the subjectivity and provide a clear measure of success, we identified in D4.3.1 a set of goals for the experiment and defined the corresponding objective criteria. Given the fact that four different levels of success are defined, the objective is to achieve a considerable level of success in the first stage of the experiment reported herein and then build on that in the second and final stage of the experiment in pursuit of even higher success. We review the experiment's goals here briefly, together with their corresponding success criteria, as defined in D4.3.1: Goal 1: Be an EXPERIMEDIA test bed (Baseline success) The experiment can be executed. This entails having implemented the experiment architecture, having made all of the included components operational and having been successful in their integration. Know-how has been gathered. This refers to the gathering of know-how related to the further implementation of the embedded experiment.
Goal 2: Explore suitability of FIRE technologies for the field under examination (Moderate success) Identify differentiation between using and not using the FIRE technologies. In other words, we need to establish that there is a substantial difference for the visitors between the conventional experience currently offered and the one that will be offered in the scope of the experiment. Classify the impact of each component as positive, negative or neutral. This is a more specific version of the previous criterion, as here it is not enough to establish that there is a difference. What is also required is a clear indication regarding whether this difference has an impact that QoE of the visitors and if so whether this impact is positive or negative.
EXPERIMEDIA
Dissemination level: PU
Quantify and measure QoE. In other words we need to have designed a measure that quantifies QoE and we also need to have applied this to data gathered from the experiment. Correlate measured QoE to utilized FIRE technologies. Moving a step further, to meet this criterion we should be able to identify the contribution of each component in the QoE, so that strategic decisions can be made regarding the directions that warrant further examination.
Goal 4: Identify parameters that affect impact (Exceptional success) Measure QoE for different parameters. This criterion is met if data gathered when running different instances of the experiment and different QoE values are computed. The compared instances need to be such that a direct comparison related the differences in QoE to differences in QoS of some kind (e.g. bitrate), differences in the design and execution of experiment (e.g. duration of show), differences in demographics etc. Gathered insight for the design of future experiments. This refers to the gathering of insight related to the implementation of future EXPERIMEDIA installations at Hellenic Cosmos, for example in order to run future experiments.
3.2.
The embedded experiment is built around the VR immersion experience offered by the Tholos. When EXPERIMEDIA extensions are not considered, this is offered mainly as a standalone experience that is not combined with any of the other exhibits or services of Hellenic Cosmos. The typical operation of the Tholos and of the service it offers to its visitors may be graphically modelled as in Figure 1. It is easy to see that this is a mainly one-way communication system, as the museum educator controls the system, thus specifying what the Tholos system will render and project to the visitors, while at the same time commenting on it. As a sole exception to this, visitors are able to participate in electronic polls which determine the path that the Educator will follow altering in this way the flow of the presentation in real time. The main reason for this extremely structured and predefined approach is that the museum educator is working with predefined scenarios, i.e. descriptive texts prepared by the FHW experts. These texts provide information on the 3D worlds in a specific order and therefore the tour in the 3D world has to follow the same order, otherwise the museum educator would be unable to provide synchronized information. Before the start of the show there is a brief pre-show which informs the visitors about the characteristics of the VR immersion system. This contains information such as whether they will feel dizzy, what to do in that case and so on. The pre-show is not related to the content of the show and does not add to the value that is offered to the visitors. After the end of the show the visitors exit the Tholos and this completes the offered experience.
EXPERIMEDIA
Dissemination level: PU
3.3.
As we have already mentioned in our introductory comments, we have had compelling motivation to run the experiment with an accelerated schedule in time for it to be examined during the projects first yearly review. On the other hand, the projects very tight time schedule made it almost impossible for the complete experimental facility to be implemented and used and then for all this date to be analysed in time. Particularly, the timing of the delivery of the first working Act2 software as well as of the Act3 integration work made it next to impossible to run an experiment within the first year. In order to meet our goal of running and reporting the experiment for the first review we were forced to create a scaled down version of it, one containing only the components that were available ahead of their planned time, so that the reviewers feedback on our direction could be sought. As a result, our experiment is now organized in two stages, as explained in the following.
EXPERIMEDIA
Dissemination level: PU
3.4.
Constraints
The main attendants of FHW shows are children and adolescents. In the EXPERIMEDIA experiments only adults will be considered, which creates a question regarding the validity and generality of the results. For the same reasons, and given the fact that the navigation in a virtual world is a group experience, it is not possible to apply some monitoring techniques (for example video recording), unless if the experiment is executed only when pure groups of participants are present, i.e. when there is no one present who is either not eligible or has not agreed to participate in the experiment. Regarding the experiment timeline, since some of the real exhibits connected to tags in the virtual content are in open areas of the Hellenic Cosmos venue, the weather may have an impact on the execution of the experiment. The execution of the experiment is also constrained by the timeline of the development of EXPERIMEDIA components by the technological partners of the project. This has already become more than evident during the definition and implementation of the first stage of the experiment.
10
EXPERIMEDIA
Dissemination level: PU
4.1.
Definition of agents
The agents participating in this experiment are - Visitors: The audience that participate in the interactive shows of the "Tholos" dome theater. We will refer to them as Visitors. - Museum educator or simply "educator" is a person that interacts with the audience and controls the navigation through the virtual 3D environment. - Experts: One or more scientists, located remotely, that provide additional commentary on the content that is shown during the show and answer questions from the visitors. We will refer to these scientists as "Experts".
4.2.
Since the Experts are on a remote location, the main motivation behind this experiment is to allow real-time interaction between the Experts and the Visitors. To this end, the experts and the audience must see and hear the same part of the interactive show what is shown in the dome. Additionally, the experts must be able to hear any questions from the audience, and the audience must hear the experts. Figure 2 shows a high level overview of the locations and the desired flow of information between the agents that participate in this experiment.
11
EXPERIMEDIA
Dissemination level: PU
Remote Location 1
"Tholos" Theater
Remote Location 2
Visitors
Expert 2
Educator
Figure 2. High-level overview of the locations and the desired information flow between the agents of the experiment. In this particular example two experts participate in the experiment from remote locations.
The actual connectivity and communication between the components is shown in Figure 3. In this figure we can see that the museum educator holds the navigation control, which specifies the content that should be displayed to the visitors. Based on this input, the cluster in the Tholos dome processes the loaded 3D world in order to render the according location and viewpoint and display it to the visitors in the dome. This is the part that was already supported before EXPERIMEDIA and it is in fact the typical scenario for the utilization of the Tholos. With the EXPERIMEDIA extensions, the Tholos system, in addition to the local projections, also forwards the rendered stream (actually a downsized and 2D version of it) to the video stream server, which in turn makes it available to the experts' application. In this way the experts will be aware of the presented content in real-time. The video stream from Tholos is captured from another PC through a video capture card (AVERMEDIA Game Broadcaster HD). At this PC the video is transcoded along with the audio feed from the educators microphone and are transmitted using Adobes Flash Live Encoding to ATOS Server. The Experts will be using ATOS Flash Player to visualize the video stream. Additionally, the museum educator is able to see video-feeds from the experts, as shown in Figure 11. This video and audio feed from the experts is also passing through the ATOS Server, where it is transcoded, and it is made accessible to the educator through a regular website. In order to see this website, the adobe flash player is required, a plugin that is available in most modern operating systems. A minor limitation of the system is that since adobe flash is required, the web-based application cannot run on mobile or embedded devices, which usually lack support for this plugin. Nevertheless, the usage of web-technologies like flash in our system makes the video and audio streams easily accessible from multiple computers.
12
EXPERIMEDIA
Dissemination level: PU
4.3.
The second part of the experiment takes part after the interactive show in the Tholos Theatre. During this experiment, the visitors can learn additional information about the content of the show, by visiting specific locations on the premises of the Hellenic World facilities. In these locations we have installed markers and using augmented reality techniques, these markers can be recognized by an application that runs on commodity mobile devices. The application can the super-impose virtual objects on top of the real ones, by tracking the position and orientation of the markers, or can simply open a website with additional information, depending on the location of the marker. It is worth noting that this experiment does not require the participation of the experts, the visitors will take the additional information directly from their mobile device. This is an significant advantage, because it minimizes the operating expenses. The architecture and information flow of this part of the experiment is shown in Figure 4. With our application, the input video feed from the real world is augmented with virtual 3D objects. Alternatively, when a specific marker is detected in the input video feed, the user is redirected automatically to a website that is related to his particular location. The association of the markers with the virtual objects or the websites is performed using a configuration file.
13
EXPERIMEDIA
Dissemination level: PU
14
EXPERIMEDIA
Dissemination level: PU
Figure 4. Flow of information and component diagram for the experiment. Our application configuration can support up to 50 markers and associate them with different 3D objects or links to websites.
15
EXPERIMEDIA
Dissemination level: PU
5.1.
The implementation of the first part of the experiment requires streaming the content shown at the dome to multiple remote locations. The interactive content for the dome is produced by a cluster of PCs, as shown in Figure 2. Each PC renders of this cluster renders only one part of the dome. We have augmented this cluster with one additional PC that renders one extra view of the dome. This required a change in the configuration files of our cluster. The video output of this PC is directed to another PC that encodes the content in MPEG4 H.264 format and streams it to the ATOS central server. The h.264 format was chosen among many others because it is one of the most advanced video codecs and it is known to provide state-of-the-art encoding quality. The quality of the streaming experience is highly sensitive to the available bandwidth. Therefore we have experimented with various encoding settings for the video and audio stream, in order to maximize the image quality and the responsiveness of the streaming content. One technical limitation that was imposed from the ATOS media server was that the stream should be at 60fps. A 30fps stream would be perfectly adequate for our purposes and would require less bandwidth but at the time of this experiment we could not use this option. After some experiments with our line, we have settled on a 1440x1080 video resolution for the stream. This is highly depended to the bandwidth of the internet connection in the FHW facilities. It should be noted that the ATOS media server is located in Spain. Therefore some latency is to be expected in the communication. Aside from the network latency, which is to be expected, some amount of latency was also introduced by the h.264 encoder in our streaming PC. Nevertheless, this latency was only a few seconds, and did not pose a serious problem in our experiment. In the future it might be worth investigating methods to reduce this latency even further.
5.2.
The second part of our experiment, as noted before, involves a mobile application that uses augmented reality technologies. The augmented reality framework that was provided to us for testing was the Metaio Mobile SDK. Using this SDK, we have developed our own mobile application. During the implementation of this application we had to make a several design decisions and tests. Bellow we outline these implementation efforts. First we had to decide which mobile platform to support. The provided SDK supports both the Android OS from Google and the Apple iOS. After consideration, we chose to implement our application in the Android OS, because we have found that the development process is more open. Developing and testing on Apple devices required a registration with Apple, something that could take several weeks for corporate entities and could potentially delay our efforts, thus compromise the project. Additionally, the social networking API that is required for another part
Copyright FHW and other members of the EXPERIMEDIA consortium 2012
16
EXPERIMEDIA
Dissemination level: PU
of the experiment supports only the Android OS. Therefore the application was made for Android, using the JAVA language and the Eclipse SDK and the Android Development Toolkit. Figure 5 shows a screenshot of the development environment. The exact versions of the software used are Eclipse SDK 3.7.2, ADK 18.0.0.v201203301601-306762. The target Android OS version was 2.3.7 or higher.
After the decision about the Operating System, we had to choose the particular device that was going to be used in our experiment. We performed our first experiments in a "Sony Ericson Live" mobile phone, shown in Figure 6.
17
EXPERIMEDIA
Dissemination level: PU
Figure 6. Initial tests were performed in this mobile phone (Sony Ericsson Live). The small screen made it difficult to use our application.
Our application was successfully running in this phone, but we have observed that the screen size was rather small, 3.2inches. The small size of the screen was making it hard for the visitors to see the provided information in the form of 3D Objects or Websites. The details in the 3D objects were not apparent and the websites were difficult to navigate and read. Therefore we have used a device with larger screen, the Sony XPERIA P, shown in Figure 7. This phone has a 4.0Inch screen, which makes using the application more comfortable.
Figure 7. The actual device that was used in the experiment (Sony XPERIA P). The 4Inch screen is a good trade-off between usability and portability.
Ideally, we could also use some Tablet PCs running the Android OS, but the larger size makes these devices less portable. We believe a screen around 4-5 inches provides the right balance between usability and portability. Portability is required for our experiment, because the
18
EXPERIMEDIA
Dissemination level: PU
augmented reality markers are scattered at various locations in the FHW facilities, and the visitors are expected to carry the device with them, while walking around.
Figure 8. Our application running on the Android Emulator on a PC. Although the application can run in the emulated device, the actual experiments and testing cannot be performed in the emulator, because augmented reality applications inherently require the video feed from a mobile device. Thus, all of our testing and development for the augment reality experiment was performed on actual mobile devices.
Next, we had to adapt our virtual reality content to the needs of a mobile device. Our 3D models are mostly designed to be displayed by powerful workstations. These workstations are several times faster than a mobile device, so our content had to be simplifies in order to be usable in a mobile phone. After considerable experimentation, we have found that our particular mobile device can show 3D models with up to 20000 triangles. Another limitation was that the model should have only one texture, instead of multiple layers of textures that we use in our workstations. To overcome this limitation, our artists merged the information from all the layers in a single texture. This texture also has the information about the lighting. This process is called "baking" in technical terms. Therefore, we have concluded that after reducing the polygon count to around 20000 triangles and baking all the information to a single texture, the 3D models were usable in our mobile devices. Next, we have performed various experiments with the type of markers that we were going to use in our experiments. The Metaio SDK supports both "marker-less tracking", where the
19
EXPERIMEDIA
Dissemination level: PU
application can track the location and the orientation of an arbitrary image, and tracking with markers, where the application tracks specific markers, as the ones shown in Figure 9.
In both cases, the 3D model that is superimposed in the video feed should follow the position and the orientation of the image or the marker respectively. Nevertheless, we have found that the tracking with markers instead of images is more robust. In particular, we have found that when using the markers, the tracking is more stable, meaning that the 3D object is actually closely following (tracks) the marker in the video feed, and does not appear to be moving independently, which was the case when using markerless tracking. In particular, the performance of markerless tracking depends on the contents of the actual image that is used for the tracking, something that we have found unacceptable for our application. Furthermore, tracking with markers was less sensitive to lighting conditions, and was working reliably even on environments with relatively low illumination, thus we have used tracking with markers. The markers are integrated on an information sheets that are placed on various locations at the facilities of Hellenic Cosmos. Figure 10 shows an example of such a sheet, along with the application running on the mobile device.
20
EXPERIMEDIA
Dissemination level: PU
Figure 10. Left: The augmented reality information sheet. Right: The visitor places the phone over the sheet and observer the 3D object.
Figure 11. The PC screen of the museum educator, communicating with the remotely located Expert.
21
EXPERIMEDIA
Dissemination level: PU
Figure 12. Two museum educators guide the audience through a virtual representation of the ancient City of Miletus inside the Dome Installation of FHW. Low light conditions were required for the projectors. One educator controls the navigation with the joystick, while the other one narrates some historical information. On the right side we can see the PC with the video feed from the experts.
22
EXPERIMEDIA
Dissemination level: PU
5.3.
Our intention was to use social networks to allow direct communication of the Visitors with the Experts. To this end, we have used the Social Networking API, which is provided by our partners, in order to build a mobile application that connects the users to the Twitter and Facebook mobile networks. During our testing, we have found that the Social Networking API is not robust and ready for deployment, and further development is needed in order to fix various issues and bugs. Bellow we have a comprehensive description of our tests, the errors occurred and how to reproduce them: 1. The provided source at the Experimedia SVN has a small error and fails compilation. In particular at the link path there are two versions of the twitter4j library. To make it compile and perform the tests that follow we had to remove the "twitter4j-core-2.1.1SNAPSHOT.jar" library and keep only the "twitter4j-core-android-2.2.5.jar". 2. After the above correction, I have launched the example app and tried to login to twitter. When pressing the button I insert my twitter account credentials and I authorize the app. After this step, which is performed in a browser that pops-out, the control redirected back to the android app, but it crashes immediately and shows the typical crash message. See the screenshot in Figure 14.
23
EXPERIMEDIA
Dissemination level: PU
Figure 14. An error during the testing of the Social Networking API in the Android Emulator. The same error also occurs on the devices.
3. After this step, if you relaunch the app, it appears that the twitter account is logged-in. Perhaps it kept the credentials from the previous step, before the crash. But if I sent a message, the app crashes again. 4. I have also tried to login only using Facebook. In the Facebook case the login happens again in a browser window that pops-out, but there is not any crash, like twitter. After the login, when trying to send a test message, the app crashes again. Interestingly enough, judging from the logs, the app crashes on the twitter components (TwitterBaseImpl.java). I would expect if the user provided only a Facebook login, and not a Twitter one, the API should not try to use Twitter (and crash) 5. The last test was to try to send a message using both Twitter and FB (it could be an API limitation that it needs both). This use-case also leads to a crash. It seems that the error has again to do with the twitter integration/components of the API. This has been tested on a Sony Experia P with android 2.3.7, on a virtual device with android 2.3.3 and on a virtual device me 4.0.3. All the above configurations exhibited the same behaviour. All the errors appear deterministic and reproducible in our setup (i.e. if we follow the exact same steps, we always get the errors described above)
24
EXPERIMEDIA
Dissemination level: PU
More recently we have received an updated version of the Social Networking API from our partners, and we are in the process of integrating it in our experimental facility so that we can include it in our tests, as originally planned.
5.4.
To this day we have not worked with any other EXPERIMEDIA components. Clearly, this will be altered for the second stage of the experiment. If nothing else, we shall at least have to integrate the experiment monitoring component that is being prepared by our partners, since the projects methodology requires the experiments to be built and executed around it.
25
EXPERIMEDIA
Dissemination level: PU
6. Experiment execution
In this section we present the actions taken and procedures followed in order to conduct the experiment.
6.1.
A great deal of effort was required before even approaching the first candidate participant. Specifically, we needed to: deploy the experimental facility, familiarize ourselves with the experimental facility, so that we could provide assistance as required, inform staff about the upcoming experiment and schedule accordingly, prepare the informed consent forms and, prepare the questionnaire.
Point 2 is a Book on Priene offered in the Hellenic Cosmos store and is coupled with a 3D reconstruction of a building that could be found in the city.
26
EXPERIMEDIA
Dissemination level: PU
Point 3 is a presentation of Thales in the Mathematics exhibition. It is coupled with a link to the online Encyclopedia of the Hellenic World (www.ehw.gr) and specifically to the lemma on Thales. As we found out during testing the wireless signal at that location is not stable enough and at times it is not possible to access the page. In face the wireless connection failed us during all the tests and therefore we did not manage to have feedback on this point of interest.
Point 4 is a physical reconstruction of an ancient ship. It is coupled with a digital reconstruction of a different ancient ship and the descriptive text outlines the differences. Depending on the time of day and most importantly on whether the weather was cloudy or not, difficulties were observed with the clarity of the monitor.
27
EXPERIMEDIA
Dissemination level: PU
Point 5 is a sample excavation site that we use to show pupils how an archaeologist actually works. It is coupled with the reconstruction of an amphora.
One core concern is the relative location of these points. As can be seen in Figure 20 the points are located at quite distant locations from each other. In fact the walking distance between points 3 and 4 is approximately 900 meters. Considering that a participant would have to be approached and informed about the project and the experiment at the main entrance of the venue (close to points 1 and 2) but should walk to points 4 and 5 and then back again in order to have experienced all 5 elements that we have developed, it is clear that the experiment will take long to run. In fact, we have had to accept that only one round of experiments will run in a day and that in the participant selection criteria we would also add an evaluation of the physical state of the individual, so that they would not have a problem covering this distance.
28
EXPERIMEDIA
5.Amphorea 4. Ship
Dissemination level: PU
2. Building
1. Bed
EXPERIMEDIA
Dissemination level: PU
6.2.
Experiment procedure
30
EXPERIMEDIA
Dissemination level: PU
6.2.3.2. Focus groups In the first stage of the experiment it proved impossible to hold focus groups. The reason is purely technical: focus groups work well when a large number of participants has first been polled through structured questionnaires and then based on the analysis of these questionnaires the researchers determine some points on which they would like to have additional information and organize focus groups. Each focus group has a smaller but still considerable number of members and through discussion the point of focus is examined more to get a more concrete idea of the participants view. In the first stage of the experiment it was first of all impossible to first examine the data and then have the focus groups, simply because the data was gathered over different days and when it was analysed the experiment execution was already over and no participants were available to hold a focus group. But even if we had manually selected some focus questions and decided to hold focus groups we still could not do it because: The three participants that we have as a maximum number for any given moment is still a very small number The participants are scattered over a very large area and randomly return to the reception, hand in the equipment, fill in the questionnaires and leave. There is no way to synchronize the departure of the three participants and therefore it is not possible to hold focus groups since it is not possible to have the participants at one place together at the end of the experiment.
For the second stage of the experiment we plan to hold focus groups. The execution of the part of the experiment that refers to Tholos will facilitate this, as considerably larger numbers of individuals can take part at the same time.
31
EXPERIMEDIA
Dissemination level: PU
7. Data analysis
Given the limited data that has been gathered, data analysis is much less rigorous that one might expect from such a complex and ambitious experiment. We do envisage more powerful data analysis being needed for the second stage of the experiment. The small size of the gathered data allows us to include the complete list in this document. The list can be found in Appendix D. Despite the small number of samples, the number of questions allows a wide range of statistical values, correlations and graphical charts to be produced. We present below some statistics that we find most informative and interesting. Questions are translated roughly and abbreviated. A standard 1-5 Likert scale is used for most questions. The exception is Yes/No questions which are treated as 1/0 values.
7.1.
Question Clarity of picture for the bed Clarity of picture for the building Clarity of picture for the ship
Question Relevance for the bed Relevance for the building Relevance for the ship
Question Interest for the bed Interest for the building Interest for the ship
32
EXPERIMEDIA
Dissemination level: PU
Question Educational value for the bed Educational value for the building Educational value for the ship
Question Fun for the bed Fun for the building Fun for the ship
Question Content and points of interest similar to the bed Content and points of interest similar to the building Content and points of interest similar to the ship
Value 8% 8% 92%
7.2.
Interesting observations
Since the focus of EXPERIMEDIA is on users we pay particular attention to Table 5 and Table 6, which probably provide the best measure of QoE. The fun that the participants have is a direct measure of the value they have given to their experience, whilst the question about whether they would be willing to pay for similar services is an indirect measure of the same thing. The first important observation is that people would be willing to pay money for such an application. This is probably the safest way to conclude that the augmented reality component did enhance their experience considerably. Quite interestingly there are huge differences in the average values for the different points. It is clear that points such as the ship and the amphorae have made a bigger impression than for example the bed. Although one can make guesses, it is the goal of EXPERIMEDIA to focus on users. Therefore, instead of attempting to guess which elements of these points make them more
Copyright FHW and other members of the EXPERIMEDIA consortium 2012
33
EXPERIMEDIA
Dissemination level: PU
fun and valuable for the users, we note the observation and hope to have the opportunity to explore the reasons behind it in a focus group during the second stage of the experiment. EXPERIMEDIA aims to correlate QoE measures with QoS measures in order to assess which technical parameters influence the value that the user receives and in which way. Not having integrated the experiment monitoring component in our facility yet, the closest we have to a QoS measure is the participants feedback regarding the clarity of picture. Table 7 allows us to review correlations more easily.
Table 7. Summative data
Building Ship 4,58 2,33 2,25 1,41 2,2 8% 1,33 3,83 3,4 3,25 4,42 92%
It is quite interesting, and perhaps shocking, that the QoS and QoE appear to have a negative correlation. Most probably a better explanation would be to conclude that they do not have a strong correlation in this experiment. Even so, it is quite interesting to see that the lower quality of picture that is available at the outdoors location is not enough to make the experience less enjoyable and valuable. Clearly there are other parameters related to these points that have a stronger correlation with the QoE, and it is our goal to explore which are these parameters during the second stage of the experiment.
7.3.
Evaluation
34
EXPERIMEDIA
Dissemination level: PU
Overall our internal evaluation is that this has been a very successful first stage, and that the accumulated experience will facilitate even further the implementation of the rest of the work.
35
EXPERIMEDIA
Dissemination level: PU
8.1.
Experimental facility
We are currently happy with the technical work done on the augmented reality part of the experimental facility, so we do not expect to be doing and core development work in this direction. We will of course have to examine if and how the augmented reality software that we are using can be linked to the EXPERIMEDIA experiment monitoring component. As far as the data itself is concerned, we do hope to make additions. Specifically, our limited initial testing has indicated that the augmented reality points are too sparse and only some of them are truly interesting and relevant to the items they have been coupled with. We have already asked our content experts (historians and archaeologists) to examine whether more points can be created, given the existing physical exhibitions available in Hellenic Cosmos. We are also quite happy with the technical work done on the video streaming end, and grateful to ATOS for their huge support. We also hope to be able to join this with the experiment monitoring component. We did not manage to integrate the social component with our facility the first time round, but we are now convinced that the updated version that ICCS has circulated will operate as planned. We therefore plan to do work on integrating this component with our venue and with the experiment monitoring component. As has already become available, the integration of the experiment monitoring component with our venue and experiment is a core goal for the upcoming period. As far as the involved hardware is concerned, we hope to be able to have more smart devices for the second stage of the experiment, so that we can run the augmented reality part of the experiment with more participants.
8.2.
Experimental methodology
Although we have only managed to run few experiments with the augmented reality component and none with the video streaming component, we have already had the chance to identify some weaknesses in our methodology and examined scenario, which may call for the corresponding adjustments. Specifically: It is hard to perform extensive tests with the augmented reality component because having only 3 devices we can only have 3 participants at a time. Considering the size of the venue and the dispersion of the augmented reality points, this practically means that only 3 participants may be considered per day, which does not allow for much flexibility. 36
EXPERIMEDIA
Dissemination level: PU
Moreover, a single damaged or stolen device can have a major impact on our ability to run experiments. We have already painfully realized that the planned scenario asks for some of our most valuable (and therefore busy) staff members to stop their work for the whole duration of the experiments shows (about 45 minutes each time) in order to be readily available to help. This is clearly not realistic for a large scale implementation. We may need to consider an alternative scenario where an expert is not standing by doing nothing but either has a more meaningful role or is only alerted when there is actual need.
We have not designed a new version of the experimental scenario at this time, nor are we sure that we will, but this is a direction that we will have to examine seriously over the next period if we are to maximize the benefits from the execution of the experiment.
37
EXPERIMEDIA
Dissemination level: PU
9.1.
In D5.1.1 a set of ethical principles has been identified for the embedded experiments. They have all been considered in the design of this embedded experiment, as explained in the following.
38
EXPERIMEDIA
Dissemination level: PU
EXPERIMEDIA technology. Therefore it does good both for those participating in the experiment as well as in general. As far as the participants of the experiment are concerned, they will have the opportunity to benefit from enhanced services that were previously unavailable. In the more general sense, this experiment will be a first step towards making these technologies a part of the normal operation of the Tholos (remote experts participating) and of the venue (augmented reality points scattered in the venue), so that more people can benefit from them in the future.
9.1.4. Consent
The preparatory phase of the experiment involves the explicit communication of any relevant information to the eligible participants (i.e. what the experiment is about, what it entails, which is their role, etc). Only those eligible participants that have agreed and have signed a note of informed consent are considered in the experiment. This consent has a predetermined duration
Copyright FHW and other members of the EXPERIMEDIA consortium 2012
39
EXPERIMEDIA
Dissemination level: PU
of two months. Differently to what was stated in D4.3.1, this consent is not revocable after the submission of the information. The reason is that all information is fully anonymised from its very creation and therefore it is technically impossible to locate and remove the information provided by a specific person.
9.1.5. Confidentiality
During the experiment only the required data is gathered, this data will only be made available to the individuals that are needed to process that data and no part of this data will be disclosed to any third parties. Gathered data is fully anonymised. All data will be purged after the analysis has been completed and at the latest two months after its gathering.
Figure 21. One of the completed questionnaires. No personal data is listed and all options are multiple choice so that no handwriting is required either
9.2.
D5.1.1 has also produced a more detailed set of ethical principles, more customized to the specifics of EXPERIMEDIA and the embedded experiments. These have also been considered and adopted in the design of the experiment, as seen in the following.
40
EXPERIMEDIA
Dissemination level: PU
9.2.2. Deception
We will never intentionally deceive, mislead or withhold information from participants over the purpose and general nature of the investigation.
41
EXPERIMEDIA
Dissemination level: PU
will NOT have the right to withdraw retrospectively their consent given and to require that their own data be destroyed, simply because with the fully anonymised data it is not possible to do that. Our EAB assures us that this is okay.
9.3.
PIA
As was shown in D4.3.1, no further PIA is required. Nevertheless, written consent of a two duration of two months is acquired. Dr. Manolis Wallace will act as the data controller for the 42
EXPERIMEDIA
Dissemination level: PU
experiment. Since no other personal data is recorded, the data controllers duty is limited to keeping safe the forms of informed consent (which list the participants names) and deleting them at the end of the predefined period.
43
EXPERIMEDIA
Dissemination level: PU
10. Risks
At the beginning of the work on the experiment, two risk registers were formulated: risks for the participants and risks for the experiment itself. These were initially reported in D4.3.1, but were also constantly monitored as a live document, and updated as the project and its environment evolved. In the following we start by presenting how the evolution of the risk registers has affected the work on the experiment to this day, as well as the current instance of the registers with respect the upcoming work.
44
EXPERIMEDIA
Dissemination level: PU
The AR and the streaming component were successfully integrated into the facility and so E6 and E7 changed to low probability. This also reduced the impact of E5 to medium, as enough EXPERIMEDIA components were already integrated into the facility for the experiment to be meaningful. Given that the first stage of the experiment has been completed successfully, the impact of E7 has now been reduced to low, as the corresponding component has already been tested and evaluated to some extent and therefore being forced to exclude it from the second stage of the experiment would be undesired but not disastrous. Among other things, the execution of the first stage of the experiment validated our approach to the recruitment of participants. Thus the probability of E1 was reduced to low for the second stage of the experiment, as following the same approach is expected to yield similar results. We do keep in mind though that larger numbers of participants will need to be achieved for the second stage of the experiment. Overall, the experience of the first phase of the experiment confirmed that the approach to risk management defined in D1.1.2 is applicable and successful and also that the benefits that are gained from its implementation are worth the small overhead that is associated with it. Therefore we have every intention of continuing to monitor and handle risks in the same manner for the second stage of the experiment.
ID P1
Description Mobile device damaged during the installation of the mobile application
Probability low
Response avoid
Comment Instead of using the participants own devices, the project will supply the mobile devices that will be used during the experiment. See point P1
P2
Mobile device not compatible with the EXPERIMEDIA software Malicious software installed on the mobile device Participants feel pressured to participate in the experiment
high
high
Execution phase
avoid
P3
low
high
avoid
See point P1
P4
low
high
avoid
This is true particularly for members of groups who visit FHW facilities as part of an agreement between.
45
EXPERIMEDIA
ID Description Probability Impact Proximity
Dissemination level: PU
Response Comment We will make it clear to all during the preparatory phase that participation is not a requirement and has no effect.
10.2.2.
For the reasons listed above, the current instance of the risk register for the experiment is as follows:
Table 9. Risk register for the experiment.
ID E1
Description Required experiments cannot be run because 90% of thee visitors are children and adolescents and the agreed Ethical Oversight Measures state we will not be dealing with children Required EXPERIMEDIA components not available on time or not compatible with the FHW facilities Damaged mobile devices
Probability low
Impact high
Response accept
E2
low
high
E3
low
reduce
We will run the experiment with as many working devices are available. The mobile devices only affect one part of the experiment, and therefore the feedback of participants that are not given a device is still relevant We are planning to execute the experiments at a time that the weather is typically suitable Via close coordination
E4
low
low
Execution phase
reduce
E5
Social component
medium
medium Second
reduce
46
EXPERIMEDIA
ID Description not available on time or not compatible with the experimental facility E6 Streaming component not available on time or not compatible with the experimental facility AR component not available on time or not compatible with the experimental facility Experiment monitoring component not available on time or not compatible with the experimental facility Delays in the implementation and integration of the facility make it impossible for the work to be completed on time low Probability Impact Proximity phase of the experiment medium Execution phase accept
Dissemination level: PU
Response Comment with ICCS and also by moving to the second stage of the experiment Already integrated successfully into the experimental facility
E7
low
low
Execution phase
accept
E8
low
high
Execution phase
reduce
Working closely with the technical partners and planning for an early deployment so that there is time to deal with any unforeseen difficulties We have left "dangerous" components for the second stage of the execution
E9
low
high
Execution phase
reduce
47
EXPERIMEDIA
Dissemination level: PU
11. Conclusion
This document presents the working report of the 3rd EXERIMEDIA embedded experiment. The document covers all aspects of the experiment, ranging from the purely technical to the purely theoretical ones. More specifically: Relevant elements of the experiment's background have been highlighted in order to improve the readability and completeness of this document, whilst the reader is advised to refer to previous deliverables of the project for further information. Making the descriptions of D4.3.1 more detailed and adopting them to the specifics of the first stage of the experiment, we have presented the architecture of the two parts of the experimental facility and the information flow between the components involved. Continuing, we have presented the various stages of the implementation work that has been carried out. Due to the early timing of our experiment additional work had to be carried out from our part in order to address issues not yet addressed by our technical partners; occasionally not even successfully. Still, in addition to having a working instance of the experimental facility at a very early time in the project, this has also given us the opportunity to provide more concrete feedback to our technical partners regarding their own work and its usability in real life situations. Using the implemented experimental infrastructure and following a reduced version of the scenario already outlined in D4.3.1 we invited a number of participants to experience the EXPERIMEDIA extensions to the venue and provide us with their feedback on them. Unfortunately we were only able to do this for the augmented reality component at this stage, but we are confident we will also cover the other components for the second stage of the experiment. The limited data that has been gathered was analysed and useful conclusions were drawn. Despite the small sample they are based on, these conclusions will be very useful in guiding the preparations for the second stage of the experiment. We reached our conclusions, but we also asked an external individual to weigh in with a second view in order to maintain objectivity. Ethics and privacy have been considered, and constraints have been somewhat relaxed due to the fact that no personal data is kept and all gathered data is anonymised from the beginning. Risks have been monitored and handled throughout the design, implementation and execution of the experiment, following the guidelines specified in D11.2 and using the registers defined in D4.3.1. This process has already proven its worth as various critical risks were identified and handled successfully, protecting the overall experiment of possibly crucial failures. Therefore the monitoring and handling of experiments will be continued into the second stage of the experiment and the know-how gained from it will be shared with consortium partners and particularly those now embarking on the detailed design and implementation of the second year's experiments.
48
EXPERIMEDIA
Dissemination level: PU
Overall we are thrilled to have completed this portion of the experiment so soon. We hope this document will be considered during the 1st year review so that together with the experience and feedback we acquired from the worked performed and the questionnaires analysed we can also rely on the project reviewers feedback on how to best proceed for the second stage of the experiment. The experience and know-how gathered while working on this experiment will also permit us to provide more concrete, useful and reliable advice to the new partners who are joining the project now and are planning to execute similar experiments at the Hellenic Cosmos venue.
49
EXPERIMEDIA
Dissemination level: PU
50
EXPERIMEDIA
Dissemination level: PU
51
EXPERIMEDIA
Dissemination level: PU
Appendix C.
Structured questionnaire
52
EXPERIMEDIA
Dissemination level: PU
53
EXPERIMEDIA
Dissemination level: PU
54
EXPERIMEDIA
Dissemination level: PU
Appendix E.
55