JoVE Logo

Sign In

A subscription to JoVE is required to view this content. Sign in or start your free trial.

In This Article

  • Summary
  • Abstract
  • Introduction
  • Protocol
  • Results
  • Discussion
  • Disclosures
  • Acknowledgements
  • Materials
  • References
  • Reprints and Permissions

Summary

This protocol delineates the technical setting of a developed mixed reality application that is used for immersive analytics. Based on this, measures are presented, which were used in a study to gain insights into usability aspects of the developed technical solution.

Abstract

In medicine or industry, the analysis of high-dimensional data sets is increasingly required. However, available technical solutions are often complex to use. Therefore, new approaches like immersive analytics are welcome. Immersive analytics promise to experience high-dimensional data sets in a convenient manner for various user groups and data sets. Technically, virtual-reality devices are used to enable immersive analytics. In Industry 4.0, for example, scenarios like the identification of outliers or anomalies in high-dimensional data sets are pursued goals of immersive analytics. In this context, two important questions should be addressed for any developed technical solution on immersive analytics: First, is the technical solutions being helpful or not? Second, is the bodily experience of the technical solution positive or negative? The first question aims at the general feasibility of a technical solution, while the second one aims at the wearing comfort. Extant studies and protocols, which systematically address these questions are still rare. In this work, a study protocol is presented, which mainly investigates the usability for immersive analytics in Industry 4.0 scenarios. Specifically, the protocol is based on four pillars. First, it categorizes users based on previous experiences. Second, tasks are presented, which can be used to evaluate the feasibility of the technical solution. Third, measures are presented, which quantify the learning effect of a user. Fourth, a questionnaire evaluates the stress level when performing tasks. Based on these pillars, a technical setting was implemented that uses mixed reality smartglasses to apply the study protocol. The results of the conducted study show the applicability of the protocol on the one hand and the feasibility of immersive analytics in Industry 4.0 scenarios on the other. The presented protocol includes a discussion of discovered limitations.

Introduction

Virtual-reality solutions (VR solutions) are increasingly important in different fields. Often, with VR solutions (including Virtual Reality, Mixed Reality, and Augmented Reality), the accomplishment of many daily tasks and procedures shall be eased. For example, in the automotive domain, the configuration procedure of a car can be supported by the use of Virtual Reality1 (VR). Researchers and practitioners have investigated and developed a multitude of approaches and solutions in this context. However, studies that investigate usability aspects are still rare. In general, the aspects should be considered in the light of two major questions. First, it must be evaluated whether a VR solution actually outperforms an approach that does not make use of VR techniques. Second, as VR solutions are mainly relying on heavy and complex hardware devices, parameters like the wearing comfort and mental effort should be investigated more in-depth. In addition, the mentioned aspects should always be investigated with respect to the application field in question. Although many extant approaches see the needs to investigate these questions2, less studies exist that have presented results.

A research topic in the field of VR, which is currently important, is denoted with immersive analytics. It is derived from the research field of visual analytics, which tries to include the human perception into analytics tasks. This process is also well-known as visual data mining4. Immersive analytics includes topics from the fields of data visualization, visual analytics, virtual reality, computer graphics, and human-computer interaction5. Recent advantages in head-mounted displays (HMD) led to improved possibilities for exploring data in an immersive way. Along these trends, new challenges and research questions emerge, like the development of new interaction systems, the need to investigate user fatigue, or the development of sophisticated 3D visualizations6. In a previous publication6, important principles of immersive analytics are discussed. In the light of big data, methods like immersive analytics are more and more needed to enable a better analysis of complex data pools. Only a few studies exist that investigate usability aspects of immersive analytics solutions. Furthermore, the domain or field in question should also be considered in such studies. In this work, an immersive analytics prototype was developed, and based on that, a protocol, which investigates the developed solution for Industry 4.0 scenarios. The protocol thereby exploits the experience method2, which is based on subjective, performance, and physiological aspects. In the protocol at hand, the subjective aspects are measured through perceived stress of the study users. Performance, in turn, is measured through the required time and errors that are made to accomplish analysis tasks. Finally, a skin conductance sensor measured physiological parameters. The first two measures will be presented in this work, while the measured skin conductance requires further efforts to be evaluated.

The presented study involves several research fields, particularly including neuroscience aspects and information systems. Interestingly, considerations on neuroscience aspects of information systems have recently garnered attention of several research groups7,8, showing the demand to explore the use of IT systems also from a cognitive viewpoint. Another field that is relevant for this work constitutes the investigation of human factors of information systems9,10,11. In the field of human-computer interaction, instruments exist to investigate the usability of a solution. Note that the System Usability Scale is mainly used in this context12. Thinking Aloud Protocols13 are another widely used study technique to learn more about the use of information systems. Although many approaches exist to measure usability aspects of information systems, and some of them have been presented long ago14, still questions emerge that require to investigate new measures or study methods. Therefore, research in this field is very active12,15,16.

In the following, the reasons will be discussed why two prevalently used methods have not been considered in the current work. First, the System Usability Scale was not used. The scale is based on ten questions17 and its use can be found in several other VR studies18 as well. As this study mainly aims at the measurement of stress19, a stress-related questionnaire was more appropriate. Second, no Thinking Aloud Protocol20 was used. Although this protocol type has shown its usefulness in general13, it was not used here as the stress level of study users might increase only due to the fact that the think aloud session must be accomplished in parallel to the use of a heavy and complex VR device. Although these two techniques have not been used, results of other recent studies have been incorporated in the study at hand. For example, in previous works21,22, the authors distinguish between novices and experts in their studies. Based on the successful outcome of these studies, the protocol at hand utilizes this presented separation of study users. The stress measurement, in turn, is based on ideas of the following works15,19,21,22.

At first, for conducting the study, a suitable Industry 4.0 scenario must be found for accomplishing analytical tasks. Inspired by another work of the authors23, two scenarios (i.e., the analysis tasks) have been identified, (1) Detection of Outliers, and (2) Recognition of Clusters. Both scenarios are challenging, and are highly relevant in the context of the maintenance of high-throughput production machines. Based on this decision, six major considerations have driven the study protocol presented in this work:

  1. The solution developed for the study will be technically based on mixed reality smartglasses (see Table of Materials) and will be developed as a mixed reality application.
  2. A suitable test must be developed, which is able to distinguish novices from advanced users.
  3. Performance measures should consider time and errors.
  4. A desktop application must be developed, which can be compared to the immersive analytics solution.
  5. A measure must be applied to evaluate the perceived stress level.
  6. In addition to the latter point, features shall be developed to mitigate the stress level while a user accomplishes the procedure of the two mentioned analysis tasks (i.e., (1) Detection of Outliers, and (2) Recognition of Clusters).

Based on the six mentioned points, the study protocol incorporates the following procedure. Outlier Detection and Cluster Recognition Analysis tasks have to be accomplished in an immersive way using mixed reality smartglasses (see Table of Materials). Therefore, a new application was developed. Spatial sounds shall ease the performing of analysis tasks without increasing the mental effort. A voice feature shall ease the navigation used for the developed application of the mixed reality smartglasses (see Table of Materials). A mental rotation test shall be the basis to distinguish novices from advanced users. The stress level is measured based on a questionnaire. Performance, in turn, is evaluated based on the (1) time a user requires for the analysis tasks, and based on the (2) errors that were made by a user for the analysis tasks. The performance in mixed reality smartglass is compared with the accomplishment of the same tasks in a newly developed and comparable 2D desktop application. In addition, a skin conductance device is used to measure the skin conductance level as a possible indicator for stress. Results to this measurement are subject to further analysis and will not be discussed in this work. The authors revealed in another study with the same device that additional considerations are required24.

Based on this protocol, the following five research questions (RQs) are addressed:

RQ1: Do spatial imagination abilities of the participants affect the performance of tasks significantly?
RQ2: Is there a significant change of task performance over time?
RQ3: Is there a significant change of task performance when using spatial sounds in the immersive analytics solution?
RQ4: Is the developed immersive analytics perceived stressful by the users?
RQ5: Do users perform better when using an immersive analytics solution compared to an 2D approach?

Figure 1 summarizes the presented protocol with respect to two scales. It shows the developed and used measures and their novelty with respect to the level of interaction. As the interaction level constitutes an important aspect when developing features for a VR setting, Figure 1 shall better show the novelty of the entire protocol developed in this work. Although the evaluation of the aspects within the two used scales is subjective, their overall evaluation is based on the current related work and the following major considerations: One important principle constitutes the use of abstractions of an environment for a natural interaction, in which the user has become attuned to. With respect to the protocol at hand, the visualization of point clouds seems to be intuitive for users and the recognition of patterns in such clouds has been recognized as a manageable task in general. Another important principle constitutes to overlay affordances. Hereby, the use of spatial sounds as used in the protocol at hand is an example, as they correlate with the proximity of a searched object. The authors recommend to tune the representations in a way that most information is located in the intermediate zone, which is most important for human perception. The reason why the authors did not include this principle was to encourage the user to find the best spot by themselves as well as to try to orientate themselves in a data visualization space, which is too large to be shown at once. In the presented approach, no further considerations of the characteristics of the 3D data to be shown were made. For example, if a dimension is assumed to be temporal, scatterplots could have been shown. The authors consider this kind of visualization generally interesting in the context of Industry 4.0. However, it has to been focused on a reasonably small set of visualizations. Moreover, a previous publication already focused on the collaborative analysis of data. In this work, this research question was excluded due to complexity of the other addressed issues in this study. In the presented setup here, the user is able to explore the immersive space by walking around. Other approaches offer controllers to explore the virtual space. In this study, the focus is set on the usability by using the System Usability Scale (SUS). Another previous publication has conducted a study for economic experts, but with VR headsets. In general, and most importantly, this study complains about the limited field of view for other devices like the used mixed reality smartglasses in this work (see Table of Materials). Their findings show that beginners in the field of VR were able to use the analytic tool efficiently. This matches with the experiences of this study, although in this work beginners were not classified to have VR or gaming experiences. In contrast to most VR solutions, mixed reality is not fixed to a position as it allows to track the real environment. VR approaches such as mention the use of special chairs for a 360° experience to free the user from his desktop. The authors of indicate that perception issues influence the performance of immersive analytics; for example, by using shadows. For the study at hand, this is not feasible, as the used mixed reality smartglasses (see table of materials) are not able to display shadows. A workaround could be a virtual floor, but such a setup was out of the scope of this study. A survey study in the field of immersive analytics identified 3D scatterplots as one of the most common representations of multi-dimensional data. Altogether, the aspects shown in Figure 1 cannot be found currently compiled to a protocol that investigates usability aspects of immersive analytics for Industry 4.0 scenarios.

Protocol

All materials and methods were approved by the Ethics Committee of Ulm University, and were carried out in accordance with the approved guidelines. All participants gave their written informed consent.

1. Establish Appropriate Study Environment

NOTE: The study was conducted in a controlled environment to cope with the complex hardware setting. The used mixed reality smartglasses (see Table of Materials) and the laptop for the 2D application were explained to the study participants.

  1. Check the technical solution before each participant; set in default mode. Prepare the questionnaires and place next to a participant.
  2. Let participants solve tasks from the use cases outlier detection and cluster recognition in one session (i.e., average time was 43 min).
  3. Start the study by welcoming the participants and introducing the goal of the study, as well as the overall procedure.
  4. Participants using the skin conductance measurement device (see Table of Materials) must adhere to a short resting phase, to receive a baseline measurement. Only half of the participants used this device.
  5. All participants have to fill out the State-Trait Anxiety Inventory (STAI) questionnaire31, prior to the start of the experiment.
    1. Next, participants have to perform the mental rotation test (see Figure 4, this test evaluated the spatial imagination abilities), which was the basis to distinguish high from low performers (high performers are advanced users, while low performers are novices), followed by the spatial sound test to measure spatial hearing abilities of a participant.
      NOTE: A median split of the test scores in the mental rotation test32 was used to distinguish low from high performers.
  6. Randomly separate participants into two groups; either start with the task on outlier detection or cluster recognition, while continuing with the other use case afterwards. For the cluster recognition task, half of the participants firstly started with the used mixed reality smartglasses (see Table of Materials), and then used the 2D application, while the other half firstly started with the 2D application, and then used the mixed reality smartglasses (see Table of Materials). For the outlier detection task, randomly select one group which receives sound support, while the other part of the group receives no sound support. 
  7. Conclude the session, participants have to answer the State-Trait Anxiety Inventory (STAI) questionnaire31 again, as well as the self-developed, and a demographic questionnaire. 
  8. Store the generated data, which was automatically recorded by each developed application, on the laptop's storage after the session was accomplished.

2. Study Protocol for Participants

  1. Prepare the experiment (see Figure 2 for the room of the experiment) for each participant. Present the desktop PC, the used mixed reality smartglasses, and hand out the questionnaires.
  2. Inform the participants that the experiment will take 40 to 50 minutes, and that half of them start after the pretests (see Points 3-6 of Study Protocol) firstly with the outlier detection test (see Point 7 of Study Protocol), followed by the cluster recognition test (see Point 8 of Study Protocol), while the others accomplish these two tests vice versa (i.e., Point 8 of the Study Protocol before Point 7).
  3. Decide randomly whether a skin conductance measurement is done. In case of yes, prepare the skin conductance measurement device33 and inform the participant to put on the device. Request a short resting phase from participants to receive a baseline measurement for their stress level.
  4. Request participants to fill out the State-Trait Anxiety Inventory (STAI) questionnaire31 and inform them that it measures the current perceived stress before the experiment.
  5. Conduct a mental rotation test.
    1. Inform participants that their mental rotation capabilities are evaluated and usher them in front of a desktop computer. Inform participants about the test procedure. Note that they had to identify similar objects that had different positions in a simulated 3D space.
    2. Inform participants that only two of the five shown objects are similar and that they will have 2 minutes for the entire test. Inform participants that seven tasks could be accomplished within the given 2 minutes and tell them that performance measures are recorded for each accomplished task.
  6. Evaluate spatial sound abilities.
    1. Inform participants that their spatial sound abilities are evaluated and usher them in front of a desktop computer. Inform participants about the test procedure. Explain to participants that six sound samples must be detected, which will be played for 13 seconds each.
    2. Inform participants that they have to detect the direction (analogously to the four compass directions) of which the sound is coming from.
  7. Evaluate outlier detection skills.
    1. Request participants to put on the mixed reality smartglasses. Explain to them that outliers must be found within the world created for the mixed reality smartglasses.
    2. Further inform them that an outlier is a red-marked point, all other points are white-marked. Explain then to them that they must direct their gaze to the red-colored point to detect it.
    3. Further inform the participants that not only visual help is provided, additionally environmental sounds support them to find outliers. Provide the information to the participants that they have to accomplish 8 outlier tasks, meaning that 8 times within the virtual world, the red-colored point has to be found. For each participant, 4 tasks are sound-supported, while 4 tasks are sound-unsupported. For each participant, it is randomly selected whether they start a task sound-supported or not. Then, dependent from the first task, it changes from task to task whether sound support is provided or not.
    4. Tell participants which information will be recorded: required time for each task, length of walking, and how their final moving position is looking like related to their starting position. Finally tell participants that the red-marked point changes to green if it was detected (see Figure 3).
  8. Evaluate cluster recognition skills.
    1. Randomly decide for the participant whether firstly to use the mixed reality smartglasses or to usher the participant to a desktop computer. In the following, only the procedure for the mixed reality setting is described. If a participant firstly starts with the desktop computer, the procedure is the same in changed order and except the voice commands, they are only provided when using the mixed reality solution.
    2. For participants using mixed reality: Request participants to put on the mixed reality smartglasses. Inform participants how to find clusters within the world created with the used mixed reality smartglasses. Emphasize to the participants that they had to distinguish between overlapping clusters by moving around them.
    3. For participants using mixed reality: Explain to participants that they can navigate in the virtual world and around the clusters using voice commands. Finally tell participants that they had to detect six clusters.
    4. For participants using mixed reality: Request participants to remove the used mixed reality smartglasses. Usher participants to a desktop computer and tell them to use the software shown on the screen of the desktop computer. Inform them that the same type of clusters like shown in the used mixed reality smartglasses had to be detected using the software on the desktop computer (see Figure 7 and Figure 8).
  9. Request participants to fill out three questionnaires, namely the State-Trait Anxiety Inventory (STAI) questionnaire31, a self-developed questionnaire to gather subjective feedback, and a demographic questionnaire to gather information about them.
  10. Request participants to remove the skin conductance measurement device33 if they were requested in the beginning to put it on.
  11. Relieve participants from the experiment by saying thanks for the participation.

Results

Setting up Measures for the Experiment
For the outlier detection task, the following performance measures were defined: time, path, and angle. See Figure 6 for the measurements.

Time was recorded until a red-marked point (i.e., the outlier) was found. This performance measure indicates how long a participant needed to find the red-marked point. Time is denoted as the variable "time" (in milliseconds) in the results.

Discussion

Regarding the developed mixed reality smartglasses (see Table of Materials) application, two aspects were particularly beneficial. The use of spatial sounds for the outlier’s detection task was positively perceived on one hand (see the results of RQ3). On the other, the use of voice commands was also perceived positively (see Figure 10).

Regarding the study participants, although the number of recruited participants was rather small for an e...

Disclosures

The authors have nothing to disclose.

Acknowledgements

The authors have nothing to acknowledge.

Materials

NameCompanyCatalog NumberComments
edaMovemovisens
HoloLensMicrosoft
Matlab R2017aMathWorks
RPY2GNU General Public License v2 or later (GPLv2+) (GPLv2+)https://pypi.org/project/rpy2/
SPSS 25.0IBM

References

  1. Korinth, M., Sommer-Dittrich, T., Reichert, M., Pryss, R. Design and Evaluation of a Virtual Reality-Based Car Configuration Concept. Science and Information Conference. , 169-189 (2019).
  2. Whalen, T. E., Noël, S., Stewart, J. Measuring the human side of virtual reality. IEEE International Symposium on Virtual Environments, Human-Computer Interfaces and Measurement Systems, 2003. , 8-12 (2003).
  3. Martens, M. A., et al. It feels real: physiological responses to a stressful virtual reality environment and its impact on working memory. Journal of Psychopharmacology. 33 (10), 1264-1273 (2019).
  4. Keim, D. A. Information visualization and visual data mining. IEEE transactions on Visualization and Computer Graphics. 8 (1), 1-8 (2002).
  5. Dwyer, T., et al. Immersive analytics: An introduction. Immersive analytics. , 1-23 (2018).
  6. Moloney, J., Spehar, B., Globa, A., Wang, R. The affordance of virtual reality to enable the sensory representation of multi-dimensional data for immersive analytics: from experience to insight. Journal of Big Data. 5 (1), 53 (2018).
  7. Davis, F. D., Riedl, R., Vom Brocke, J., Léger, P. M., Randolph, A. B. . Information Systems and Neuroscience. , (2018).
  8. Huckins, J. F., et al. Fusing mobile phone sensing and brain imaging to assess depression in college students. Frontiers in Neuroscience. 13, 248 (2019).
  9. Preece, J., et al. . Human-computer interaction. , (1994).
  10. Card, S. K. . The psychology of human-computer interaction. , (2018).
  11. Pelayo, S., Senathirajah, Y. Human factors and sociotechnical issues. Yearbook of Medical Informatics. 28 (01), 078-080 (2019).
  12. Bangor, A., Kortum, P., Miller, J. Determining what individual SUS scores mean: adding an adjective rating scale. Journal of Usability Studies. 4 (3), 114-123 (2009).
  13. Krahmer, E., Ummelen, N. Thinking about thinking aloud: A comparison of two verbal protocols for usability testing. IEEE Transactions on Professional Communication. 47 (2), 105-117 (2004).
  14. Hornbæk, K. Current practice in measuring usability: Challenges to usability studies and research. International Journal of Human-Computer Studies. 64 (2), 79-102 (2006).
  15. Peppa, V., Lysikatos, S., Metaxas, G. Human-Computer interaction and usability testing: Application adoption on B2C websites. Global Journal of Engineering Education. 14 (1), 112-118 (2012).
  16. Alwashmi, M. F., Hawboldt, J., Davis, E., Fetters, M. D. The iterative convergent design for mobile health usability testing: mixed-methods approach. JMIR mHealth and uHealth. 7 (4), 11656 (2019).
  17. System Usability Scale (SUS). Assistant Secretary for Public Affairs Available from: https://www.hhs.gov/about/agencies/aspa/how-to-and-tools/methods/system-usability-scale.html (2013)
  18. Fang, Y. M., Lin, C. The Usability Testing of VR Interface for Tourism Apps. Applied Sciences. 9 (16), 3215 (2019).
  19. Pryss, R., et al. Exploring the Time Trend of Stress Levels While Using the Crowdsensing Mobile Health Platform, TrackYourStress, and the Influence of Perceived Stress Reactivity: Ecological Momentary Assessment Pilot Study. JMIR mHealth and uHealth. 7 (10), 13978 (2019).
  20. Zugal, S., et al. Investigating expressiveness and understandability of hierarchy in declarative business process models. Software & Systems Modeling. 14 (3), 1081-1103 (2015).
  21. Schobel, J., et al. Learnability of a configurator empowering end users to create mobile data collection instruments: usability study. JMIR mHealth and uHealth. 6 (6), 148 (2018).
  22. Schobel, J., Probst, T., Reichert, M., Schickler, M., Pryss, R. Enabling Sophisticated Lifecycle Support for Mobile Healthcare Data Collection Applications. IEEE Access. 7, 61204-61217 (2019).
  23. Hoppenstedt, B., et al. Dimensionality Reduction and Subspace Clustering in Mixed Reality for Condition Monitoring of High-Dimensional Production Data. Sensors. 19 (18), 3903 (2019).
  24. Winter, M., Pryss, R., Probst, T., Reichert, M. Towards the Applicability of Measuring the Electrodermal Activity in the Context of Process Model Comprehension: Feasibility Study. Sensors. 20, 4561 (2020).
  25. Butscher, S., Hubenschmid, S., Müller, J., Fuchs, J., Reiterer, H. Clusters, trends, and outliers: How immersive technologies can facilitate the collaborative analysis of multidimensional data. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. , 1-12 (2018).
  26. Wagner Filho, J. A., Rey, M. F., Freitas, C. M. D. S., Nedel, L. Immersive analytics of dimensionally-reduced data scatterplots. 2nd Workshop on Immersive Analytics. , (2017).
  27. Batch, A., et al. There is no spoon: Evaluating performance, space use, and presence with expert domain users in immersive analytics. IEEE Transactions on Visualization and Computer Graphics. 26 (1), 536-546 (2019).
  28. Towards hmd-based immersive analytics. HAL Available from: https://hal.archives-ouvertes.fr/hal-01631306 (2017)
  29. Luboschik, M., Berger, P., Staadt, O. On spatial perception issues in augmented reality based immersive analytics. Proceedings of the 2016 ACM Companion on Interactive Surfaces and Spaces. , 47-53 (2016).
  30. Fonnet, A., Prié, Y. Survey of Immersive Analytics. IEEE Transactions on Visualization and Computer. , (2019).
  31. Spielberger, C. D., Gorsuch, R. L., Lushene, R. E. STAI Manual for the Stait-Trait Anxiety Inventory (self-evaluation questionnaire). Consulting Psychologist. 22, 1-24 (1970).
  32. Vandenberg, S. G., Kuse, A. R. Mental rotations, a group test of three-dimensional spatial visualization. Perceptual Motor Skills. 47 (2), 599-604 (1978).
  33. Härtel, S., Gnam, J. P., Löffler, S., Bös, K. Estimation of energy expenditure using accelerometers and activity-based energy models-Validation of a new device. European Review of Aging and Physical Activity. 8 (2), 109-114 (2011).
  34. . RPY2: A Simple and Efficient Access to R from Python Available from: https://sourceforge.net/projects/rpy/ (2020)
  35. Hoppenstedt, B., et al. Applicability of immersive analytics in mixed reality: Usability study. IEEE Access. 7, 71921-71932 (2019).
  36. Hoppenstedt, B. Applicability of Immersive Analytics in Mixed Reality: Usability Study. IEEE Dataport. , (2019).

Reprints and Permissions

Request permission to reuse the text or figures of this JoVE article

Request Permission

Explore More Articles

Mixed RealityUsability EvaluationIndustry 4 0Smart GlassesImmersive AnalyticsData AnalyticsParticipant ExperimentAnxiety InventoryMental Rotation TestSpatial Sound AbilityOutlier DetectionEnvironmental Sound CuesCluster Recognition

This article has been published

Video Coming Soon

JoVE Logo

Privacy

Terms of Use

Policies

Research

Education

ABOUT JoVE

Copyright © 2025 MyJoVE Corporation. All rights reserved