JoVE Logo
Faculty Resource Center

Sign In





Representative Results






Combining Eye-tracking Data with an Analysis of Video Content from Free-viewing a Video of a Walk in an Urban Park Environment

Published: May 7th, 2019



1Visiting Professor, Dipartimento di Scienze Agro-Ambientali e Territoriali, Università degli Studi di Bari, 2Centre for Urban Research, Royal Melbourne Institute of Technology (RMIT University), 3School of Software and Electrical Engineering, Swinburne University of Technology, 4Faculty of Civil Engineering, Babol Noshirvani University of Technology, 5School of Science, Australian Catholic University

The objective of the protocol is to detail how to collect video data for use in the laboratory; how to record eye-tracking data of participants looking at the data and how to efficiently analyze the content of the videos that they were looking at using a machine learning technique.

As individuals increasingly live in cities, methods to study their everyday movements and the data that can be collected becomes important and valuable. Eye-tracking informatics are known to connect to a range of feelings, health conditions, mental states and actions. But because vision is the result of constant eye-movements, teasing out what is important from what is noise is complex and data intensive. Furthermore, a significant challenge is controlling for what people look at compared to what is presented to them.

The following presents a methodology for combining and analyzing eye-tracking on a video of a natural and complex scene with a machine learning technique for analyzing the content of the video. In the protocol we focus on analyzing data from filmed videos, how a video can be best used to record participants' eye-tracking data, and importantly how the content of the video can be analyzed and combined with the eye-tracking data. We present a brief summary of the results and a discussion of the potential of the method for further studies in complex environments.

Our daily lived experiences of urban environments greatly impact on our health and well-being. Our well-being can depend on the amount of green spaces that we view and experience1,2,3, and these views can be quantified using eye-tracking equipment to guide decision making about park design. However, an issue arises with the volume of eye tracking data that is generated and making sense of this data. As the equipment for recording gaze data in a lab or natural setting becomes easier to use and more powerful, researchers need to consider how we can collect and analyze data vali....

Log in or to access full content. Learn more about your institution’s access to JoVE content here

Ethical approval for this project was given by the Australian Catholic University ethics committee - approval number #201500036E. This ensured that informed consent was gained from all participants and all participants participated voluntarily, and that participants data remained anonymous and confidential. In addition the approval was given due the method and equipment meeting Australian standards safety regulations.

1. Filming Urban Scenes that Can Be Used in an Eye-Tracking Study

    Log in or to access full content. Learn more about your institution’s access to JoVE content here

Figure 1 and Figure 2 show the result of taking all eye-tracking data for the whole video across all participants and producing a heat map; this is the standard approach available in eye-tracking software packages. By comparing Figure 1 and Figure 2 it is possible to identify that on average participants scanned left and right on the x coordinate of the video in .......

Log in or to access full content. Learn more about your institution’s access to JoVE content here

Generally, in standard software packages for analyzing eye-tracking data, a vector AOI is used. Even for a single still image, the size of the vector cannot be easily measured. Furthermore, including all AOIs in an image and calculating the relative amounts of AOIs is laborious. It is almost impossible to do this manually on a video without a machine learning technique such as the one described. This was a relatively simple statement that infers a free viewing situation. A much more precise scenario can be used and diffe.......

Log in or to access full content. Learn more about your institution’s access to JoVE content here

This work was financially supported by the City of Melbourne and partially by ARC DP 150103135. We would like to thank Eamonn Fennessy for his advice and collaborative approach. With special thanks to researcher assistants Isabelle Janecki and Ethan Chen whom also helped collect and analyze this data. All errors remain the authors.


Log in or to access full content. Learn more about your institution’s access to JoVE content here

Name Company Catalog Number Comments
12 mm lens Olympus Lens
Panasonic GH4  Panasonic Video Camera
Tobii Studio version (2.1.14)   Tobii Software
Tobii x120 desktop eye-tracker Tobii Eye-tracker

  1. Patrik, P., Stigsdotter, U. K. The relation between perceived sensory dimensions of urban green space and stress restoration. Landscape and Urban Planning. 94 (3-4), 264-275 (2010).
  2. Bjørn, G., Patil, G. G. Biophilia: does visual contact with nature impact on health and well-being?. International Journal of Environmental Research and Public Health. 6 (9), 2332-2343 (2009).
  3. Velarde, M. a. D., Fry, G., Tveit, M. Health effects of viewing landscapes-Landscape types in environmental psychology. Urban Forestry & Urban Greening. 6 (4), 199-212 (2007).
  4. Polat, A. T., Ahmet, A. Relationships between the visual preferences of urban recreation area users and various landscape design elements. Urban Forestry & Urban Greening. 14 (3), 573-582 (2015).
  5. Peter, P., Giannopoulos, I., Raubal, M. Where am I? Investigating map matching during self-localization with mobile eye tracking in an urban environment. Transactions in GIS. 18 (5), 660-686 (2014).
  6. Berto, R., Massaccesi, S., Pasini, M. Do Eye Movements Measured across High and Low Fascination Photographs Differ? Addressing Kaplan's Fascination Hypothesis. Journal of Environmental Psychology. 28 (2), 185-191 (2008).
  7. Kaplan, S. The restorative benefits of nature: Towards an integrative framework. Journal of Environmental Psychology. 15, 169-182 (1995).
  8. Duchowski, A. T. . Eye Tracking Methodology: Theory and Practice. , (2017).
  9. Amati, M., Ghanbari Parmehr, E., McCarthy, C., Sita, J. How eye-catching are natural features when walking through a park? Eye- tracking responses to videos of walks?. Urban Forestry and Urban Greening. 31, 67-78 (2018).
  10. Gould, S. D. A. R. W. I. N. A Framework for Machine Learning and Computer Vision Research and Development. Journal of Machine Learning Research. (Dec), 3533-3537 (2012).
  11. Richardson, D., Matlock, T. The integration of figurative language and static depictions: an eye movement study of fictive motion. Cognition. 102 (1), 129-138 (2007).
  12. Bojko, A. . Eye Tracking the User Experience: A Practical Guide to Research. , (2013).

This article has been published

Video Coming Soon

JoVE Logo


Terms of Use





Copyright © 2024 MyJoVE Corporation. All rights reserved