A team of researchers at RMIT University led by Dr Jair Garcia and Assoc Prof Adrian Dyer tried the capacity of Tobii 2 Glasses to record in the highly complex lighting conditions at the famous ‘White Night’ event in Melbourne on the 18th February 2017.
Assoc Prof Adrian Dyer said, “We were pleasantly surprised at the capacity of the glasses to track subject eye movements despite, low light intensity levels and very dynamic changes in illumination conditions The team member shown in the black jacket is filming the amazing art displays near RMIT, and the tablet connected wirelessly to the Tobii Pro Glasses 2 allowed the team to simultaneously see what she was viewing on her camera screen as she composed the image, whilst also looking around at all the incredible art displays. The data allow the team to better understand individual experiences when visiting such a complex art installations.”
On 28 Nov to 1 Dec 2016, Objective Eye Tracking attended the 4th SEANES International Conference on Human Factors and Ergonomics in South-East Asia in Bandung, Indonesia.
Ying Ki, our very own eye-tracking research consultant, was on the centre-stage, describing how eye-tracking research can help us uncover in-depth insights that are not easily accessible via regular methods in Human Factors and Ergonomics research.
Eye-tracking is particularly important in Human Factors and Ergonomics research, for example in driving research, where safety and accident prevention are of utmost significance.
In terms of attention, the human’s dominant sense is vision. More often than not, our sense of sight contributes to the majority of our conscious awareness.
However, what might be surprising to you is that even our attention and decision making processes are influenced by our unconscious visual inputs.
This neuro-cognitive mechanism is something that visual and interaction designers adopt to improve conversion. Our consulting company arm, Objective Experience, takes into consideration this mechanism when conducting any user testing and has always been educating the user experience industry in Australia and Singapore about this.
The Tobii Pro Glasses 2 was also showcased during the presentation, and it generated a huge amount of interest among the audience. With a wearable eye tracker like the Tobii Pro Glasses 2, human factors and ergonomics research can be conveniently conducted in naturalistic environments. This is essential to understand how our visual inputs affect our behaviour and decision making in real world conditions, and not just in lab settings.
Do you want to find out how eye tracking can help your research? Drop us an email at firstname.lastname@example.org to arrange a demonstration session.
Objective Eye Tracking is hosting a refresher workshop in Sydney on the 3rd of November at the University of New South Wales Business School, Room 130, Level 1.
Eye tracking guru, Dan Sorvik, explains how to get the most out of your Tobii eye tracker. He will cover:
- Why bother with eye tracking
- Considerations and pitfalls of eye tracking projects
- How to make sure everything works on testing day – Set up and risk mitigation
- Eye behaviours associated with UX issues
- Case studies from successful projects
- Analysis and report writing tips
- Key points for selling eye tracking consulting as part of your UX business
- Where is eye tracking headed: Autonomous eye tracking
You will also be given the chance to share experiences and ask those burning questions which will make you successful in the future.
Check it out below!
This is the first part of four in our mini-series on advanced topics in eye tracking. We begin with a topic that is fundamental to the technique: classification of eye movements.
Why do we need to classify eye movements in the first place?
Most modern eye trackers are video-based. Images of the eye captured at regular intervals, the sampling rate, are processed to calculate instantaneous gaze position. This discrete data stream must then be converted back to the informative, continuous eye movements for analysis. This digital-to-analog conversion is accomplished by passing the raw gaze samples through an event detection algorithm called the fixation filter.
Fixation filters can come with adjustable parameters to enable tailoring their characteristics to specific circumstances. Choosing the appropriate parameters is of fundamental importance in properly classifying eye movements and calculating valid metrics based on the resulting fixations and saccades.
How do I apply it?
There are a variety of fixation filters and researchers may choose based on those commonly used in their field. If using Tobii Pro Studio for analysis, you can choose from several with varying levels of complexity and adjustability.
The Tobii Pro Studio default is the Tobii I-VT fixation filter. As a classification filter that operates on the velocity of eye movement, it is effective and commonly used in human behavior research.
You can find the algorithm description here: Download White Paper: Tobii I-VT Fixation Filter
Tip: As reviewers get more demanding and want to understand better how you processed your data, we encourage you to cite this White Paper and the parameters chosen in your methodology section if you use this filter.
Our White Paper on the Default Values Tobii I-VT Filter describes how we determined the optimal parameter values of the Tobii I-VT Fixation Filter.
As this is a generic eye movement filter, it is reasonable to review and validate the parameters of your eye movement filter empirically. You find a great hands-on guide for this is in Chapter 5.3, pp. 153 in “Eye Tracking – A Comprehensive Guide to Methods and Measures” from Holmqvist & Nyström et al. (Oxford University Press, 2011). In Tobii Pro Studio, the Velocity Chart can aid you in this process (see Tobii Pro Studio manual Appendix 14.2)
To summarize, how you classify eye movements in your data is an extremely important step in your research and can have a massive influence on the calculated measures, so choose carefully.
Chapter 5, Estimating Oculomotor Events from Raw Data Samples, Holmqvist & Nyström et al., 2011.
If you’d like to learn the basics of eye movements and events (e.g. what are fixations, saccades, smooth pursuit, vergence, VOR), we touched base on it in this article here.
Next up…we will be taking a deeper look at the three different spaces used in eye tracking and how they relate to drawing areas-of-interest. Watch out for the article on eyetracking.com.sg!
On 27 September 2016, we attended Innovation Labs World in Singapore as an exhibitor, displaying Tobii eye tracking equipment and our research consultancy services, engaging with the public sector. Indeed, there was copious amount of interest in eye tracking from the various government agencies – thanks to GovTech’s Government Digital Services Hive UX Lab where they use a Tobii Pro X2-30 screen-based eye tracker for their UX research!
But this event was more than just about showcasing the equipment. It gave us a glimpse into the future that beheld Singapore’s design, technology and IT sector for public service innovation.
Political momentum has triggered the creation of new innovation labs, smart city units and digital services. Various representatives from Singapore’s Smart Nation strategy to India’s drive for 100 Smart Cities, and from Australia’s Digital Transformation Office to Makassar’s “War Room” were present at Innovation Labs World. GovTech, the newly formed agency aiming to build deep tech capabilities in the Singapore Government and to drive its digitalisation efforts was also present at this event.
Using tech and the harnessing of data to engage citizens, build better environments and the development of health and innovation policies were some case studies put forth by experts around the world.
We’re really excited that we were part of this and for the future going forth.