The Emergence of a New Medium – VR and its UX considerations

A very Happy New Year from everyone here at Objective Experience! Hope you guys had a wonderful 2015, and continue to stay awesome. Let’s share the joy and love to everyone by making the world a better place every day and aim towards an even better 2016!

In 2015 we have seen some interesting new trends happening in the technology and UX scene. Notably, we see the emergence of a familiar medium that we are so used to see in science fiction movies – Virtual Reality (VR). Although VR has been around for quite some time now, it was a niche technology that were mostly used as a research tool, as it has been far too expensive and bulky to enter the mainstream market.

In the consumer market arena, Samsung has already announced their new Samsung Gear VR and its corresponding lineup of games and apps, however many are still unaware that VR indeed exists because they have not been educated on it. VR is predicted to be pushed into widespread adoption via the gaming industry, with Sony spearheading the charge. Facebook’s acquisition of Oculus last year could also probably mark the start of an era where VR could replace many of our real-world interactions.  VR could very well be the next computing platform, akin to how smartphones is starting to replace our desktop/laptop computers, as well as change how we live our lives over the last decade.

With the possibility that VR will become the next computing platform that will become increasingly prevalent and integrated into our lives, it seems that the UX community needs to get acquainted with this new medium, because it is our job and passion to make interactive experiences pleasurable. 2016 is poised to be an exciting year for VR, and UX designers can expect to have projects working on the VR medium.

2388290.png

Source: Back To The Future Part 2 (1989)

As the medium is still relatively new, it could be still quite difficult to find good resources on the internet, but fret not, Github user omgmog (Max Glenister) has compiled a really comprehensive list of resources (to date) on UI/UX design considerations for VR. Below is the 3 fundamental UX considerations that is important for a pleasurable VR experience:

1. Immersion / Presence – Perhaps the most important concept that is associated with the UX of VR is “immersion” (or “presence”), so much so that design on the VR platform has been coined “immersive design”. Basically, “immersion” is the extent on how the virtual environment faithfully reproduces experiences in which users believe that the virtual environment is physically real. There are many factors that can “break” immersion, for example, if interaction with a virtual object does not result in any effects, it violates our mental model for object interaction and hence breaking immersion. Unrealistic positional sound effects and model details would also make the object interaction seem less realistic.

 

2. Spatial Disorientation / Virtual Reality Sickness – Research has shown that virtual reality sickness is a major barrier to using VR. The cause behind Virtual Reality Sickness is still not fully known yet, but sensory conflict during movement seems to be the primary cause. In natural navigation, we use a few of our senses in tandem to makes sense of the environment, especially the eyes and the ears. However in VR, this job become primarily subserved by your eyes. The mismatch from the information going into your eyes and the other of your senses creates discomfort and symptoms that are similar to motion sickness. However, the solution to this apparently inherent problem to the VR platform can be as simple as adding virtual noise or twerking virtual reality motion parameters.

 

3. Comfort – Although comfort mainly depends on the hardware design, the design of the software applications contribute to comfort as well. For example, physical movements should be consistent with human ergonomics. If a particular action forces an unnatural twist to the body (e.g., overturning your head while sitting still), it is uncomfortable and can be potentially dangerous. Illegible text (which is pretty common in VR) and overly bright scenes will also impose additional stress on the eyes, causing eye fatigue.

Other than putting the focus on assessing the UX for VR applications, VR can also be a useful tool for general UX research. As mentioned above, VR technology started out mainly as a research tool, thanks to fact that it can handle research that requires ecological validity in a controlled environment. Before VR existed, many research are conducted in a lab-based setting which cannot really be generalized to the “real-world”. With VR, you can attain both criteria by constructing an artificial environment resembling the real-world within a controlled environment. With this in mind, undoubtedly VR can also be useful for UX and market research, specifically in assessing user experience in an unbiased, controlled setting.

VR can also be combined with eye tracking technology to provide more ecological-valid insights to UX research. For example, Tobii Pro offers VR integration with the Tobii Pro Glasses 2, providing an easy way to combine both VR and eye tracking technology into a powerful research tool.

Click here for more details!

 

Seeing the world through Noongyar eyes

How can Aboriginal ways of seeing the world transform education? Research into Aboriginal Australians, specifically how their contributions can inform contemporary educational practices, are needed to help students, schools and communities sustain Aboriginal knowing, recognise its value to society and bridge learning across and between cultures. For example, using cultural knowledge to better understand weather and its relationships to breeding seasons and availability of food sources can allow synergies between scientific and indigenous Aboriginal knowledge to become apparent.

Education researcher Dr. Khady Ibrahim-Didi and Jason Barrow from Edith Cowan University (Western Australia) are using Tobii Pro Glasses 2 in a small research study “Seeing the world through Noongar eyes’. The study investigates how Tobii Pro Glasses 2 can be used by Aboriginal educators to help others ‘see’ changes in their environment.

Aboriginal Educator in parkland using Tobii2 glasses to point out flora and fauna to early spring (photo credits: Kadhy Ibrahim-Didi, Edith Cowan University, Western Australia)

Aboriginal Educator in parkland using Tobii2 glasses to point out flora and fauna to early spring (photo credits: Khady Ibrahim-Didi, Edith Cowan University, Western Australia)

Wearing the Tobii Pro Glasses 2, an Aboriginal educator walked around a parkland location in the northern suburbs of Perth, Western Australia.  It was early spring and as he focused on those signs that indicated the turn of the weather, the Tobii Pro Glasses 2 showed the focus of attention and the order in which he sought those signs. The team then invited a young Aboriginal youth to walk through the same area, wearing the same pair of Tobii Pro Glasses 2. Dr Khady spoke about the value and significance of Tobii Pro Glasses 2 in documenting such knowledge, “It was fantastic! We could see the parallels between the two people, one much more informed than the other, and yet you could see the beginnings of a culturally informed perspective emerging”. Khady and Jason both see the potential for using this tool to show how an Aboriginal or non-Aboriginal child might learn to “see the world through Nyoongar eyes” and for recognising those aspects that might show similarities between Aboriginal and non-Aboriginal views.

Aboriginal youth focusing on fallen seeds and animal tracks through a pair of Tobii2 goggles as she is inducted into the culturally relevant signs of the types of birds and animals that inhabit the area (photo credits: Kadhy Ibrahim-Didi, Edith Cowan University, Western Australia)

Aboriginal youth focusing on fallen seeds and animal tracks through a pair of Tobii2 goggles as she is inducted into the culturally relevant signs of the types of birds and animals that inhabit the area (photo credits: Khady Ibrahim-Didi, Edith Cowan University, Western Australia)

In another instance, Aboriginal elder at ECU, Oriel Joy Green (Bartlett) has also used the Tobii Pro Glasses 2 when she took a number of women, from multiple cultures back to country on a trip to support reconciliation. The trip, named ‘Koorliny Koort Boodja’ (Going to Heart Land), was supported by the City of Stirling. This trip focused on the journey of Oriel and her family. As she explained the significance of the places, the Tobii Pro Glasses 2 informed those with her, including Khady, areas and specific spots that carried such emotional and cultural significance to Oriel and her family. This trip of reconciliation was a celebrated success as the group continue to meet, sharing in a common vision – that of a better more united tomorrow for Australians from all cultures.

Aboriginal Elder Oriel Green pointing out some of the culturally significant areas in her country (photo credits: Kadhy Ibrahim-Didi, Edith Cowan University, Western Australia)

Aboriginal Elder Oriel Green pointing out some of the culturally significant areas in her country (photo credits: Khady Ibrahim-Didi, Edith Cowan University, Western Australia)

 

For further updates please contact Dr. Kadhy Ibrahim-Didi .

Emotional UX – Techniques for measuring user’s emotions

Emotion, the very spark of feeling that makes our heart flutter, eyes tear over and our hands clench in fear. No doubt, we are all controlled by emotions. It is the primary instinct that drives us to feel and act. In UX, people are paying more attention upon the skill of empathy, plus emotion. But, as far as we can see, no-one has defined a standard on how emotion in UX/usability should be measured. A designer’s gut feel, their previous mistakes and experience, mostly does it. Trial and error with an agile process is ok, but can it be measured?

Since, by nature, emotions are intangible, there isn’t a definite method to measure emotion yet. We have written this summary so that we can work out what would be best to do as a consultancy right now. 

Neurometrics, Biometrics AND Eye Tracking

Andrew Schall (the principal researcher and senior director at Key Lime Interactive) has written a comprehensive article suggesting various new methods on how emotions can be measured more accurately and objectively, along with their pros and cons. We briefly review some of the methods mentioned in his article below. Then we will focus on some techniques you can adopt right now in your UX practice.

BIOMETRICS

Facial response analysis 

Traditional facial response analysis involves a few researchers observing participants, and coming to an agreement on what emotions are being elicited by the participants. In recent years, software and algorithms have been developed to recognize facial expressions of different emotions, just with a simple webcam set-up. However, the current state of this technology only recognizes a limited set of emotions (e.g., anger, fear, joy), and are only accurate when the emotions are overtly expressed. An example for such a software would be AFFDEX by Affectiva. You can also check out this Tedtalk here by Affectiva’s Chief Strategy and Science Officer, Rana el Kaliouby. Other similar software includes Noldus’ FaceReader and ZFace. Despite the limitations, deeper and more precise algorithms are rapidly being developed to raise the accuracy of the analysis.

Electromyography (EMG)

EMG is able to accurately measure more subtly expressed emotions by measuring signals from specific muscles known to react to specific emotions (check out this Scholarpedia article for a simple introduction). However, EMG is obtrusive and only works if you know which facial muscles to measure beforehand. It is also impossible to put electrodes across the entire face of the participants; but again, this is too intrusive for everyday usability testing.

Another limitation for using facial response analysis and EMG is that they can only measure overt emotions which are often under conscious control. As such, these emotions can be highly influenced by social settings. For example, humans tend to show stronger facial expressions if they believed that they are being observed.

One of our UX consultants trying out the Empathica E3 Wristband

One of our UX consultants trying out the Empathica E3 Wristband

GSR (Galvanic Skin Response)

GSR technology have been traditionally used to measure physiological arousal. It can accurately measure intensities (e.g., arousal, stress), but not emotional valence (positive or negative). Although some computational algorithms can be applied to the GSR data to measure valence (Monajati, Abbasi, Shabaninia, & Shamekhi, 2012), it is still far from being able to measure specific emotions.

Other limitations include a delay of 1- 3 seconds (maybe more, depending on the equipment used), and can be affected by external surrounding conditions (e.g., temperature, humidity) as well as internal bodily conditions (e.g., medications). We have a GSR unit and tried experimenting with it, but we found that it was rather difficult to correlate spikes in GSR with UI interactions. The temporal resolution of GSR is too crude to measure emotional response to individual events.

NEUROMETRICS

Electroencephalography (EEG)

EEG is a neuroimaging method used to measure real-time changes in voltage caused by brain activity. Measuring brain activity means it has a much larger arsenal of measures for emotional responses as compared to biometrics. Its excellent temporal resolution also means that it has the potential to measure real-time changes in emotional responses that would be very useful for UX research. However just like physiological response patterns, brain activity patterns are affected by many external and internal factors. Well-designed computational methods and trained algorithms are needed to extract information from the “noisy” EEG data. For example, movement can cause bunches of artifact that are not related to experienced emotions. Research into EEG as a measurement for emotions are still in early stages, but it has showed more promising results than the GSR in measuring emotional states.

EEG technology is now becoming increasingly accessible (check out this list on Wikipedia), and companies like Emotiv are already starting to produce lightweight and wireless EEG equipment for a simpler and less obtrusive set-up. It means, however, that there will be lesser electrodes to precisely and reliably transform the data into meaningful insights. It is a trade-off between obtrusiveness and data sensitivity.

EYE-TRACKING

Eye-tracking is not obtrusive and can measure arousal from blink activity, pupil size and dwell times, however pupilometry like this suffers from the same problem of being affected by many external and internal factors. Thus, the environment must be well-controlled to avoid disturbance that may contribute to changes in pupilometry data of the participants.

With eye-tracking we can measure people’s unconscious eye gaze response to an interface they are using. Specific emotions, however, cannot be measured using eye-tracking alone, and instead are discovered only in the Retrospective Think Aloud (RTA) Interview afterwards, which is susceptible to suggestibility effect.

Despite eye-tracking’s inability to measure emotional states meaningfully on its own, its main advantage lies in its flexibility to combine with other research methods and measurements to gather powerful insights. Eye-tracking aids us in determining the user’s attention, focus or other mental states. Using other devices, we are potentially able to pinpoint specific events or touch points that cause a change in emotional states during testing sessions. The usage of lightweight eye tracking equipment, such as our Tobii Glasses 2, also enables the flexibility of the research objective to test in their own environment if they were to require more ecological validity.

How do we use all this?

One important piece of advice from Andrew Schall’s article is that EEG and GSR are not for everyone, as there can be potential for misinterpretation and misuse of the data. We believe that there is a need to understand the science behind the complexities of the technologies beforehand in order to avoid misusing them. This also applies to the eye-tracking technology, even if you are using it as a complementary research method to pinpoint specific events as mentioned above.

Andrew also warned that it is often insufficient to measure emotions just with a single technique, as the neurometrics and biometrics measurements for emotions described above are not fully matured yet. Using a variety of methods to complement each other would obtain a better accuracy in identifying users’ specific emotional experiences. There are, however, still significant challenges to implement a standard for measuring emotions using these technologies, especially in terms of economy and practicality. Given neurometrics and biometric measurements still have some way to go, is there any other way to measure emotions more economically and practically?

What else can we do to measure emotion?

We believe the answer to this question could be good old self-report questionnaires.

Questionnaires, unlike user interviews, are more objective and standardized, hence results can be compared across different context and projects. Our clients always want to compare scores like NPS or SUS for themselves against other projects across their organization. Although questionnaires still suffer from the same problem of having a reliance on a user’s recall (which could be mitigated by the use of the eye-tracking + RTA research methodology), it is simple to implement and you do not need to be a neuroscientist to analyze the results. There might be countless questionnaires available online, but fret not, we have done a little research to identify the following that are designed and empirically tested to measure aspects related to emotions

1. Geneva Emotion Wheel

This is an empirically tested instrument to measure emotional reactions to objects, events, and situations, based on Scherer’s Component Process Model. It assesses 20 emotions and can be used in 3 different ways, depending on your objective. You can download a standard template to use at their website, provided it is for non-commercial research purpose.

2. Plutchik’s Wheel of Emotions

70cb81fe1b87d2703d5c2f127841efad.jpg

Source: Author/Copyright holder: Machine Elf 1735. Copyright terms and licence: Public Domain.

Plutchik’s wheel of emotions is an early model of an emotional wheel that was constructed based on 8 “basic” emotions and their “opposite emotions”. It was further expanded to include more complex emotions that are composed of 2 basic ones. Even though this model lacks empirical testing, some UX designers and researchers use it from time to time to map out user journeys, because it provides an organisational structure (e.g., intensity, complexity) when measuring emotions.

3. Self Assessment Manikin (SAM)

This is a questionnaire that uses pictorial scales to measure 3 dimensions of experienced emotions: pleasure, arousal and dominance. It has been often used in evaluations of advertisements and increasingly in product evaluation. Because it is pictorial-based, it is compatible with a wider range of population (children, or participants from different language/cultural background).

4. PrEmo

This questionnaire also uses pictorial scales, but it is designed to measure more specific emotions for product evaluation purposes. It uses a set of 7 positive and 7 negative emotions to measure the emotional impact on users. Like eye tracking, PrEmo can be used either as a quantitative tool by itself, or as a qualitative tool to complement user interviews. Although PrEmo is available for academic (non-commercial) usage free-of-charge, there is a charge in using it for commercial purposes.

5. AttrakDiff

The Attrakdiff does not measure specific emotions, but it includes an assessment of emotional impact on product evaluation. It measures attractiveness of a product based on 2 sets of scales:

  • Pragmatic scale – basically usability, e.g., usefulness of a product
  • Hedonic scale – this is measuring emotional reactions. It is not measuring the distinct emotions itself, but the user’s needs and behaviours arising from the emotions, e.g., curiosity, identification, joy, enthusiasm

Their website offers a pretty comprehensive overview of what is it about and you are able to have a go at the demo on their website too.

 

6. youxemotions

tablet2.png

Source: http://emotraktool.com/en/why

youxemotions offer a simple and easy-to-use solution to measure emotions. Users will choose what they felt from 9 emotions and 5 levels of intensity. Turning results into charts for presentation is extremely easy as well. It is currently in beta, and is free for use till the end of the beta period.

Even though there are various ways to measuring emotions is UX, it is important to understand the benefits and limitations to each method. After all, research methods are only useful if it can help you answer your research question or design objective.

If specific emotions are too complicated for your needs, maybe an analysis of how users are using their mouse would be a good enough  tool to infer negative emotions when users are browsing websites.

-Ying Ki, Shermaine & James

References

Monajati, M., Abbasi, S. H., Shabaninia, F., & Shamekhi, S. (2012). Emotions States Recognition Based on Physiological Parameters by Employing of Fuzzy-Adaptive Resonance Theory. International Journal of Intelligence Science, 2, 166-175 .

We Can Help you Empathise With Your Customers

With the recent interest on empathy in design, many designers and researchers in the UX community are writing on this construct and imploring other designers to utilize more empathy in their design process. In fact, a quick search on Google for “Empathy & UX” will net you more articles than you can read in a week!

Unfortunately, interesting as these articles might be, one soon realises that each and every author has their own interpretation of the term “empathy”. From “understanding the feelings and thoughts of others” to “sharing the same feelings as others” or “a feeling of affinity with others”. Empathy is being defined in a variety of ways that do not necessarily coincide with each other. To quote Product Designer Emily Campbell,

“Like most buzzwords that become jargon, the value of the word empathy is being lost in the noise.”

At Object Experience, after self-insight, we believe that empathy is one of the most important assets for a UX’er. This blog post aims to decode this elusive construct and convey to the community exactly how to effectively embrace empathy in the design thinking process, especially with the help of cutting edge research technologies such as eye tracking.

What is empathy?

Source: http://www.stuartduncan.name/general/autism-and-empathy-heres-another-way-to-look-at-it/

Empathy comes from both the Greek root “pathos” which means emotion, feeling, suffering, or pity, as well as the German word “Einfühlung” which was used to refer to the human capacity’ to experience a sense of unity with the artist of a piece of art. Interestingly, the concept of “Einfühlung” helps us to understand that the act of empathizing needs not be limited to human beings, and it can be extended to objects and even ideas and symbols among other things that we can empathize with.

Empathy is a multidimensional construct. In psychology research, empathy has been dissected into two distinct types:

  1. Emotional empathy refers to the ability to understand and identify with someone else’s emotional states without being overtaken by them. This kind of empathy often happens automatically and unconsciously.
  2. Cognitive empathy refers to the objective understanding of other’s perspectives, context, goals and motivations, and is largely consciously driven.

Source: http://www.medicaldaily.com/your-brain-structure-may-decide-how-you-empathize-emotional-brains-are-physically-339996

As with many elusive constructs, empathy receives a fair amount of debate on whether there is a need to distinct it into the 2 above-mentioned types. Some researchers have argued that emotional empathy is a more primitive type of empathy that preempts cognitive empathy and is thus necessarily more important (McLaren, 2013).  However, recent neurocognitive research has found that they are sufficiently different processes in terms of brain region activations, suggesting that cognitive empathy is no less important than emotional empathy (Nummenmaa, Jussi, Riitta, & Jari, 2008). This finding actually partially echos early propositions that argue “true empathy” integrates both types of empathy (Staub, 1987).

Role of empathy in UX Design & Research

Empathy underpins human-centered design and the UX community more than anything else. Developers and designers who do not understand the importance of empathy will find it difficult appreciating how different users think and work, and often assume that every user will approach and solve problems in the same way as themselves. Peter Smart, a designer who had gone on a journey to solve 50 design problems in 50 days mentioned that:

“Empathic research helps us understand our users’ needs beyond the functional, enabling us to develop more appropriate design outcomes. It is one of a raft of valuable processes and tools, on its own seemingly no more important than any other. However, while good designers understand the tools, great designers understand people.”

To put it simply, “great designers” are those who have a thorough and holistic understanding of their users, not just with their overt needs, but also their underlying implicit and latent needs.

Source: https://www.interaction-design.org/literature/article/empathic-design-is-empathy-the-ux-holy-grail

In the UX community, empathy often takes on a strong emotional tone, because an “experience” is often associated to emotions, and companies that capitalize on users’ emotions in their marketing efforts often stand out. This is predominantly because emotions are more primal states of mind than rational thinking, and that the emotional brain processes sensory information much quicker than the cognitive brain.

However, as mentioned above, “great” UX design and empathic research requires a thorough and holistic understanding of users. This requires both emotional and cognitive empathy, and not just either one of them. Cognitive empathy is just as important, as understanding users’ multi-faceted cognitions (e.g. contexts, goals, motivations and problem solving approach) is required to make sense of the emotions and feelings that are elicited.

Developing empathy

Some people are naturally more empathic than others. Highly emphatic people usually have a curious and sensitive personality, and are genuinely curious about others, be it a friend, stranger or even an enemy. They also strive to challenge their own preconceptions and prejudices of others by searching for commonalities, rather than taking human differences on face value.

Source: https://www.scarymommy.com/teach-my-child-empathy/

That does not, however, mean that empathy is an innate trait and cannot be cultivated.

Broadening your horizons is one good way to develop empathy. The greater you immerse yourself into different experiences and expand your social circle, the higher tendency you will share a common experience with others, and hence real connections. Getting out from your comfort zone and travelling to different cultures can be a good way to gain experiences outside of your own community. You will not only appreciate human differences better, but also experience many of the real problems people are facing which you may not have encounter otherwise. If you are low on travelling budget, try helping people in your own community, be it counselling or volunteer work. Experiencing real problems faced by real people yourself is the best way to understand them.

Source: http://theoldrectorydonard.com/mindfulness-retreat-20-september-2015/

Interestingly, empathy also starts from understanding yourself. Practice mindfulness, whether that be yoga, meditation, sport or other pursuits that raise your awareness of your inner voice. Only through a greater awareness of our own attitudes and thought processes, can we then understand that our perceptions of others are often skewed by the self-reference bias that are inherent in all of us. True empathy requires an objective empathic process, and being mindful (intentional, accepting and non-judgmental) definitely helps. If mindfulness is too elusive a construct for you to grasp, maybe learning how to “Listen Better” would be easier to digest.

Deepen empathy with Eye Tracking

The human mind is complex. It’s next to impossible to read someone’s mind even if you are highly empathic, but there are ways to get closer. As far as UX research is concerned, we believe that Eye Tracking is one of the best ways to help us empathise with our users. Eye tracking allows us to directly see unconscious reactions during a usability testing experience, and this reveals what attracts consumers’ attention without requiring us to interrupt participants or expect them remember exactly what they have done. Most humans’ visual behaviour operates below the level of conscious awareness. People simply are not aware where their eyes are going, much less can they recall it.

From our experience, eye tracking helps us to better enquire about a consumer’s experience. When a task is complete, showing the participant where they looked and undertaking an empathic user interview we can gain deep insights into their experience.

One of our UX consultants conducting an user testing session to uncover insights into the thought processes and feelings going through a user’s mind when interacting with a mobile app.

Other than allowing us to deeply understand users, eye tracking numbers also offer a simple way to communicate usability findings and speak a language that executive stakeholders understand. By speaking a common language, we are essentially sharing a common experience, allowing us to be more empathic with one and other.

-Ying Ki and James

References

McLaren, K. (2013). The Art of Empathy: A Complete Guide to Life’s Most Essential Skill. Louisville, Colorado: Sounds True.

Nummenmaa, L., Jussi, H., Riitta, P., & Jari, K. H. (2008). Is Emotional Contagion Special? An Fmri Study on Neural Systems for Affective and Cognitive Empathy. NeuroImage, 43(3), 571-580.

Staub, E. (1987). Commentary on Part 1. In N. Eisenberg, & J. Strayer, In Empathy and Its Development (pp. 103-115). Cambridge: Cambridge University Press.

How to Measure Customer Experience?

Very often in Customer Experience (CX) consulting we gather qualitative findings and we also use the popular Net Promoter Score (NPS) as a quantifiable rating to measure customer sentiments. However it is difficult to define the relationship as to which qualitative finding actually has the highest impact on the NPS rating.nps-comic

Introducing CX/UX metrics that can be collected across methodologies can help in correlating the reasons behind a particular NPS rating. The metrics should be carefully chosen in accordance to the study’s objective and the product’s development cycle.

For example, Product A has just been newly launched in the market and the company wants to test its performance for the first time, using the results from this first round as a benchmark against future testing. A 1st round Summative Test method is chosen and the performance metrics of the product are typically task completion matrix, time per task, errors per task, clicks/button presses per task, Single Usability Metric (SUM) and the System Usability Score (SUS).

If we want customers to attribute their NPS responses to the product or service’s value proposition, it is then important that we need to select the tasks they perform during the test sessions from the known factors that customers need. These known factors can usually be derived from prior/early research.

There are also other types of customer experience metrics. Jeff Sauro made a list on how to measure the customer experience across the 6 categories of (i) Attitudes and Affect, (ii) Customer Attributes, (iii) Product and Service Features, (iv) Design Elements, (v) Experience and Usability, and (vi) Effectiveness.

Attitudes and Affect typically measure customer satisfaction and loyalty, whereas Customer Attributes dig deeper into customer expectations and who exactly are your customers (segmentation analysis). Product and Service Features metrics measure the pricing, value and acceptance of the features. Design Elements metrics delves into understanding how customers notice certain designs of the product and whether those elements are remembered. Experience and Usability metrics are similar to what was listed before about task completion, navigational difficulty and ease of use. Finally, metrics for Effectiveness looks into improving conversion rates via A/B testing and even prioritizing usability problems to fix using Failure Modes Effects Analysis.

All of these CX/UX metrics can be selected and used at different stages of testing according to your company’s needs. Come drop a line with us if you’d like Objective Experience to consult with you on which is the best quantitative testing for your product or service.

What is Agile User Research?

User research in a tight timeline and budget is not impossible. In fact, it is already happening now. All you require are the quality voices of a handful of customers to test and validate your work using an agile user research method.

So what is the core difference between agile and a full user research method? Fewer number of participants are being tested in agile as compared to the full method. But does that mean lesser quality data? No.

One of the early usability gurus, Jakob Nielsen’s research suggested that with only 5 users, 85% of usability problems can be found. For a full user research method, 12 users can find almost 99% of the usability problems. For those who think that user research is too costly and elaborate, a small and agile user research method with frequent testing suits better (as many as the budget allows).

The other difference between agile and full user research is that there will less tasks covered during the testing. To overcome this, test and iterate the product’s features and functions in smaller chunks until it achieves its bigger goal, which is part of the agile manifesto.

Planning and communication are the keys to conducting a great agile user research. Early strategizing occurring at the previous development cycle helps. All of these information and ideas in the early planning phase should be communicated frequently to the user research team so that any issues can be ironed out quickly and for resource management to occur efficiently.

Here in Objective Experience, the entire testing to reporting phase takes only 2 days.  The planning beforehand from the kick-off workshop takes 2 days. Ideally, everything happens within 4 days as illustrated below.

Agile user testing in Singapore

For agile user research, there is no need for testing a large number of users as then it defeats the purpose of the word ‘agile’, which means quick. Testing 5 users who are selected carefully and thoroughly screened to ensure the best participant quality of the targeted user segment is good. Each testing session covers around 2-3 main tasks or user flows within 45 minutes.

It is compulsory for the product design and development team members to sit in and observe the testing sessions as it goes on. Why? To immediately get a sense of what users actually need and iterate on the spot or the next day.

In our agile user research sessions, we also use eye tracking as a way to gain direct insight into how the product is used and what users struggle with. Eye tracking allows observers of the testing to see users’ unconscious behavior in real time, and enables stakeholders to make instant decisions about solutions to interface problems.

At Objective Experience, we have the facilities for team members and other stakeholders to observe the live sessions in person at our viewing room or remotely via a web link. The remote viewing link is great if you have overseas members interested in observing what goes on during the user research. We’ve got a really comfortable space complete with refreshments too!

OESGviewingroom

Take a peek at our viewing room!

After all the testing sessions are done, a brief workshop with the research moderator is held with the observing team members to discuss the key findings from the users, brainstorm some solutions together and actualize the results for the next development. The next day, a report cementing the top 10 most impactful findings with the actionable design recommendations will be produced.

oesg_agilereportsampleSLIDELet us help you make incremental improvements to the user experience of your products, thus driving business growth. Drop us a line at infosg@objectiveexperience.com or call +65 67374511 to discuss your needs now.

See Tobii Glasses 2 in action!

Have you wondered what the Tobii Glasses 2 can do for you? Look no further, we have compiled a list of videos depicting the usage of Tobii Glasses 2 in its many varied areas of research.

With this new wearable eye tracker, your glimpse into the unconscious mind of humans as they go about in the real-world environment need not be just a faraway idea but a confirmed possibility now!

Contact James Breeze at jbreeze@objectiveeyetracking.com for further information about renting or buying the Tobii Glasses 2 for your needs across Australia, New Zealand, Singapore and the rest of South-East Asia. We also provide consulting and research services.

INTRODUCTION TO THE TOBII GLASSES 2

SPORTS SCIENCE RESEARCH

Two tennis players wearing the Tobii Glasses 2 during a match.

ESPN2’s Sport Science team tested former Auburn University wide receiver Sammie Coates’ reaction times with the versatile real-world eye tracking glasses, Tobii Glasses 2.

DRIVING RESEARCH

A driver tested out the Tobii Glasses 2 on a race track (Willow Springs International Raceway) in a high-sun desert environment.

SHOPPER RESEARCH

This short video shows the search and purchase behavior of a shopper in a local pharmacy store for Panadol.

MOBILE DEVICE USABILITY TESTING

In this usability test, the participant was instructed to use Google Maps on a smartphone to navigate from his office to the main station. In addition, he was asked to check the different visual cues such as the blue (highlighted) path. The accuracy of the eye tracking data in these conditions is estimated at 0,4° on real working conditions.

This short video demonstrates where a person looks whilst playing Candy Crush on a smartphone.

HUMAN NAVIGATION AND WAY-FINDING RESEARCH

User puts on the Tobii Glasses 2 and work his way to the MRT station.

CHILD DEVELOPMENT AND PLAY RESEARCH

A short video showing where a child looks whilst trying to follow instructions on how to put together a Star Wars Lego creation.

COGNITIVE AND MOTOR RESEARCH

A research suggests that novice and expert jugglers use different visual and motor strategies. While novices tend to look at the balls around their zeniths, experts mostly fixate their gaze at a central location – so called ‘gaze-through’. The latest found to be a simpler one allowing superior tossing accuracy and error corrections.

UNOBTRUSIVE ETHNOGRAPHY

In this video, the user cooks breakfast in his kitchen.The Glasses 2 provides an in-depth yet unobtrusive method in showing what a user sees and do in their natural settings. This is helpful with understanding consumers’ behaviour outside of a lab setting.


Do you want to know how to apply the eye tracking methodology using the Glasses 2 into your research?

Tobii Pro Insight Research Services are hosting a FREE webinar:  Wearable eye tracking and ethnographic field methods – an autonomous approach  on 2nd July 2015, at 16:00 PM Central European Time (GMT +1). 

You will have a walkthrough of this innovative ethnographic pilot study of young peoples’ reading behavior and media usage. You will also see how our easy-to-use Glasses 2 can be used in your research and have your questions answered by one of our very own Tobii “Pro’s.” 

Pre-registration is mandatory for this webinar. Click here for more information and to register.