video calls
Decoded: Where do employees gaze at during video calls

Decoded: Where do employees gaze at during video calls

BDC News

New York, March 30  As more and more people use video conferencing tools to stay connected in social distancing times, neuroscientists from Florida Atlantic University have found that a person’s gaze is altered during telecommunication if they think that the person on the other end of the conversation can see them.

The phenomenon known as “gaze cueing,” a powerful signal for orienting attention, is a mechanism that likely plays a role in the developmentally and socially important wonder of “shared” or “joint” attention where a number of people attend to the same object or location.

“Because gaze direction conveys so much socially relevant information, one’s own gaze behaviour is likely to be affected by whether one’s eyes are visible to a speaker,” said Elan Barenholtz, associate professor of psychology.

For example, people may intend to signal that they are paying more attention to a speaker by fixating their face or eyes during a conversation.

“Conversely, extended eye contact also can be perceived as aggressive and therefore noticing one”s eyes could lead to reduced direct fixation of another’s face or eyes. Indeed, people engage in avoidant eye movements by periodically breaking and reforming eye contact during conversations,” explained Barenholtz.

People are very sensitive to the gaze direction of others and even two-day-old infants prefer faces where the eyes are looking directly back at them.

Social distancing across the globe due to coronavirus (COVID-19) has created the need to conduct business “virtually” using Skype, web conferencing, FaceTime and any other means available.

For the study, published in the journal Attention, Perception & Psychophysics, the team compared fixation behaviour in 173 participants under two conditions: one in which the participants believed they were engaging in real-time interaction and one in which they knew they were watching a pre-recorded

[wp_ad_camp_3]

The researchers wanted to know if face fixation would increase in the real-time condition based on the social expectation of facing one”s speaker in order to get attention or if it would lead to greater face avoidance, based on social norms as well as the cognitive demands of encoding the conversation.

Results showed that participants fixated on the whole face in the real-time condition and significantly less in the pre-recorded condition. In the pre-recorded condition, time spent fixating on the mouth was significantly greater compared to the real-time condition.

There were no significant differences in time spent fixating on the eyes between the real-time and the pre-recorded conditions.

To simulate a live interaction, the researchers convinced participants that they were engaging in a real-time, two-way video interaction (it was actually pre-recorded).

When the face was fixated, attention was directed toward the mouth for the greater percentage of time in the pre-recorded condition versus the real-time condition.

“Given that encoding and memory have been found to be optimized by fixating the mouth, which was reduced overall in the real-time condition, this suggests that people do not fully optimize for speech encoding in live interaction,” the authors wrote.

 

--IANS
sms\rm
(This story has not been edited by BDC staff and is auto-generated from a syndicated feed from IANS.)
Writers are welcome to submit their articles for publication. Please contact us through Contact Us in the Menue