TheMindReport

Why Eye Contact Feels Different—and What Fixations Reveal

Eye contact is a social shortcut. With a glance, we gauge interest, trust, and intent. But for many autistic people, those few seconds can feel complicated—less like a shortcut and more like a traffic jam. A new research paper, Pattern of fixation explains atypical eye processing during observation of faces with direct or averted gaze in autism (results of the INFoR Cohort), offers a clear, data-driven explanation for why. Rather than framing this difference as “dislike” or “avoidance,” the study points to a simpler and more powerful idea: where the eyes land first—and how long they stay there—shapes how quickly and accurately people read faces.

Using eye-tracking with 88 autistic participants and 56 non-autistic participants, the INFoR Cohort team asked people to judge whether photographed faces were looking at them or away. The twist was not the task but the lens: the researchers watched fixation patterns, especially how often participants looked at the eyes. They introduced a measure called the Eye Fixations Index (EFI), derived from the number of fixations on the eye region. They also measured response times (RTs)—how quickly people decided where the face was looking.

The results matter for everyday life. Whether you’re interpreting a coworker’s glance in a meeting or a partner’s expression at dinner, split-second decisions about gaze can influence how conversations unfold. This study shows those decisions are strongly tied to where we look. It reframes a common narrative about autism—from “won’t make eye contact” to “uses a different visual strategy”—and reveals that this strategy is linked to broader attention and executive control, not simply social discomfort.

What the Eye-Tracking Data Revealed About Split-Second Choices

The study’s core finding is straightforward: faces with direct gaze pulled everyone’s attention more than faces with averted gaze. Across the board, eyes that looked straight at the viewer attracted more fixations. But there was a key difference between groups. Autistic participants showed a reduced Eye Fixations Index (EFI)—they spent less time fixating on the eyes—and they took longer to decide if the gaze was direct or averted. These two measures were tightly linked: the fewer the eye fixations, the slower the response.

In numbers, that means that EFI and RTs were strongly and negatively correlated: lower EFI predicted longer decision times. A mediation analysis showed that the slower responses seen in the autistic group were largely explained by their fixation pattern—specifically, the reduced eye fixations. In other words, the visual strategy drove the performance difference.

The study also linked EFI to anticipatory saccades in basic eye-movement tasks. Anticipatory saccades are quick, predictive eye jumps—like your eyes moving to where a ball is headed before it gets there. Participants with more anticipatory saccades tended to show a particular pattern of fixations during face viewing, suggesting that fundamental oculomotor control plays a role in social viewing strategies.

Finally, slower response times (RTs) were associated with higher scores on attention and executive function measures (ADHD-RS and BRIEF) and with greater autism-related social differences (SRS-2). Notably, RTs were not linked to social anxiety. So in realistic moments—like deciding if a manager’s look invites a comment or signals “hold that thought”—what seems like a social hesitation may instead reflect broader attention and control systems interacting with visual strategy.

From Eye Contact to Search Strategy: What Drives the Difference?

These findings challenge a common interpretation: that reduced eye contact in autism is mainly about discomfort or avoidance. The data point to attentional control and oculomotor strategy. The reduced Eye Fixations Index (EFI) was not random; it closely tracked with anticipatory eye movements and predicted how quickly people made judgments. That suggests that what’s happening is less about “won’t look” and more about “looks differently.”

Past research has shown that autistic individuals often spend less time looking at eyes and more time sampling other informative regions (like the mouth or objects in the scene). Some theories emphasize reduced “social reward” for eye contact, while others highlight differences in sensory sensitivity or prediction. This study adds an important layer: eye behavior during face viewing is connected to the basic mechanics of how eyes move and how attention is managed. If your system is geared to anticipate and sample efficiently across a scene, you might allocate fewer fixations to the eyes—especially when the face’s gaze direction can be inferred from other cues.

Consider a concrete example: in a group discussion, a team member glances toward you. A non-autistic person may reflexively lock on the eyes and quickly decide, “They’re addressing me.” An autistic person may sample the eyebrows, head tilt, or edges of the mouth before returning to the eyes. That broader scan can take a beat longer, producing a slower response without implying disinterest. The mediation result is crucial here: group differences in speed were largely explained by fixation strategy. That finding is consistent with attention-control models in psychology, where where you look and how you regulate your gaze guide what you process next.

Importantly, the study reports that slower RTs related to ADHD and executive function measures, and to autism-related social behaviors—but not to social anxiety. This helps separate the idea of “eye avoidance due to anxiety” from “a different, stable visual-attention strategy.” While social anxiety can influence eye behavior in many contexts, it didn’t explain performance here. That narrows the target for support: improving gaze-related processing speed may be more about tuning attention and eye-movement habits than treating anxiety.

Turning Insight into Action: Training Attention, Not Forcing Stares

What can we do with this insight? First, in clinical and educational settings, shift from “make eye contact” to “optimize information gathering.” Coaches, therapists, and teachers can encourage brief, purposeful glances at the eyes when needed—then allow comfortable scanning. Micro-goals like “two quick looks at the eyes before answering” respect autonomy while improving efficiency for tasks that depend on gaze.

Second, consider attention and oculomotor training. Simple exercises that practice stabilizing gaze, timing saccades, or shifting attention can strengthen the systems tied to the Eye Fixations Index (EFI). Gamified tools that reward quick, accurate decisions about gaze direction—as opposed to prolonged staring—are likely a better fit. For example, a tablet game could flash faces briefly and train players to sample the eyes swiftly before scanning other features.

Third, design environments that reduce time pressure during face-to-face exchanges. In classrooms and workplaces, allow alternative signals—like verbal check-ins or brief hand cues—so that understanding doesn’t hinge on split-second gaze judgments. If you’re a manager leading a hybrid meeting, say the person’s name before directing a question. This removes the guesswork from interpreting eye direction on a video grid.

Fourth, apply the findings to assessment. Because the research ties RTs to executive function (BRIEF), attention (ADHD-RS), and social responsiveness (SRS-2), clinicians can interpret slower performance not as a single “social” issue but as a multi-factor pattern. Eye-tracking measures like EFI could complement existing tools, guiding personalized interventions that address attention control along with social-cognitive goals.

Finally, for families and partners, reframe expectations. If a loved one looks away briefly before answering, they may be running a different, effective visual strategy. A gentle pause and clear verbal cues can make conversations smoother without pushing uncomfortable levels of eye contact.

A Small Shift in Gaze, A Big Shift in Understanding

The INFoR Cohort’s eye-tracking work shows that differences in social viewing in autism are rooted in pattern of fixation rather than simple avoidance—and that these patterns shape how quickly people read gaze. The link between Eye Fixations Index (EFI), response times (RTs), and measures of attention and executive function reframes support targets: help people tune strategies, don’t force stares. As the Pattern of fixation explains atypical eye processing during observation of faces with direct or averted gaze in autism (results of the INFoR Cohort) research paper suggests, small, well-timed looks can do more than prolonged eye contact. The question going forward is practical: how can schools, clinics, and workplaces build cues and tools that respect different visual strategies while keeping social decisions quick and clear?

Data in this article is provided by PLOS.

Related Articles

One Response

Leave a Reply