Eyefluence, a startup company based in Milpitas, Calif., is stacked with leadership experience in consumer electronics, engineering and sales. Add to that about 10 years’ worth of eye studies, algorithm development, optical and illumination system design, and field applications, and you have what Jim Marggraff believes is the future of head mounted displays (HMDs): eye-controlled interactions.

The notion of “transforming intent into action” is somewhat of a tagline for Eyefluence, repeated throughout the company literature as well as by Marggraff himself during his VM Summit presentation. What the phrase describes is the potential of Eyefluence’s software to allow fast, seamless control of a digital device through movement of only the eyes.





“Eyefluence transforms intent into action through your eyes,” said Marggraff. “The bar that I set was that anything you do with your phone using your finger, you should be able to do using your eyes—and faster,” he said. “It was a high bar.”

Eyefluence partners with HMD manufacturers and market leaders to engineer and integrate eye-interaction technology into augmented reality (AR), virtual reality (VR) and mixed reality (MR) scenarios. In laymen’s terms, the company wants to be the first to bring technology allowing your eyes to do what your hands do with digital devices to the consumer level.

Jim Marggraff believes eye-controlled interactions will be important to the development of HMDs.
Marggraff said technology could “improve thinking, communication, socialization…and advance you as a human being.”
According to Marggraff, wearables currently on the market, which allow the use of hands and heads to move and interact within an environment, “are fundamentally incomplete,” he said. “There are 100 companies developing HMDs right now dealing with one challenge: control. We have hands, head and voice—but what about the eyes?”

A worthy question when posed to an audience of leaders, and perhaps future innovators, of the eyecare industry. “We want to transform intent into action,” Marggraff repeated. Showing VM attendees a real-time video demonstration of his own interaction with Eyefluence’s software, Marggraff proved that this is no trick of the eye, but a viable use of eye-tracking technology.

“The key thing to take away is: I was using a new eye interaction model. No waiting, no dwell, just looking.”

This technology, Marggraff said, could “improve thinking, communication, socialization…and advance you as a human being. That’s what we’re up to.”

catwolinskiwrites@gmail.com