It’s a first step, though. Imagine a time when you can look at a point on the screen and trigger a mouse event. Maybe there’s a separate button to click but the point is your fat, grubby finger isn’t blocking your view, or you’re not having to constantly perform hand-eye tests with a mouse.
I’ve wanted good eye-tracking technology for nearly a decade, because it’d be so vastly superior to current technology. It has to be pixel accurate, though, and that’s always held it back. Sounds like this might be making some progress.
A feature no one needs. (Except those with disabilities, and I think they already have tools for this)
It’s a first step, though. Imagine a time when you can look at a point on the screen and trigger a mouse event. Maybe there’s a separate button to click but the point is your fat, grubby finger isn’t blocking your view, or you’re not having to constantly perform hand-eye tests with a mouse.
I’ve wanted good eye-tracking technology for nearly a decade, because it’d be so vastly superior to current technology. It has to be pixel accurate, though, and that’s always held it back. Sounds like this might be making some progress.
Well, uh, thanks for sharing, I guess.
My name is Kettle, what’s yours?