It’s been an incredible (and tough I’d say) starting of week: I had a lot of works at home to make it a little bit more secure against intrusions (and I’m not speaking about hackers ).
But I also have been so lucky to have a long chat with Frank Rose, contributing editor at Wired Magazine about the Eye Tracking research projects I’m currenlty working on, stressing those on remote control for Television/Video content: they’re both based on a remote usage of eyetracker in order to interact with the content on the video. In one of those the user can unconsciously rotate the real 3d scene selecting the character/object she preferes.
My opinion is that existing TV remote controls are unusable when the user needs a deeper interaction with the filmed scenes since she has to look down at the remote to find the buttons that need to be presses, while – using eye tracking technology – she could easily watch at the screen and select what she likes/needs without looking somewhere else.
It was a deep 40mins chat that I won’t be able to summarize here, I’ll just wait for Frank’s piece being published.