While perusing the code for Google Glass's companion Android app, Reddit user Fodawim chanced across several lines of code that could offer up some interesting navigation options for your Glass. Titled 'eye gestures,' it looks like the wearable's built-in sensors should be able to detect eye activity and integrate that into device input. Two lines of code mention enabling and disabling eye gestures, suggesting it'll be an optional feature, while other lines hint that it would have to be calibrated to your wink before use. Get your well-timed slow-wink at the ready, however, as the final line spotted suggests that a wink gesture can command the 5-megapixel camera to capture whatever you're looking at. Google was already granted a patent for unlocking a screen using eye-tracking information, although wink-based commands sounds a shade easier to deal with -- as long as it doesn't think we're blinking.
Filed under: Wearables, Google
Via: Glass-apps
Source: Reddit
Source: http://feeds.engadget.com/~r/weblogsinc/engadget/~3/y6kb_LijeKY/
punksatony phil 2012 groundhog day groundhog phil pee wee herman ketamine ground hogs day 2012 goundhog day
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.