Facebook unveils the future of augmented reality
Mark Zuckerberg recently talked about the use of VR and AR solutions in the professional field. A very detailed article published by Facebook Reality Labs reveals the future imagined by the Menlo Park company for human-computer interaction through an artificial intelligence-based interface for augmented reality glasses.
The AR interface will be proactive, not reactive. The glasses will translate the user's intentions into actions, so they will work in a similar way to the human brain, thanks to artificial intelligence. But to achieve this goal it is necessary to completely rethink the interaction between humans and computers.
Facebook imagines AR glasses that work together with a bracelet that detects hand and finger gestures, also providing haptic feedback. Obviously, it is not possible to use current technologies (tracking of the head, eyes and the like), so engineers study the application of neural inputs, such as electromyography, to convert electrical signals that pass through the body into digital inputs.
It is also necessary to develop an interface that uses contextual artificial intelligence. The user wearing glasses does not have to make specific choices to perform certain actions, but the information requested must be shown automatically based on the context, taking into account the surrounding environment. However, such an AR interface requires years of research.
Facebook will unveil some contextual AI features in the coming days, such as proactively suggesting a playlist that the user may want to listen to while in business sporty.
Source: Facebook
AR interaction: problems and solutions
Facebook had already confirmed the arrival of the first Ray-Ban smart glasses, but more technologically advanced devices capable of completely replacing computers and smartphones are expected in the future. The glasses imagined by the Californian company will be an extension of the body: comfortable to wear, easy to use, discreet, safe and private.The AR interface will be proactive, not reactive. The glasses will translate the user's intentions into actions, so they will work in a similar way to the human brain, thanks to artificial intelligence. But to achieve this goal it is necessary to completely rethink the interaction between humans and computers.
Facebook imagines AR glasses that work together with a bracelet that detects hand and finger gestures, also providing haptic feedback. Obviously, it is not possible to use current technologies (tracking of the head, eyes and the like), so engineers study the application of neural inputs, such as electromyography, to convert electrical signals that pass through the body into digital inputs.
It is also necessary to develop an interface that uses contextual artificial intelligence. The user wearing glasses does not have to make specific choices to perform certain actions, but the information requested must be shown automatically based on the context, taking into account the surrounding environment. However, such an AR interface requires years of research.
Facebook will unveil some contextual AI features in the coming days, such as proactively suggesting a playlist that the user may want to listen to while in business sporty.
Source: Facebook