- Meta develops her intelligent glasses Aria Gen 2, which are filled with sensors and characteristics of the AI
- Intelligent glasses can follow your gaze, movement and even the heart rate to assess what is happening around you and your feelings about this subject
- Intelligent glasses are currently used to help researchers train robots and build better AI systems that could be incorporated into intelligent consumer glasses
The Ray-Ban Meta intelligent glasses are still relatively new, but Meta already increases work with her new intelligent glasses Aria Gen 2.. Unlike rays, these smart glasses are only for research purposes, but are filled with sufficient sensors, cameras and power of treatment that Meta learns will be incorporated in the future portable.
The tools in terms of project Aria’s search, such as new intelligent glasses, are used by people working on computer vision, robotics or any relevant hybrid of contextual and neuroscience that attracts Meta’s attention. The idea for developers is to use these glasses to design more effective methods for teaching machines to navigate, contextualize and interact with the world.
The first intelligent ARIA glasses were released in 2020. ARIA GEN 2 are much more advanced in hardware and software. They are lighter, more precise, wrap more power and look much more like glasses than people wear in their regular life, although you do not confuse them with a standard pair of glasses.
The four computer vision cameras can see an arc of 80 ° around you and measure the depth and the relative distance, so that it can say both how much your cup of coffee is on your keyboard or where the landing gear of a drone could go. It is only the start of sensory equipment in glasses, including an ambient light sensor with ultraviolet mode, a contact microphone that can pick up your voice even in noisy environments and an impulse detector integrated into the nose cushion that can estimate your heart rate.
Future facial clothes
There is also a lot of eye tracking technology, capable of saying where you are looking for, when you flash, how your students change and what you are focusing on. He can even follow your hands, measure the joint movement in a way that could help the training of robots or learning gestures. Combined, the glasses can understand what you are looking at, how you hold an object, and if what you see is to raise your heart rate because of an emotional reaction. If you hold an egg and see your sworn enemy, AI may be able to understand that you want to throw the egg for them and help you aim with precision.
As indicated, these are research tools. They are not for sale to consumers, and Meta has not said if they would ever be. Researchers must apply to obtain access and the company should start to take these applications later this year.
But the implications are much more important. Meta’s plans for smart glasses go far beyond checking messages. They want to link human interactions with the real world to machines, teaching them to do the same. Theoretically, these robots could watch, listen and interpret the world around them like humans.
This will not happen tomorrow, but the intelligent glasses Aria Gen 2 prove that it is much closer than you think. And this is probably only a matter of time before a version of Aria Gen 2 ended up selling to the average person. You will have this powerful brain seated on your face, remembering where you left your keys and sending a robot to recover them for you.