The race to put intelligent augmented reality glasses on your face warms up. Snap shows are transformed into “specifications” and launch as a lighter and more powerful AR laptops in 2026.
CEO Evan Spiegel has announced the brand new specifications on the XR AWE event, promising smart glasses that are smaller, considerably lighter and “with a ton of more capacity”.
The company has not explained a specific time or price, but the 2026 launch calendar is Meta in opinion, which is busy preparing its exciting oron glasses for 2027. It seems that Snap Specs will face with Samsung / Google Android XR glasses, which are also expected in 2026.
As for what consumers can expect specifications, Snap builds them on the same Snap operating system used in its fifth generation glasses (and is probably always using a pair of Qualcomm Snapdragon XR chips). This means that all interface and interaction metaphors, such as gesture -based controls, will remain. But there are a large number of new features and integrations that will begin to appear this year, long before the specifications arrive, including AI.
Platform upgrade
Spiegel explained the updates by first revealing that Snap began to work on “before snapchat” glasses was even one thing and that the essential objective of the company is “to make computers more human”. He added that “with advances in AI, computers think and act more than ever like humans”.
The Snap plan with these updates for Snap OS is to bring AI platforms to the real world. They bring Gemini and Openai models to Snap OS, which means that certain multi-mode capacities will soon be part of the fifth generation shows and, ultimately, specifications. These tools can be used for translation of text on the fly and conversion of currencies.
The updated platform also adds tools for SNAP objectives manufacturers which integrate with display capacities based on the waveform of shows and specifications.
A new SnAP3D API, for example, will allow developers to use Genai to create 3D objects in objectives.
Updates will include a depth module AI, which can read 2D information to create 3D cards that will help anchor virtual objects in a 3D world.
Companies that deploy spectables (and possibly specifications) can assess the new fleet management application, which will allow developers to manage and monitor several specifications at a time, and the possibility of deploying guided navigation specifications, for example, a museum.
Later, Snap OS will add webxr support to create AR and VR experiences within web browsers.
Let’s make it interesting
Spiegel said that, via Snapchat’s objectives, Snap has the largest AR platform in the world. “People use our AR goals in our camera 8 billion times a day.”
It’s a lot, but it’s practically throughout smartphones. Currently, only developers use voluminous glasses and their objective capacities.
The release of consumer specifications could change this. When I tried glasses last year, I was impressed by the experience and I found them, although not as good as the meta-body glasses (the lack of lookum stood for me), full of potential.
A lighter factor that approaches or exceeds what I found with Orion and that I saw in some Android XR Samsung glasses, could blow up snap specifications in the lead of AR glasses. In other words, provided that they do not cost $ 2,000.