
Snap Plans to Sell Lightweight, Consumer AR Glasses in 2026
Snap is making a bold move back into the consumer augmented reality (AR) market with plans to launch a new pair of lightweight AR smart glasses, dubbed Specs, in 2026. CEO Evan Spiegel unveiled the news at the Augmented World Expo in Long Beach, California, signaling a renewed commitment to bringing AR technology to everyday users. A Snap spokesperson confirmed the 2026 launch date to TechCrunch.
These upcoming Specs aim to blend the advanced AR and artificial intelligence (AI) features found in Snap’s developer-focused Spectacles 5 with a more consumer-friendly design. The key focus is on creating a smaller and lighter form factor, addressing a major hurdle that plagued previous iterations. Snap hopes the sleeker design will make the glasses more appealing and practical for public use, unlike their bulkier predecessors.
The new Specs will feature see-through lenses that overlay digital graphics onto the user’s view of the real world, enhancing their perception with interactive elements. An integrated AI assistant, powered by Snap’s own technology, will be capable of processing both audio and video inputs, providing users with contextual information and assistance.
Snap’s initial foray into consumer smart glasses with the first Spectacles in 2016 faced significant challenges, ultimately resulting in poor sales. Now, nearly a decade later, the landscape has changed dramatically. Snap faces stiff competition from tech giants like Meta and Google, both of whom are actively developing their own AR glasses and platforms.
Meta is reportedly gearing up to launch glasses with a built-in screen, codenamed “Hypernova,” in 2025. Google is also making strides in the AR space, recently announcing partnerships with Warby Parker, Samsung, and other companies to develop Android XR smart glasses. These competitive pressures underscore the need for Snap to deliver a compelling and differentiated product.
Snap is banking on its SnapOS developer ecosystem to gain a competitive edge. The company has invested heavily in building this platform, which already boasts millions of AR experiences, known as Lenses, created for Snapchat and Spectacles. Snap says many of these existing Lenses will be compatible with the new Specs, giving users a rich library of content from day one.
During his presentation, Spiegel showcased some of these Lenses, including “Super Travel,” which translates signs and menus in foreign languages, and “Cookmate,” which suggests recipes based on available ingredients and provides step-by-step cooking guidance. These examples highlight the potential of AR to provide practical and engaging experiences.
While companies have been demonstrating similar AR use cases for years, the challenge remains in creating smart glasses that are both capable, affordable, and comfortable enough for everyday use. Snap believes it has overcome these hurdles with Specs, though key details remain under wraps. Snap has yet to disclose the price, sales strategy, or exact design of the new glasses.
In addition to the new glasses, Snap announced several updates to its SnapOS platform, allowing developers to build apps powered by multimodal AI models from OpenAI and Google DeepMind. The company also introduced a “Depth Module API,” which enables AR graphics from large language models to be anchored in 3D space, enhancing realism and interactivity. Snap is also partnering with Niantic Spatial, the company behind Pokémon Go, to develop AI-powered maps of the world, further enriching the AR experience.
The success of Snap’s Specs will depend on whether these efforts can translate into a compelling product that consumers genuinely want to buy. While Meta has seen early traction with Ray-Ban Meta, Snap’s Specs are likely to be significantly more expensive. To win over consumers, Snap needs to demonstrate the practical value of AR glasses, transforming them from a novelty into an indispensable tool.