Snap OS 2.0 Brings The AR Glasses Closer To Consumer-Ready

Snap OS 2.0 is out now, adding and improving first-party apps like Browser, Gallery, and Spotlight to bring the AR platform closer to being ready for consumers.

If you’re unfamiliar, the current Snap Spectacles are $99/month AR glasses for developers ($50/month if they’re students), intended to let them develop apps for the Specs consumer product the company behind Snapchat intends to ship in 2026.

Spectacles have a 46° diagonal field of view, angular resolution comparable to Apple Vision Pro, relatively limited computing power, and a built-in battery life of just 45 minutes. They’re also the bulkiest AR device in “glasses” form factor we’ve seen yet, weighing 226 grams. That’s almost 5 times as heavy as Ray-Ban Meta glasses, for an admittedly entirely unfair comparison.

Snap Says It Will Launch Consumer AR Glasses, Called Specs, In 2026

The company behind Snapchat says it will launch fully standalone consumer AR glasses, called Specs, next year.

But Snap CEO Evan Spiegel claims that the consumer Specs will have “a much smaller form factor, at a fraction of the weight, with a ton more capability”, while running all the same apps developed so far. As such, what’s arguably more important to keep track of here is Snap OS, not the developer kit hardware.

Snap OS

Snap OS is relatively unique. While on an underlying level it’s Android-based, you can’t install APKs on it, and thus developers can’t run native code or use third-party engines like Unity. Instead, they build sandboxed “Lenses”, the company’s name for apps, using the Lens Studio software for Windows and macOS.

In Lens Studio, developers use JavaScript or TypeScript to interact with high-level APIs, while the operating system itself handles the low-level core tech like rendering and core interactions. This has many of the same advantages as the Shared Space of Apple’s visionOS: near-instant app launches, interaction consistency, and easy implementation of shared multi-user experiences without friction. It even allows the Spectacles mobile app to be used as a spectator view for any Lens.

Snap OS doesn’t support multitasking, but this is more likely a limitation of the current hardware than the operating system itself.

Snap Spectacles See Peridots Play Together & Now Use GPS

Six months in, Snap’s AR Spectacles for developers and educators just got big new features, including GPS support and multiplayer Peridot.

Since releasing Snap OS in the latest Spectacles last year, Snap has focused on adding more capabilities for developers building Lenses.

For example, in March the company added the ability to use the GPS and compass heading to build experiences for outdoor navigation, detect when the user is holding a phone, and spawn a system-level floating keyboard for text entry.

Footage recorded by UploadVR at Snap London showing Gemini multimodal AI integration with depth caching. NOTE: Spectacles records a much wider field of view than it can display. In reality, you can essentially only see virtual content when looking directly at it.

In June, Snap added a suite of AI features, including AI speech to text for 40+ languages, the ability to generate 3D models on the fly, and advanced integrations with the visual multimodal capabilities of Google’s Gemini and OpenAI’s ChatGPT.

This included depth caching for image requests, letting developers anchor information from the responses of those AI models in real-world space.

Snap OS 2.0

Now, with Snap OS 2.0, rather than just focusing on the developer experience, the company is rounding out its first-party software offering as a step towards next year’s consumer Specs launch.

Snap OS 2.0’s upgraded browser, with Travel Mode enabled. NOTE: Spectacles records a much wider field of view than it can display. In reality, you can essentially only see virtual content when looking directly at it.

Snap OS now has a Travel Mode, which when enabled, makes the positional tracking work properly on moving vehicles, such as planes and trains. Apple was the first to launch this feature, on Vision Pro at launch, while Meta and Pico have since followed. Snap says it should work on planes, trains, and even cars.

The Snap OS web browser has been overhauled to make it “faster, more powerful, and easier to use”. It now has widgets and bookmarks, and the ability to resize the window to whatever aspect ratio you want.

Snap’s browser also now supports WebXR, bringing a host of cross-platform web-based experiences to the glasses. Keep in mind that the limited onboard compute means this WebXR support is intended for relatively simple AR experiences, not expansive immersive games.

The Snap OS Gallery app. NOTE: Spectacles records a much wider field of view than it can display. In reality, you can essentially only see virtual content when looking directly at it.

The new Gallery lens lets you view your Spectacles captures and share them to others on Snapchat. However, the glasses still only record 30 seconds of footage from inside a Lens, and still don’t support general image and video capture.

The Gallery lens also doesn’t show your captured Snapchat images and videos from your phone, as you might have expected it to do. I asked Snap about this, and they said it was an interesting idea.

The Snap OS Spotlight app. NOTE: Spectacles records a much wider field of view than it can display. In reality, you can essentially only see virtual content when looking directly at it.

The other new first-party app in Snap OS 2.0 is Spotlight, which is the Snapchat phone app’s equivalent of TikTok and Instagram Reels. The current Spectacles hardware field of view is taller vertically than it is wide horizontally, so is ideal for watching vertical video like this.

Synth Riders

Alongside the announcement of the new operating system, Snap also revealed that the XR rhythm game Synth Riders is coming to its platform.

I was able to try Synth Riders on Spectacles at Snap London. While it isn’t the full Synth Riders experience you get on Quest, PC VR, and Apple Vision Pro, it’s a fun adaptation that shows how developers are starting to learn to make use of Snap’s tools.

Footage recorded by UploadVR at Snap London. Unfortunately, Spectacles can only record for 30 seconds. NOTE: Spectacles records a much wider field of view than it can display. In reality, you can essentially only see virtual content when looking directly at it.

As an aside, note how the virtual object emits light onto the furniture below it. That’s because Snap OS provides a continuous scene mesh to Lenses, letting them naturally fit in with real world geometry.


Source link

Leave a Reply

Your email address will not be published. Required fields are marked *