As I was e-biking to the Ray-Ban Store in SoHo, I realized I wouldn’t have to keep checking my phone for directions if I wore the Meta Ray-Ban Display glasses. When I got there, they were already sold out. So I made my way to Sunglass Hut on 5th Avenue and acquired one of the last sets available in the city on launch day.
The sun was setting behind the trees as I unboxed the band and glasses in the basket of a Citi Bike at 79th Street on the east side of Central Park. I paid $10 to my cellular provider for 1 gigabyte of extra data to download updates for the Meta AI app on iPhone, as well as the latest firmware to the wearables. The box housing the Meta Neural Band fell out first, with text in the packaging telling me to start with the glasses instead. While the hardware updated, I walked the e-bike into the park.
I put on the glasses and band, skipped my way through the tutorial, and began testing with a photo.

Below is my first video shot on the glasses the first time I tried the navigation features.
There’s a long way to go toward Meta supporting the wide range of ways one can get around a city like New York, one of the 28 cities Meta’s pedestrian navigation currently works in. The Neural Band’s twist gesture – index finger to thumb and turn your hand – works to zoom in and out of maps really well, and I can find my way around pretty effectively just using that feature.
It was just a few minutes into my active testing when I started to notice distracting reflections in the corner of my vision. While David Heaney is right that I wouldn’t watch a movie with the monocular display, I did grow comfortable very quickly with using it for a couple seconds at a time for all the tasks I outline in this article. What I was more bothered by were these reflections. By my rough estimate about 20 percent of the right side of the right lens catches reflections of my surroundings in extremely distracting ways. It’s as if there’s a mirror, even with the display off, reflecting surroundings to the periphery of one eye. The reflection is fairly clear — I could make out lights or trees flying by. For me, it is quite startling to catch glimpses of something in the corner rushing past in a direction that doesn’t make logical sense. And at the same time trying to keep an eye on the road ahead or on the monocular display? I actually jerked my body once I was so startled by something in my periphery while riding the bike.
Those who wear many types of simple glasses and even those familiar with head-worn see-through optics on earlier types of smart glasses may be familiar with this reflectivity issue. How much it bothers you is likely to vary from person to person. David, for instance, didn’t see the reflections in his demo, which took place in an environment without strong lighting. I confirmed at the store I still see the reflections on two other sets of the Meta Ray-Ban Display glasses. I’ve never noticed them before on four pairs of display-free Meta glasses.
Moving past this distraction, I made my way slowly toward the middle of the park, sometimes walking the bike, when I heard a man playing saxophone.
After capturing that short video from a distance, I asked him if I could film him with my glasses for $20. I handed him the cash and he asked me what I wanted him to play. I asked for his favorite song.
I recorded two videos from that angle, one minute each, and went along my way again.
I parked the bike on the west side of Central Park and got out on foot to head to the Museum of Natural History. My initial interactions with the glasses focused around the physical button on top for activating the camera. But now, back on foot, it was time to learn the Neural Band.
My family and I spent about an hour and a half at the museum and in that time I became proficient with the Meta Neural Band’s easy-to-learn gestures.

I learned how to navigate the menu system swiping or pressing my thumb to my index finger. A single middle finger pinch to thumb goes back in the menu and a double pinch turns the display on or off.
“Hey Meta, how old is this baby?” I asked.
I was holding my child close to me as we waited for the elevator. He was facing away and Meta’s answer was fairly imprecise. I held my kiddo out a bit further, still facing away, and asked again. Meta answered with both greater precision and accuracy.
Exiting the museum, I started exploring advanced interactions and settings with the Neural Band. I could alter the brightness of the display with that twist gesture. I also used the Neural Band while walking to navigate to the camera and start a video recording. This is when I fell in love with the Neural Band.
These wearables capture a video more naturally than pressing the button on the glasses or asking Meta AI. The display enables framing for photography and videography for the first time in Meta glasses. The Neural Band enables zoom with that twist gesture too. Pointing an eye at the display provides immediate and ongoing guidance about the precise edges of the frame.
Meta Ray-Ban Display glasses might as well be a LIV camera for the real world. The Neural Band with a display is the difference between wearing a GoPro and making a movie. It helps that starting a video recording with this feedback is the same in both Apple Vision Pro and these Meta glasses — a single index to thumb pinch “pressing” a record button only viewable in a face computer.
I connected the glasses to Apple Music on my way into the museum, and on the way out I explored the calling features. Meta requests access to contacts over Bluetooth and in the Meta AI app. The dialog for these permissions didn’t seem to indicate I could be more selective than just turning over my entire address book, so I deleted the contacts of my family members from the phone I dedicate to work and agreed to the permissions. After agreeing, the user interface on iPhone made clear I could grant Meta access to exactly one contact instead of all of them. I gave the phone number and name of UploadVR reader and fellow New Yorker James O’Loughlin to Meta, and then I called him from the glasses as we crossed in front of the Central Park Great Lawn. A few hours earlier, on the other side of that same grassy expanse, I filmed a saxophone player during golden hour. Now, in the dark, I’m telling James over my glasses and through my iPhone that he might have trouble buying his own pair after he demos them at the Ray-Ban store. They told me they were sold out and turned away 70 people seeking demos on launch day.
The glasses were down to about 15 percent battery and I asked James if he wanted to try WhatsApp for view sharing. He asked me to text him my WhatsApp number. I pulled out my iPhone and sent it. A few seconds later, I had a text in WhatsApp from James on my glasses display.
We started a call on WhatsApp as I was exiting the east side of Central Park and then I turned on view sharing. As the battery ticked down, James said hi to my spouse walking in front of us. I reached out and pinch-twisted to change call volume, showing James I wielded total control over the voices in my head.
A few blocks from home I said goodbye to James and, satisfied with this first real world test of these glasses and Neural Band, I began playing with the display using the last bit of power. I kept double pinching my middle finger to my thumb turning the display on and off again and again, just to confirm it works reliably. It worked every time.
I finally took off the glasses and unflattened their case from my pocket. Placing the glasses inside, I found myself pinching again just to see if I could make the Neural Band vibrate with the glasses not in use. Nope, the band didn’t do anything for me anymore, and that made me sad within seconds of the glasses leaving my head. I was reluctant to even pull the band off to charge it overnight even though that makes no logical sense.
On my second day with the glasses I downloaded the English language to test out live captions while talking with James after his own demo at the store. I listened to music on the train home, turning the volume up and down with a twist in the air. I also captured videos while riding an e-bike with the glasses trying to give me pedestrian routing at the same time.
The thick and heavy display glasses my colleague tried under more controlled conditions appear to be more than flawed, at least for me, due to those distracting peripheral reflections.
And yet, that display breathes new life into the prospect of photography with glasses now that Meta added a camera viewfinder and zoom controls. I strongly believe headsets like Apple Vision Pro and glasses like Meta’s with a display can revolutionize photography, videography and media production in deeply profound ways.
Imagine a future iteration of the Meta Neural Band including 1 terabyte of storage. Imagine the glasses offloading their captures to the band instead of a phone. Imagine a VR headset pulling those videos and photos out of the band for you to edit in VR or hang on your walls. Imagine getting objects or worlds from the band too. Imagine pulling any file you want out from your wrist, like an ace up the sleeve, to hand a copy to a friend.
The Meta Neural Band is the most important technology ever developed by Mark Zuckerberg’s team. It’s his key to unlocking his vision of the metaverse.
The term “golden handcuffs” refers to financial incentives designed to encourage employees to stick around with a particular company. A company like Meta or Apple might grant workers Restricted Stock, for example, giving ownership in pieces of the company that can only be sold when conditions are met, like performance milestones or a minimum number of years of work.
Meta, like Facebook before it, has a habit of making it effortless to put your personal information into places they can control and analyze, while also making it exceedingly hard to get that information out again. With the Neural Band, Meta made a way to put golden handcuffs on everyone. All they have to do is make it effortless to use and nearly impossible to stop. I felt that future after one night with a Meta Neural Band on my wrist.
Source link