Techbird : Your Gateway to the Future of Innovation

Revolutionizing the Future: Real Augmented Reality Glasses

November 27, 2024 | by ranazsohail@gmail.com

meta-ar-brille-orion-redaktionell-1200px-png

This might be the clearest look into the future of tech I’ve seen. I’m one of a very few people who’ve had the chance to try out two of the newest and rarest augmented reality glasses of 2024: the Meta Orion Smartglasses and the Snapchat AR Spectacles. These aren’t available to the public yet, and you’ll see why in a minute. But what’s impressive is how different each one is, while still being pretty incredible in their own way.

A few months ago, I made a video explaining this idea, but just to recap: VR headsets are over here, smartglasses are over there, and both are moving toward the same middle ground—augmented reality glasses. VR headsets are packed with amazing tech, incredible immersion, and a wide field of view. But they’re also huge, and let’s be honest, you’re not exactly going to walk around in public with one on. Smartglasses, on the other hand, are much sleeker and could easily pass as regular glasses, but they can’t hold much more than a camera, some batteries, speakers, and a tiny computer inside.

So, VR headsets are trying to shrink down until they can fit into something that looks like regular glasses, while smartglasses are working to add more tech without losing their everyday appeal. Somewhere in the middle, there’s the dream of augmented reality glasses.

Meta Orion Project

What if we could get a glimpse of what the future of augmented reality might look like with today’s technology? That’s what we’re looking at here. Meta and Snapchat have each taken very different paths to bring these ideas to life, and while neither of these projects is ready for public release (and probably shouldn’t be just yet), both are really exciting. After trying them both out, I find myself comparing the two.

Let’s start with Meta’s Orion project. They introduced it at their Connect event a few weeks ago, and since then, a few people have had the chance to try it out. The Orion system is made up of three parts: glasses you wear, a wireless computer puck that needs to stay within 15 feet of the glasses, and a wrist strap that measures electrical impulses in your arm, which acts as the input device. Yes, you read that right.

Together, these three components create an augmented reality experience that feels like something out of science fiction. While there have been other AR projects like Magic Leap and HoloLens, the feeling of wearing a pair of simple, transparent glasses that overlay digital objects onto the real world in front of me is something entirely different.

One of the challenges with making a video about these devices is that you can’t record the experience directly. It’s not like a normal screen recording because the experience is all about me looking through the glasses and seeing the digital content mixed with the real world. The best I could do was film a first-person video and overlay the graphics from the glasses on top, giving you a rough idea of what it looks like, but it’s still hard to do it justice.

Basic Usage
With the Meta Orion glasses, I got to try out three different demos. The first was just basic usage. Imagine sitting in a coffee shop or on a park bench, scrolling through Instagram, but instead of looking at your phone, the feed appears as a floating window in mid-air—only visible to you. I also played around with multitasking, having a video call in one window while other floating windows showed messaging and Instagram. It’s a simple concept, but it feels pretty amazing.

The glasses feel really light on my face, weighing only about 100 grams. As I’m watching Instagram Reels, the audio plays through built-in speakers just above my ears, and I scroll through the content with a simple gesture—just using my thumb and swiping on my hand.

You might think that the glasses’ cameras and sensors are tracking my hand and syncing the scroll to that gesture. But that’s not actually how it works. The hand tracking isn’t responsible for the scroll. This gesture could be detected even if my hand was in my sweatshirt pocket or behind my back, thanks to the wristband I’m wearing. And honestly, this wristband might be the coolest tech I’ve tried in a long time. It’s an EMG wristband they’ve developed.

EMG stands for electromyography. The wristband is about the size of a WHOOP, and it has electronics woven into the fabric. It’s got an onboard machine-learning computer that connects to the glasses via Bluetooth and measures the electrical signals sent from your brain to your fingers. Here’s why it works: your tendons run through your arm and are connected to your nervous system, which links to your brain. So, when you make a gesture, the electrical impulses that travel from your brain to your hand are unique to each movement. For example, the impulse pattern is different for this gesture, this one, or any other movement.

The wristband detects those impulses and uses them to control the glasses. Even in the prototype version I’m using, it’s impressively accurate—about 80%—and it gives haptic feedback to let you know when it’s working correctly. The folks at Meta, including CTO Boz, are really excited about this new input method. They think it has huge potential and could evolve in big ways, possibly even letting you “write” in midair with an imaginary pen, turning your electrical impulses into text. It’s an amazing idea with a lot of promise.

So, either way, the glasses work really well for scrolling through Instagram. I can see the app through the lenses, but I also keep an eye on it with my own vision just to make sure I’m controlling the right thing, thanks to the eye tracking. The gesture scrolling works perfectly. It’s a cool feature on its own, but that’s just the beginning.

The second demo was even more impressive. I walked up to a table full of ingredients, looked at them, and made a gesture. Then I asked the AI built into the glasses what kind of smoothie I could make with those ingredients. The cameras on the front of the glasses picked up everything—Meta’s AI recognized the labeled ingredients and the obvious fruits on the table and quickly suggested, “You could make a pineapple smoothie with matcha!” It’s a good suggestion, sure, but what really blew me away was the AR touch. Little blue dots appeared on the ingredients, labeling them, and those labels stayed attached to the objects as I moved around and looked at them. It’s a small detail, but it made a huge difference. It’s hard to capture in a video, but just imagine seeing those labels hover over things in real life through your glasses. It felt like something straight out of a sci-fi movie.

Then came the third demo, which was probably the most fun—especially because it was a shared experience. Two of us, both wearing the glasses, walked up to a QR code in the center of the room. After staring at it for a few seconds, that QR code became the anchor for a shared 3D experience—in this case, a game of 3D Pong. With sensors on the front of the glasses, they tracked my hand movements and mapped them to a paddle in the game, letting me hit the ball back and forth. Honestly, I got pretty good at it—sorry, Ellis, but competition is competition!

It was just Pong, but in real life, and only the people wearing the glasses could see it. It was hilarious because we were having a blast, but to anyone else, it probably looked pretty strange. There’s a ton of complex tech behind these glasses—from the sensors that track your eyes and the world around you, to the micro-LED projectors inside, the waveguides, and even the special silicon carbide material that lets the light bend at extreme angles without distorting. It’s a lot of advanced engineering, but it all comes together to make it feel like something out of the future.

I had an in-depth conversation with Meta’s CTO, Boz, and we’re putting that whole segment on the Waveform podcast. It should be out by the time you’re watching this, but I’ll drop a link below so you can subscribe and dive into the details with us.

Here’s the main thing you need to know: these glasses are packed with some seriously advanced tech. They’ve got seven small sensors and cameras, all custom-designed for tracking your eyes and the environment around you. Inside, there’s custom silicon pulling all that data together, plus batteries are split up to evenly distribute the weight across your face. The frames themselves are made of magnesium, which keeps the lenses aligned and doubles as a heat sink to help manage the internal computer’s temperature.

To really understand how cramped everything is inside, Meta made a working see-through version. But here’s the catch—the transparent plastic isn’t as good at dissipating heat, so it overheats faster. This shows how thermally constrained the glasses are using today’s technology. Battery life is about two to three hours, and that’s not even counting the separate compute puck with a co-processor that handles all the app logic. It’s a technical marvel, really.

Even with all that, the glasses are still pretty lightweight and comfortable to wear for a couple of hours, though they do get warm and a bit heavy around the ears. The augmented reality graphics themselves are tracked really well—not the highest resolution, but definitely solid. With a 70-degree field of view, you can look around, and the graphics generally stay in your line of sight, though they do start to get cut off at the edges. Honestly, in real life, they felt like wearing just a slightly thicker, slightly heavier pair of glasses with a touch of tint and some added flair. But still, it was the most convincing demo of a post-smartphone AR future I’ve ever experienced.

Honestly, none of this matters right now because Meta’s not planning to release it. It’s a pretty strange move for a tech company to show off a new product and demo it, only to announce that they don’t actually plan on selling it. But I guess it’s more about PR. After watching my chat with Boz, the main thing I took away is that Meta’s focus is on continuing to work behind the scenes on improving the product, without the distraction of marketing or packaging the first version. They want to refine it and make a second or third version that’s actually ready to go to market. The idea is that the next iteration could be brighter, last longer on a charge, and possibly have better resolution, so it’s something people would actually want to buy. Of course, there’s also the issue of cost—this prototype is expensive, especially with the specialized materials like waveguides and silicon carbide. But overall, that’s the plan, and I actually think it’s a good one.

Snapchat AR

These are the Snapchat AR Spectacles, and right off the bat, they look a lot more like a piece of tech on your face. But functionally, they’re doing the same thing as Meta’s glasses. I can look through them at the real world, and in front of me, there’s a menu (you can’t see it, but I can) with digital overlays, and it’s pretty impressive that it all works. But there are a couple of key differences between these and Meta’s glasses that stand out.

First off, there’s no separate computer puck. Everything’s built right into the glasses. That’s probably why these feel bulkier than Meta’s version. Meta’s glasses looked more like regular glasses, but they had a separate computer, the size of a smartphone, constantly tethered to them. These Snapchat Spectacles, on the other hand, can connect to your phone but don’t need any extra hardware to work. Everything’s in the glasses, so you’re essentially wearing all the tech right on your face.

The second difference is in the materials and build. Most of the frame is made of black plastic, which might sound cheap, but it’s actually a smart move. It helps keep the glasses lightweight. They’re still sturdy and well-built, but the plastic keeps them from feeling heavy. The metal band on the sides is there to help dissipate heat from the hottest parts of the device, making sure the tech inside stays cool while you’re wearing them.

These glasses weigh 228 grams, and once you put them on, it’s hard to ignore just how big they are. They’re massive. The arm extends way past my ear, and there’s a lot of bulk on the front of my face. They’re designed to be more balanced, which is great, and while it’s nice that there’s no separate computer puck, these are still today’s tech, so there’s quite a bit going on.

Now, let’s talk about what happens when you actually turn them on. There are two major differences with the display: the resolution and the field of view. It’s tough to show exactly what I’m seeing on video since there’s no eye tracking—everything’s gesture-based. But with the Snapchat glasses, the resolution of the menus is noticeably sharper. It’s much clearer than the pixelated Meta glasses, almost like what you’d see with the Vision Pro. But on the flip side, the field of view is much smaller. The number I’ve seen is 46 degrees, compared to Meta’s 70 degrees, and honestly, you can really feel that difference.

The smaller field of view isn’t a dealbreaker when you’re looking straight ahead, but it’s definitely something you notice. This is one of the biggest differences when you compare these glasses to Meta’s. I’m not sure “immersion” is quite the right word, but here’s what I mean: when you’re using the glasses—whether you’re interacting with an app or looking at something in the overlay—you’re not really thinking about the field of view. You’re just focused on what you’re doing, looking straight ahead.

But as soon as you start moving your head around or trying to see more of what’s in front of you, you start noticing the limitations. You can even see it a bit at the edges of some apps, like with Meta’s graphics, where parts of the UI get cut off. I noticed this with Meta’s glasses, which have the larger 70-degree field of view, and here it’s even more obvious. These glasses are really only going to show you things directly in front of you. If you turn your head even a little, things get clipped. There’s a golf game in the app, and when you play, you use your phone as the golf club. You look down and see the ball, but nothing else. In golf, your peripheral vision is really important. You need to look up to see the hole, then look down at the club, constantly shifting between the two. It gets awkward when you can’t see everything you need.

You could technically get your hands on these today and start building apps for them. Of course, you’d have to pay $100 a month and commit to at least a year in the developer program. But once you’re in, you’d have access to these devices and be able to start creating lenses for Snap OS. There are already quite a few apps available, including that golf app I mentioned, a browser, a music creation app, and even a “Beat Saber” alternative called “Beatboxer.” There are also tons of shared-space experiences you can try out, and they’re pretty impressive.

Just like Snap’s glasses, you don’t need to scan a QR code with these. They’ll actually map out your surroundings as long as you’re looking around with someone else, syncing both your views so you can see and interact with the same 3D objects in real time. It’s honestly pretty cool.

So, if the main question with AR glasses is “What can you even do with them?” I really like Snap’s strategy of getting them into people’s hands early and letting them figure that out for themselves.

Dream Use Cases Personally, I have two dream scenarios for AR glasses. First, I’d love to learn any instrument using a “Guitar Hero”-style overlay. They already have one for piano, where the notes fall down and you just match them as you play. Imagine that for any instrument—that would be amazing.

The second idea is something like this: When you’re on an airplane and you look out the window to see a landmark or something familiar, how cool would it be if your AR glasses could highlight and label those landmarks for you? It could outline states or cities and show you monuments or historical sites as you look at them. That would be awesome.

Of course, there’s still a lot of work to be done on these glasses. The Meta glasses require a separate computer in your pocket, last only about two hours on a charge, and cost around $25,000 because of the materials used. Snap’s glasses only last about 45 minutes and look like this.

So yeah, the technology isn’t quite there yet for everyday consumers, but it’s hard not to imagine a future where we rely on AR glasses instead of our phones. It could be amazing—maybe someday.

RELATED POSTS

View all

view all