Okulary Ray-Ban Meta wzmacniają moją wiarę w przyszłe okulary Apple

cyberfeed.pl 6 miesięcy temu


I’m on evidence as saying that I think Vision Pro is mostly a stepping-stone toward an eventual Apple Glasses product, and that I think those are inactive years distant yet.

I feel like current glasses tech is giving us these partial peeks into that future. Viture 1 XR glasses supply 1 possible take, and the Ray-Ban Meta glasses another – and perhaps suggesting the future may be closer than I thought …

Vision Pro is simply a stepping stone toward Apple Glasses

We can debate the value of imagination Pro as a product in its own right, but to me that’s not the crucial thing here, as I’ve said before.

This is simply a long-term bet by Apple. It scarcely matters how many people want to buy a 2023 device costing respective 1000 dollars, nor whether a more affordable version launches next year or the year after […]

A imagination Air isn’t even the end-game here. For me, this is truly about the eventual improvement of an Apple Glasses product.

Vision Pro is simply Apple placing a stake in the ground, and expressing its assurance that head-worn devices have a future.

I feel like existing glasses tech is gradually previewing the kinds of experiences we might 1 day anticipate from Apple Glasses – and so far, I’m impressed.

Viture Glasses sold me on video consumption

It’s notable how many imagination Pro users are saying that, for them, watching tv shows and movies is the thing they do most often. Joanna Stern said that, erstwhile the novelty wore off, this was almost the only thing she inactive utilized it for, and I know another owners who’ve found the same.

That doesn’t surprise me. surviving in the UK, I haven’t yet had the chance to effort Apple’s spatial computer, but I’m 100% sold on face computers (or face monitors) for watching video content.

I’m surely intrigued about the thought of utilizing glasses as virtual Mac monitors for work, but for me the tech isn’t there yet.

My colleague Filipe Espósito said that his experience of an earlier iteration of Ray-Ban Meta glasses convinced him to believe in smart glasses, and a key reason for this was the convenience of capturing photos and video:

There are any moments that happen so rapidly that you don’t even have time to take your telephone out of your pocket. And that’s where the Ray-Ban Meta comes in, due to the fact that they’re already there on your face, ready to be used. With just the click of a button on the right stem, you can capture a photograph of what you’re looking at. Press the button for a fewer seconds and then it captures a video.

I absolutely echo that. Provided it’s bright adequate for me to be wearing sunglasses in the first place, it’s just fantastically convenient to be able to either scope up to press or hold a button, or to say “Hey Meta, take a photo” or “Hey Meta, evidence a video.”

For video in particular, it’s far little intrusive to automatically capture point-of-view footage without interrupting what I’m doing than it is to hold up my phone. Additionally – and importantly – I’m not then ending up viewing the scene through a telephone screen alternatively of my own eyes.

For example, this is me looking around precisely as I would have done had I not been recording:

That’s a very different experience to taking my telephone out of my pocket and viewing the scene through that. likewise enjoying listening to this singer, where the microphones cope amazingly well.

However, it does trigger a Vertical Video Syndrome alert! Both inactive photos and video have a fixed 3:4 vertical format.

Additionally, it should be said that the camera quality for both inactive photos and video doesn’t come anywhere close to that of my iPhone 15 Pro Max; it’s definitely going back a fewer years. Here’s a couple of examples – nothing bad, but besides not in current iPhone territory (click/tap to view full-size):

To me, it’s like going back in time to erstwhile I utilized a standalone camera for anything serious, and my iPhone for capturing memories. With this, I feel like the glasses would be my default for memories, while I’d pull out my camera erstwhile I care more about the quality.

We besides get no visual indication of our framing. The ultra-wide lens means we usually won’t miss anything this way, but it is easy to get a wonky horizon, or little than the perfect framing. Additionally, I seldom shoot ultra-wide with my phone, preferring to control what’s in view and what’s not with a tighter view.

Finally, videos are limited to 60 seconds. That’s fine for fast clips while exploring a city, for example, but I’d absolutely love to be able to usage these for things like roller-coaster rides. expanding the limit to say 3 minutes would make a immense difference.

Put all these factors together – vertical aspect ratio, middling quality, no visual framing, 1-minute video limit – then it’s definitely not something I’d usage for more serious use. But for me the killer app is the ease and convenience of capturing POV video for memories. I will absolutely usage them for that.

Photos are initially stored in the on-board 32GB flash memory of the glasses themselves. erstwhile you want to transfer them to your phone, you do this utilizing the Meta View app, and the glasses make a wifi hotspot for the purpose. It’s very fast and painless.

Audio use: Listening, and voice calls

The arms of the glasses contain tiny speakers. I gotta say that I don’t feel like any above-the-ear headphones are good adequate for listening to music – for that, I’ll stick to my trusty MW09s – but they definitely work well for podcasts and voice calls. If you do find them good adequate for music, though, you can link them to Apple Music or Spotify, and the tap to start/resume is handy. The right arm is besides a touchpad allowing you to swipe forward and back for volume up/down – though I found this very insensitive.

But for voice calls, it worked truly well, with the microphones besides allowing the another organization to hear me clearly. In summertime in particular, I’m alternatively glad not to have something in my ear.

I was already sold on Shockz bone-conduction headphones for the comfort origin during calls, and these are even more convenient as everything is in a single device I’d be carrying with me anyway. The Meta glasses aren’t bone-conduction, just tiny, downward-facing speakers, but they work well, and are likewise hard for anyone next to you to hear.

But the truly large news since Filipe’s review is that – thanks to a software upgrade – Ray-Ban Meta now offer AI designation of scenes and objects. This is voice activated, simply saying “Hey Meta, tell me what you see.” (This feature is presently limited to the US, but that’s very easy solved: Force-quit the Meta View App. Activate a VPN connection to the US. Re-open the app. Wait about 30 seconds until prompted to effort the AI feature. Accept. At that point, you can end the VPN connection and keep the capability.)

The telephone then reads you the result, and – helpfully – besides adds both the photograph itself and a text transcript of the consequence to a log in the app.

Processing isn’t done on-device. Instead, the photograph is sent to a Meta server, and the results sent back. But I must say I was incredibly impressed by the speed! It typically took about a couple of seconds on a mobile connection (generally 5G in London, but sometimes with fallback to 4G).

I did reasonably extended investigating of this feature over a couple of days, and there’s a mix of good and bad news. Let’s start with the bad.

The promise with this kind of tech is that we could do things like look at a flower, and have the glasses not only identify it, but besides supply information on how best to care for it. virtually all time I tried this with a flower or plant, Meta’s horticultural expertise was identical to mine. Here are a fewer example results:

So, uh, yeah – thanks for that.

It did somewhat better with any iconic London buildings. Meta successfully identified the Tate Modern, and the Globe Theatre.

But it failed with Tower Bridge, Millennium Bridge, and others.

Bizarrely, the glasses don’t appear to access the phone’s GPS, which you would have thought would be the logical starting point in identifying a place. For example, this is not only a alternatively distinctive bridge visually, but I am literally standing on it as I ask Meta to identify it:

It was sometimes remarkably successful. It didn’t rather figure out that the taxi and the food stand were actually the same thing, but this was inactive an excellent effort:

Here, it successfully identified the car badge, and correctly transcribed the number plate:

It was hit-and-miss with signs, sometimes reading me the wording, another times not.

In all cases, the very wide-angle camera meant I had to position myself very close to signs. Combine this with the deficiency of visual framing, and sometimes I got too close.

It besides did well with ducks and swans.

Click/tap on the frame-grabs in the gallery below to see more examples of where the tech is at currently. Amusingly, the glasses couldn’t identify either themselves, or their charging case!





























Why does this excite me?

A non-techy friend asked the rather reasonable question: What’s the point of the AI stuff, another than for visually-impaired people, or people who can’t read?

But this is an early beta. What most excites me is not the reasonably generic descriptions it mostly gives today, but the possible for the future.

For example, translating signs in abroad languages while travelling, without having to take my telephone from my pocket.

Imagine an integration with Citymapper, where it not only tells me I request the number 77 bus, but it besides tells me I request to walk to the next bus halt along, and tells me erstwhile it spots the bus.

Look at the outside of a restaurant, and it tells me the average journey Advisor rating, as well as the recommended dishes.

Or, the dream script for individual who suffers from mild facial aphasia (seeing little facial differentiation between faces, making it hard to recognise people): telling me the name of the individual walking toward me, and a conviction or 2 about where and erstwhile we last met!

Ray-Ban Meta glasses are a truly useful part of tech at an awesome price. Standard Ray-Ban Wayfarer sunglasses typically cost around $150-190, and these start at $299. So if you’re a Ray-Ban fan already, you’re paying something like $110 to $150 for the tech. That’s honestly a steal.

Even if you wouldn’t usually buy Ray-Ban, I inactive think $300 for the convenience of instant POV video capture, without having to take your telephone from your pocket, is good value. Add in the fact that you get any headphones thrown in (albeit ones I’d only personally usage for voice), and it’s a very convenient package at a very decent price.

The AI features are mostly a gimmick at present. There were just besides fewer examples of it giving me genuinely useful information. But it’s a beta, and I do, as I say, feel excited for the future of this.

The main drawback I found is the battery-life. This is claimed to be 4 hours. I got not much more than half that in 1 of my tests, erstwhile I was really hammering it – utilizing it for AI queries All The Time – so I fishy that’s about right in real-life use. I’ll update later on this.

It’s decent, but you can’t just leave them switched on the full day, and the charging case is besides bulky to carry in your pocket. As shortly as you gotta think about switching them on only erstwhile you request the tech, you immediately lose that instant photo/video convenience. This is, though, a trade-off. A bigger battery would make the glasses heavier, and possibly bulkier, and the average sunglasses form-factor is absolutely key to their appeal.

I typically wander circular my own city without a bag, but I usually have a shoulder bag or backpack erstwhile exploring a city while travelling, so for travel usage I’d happily throw in the charging case and recharge them while eating or drinking.

Looking ahead to Apple Glasses

Apple Glasses certain won’t cost $300!

A key difference, of course, is that Ray-Ban Meta have no display – just voice input and talker output. Apple Glasses will surely have displays for AR functionality, like displaying notifications and overlaying directions for navigation. I’d say it’s a safe bet that these will besides be suitable for watching video, which is 1 of the key things people enjoy about imagination Pro.

Effectively what I’m expecting is something which combine the camera and voice access features of Meta glasses with the video-watching capability of Viture 1 glasses. erstwhile we can combine all of that into a sunglasses format, we truly will have a must-have fresh product category.

How long that will take is the large question. In particular, creating transparent displays which are unobtrusive erstwhile walking around, and yet besides effectively block outside light erstwhile watching video, is the top challenge. Right now, I wouldn’t want to walk around erstwhile wearing Viture glasses, even if it can technically be done. I surely wouldn’t want any sense of my imagination being obscured erstwhile exploring a city.

But Viture glasses work so well they have become my primary way to watch video erstwhile alone, and so far I’m not seeing any reason to catch my usual sunglasses as I leave home erstwhile I can alternatively have all the capabilities of the Meta glasses in beautiful much the same form-factor. A fewer years ago, I wouldn’t have expected either to have reached that level of applicable appeal and affordability by this point, so who knows – possibly we’ll get Apple Glasses sooner than I’ve so far expected! I’m surely excited at the prospect.

FTC: We usage income earning car affiliate links. More.



Source link

Idź do oryginalnego materiału