Home / Blog / Report

Why I Bought the Rabbit R1 (Even Though It’s Not Accessible)?

When the Rabbit R1 was first announced, it generated a lot of buzz—this tiny, stylish device promised to reinvent personal computing with the help of artificial intelligence. But as a blind accessibility specialist, I didn’t rush to pre-order one. In fact, it was my friend Michael who picked one up when it launched—and after a few underwhelming attempts to use it, the device sat untouched for months.

That changed one day when we were bored. Out of curiosity (and maybe stubbornness), we decided to give the Rabbit R1 another spin.

My Background: Accessibility Matters, But I Love Good AI

Before diving in, it’s worth mentioning where I come from. I’m an accessibility specialist by trade—ensuring digital experiences work for blind and low vision users. But I’m also someone who loves good AI. When I see potential in a tool, I don’t write it off just because it’s not perfect. I see it as a challenge and an opportunity for improvement.

The First Time Around: Meh

When Michael first unboxed the Rabbit R1, nothing about it screamed “usable” for blind users. No screen reader. No haptic cues. No audio guidance. It felt like another AI device that forgot we exist. So we set it aside.

What Is the Rabbit R1?

The Rabbit R1 is a handheld AI-powered device built around a system the company calls a Large Action Model (LAM). Unlike traditional voice assistants like Siri or Alexa, Rabbit is built to do things—log into websites, automate tasks, and control other apps or systems based on your requests.

It includes:
– A push-to-talk button
– A scroll wheel
– A rotating camera (Rabbit Eye)
– A touchscreen
– A USB-C port

But where it really shines is online, through a tool called the Rabbit Hole.

Into the Rabbit Hole

The Rabbit Hole is Rabbit’s web interface—this is where the magic really starts for those of us who rely on screen readers.

Once logged in, I explored several modes, including:

Playground

This is where you can type out any task in natural language. I told it: “Update my server.” It asked for my login credentials, then proceeded to connect and walked me through the entire update process. Within 10–15 minutes, it had performed the task. This kind of real-world automation—without needing a traditional terminal—was a huge win.

Cookie Jar

This is where Rabbit stores login credentials for services. The catch? It uses a virtual browser window that’s not accessible. I had to rely on NVDA OCR to locate fields and type in my credentials. Frustrating, but I made it work.

Real Tasks I Completed

Once I got the hang of things, I started pushing Rabbit’s limits:
– Described collectibles on Michael’s bookshelf
– Researched business strategies
– Debugged Python code
– Found cheap 3D printer filament
– Ran server commands
– Opened multiple windows for parallel tasks

Intern Mode: Rabbit’s Own AI Agent

Rabbit recently introduced Intern, a new mode that acts as your AI assistant. Some tasks it can perform include:
– Creating online courses
– Writing Python apps
– Summarizing news in Word documents

However, it has limitations:
– Audio editing had strange sounds
– Video uploads failed
– Audio-to-text didn’t work
– Editing Squarespace sites was unsuccessful

One win: generating alt text for images worked.

Today’s Test: Navigating the R1’s On-Device Menu

I wanted to figure out how to manage the R1’s menu. First, I tried using my Meta Ray-Ban smart glasses, but they weren’t helpful. They read some text but gave inaccurate or bad information.

Then, I used Seeing AI. I pressed the side button and used the scroll wheel while in short text mode. Seeing AI read out items like Settings and Updates, but it didn’t indicate what was selected. I had to rely on my remaining vision to identify the red selection highlight.

I counted five items down to reach Updates and used the side button to select it. It wasn’t perfect, but it was usable with some effort.

Why I Bought One Anyway

After testing Michael’s device, I saw real potential and ordered my own Rabbit R1 from Best Buy. It will arrive Thursday. Michael will help me set it up, and I’m fine with that. This device, despite its flaws, shows what’s possible when AI meets utility.

Looking Ahead: Opening a Dialogue

I don’t expect the Rabbit R1 to be perfect yet. But I believe in progress. I plan to start a dialogue with Rabbit’s team about how to make the device more accessible to blind and low vision users. Accessibility isn’t an afterthought—it’s a foundation for innovation, and I’m excited to help drive that conversation forward.

Check out my work at

https://taylorarndt.substack.com

Can You Get Ultra Battery Life From The Apple Watch Series 10?

Wrist wearing Apple Watch Series 10 with modular watch face.

I have been a fan of the Apple Watch since it was released, but I’ve never found them to have the best battery life until the Ultra was released. I immediately purchased one, and have enjoyed the battery life ever since, but I never really enjoyed the size of the watch.

When Apple released the Apple Watch Series 10, I was excited by the new features, and the size of the watch, but I worried about the battery life since they claim that it should get 18 hours of battery.

Last Sunday, I decided to just go and pick up a Series 10, and give it a try. I got the watch home, and just fell in love with the size. The device is thin and light, and the titanium versions are polished like the stainless steel watches. So I quickly found that this watch was what I was looking for in design, but I still worried about the battery life.

It is nearly a week later, and I can typically use the device all day. My use typically includes a few workouts, and checking notifications while on the go. What really impresses me is that it is typically still is at 40% battery when I charge it and get ready for my day.

While the Series 10 may not quite have the 48 hours that I would get from my Ultra when it was new, it still has the battery life to last all day and then some. I personally think that anyone buying this watch will be very impressed, and battery life will not cause an issue as I’ve noticed on other watches.

Broadcasts

Broadcasts icon

App Name

Broadcasts

App Version

3.4.6

Platform

iOS/iPadOS

Category

Music

Description

Listen to streaming internet radio in an ultra-simple, single-purpose app.

Broadcasts makes building a library of stations from around the world easier than ever — no subscription required. Use completely free with a limited-size library of stations, or upgrade to the full version to unlock the library limit and add as many of your own stations as you like from the Station Browser, or manually via URL. Search for stations from around the world as easily as from your own library. Use the Mini Player to showcase gorgeous station and track artwork (for supported stations).

Sync your library across iPhone, iPad, Mac, Apple Watch, and Apple TV with iCloud & Universal Purchase. Enjoy your library on the road using CarPlay and your iPhone.

Browse stations in grid or list view. Listen in Light or Dark mode. Edit stations to provide your own artwork, or use the Artwork Chooser to generate something fun and colorful. Use Handoff to transfer playback between devices, or AirPlay directly to a home speaker.

Supports Shortcuts & AppleScript for playback automation.

Free or Paid

Paid

Price

$5,49

Devices you’ve tested on

iPhone and Apple Watch

Accessibility Rating

5 – Fully Accessible

Accessibility Comments

This app is perfectly accessible and easy to use

Screen Reader Performance

It is fully accessible with VoiceOver

Button Labeling

All buttons are clearly labeled. There are some issues in the playback view, but they are easy to work around

Usability

It is easy to learn and use

Other Comments

The app has extensive shortcuts support

App Store Links

https://apps.apple.com/nl/app/broadcasts/id1469995354

Developer Website

https://www.highcaffeinecontent.com/

iOS 18: The Ultimate Upgrade

iOS 18 Icon

Apple has unveiled iOS 18, a major update packed with new features, enhancements, and improvements that redefine the iPhone experience.  This release introduces Apple Intelligence, a suite of personalized features that make your iPhone more intuitive and helpful. Here’s an in-depth look at everything iOS 18 has to offer.

Apple Intelligence: Your Personalized Assistant

Apple Intelligence is the highlight of iOS 18, offering a new level of customization and smart capabilities across the entire operating system. Designed to understand your personal context, Apple Intelligence suggests actions, assists with writing, and offers tailored recommendations. This feature brings a smarter, more context-aware Siri, new writing tools, and more precise notifications, making your iPhone experience more seamless and intuitive. I currently use the iPhone 14 and look forward to upgrading to take full advantage of these features—this is the highlight for me.

Enhanced Siri Experience

Siri has received a significant upgrade in iOS 18, featuring improved language understanding and contextual awareness. Siri now anticipates your needs, offers real-time suggestions, and understands commands based on your current activity, integrating even more closely into your daily routine. A potential application I am exploring is whether Siri’s screen-aware feature can read unlabeled buttons on the screen, which would greatly enhance usability.

Customization at Your Fingertips

iOS 18 allows you to personalize your iPhone like never before:

– Rearrange Apps and Widgets: Easily customize your Home Screen layout by rearranging apps and widgets.

– New App Icon Look: Choose a Dark mode look, tint app icons with any color, or let iOS suggest a color based on your wallpaper.

– Locked and Hidden Apps: Secure sensitive apps with Face ID, keeping your data safe when sharing your device.

Redesigned Control Center

The Control Center receives a complete overhaul with new groups of controls that are accessible with a single swipe. You can customize controls, resize them, and group them as you like. The new Controls Gallery allows you to add your favorite controls from various apps, enhancing personalization.

Photos App: A New Way to Relive Memories

The Photos app has been completely redesigned, making it easier to organize and access your library:

– Browse by Topic: Collections organize your photos by recent days, people, pets, and trips, providing a more intuitive browsing experience.

– Customize Collections: Pin your favorite collections, ensuring your most cherished photos are always easy to find.

Messages: More Fun and Functional

iOS 18 brings exciting new ways to communicate in Messages:

– Text Effects: Apply animated effects to text, words, or emojis, with suggestions appearing as you type.

– Tapback with Any Emoji or Sticker: Express yourself with a wider variety of emojis and stickers in your responses.

– Messages via Satellite: Stay connected without Wi-Fi or cellular, using satellite technology on supported iPhones.

– Schedule Messages: Use the Send Later feature to schedule messages for a specific time, ensuring you never forget to send an important text. This feature is a welcome addition as it allows scheduling communications at the most appropriate times.

Mail: Coming Soon with New Features

Later this year, iOS 18 will introduce Mail improvements with automatic categorization and a focus on important messages. The new Primary category will help users manage their inbox more effectively, prioritizing time-sensitive and significant emails. With the volume of emails I receive, this enhancement will be transformative in streamlining my communication management.

Safari: Smarter Browsing

Safari in iOS 18 introduces Highlights, automatically detecting relevant information on a page and making it easily accessible. A redesigned Reader mode now includes a table of contents and high-level summaries, allowing users to get a quick overview of articles before diving in.

Maps: Explore Like Never Before

iOS 18 brings new topographic maps and trail networks, making it easy to plan hikes and outdoor activities. Users can create custom routes, download maps for offline use, and access detailed hiking information, including trail length and elevation. I am particularly interested in exploring whether the custom route’s function can work like waypoints, enhancing navigation similar to Good Maps.

Game Mode: Elevate Your Gaming Experience

Game Mode minimizes background activity to maintain high frame rates and reduce audio latency, especially when using AirPods and wireless game controllers. This ensures smooth gameplay and an immersive gaming experience.

New Wallet Features

The Wallet app now supports Tap to Cash, allowing iPhone users to complete transactions by simply bringing their devices together. This new capability will make Apple Cash transactions even more convenient. Additionally, users can now pay with rewards and set up installment payments for Apple Pay, offering greater flexibility in managing payments.

Enhanced Accessibility Features

iOS 18 introduces revolutionary accessibility updates:

– Eye Tracking: Control your iPhone using just your eyes.

– Music Haptics: Sync the iPhone Taptic Engine with the rhythm of songs, enhancing the music experience for users who are deaf or hard of hearing.

– Vocal Shortcuts: Record specific sounds to trigger actions on iPhone, assisting those with atypical speech in communicating more effectively.

Privacy and Security Enhancements

Privacy remains a priority with redesigned Privacy and Security settings, offering easier ways to manage what information you share with apps. New contact-sharing controls and improved Bluetooth privacy provide users with more control over their data.

Additional Updates

iOS 18 brings a host of other features, including:

– Live Call Transcription: Record and transcribe phone calls directly from the Phone app. This feature is invaluable for capturing discussions and sharing notes within my team.

– New Calculator Features: Access the Math Notes calculator and explore unit conversion and history features in a new portrait mode, potentially revolutionizing accessibility in math.

– Freeform Updates: New diagramming modes, alignment tools, and improved sharing options make Freeform boards even more versatile.

iOS 18 Release Date and Compatibility

iOS 18 is set to be released on September 16th and will be compatible with a wide range of iPhone models, from the iPhone 11 up to the latest iPhone 16 series. With so many new features, iOS 18 promises to be the most powerful and personalized iPhone experience yet.

iMessage Audio Message Playback Speeds Not Accessible to VoiceOver Users

I read an article recently, outlining how to speed up audio messages. I didn’t even know that such a task was possible. However, for those, like myself, who rely entirely on VoiceOver, this feature is nonexistent, as VoiceOver doesn’t even see such control. You’re supposed to tap & hold the, “play” button to bring up these options.

What can we do, to urge Apple to fix this, as its still broken under iOS 18.1.
Another thing which the below, article mentions, is the ability to rewind & fast forward through audio messages.

Any ideas on if or how this can be done with VoiceOver? The article I referenced for this can be found at: https://9to5mac.com/2024/06/19/adjust-playback-speed-imessage-audio/#:~:text=And%20here%E2%80%99s%20how%20to%20do%20it%3A%201%20Open,button%204%20Choose%20the%20playback%20speed%20you%20want

Comment below, if you have any feedback, or if there is a workaround for this issue.

Apple Event Recap: A New Era of Innovation, Intelligence, and Accessibility

Apple’s recent event at Apple Park was not just about new products; it was a showcase of how technology can empower, connect, and enhance the lives of all users, including those with disabilities. The event highlighted Apple’s ongoing commitment to accessibility, ensuring that its innovations are designed to be inclusive and usable by everyone. With major announcements around Apple Watch, AirPods, and iPhone, Apple continues to lead the way in integrating advanced technologies that redefine our interactions with the world.

Apple Watch Series 10: The Thinnest, Most Advanced Apple Watch Ever

Apple Watch Series 10 made its debut with a 30% larger, more advanced display, designed for enhanced readability and a sleek look in polished finishes like Jet Black and Rose Gold. Featuring a Wide-angle OLED Display, improved brightness, and power efficiency, the Series 10 redefines interaction by making it easier to view the watch from any angle.

The Series 10 is Apple’s thinnest design yet, measuring just 9.7 mm, and incorporates advanced technologies such as the S10 SiP and watchOS 11. These features enable intelligent capabilities like sleep apnea detection, advanced workout metrics, and new water-based activity tracking, positioning it as the perfect companion for any lifestyle. I am definitely considering purchasing this in the future. However, I want the phone first.

Apple Watch Ultra 2: The Ultimate Sports Watch

Apple Watch Ultra 2 was introduced as Apple’s most rugged and capable smartwatch to date. With new Black Titanium finishes, the Ultra 2 offers advanced GPS, extended battery life, and enhanced sensors for underwater activities, making it ideal for athletes and outdoor enthusiasts.

AirPods 4: Redefining Personal Audio

Apple unveiled the next generation of AirPods, focusing on comfort, audio quality, and intelligent features. Powered by the H2 chip, AirPods 4 deliver superior sound with richer bass, personalized spatial audio, and new machine learning features like voice isolation and intuitive Siri interactions. For the first time, AirPods 4 come with Active Noise Cancellation and Transparency mode, adapting automatically to different environments. USB-C and wireless charging options further improve convenience. However, I’m hesitant to purchase as they may not fit well given my smaller ear size. I was impressed by the lower prices though.

AirPods Pro 2: Health-Focused Audio Innovations

Apple introduced revolutionary health features in AirPods Pro 2, including hearing protection, a clinically validated Hearing Test, and an over-the-counter Hearing Aid feature. These additions make AirPods Pro 2 a transformative tool for those with hearing challenges, providing accessible hearing support without compromising audio quality. I believe this is truly a game-changer.

iPhone 16 and iPhone 16 Pro: The Next Level of Apple Intelligence

The iPhone 16 lineup marks the beginning of a new era, integrating Apple Intelligence at its core. With the A18 chip and a 16-core Neural Engine, the new iPhones deliver enhanced on-device intelligence. Features such as the customizable Action button and advanced camera systems make the iPhone 16 the most capable and personal iPhone yet. My favorite feature is the visual intelligence, which is a standout addition.

Visual Intelligence was a major highlight of the iPhone 16, transforming the device into a powerful tool for everyday interactions. The new Camera Control on iPhone 16 allows users to instantly learn about their surroundings by simply pointing the camera. This feature leverages on-device intelligence and Apple services to provide real-time information without storing images, ensuring privacy. For example, users can identify a restaurant, view ratings, check hours, and even add events from a flyer directly to their calendar with a simple click. It also integrates with third-party tools, allowing users to search for products online or get academic help with a single tap.

The Pro models elevate the experience further with new Titanium finishes, larger displays, and superior gaming capabilities, all driven by the A18 Pro chip. Apple Intelligence integrates deeply into the system, enhancing communication, reliving memories, and even personalizing Siri to better assist with day-to-day tasks. The iPhone 16 Pro’s Camera Control not only enhances photography but also provides users with access to powerful AI-driven insights, making it an invaluable tool for visually impaired users and beyond.

Commitment to Accessibility

From the outset, the Apple Event placed a strong emphasis on accessibility, with several speakers acknowledging the profound impact Apple products have had on people with disabilities. Apple’s dedication to accessibility was evident across all announcements, as the company showcased features like on-device intelligence that respects user privacy, the health-focused innovations in AirPods, and the Visual Intelligence capabilities in iPhone 16 that make information more accessible. The entire event served as a testament to Apple’s vision of technology that is not just cutting-edge but also inclusive, ensuring that everyone, regardless of their abilities, can benefit from the latest advancements.

The Apple Event highlighted the company’s relentless drive for innovation, underpinned by a strong commitment to accessibility and user empowerment. From the thinnest Apple Watch ever to health-focused audio solutions and iPhones that redefine personal intelligence and accessibility, Apple continues to set the standard for how technology should integrate into and enhance our daily lives. Comment below and tell us your thoughts  on the event.

My Experience Using Meta Ray-Bans for shopping


I’ve been curious about whether my Meta Ray-Ban smart glasses could help me shop independently, so I decided to put them to the test at CVS and Natural Grocers in Austin. My experience was filled with both challenges and moments of shock, especially when the people around me saw how I navigated the store using these glasses.

Testing the Meta Ray-Ban Glasses at CVS
I had some errands to run at CVS, including picking up items at the pharmacy, so I thought this would be the perfect opportunity to see how the glasses could assist me. To get there, I booked an Uber, using my Meta Ray-Bans to help me identify the car when it arrived. The first Uber wasn’t the right one. I asked the glasses, “Look and tell me what car this is,” and they responded, “This is a white sedan.” I knew immediately that this wasn’t my ride.

When the correct Uber finally pulled up, I used the glasses to confirm by asking, “Look and tell me What color is this car?” The response was, “white.” I then asked, “What car is this?” and it correctly identified it as a white Honda. I double-checked by asking for the license plate number, and the glasses gave me the right details. Feeling confident, I got in and made my way to CVS.

Once at the store, I quickly grabbed my pharmacy items but wanted to explore further to see how much the glasses could assist me in navigating the aisles. I asked, “Look and tell me a detailed description of this aisle,” and it responded with descriptions of greeting cards for graduations and birthdays. Moving down another aisle, it identified the snacks section, describing candies and other treats.

When I picked up items I was interested in, I asked, “Look and tell me what I am holding.” Unfortunately, one of the specific items I was searching for was out of stock, but this process made finding and verifying products a lot easier. Before leaving, I asked the glasses, “Look and tell me if you see the exit sign,” and they guided me accurately to it. I repeated the same steps to find my Uber, verifying the car details to ensure I got into the right one.

Walking to Natural Grocers
The next day, I decided to visit Natural Grocers, which is close enough for me to walk. As I approached, I used my Meta Ray-Ban glasses to ensure I was at the correct location by asking, “Look and tell me if this is Natural Grocers.” The glasses confirmed it was indeed the right business, so I walked in confidently.

Once inside, I began exploring the store using the glasses to assist me. I moved from aisle to aisle, asking for descriptions. When I stood in front of a freezer, I asked, “Look and tell me a detailed description of the contents in this freezer.” The glasses provided descriptions like “pre-made meals,” and when I asked for specifics, it detailed items such as “chicken pot pie” and “chicken tenders.”

I picked up a chicken pot pie and asked, “Look and tell me what I am holding,” which confirmed it was the right item. I followed up with, “Look and tell me the directions,” to verify the cooking instructions, and finally, “Look and tell me the price.” With all the information I needed, I headed to the checkout.

A Moment of Astonishment at the Checkout
As I approached the checkout, the cashier was visibly surprised. The glasses had guided me independently through the store, and it was clear that everyone around was amazed. The cashier couldn’t believe that a blind person could shop without assistance, using only smart glasses to navigate and identify items. The entire store seemed to be in shock and awe, with people watching as I smoothly completed my shopping without needing help from anyone.

After checking out, I wanted to share this experience with my friend Michael. I used the glasses to initiate a WhatsApp call, showing off my purchase. The process was a bit tricky at first, as the call initially connected to my phone instead of my glasses, making it hard to hear him. I manually switched the call back to the glasses and pressed the capture button twice to activate the glasses’ camera. Once I got it right, the video call worked smoothly, and I was able to share my shopping adventure with Michael.

Final Thoughts
Using Meta Ray-Ban glasses for shopping was an empowering experience, allowing me to navigate stores like CVS and Natural Grocers independently. Despite a few initial challenges, such as finding the correct Uber and setting up video calls, the glasses proved invaluable. They helped me verify car details, identify products, read instructions, and guide me to exits.

The reaction from the people at Natural Grocers was particularly rewarding—they were astonished to see how technology enabled me to shop confidently on my own. These smart glasses are more than just a cool gadget; they’re transforming everyday experiences, making them more accessible and enjoyable.

iPhone Mirroring Accessibility Status

macOS Sequoia beta has a new feature called iPhone mirroring, which allows the user to control their iPhone from their Mac. This means that you can use your iPhone from your Mac while your phone is in your pocket, or while it isn’t in teach. The big question though is if this feature works with VoiceOver.

At the time of this writing, in early September 2024, the iPhone mirroring feature is not accessible. While I have been able to open the application and navigate the window using VoiceOver with macOS, VoiceOver does not read the contents of the phone screen. Enabling VoiceOver on the phone does provide some assistance by allowing the arrow keys of the Mac to navigate the status bar, but this does not enable the user to complete tasks as the phone must remain locked.

In my opinion, the iPhone Mirroring feature possesses significant potential. While it does effectively display iPhone notifications on macOS, it is unfortunate that it does not appear to function with VoiceOver. Furthermore, it would be great if the feature could also be utilized on the iPad. I am hoping that these issues will be addressed in a future update to the operating system.

Have you been able to get this feature to work? Let us know in the comments, or in the iAccessibility Forums.

The Truth About Web Browsers on iOS

Browsing the web on an iOS device has typically ben accomplished with Safari. We now have the option though to use third party web browsers like Firefox, or Google Chrome to browse the web as well. The advantage is that Safari isn’t the only option for browsing the web on iOS, or is it?

Web Browsers on iOS

There are several web browsers now on iOS, but are they actually different from Safari? The truth is that all web browsers on iOS are based on Apple’s WebKit technology, which also powers Safari. This is because third party apps can not run code like Javascript in a way that is required to render web pages. This may change if Apple allows JIT which stands for Just In Time code execution in third party apps. In short, it means that apps like Chrome or Firefox could not load web pages quickly like Safari, so the rule is that all iOS web browsers must use the same technology that powers Safari.

So Are Third Party Browsers Just Safari?

Third-party browsers employ the same technology as Safari, yet they offer distinct advantages. Notably, Google Chrome and Firefox enable users to synchronize bookmarks across devices. While this functionality is also available in Safari, its implementation is limited to the Apple ecosystem, and many users prefer not to exclusively utilize a Mac or Safari. Consequently, third-party browsers provide users with the flexibility to access tools and features that are not exclusive to web browsing.

Conclusion

Third party browsers allow alternatives to Safari on the iPhone and iPad. Many users believe that they are faster and more reliable than Safari without knowing that these browsers work the same while browsing the web. The only reason to consider a third party browser would really be if you like the other features provided by your favorite web browser that you use on your desktop computer.

Ray-Ban Meta Smart Glasses for Blind Users: Complete Guide

Ray-Ban Meta Smart Glasses combine style with advanced technology, offering unique benefits for blind and visually impaired users. These glasses are equipped with AI-driven features that provide hands-free accessibility, making daily tasks easier. However, it’s important to understand their limitations and how they fit into your overall accessibility toolkit. In this guide, we’ll explore the styles, key features, available commands, and important considerations when using these glasses.

Where to Buy and Available Styles

Ray-Ban Meta Smart Glasses are available at the Meta Store, Ray-Ban’s website, and major retailers like Amazon and Best Buy. Prices start at $299, with additional costs for custom lenses such as prescription or polarized options. The glasses come in various styles, including the feminine Skylar, the classic Wayfarer, and the retro Headliners, each offering different color and lens configurations.

Key Features and Accessibility Commands

The glasses are equipped with a 12MP ultra-wide camera, open-ear speakers, and advanced AI. Here are some essential commands:

 “Hey Meta, look and tell me what you see”: Identifies objects or people in view.

 “Hey Meta, look and give me a detailed description”: Provides a detailed analysis of what the camera sees.

 “Hey Meta, look and tell me everything you see”: Offers a comprehensive overview of all visible elements.

– “Hey Meta,  look and read this”: Reads text aloud, ideal for reading signs, menus, or documents.

 “Hey Meta, translate this”: Translates foreign text into your language.

Limitations and Features Under Development

While Ray-Ban Meta Smart Glasses offer many useful features, they currently do not support popular accessibility services like Be My Eyes or Aira. Meta is working on expanding the AI capabilities, but these features are not yet available. You may come across commands online that claim to offer extended functionality; however, results can vary. This is because Meta often rolls out new features gradually or tests them with select users, meaning not all commands will work consistently.

What Not to Use These Glasses For

These glasses are not designed to replace critical tools like canes or guide dogs. They are not suitable for recognizing medications, people, or performing tasks related to health and safety. The glasses’ AI is not intended for precise navigation, identifying health hazards, or making decisions about personal safety. They should be viewed as a supplementary aid rather than a primary accessibility solution.

Important Tips for Using Ray-Ban Meta Smart Glasses

1. Listen for the Beep: After a command response, a beep indicates that Meta is ready for your next command.

2. Experiment with Commands: The AI’s performance can vary, so it’s important to try different commands and learn which work best for your needs.

3. Be Aware of Limitations: Always use these glasses as an additional tool, not as a substitute for traditional mobility aids.

4. Avoid Using for Health and Safety Tasks: The AI is not equipped to handle critical safety-related identifications or medical advice.

Ray-Ban Meta Smart Glasses provide a stylish and accessible solution that enhances everyday experiences for blind and visually impaired users. With continuous updates and evolving features, these glasses are poised to become even more functional. However, it’s crucial to recognize their current limitations and use them in conjunction with traditional accessibility aids.

Comparing Meta AI, Be My AI, and Access AI

AI-powered accessibility tools like Meta AI, Be My AI, and Access AI from Aira are significantly enhancing how visually impaired users interact with the world. Each of these tools has distinct approaches, features, and benefits. Below, we compare these solutions in detail, including Aira’s new AI initiatives that are shaping the future of accessible technology.

Meta AI

Meta AI is a broad, general-purpose AI assistant integrated into Meta’s platforms, such as Facebook, Instagram, and WhatsApp. It leverages advanced language models like Llama to offer generative AI capabilities, including text, image recognition, and chat-based assistance. Meta AI’s strength lies in its powerful generative features and widespread integration, which makes it suitable for a wide range of everyday tasks beyond just accessibility.

However, Meta AI is not specifically tailored to the needs of visually impaired users. It focuses on general interaction improvements, and while it offers high-level image descriptions, it lacks the accessibility-specific refinements that specialized tools provide. Meta is currently expanding its AI reach but faces regulatory delays in Europe due to privacy and data use concerns. As part of its commitment to responsible AI development, Meta AI allows users to control data usage and offers transparency about its data handling practices.

Be My AI

Be My AI is a feature within the Be My Eyes app that uses AI, powered by OpenAI’s GPT-4 Vision model, to provide detailed descriptions of images. This tool complements the live assistance offered by sighted volunteers, allowing users to access quick and descriptive feedback on visual content. Be My AI’s strength is in its conversational style, where users can ask follow-up questions to gain deeper context about what is being seen.

The focus of Be My AI is on providing accurate and responsive descriptions specifically for visually impaired users. It excels in making AI interactions feel personal and relevant, offering a straightforward, user-friendly experience tailored to individual needs. However, unlike Aira’s Access AI, Be My AI does not offer human verification, which can be a critical feature for ensuring high trust in certain situations.

Access AI from Aira

Access AI is part of Aira’s broader vision of integrating AI into its existing visual interpreting services. It allows users, known as Explorers, to capture or upload images and receive instant AI-generated descriptions. What sets Access AI apart is the optional human verification through Aira Verify, where a professional visual interpreter can review and confirm the AI’s responses. This combination of AI and human input ensures that the service remains highly accurate, secure, and reliable.

Access AI also includes features like multi-photo upload, verbosity controls, and chat history, which enhance user interaction and personalization. Additionally, Aira’s commitment to privacy means that no Access AI sessions are shared with third parties, safeguarding user data. Aira’s new Build AI initiative further advances its AI capabilities by allowing users to contribute to AI development in a secure and controlled manner. This program, available primarily in the US, collects real-world data to improve future AI features, enhancing Aira’s service without compromising user privacy. The access AI is free for the time being.

Each of these AI tools offers unique benefits, catering to different needs and preferences. Whether you’re looking for a general-purpose AI assistant like Meta AI, a visually impaired-focused tool like Be My AI, or a hybrid solution with human verification like Access AI, there’s a tool that can help enhance accessibility in your daily life.

Back to Top