In a blog post, Facebook parent company Meta on Tuesday announced a slew of new software updates for its Ray-Ban Stories smart sunglasses.
The post was penned by Meta’s Reality Labs team.
“We continue improving [Ray-Ban Stories] with regular software updates and new features. Today, we’re excited to share the latest on how we’re making the smart glasses experience better, more powerful, and more intuitive over time,” the Reality Labs group said in the post.
The meat of the announcement highlights four areas: smoother integration with phones, faster voice interactions, smarter voice interactions, and new Meta account integration. The first three are most pertinent to accessibility, particularly around voice interaction. Meta says these represent some of the most “highly-requested features” from users, allowing them to do more with their voice and less with their smartphone. For instance, it’s now possible to start and reply to messages in WhatsApp, Messenger, and more simply by saying “Hey Facebook, reply” and the software will confirm your creation before sending it to its recipient. In a similar vein, it’s now possible to interact with the Stories using more natural, conversational language. In a clever wield of artificial intelligence technology, the glasses will even “know whether they should punctuate your outgoing message with a period or question mark based on context cues,” according to Meta. Finally, starting late next month, users will need a Meta account to use their sunglasses, which Meta says is separate from a Facebook or Instagram account. It’s also required for the Facebook View companion app, which is used to pair and customize the glasses from an iOS or Android device.
While it may seem obvious to focus energy on developing hands-free controls for wearable tech like sunglasses, the truth is the features have significant implications for accessibility as well. Assuming a person has typical speech—meaning, they don’t stutter—but may have certain fine-motor delays, the hands-free nature of operating the glasses may well be a boon in reducing friction associated with excess tapping or swiping. Likewise, more voice control can also mean more savings in terms of alleviating eye strain and fatigue when using the companion app. These may seem like little things at first blush, but it’s truly the small details that end up having the biggest impact on shaping a positive user experience for a disabled person. More pointedly, it goes to show that going hands-free goes beyond sheer convenience; hands-free can mean the difference between accessibility and inaccessibility for many folks.
For more on this, read my story from last October on the Ray-Ban Stories, as well the similarly positioned Echo Frames from Amazon.
Credit: Source link