3 minute read

INVERSE EXCLUSIVE: HERE’S A VIDEO OF HUMANE’S WEARABLE AI PROJECTOR IN ACTION

By Raymond Wong

HUMANE, the top-secret tech startup founded by ex-Apple vets Imran Chaudhri and Bethany Bongiorno, just showed off the first demo for its projector-based wearable at a TED talk. Axios’ Ina Fried broke the news, and Inverse has seen a recording of the full TED talk given by Chaudhri.

Journalist Zarif Ali, who had tweeted out an image of Humane’s wearable projecting a phone call function onto Chaudhri’s palm, says the full TED talk video is slated to become available April 22.

During the TED talk, after a quick summary on the fast rise and immense potential of AI and chatbots like ChatGPT, and even shouting out Bill Gates’s prediction that AI will be as profound as the graphical user interface that ushered in personal computing, Chaudhri shares his vision for the wearable.

“What do we do with all these incredible [AI] developments? And how do we actually harness these to genuinely make our life better?” he asks. “If we get this right, AI will unlock a world of possibility. Today, I want to share with you what we think is a solution to that end. And it's the first time we're doing so openly. It's a new kind of wearable device, that and platform that's built entirely from the ground up for artificial intelligence. And it's completely standalone. You don't need a smartphone or any other device to pair with it.”

“It interacts with the world the way you interact with the world, hearing what you hear, seeing what you see, while being privacy first, and safe, and completely fading into the background of your life,” Chaudhri says.

THINGS HUMANE’S WEARABLE CAN DO

How does the phone function work? Designer Michael Mofina, who says he caught the TED talk live before the link was removed, told Inverse: “In terms of the call, as soon as [Chaudhri] raised his hand the device displayed the appropriate incoming call interface, no menu to navigate through.”

In a reply to a retweet of his image, Ali said that the Chaudhri demoed “a translation feature that translates to another language using your own voice model for natural conversation.” Axios reported that Chaudhri was translating his voice from English to French using the wearable. Per Mofina, “The translation came out in French but it was using an AI-generated version of his voice to speak it. No projected interface for the translation.”

Ali also described two other features. There’s “a ‘catch me up’ feature that scrapes your meetings, etc, and gives you a quick list of important things you may have missed. Mofina added this: “The device gave him a recap of crucial info without disturbing him with notifications. ‘You got an email, and Bethany sent you some photos.’”

In one demo, Chaudhri taps the wearable and asks: “Where can I find a gift for my wife before I have to leave tomorrow?” The AI response: “Vancouver's Granville Island is a lively shopping district.”

And take a look at this video Mofina tweeted out. “Let’s say you’re health conscious or you have certain types of food considerations,” says Chaudhri. He takes out a candy bar, holds it in front of the device, taps on the Humane device and asks “Can I eat this?” The device responds with “A milky bar contains cocoa butter. Given your intolerance, you may want to avoid it.”

“What’s cool is my AI knows what’s best for me, but I’m in total control,” says Chaudhri. He taps on the wearable again. “Im gonna eat it anyway.” The AI replies back with some humor: “Enjoy it.”

SCREENLESS, SEAMLESS, SENSING

It’s been widely speculated that Humane’s “iPhone killer” would be a projector of sorts, and that appears to be the case. “AI will be the driving force behind the next leap in device design” reads a slide in Chaudhri’s presentation.

“We like to say that the experience is screenless, seamless, and sensing, allowing you to access the power of compute while remaining present in your surroundings, fixing a balance that's felt out of place for some time now,” says Chaudhri.

In a screenshot, you can see an image of the wearable device attached to Chaudhri’s jacket.

There appears to be a camera and a pair (or more) of sensors.

One thing Mofina says Chaudhri shared at SXSW this year was that Humane’s wearable wouldn’t have a “wake word” like Siri or Alexa. “He was shown interacting with it by voice by tapping it to start speaking to it. It also has LED lights that indicate when it’s listening, and when a call is coming in.”

In his talk, Chaudhri expresses the need to replace screens — screens for computers of all shapes (computers, smartphones, tables, smartwatches) that he helped popularize while at Apple. “For the human-technology relationship to actually evolve beyond screens, we need something radically different.” https://www.inverse.com/tech/humane-aiwearable-camera-sensor-projector-video-demo Image credit: SCREENSHOTs VIA ZARIF ALI

Several Twitter users have raised some important questions about how Humane’s wearable works, details of which were not shared in-depth at the TED talk.

“I wonder how much the Humane projector weighs? Will it weigh down a light shirt? Is it attached with a pin or a magnet?” tweeted MacRumors contributing writer Steve Moser. “What’s it like to accidentally shine it right into someone’s eyeball? Is there a recording light when the camera is on? How wide of an angle does it project?” We’re all wondering the same, Moser. We’re all wondering the same.

This article is from: