5 pieces of robot and AI news, and an important announcement
Some breakthroughs the help robots see as we do. And one that helps us see as robots do.
This week’s 5 Bits of News
Helping AI and robots to see things as we do.
Meta has released a “multimodal” data set of people doing things, with every action seen both by an observer and from the POV of the human who is doing it. The idea is to mimic how you, a human, learns an action — watching it, then mapping what you’ve seen onto your own movements. Another multimodal approach is to strap a headcam on a human toddler and then let an AI learn from seeing and hearing the world from the kid’s perspective. That’s what a team from NYU did.
Automating that special gesture that says “I care”
Want to send a handwritten note, for that personal, human touch? There’s a robot for that.
AI perched on your nose for $350
Coming in April, for the cost of a not-too-high-end pair of ordinary glasses, weighing the same or less as such a pair: A pair of Augmented Reality glasses from Brilliant Labs.
Look at a text, they'll show you the translation. Look at a berry, they'll break down its nutritional info. Look at a shoe, they'll get you the price. They're integrated with GPT-4 (words) Stability (images) Whisper AI (speech transcription) and Perplexity.ai (search), among other things.
So, hmmm. Could a cannibal get the glasses to tell him the nutritional breakdown of a neighbor? Could a panpsychist ask the device to tell her how a tree is feeling? How about a mod for straight men that will tell them if a woman is interested or is just being polite? Could a future pair of these use AI to evaluate someone's speech, expressions and body language to tell the wearer that he's lying? The mind boggles.
Fun to speculate where this could lead. But right here, right now, as I envision people using these to solve daily problems, this thing feels big. A device that looks "normal,” weighs no more than regular glasses, costs about the same, and doesn’t shout out to the world that you’re using it — that’s an AI/AR product that, for once, I can imagine a lot of people wanting. Have I caught a hype virus here, or do you feel it too?
Another indicator that no nation can robot its way out of labor shortages. Spend a few minutes reading about robotics in Japan and you'll encounter the idea that Japanese society is open to robots because it doesn't want a lot of immigrants. This desire to keep the door shut may well be part of the motivation for the Japanese government's Society 5.0 plan for a nation chock-full of AI and robot involvement in daily life.
In the real world of 2024, though, no society can robot its way out of our century's global demographic problem: Way more old people, way fewer people of working age. On Friday the Japanese government loosened rules on immigrant workers, creating a path for more such workers to stay long-term in Japan.
Reminder: Not everyone likes robots.
A Waymo self-driving taxi got vandalized and set on fire in San Francisco.
Literary Note of the Week:
“The machines reproduce, simplify and multiply vital processes. They seduce and terrify us because they give us at one and the same time the sensation of intelligence and unconsciousness: all that they do, they do well, but they don't know what they are doing." Octavio Paz
Coming Soon: The Podcast!
And, finally, some news about this blog. I want to post more, and get more voices into this space that aren’t my own. So, I’m launching an ongoing series of interviews, aka a podcast. Each episode will be a conversation with a person with insight into robots, AI and/or the way they affect us. I hope you’ll give it a listen. Episode 1 will be out in a couple of days.