Japan will send a transforming robot ball to the moon to test lunar rover tech | Space

“The transformable lunar robot will be an ultra-compact and ultra-lightweight robot that can traverse in the harsh lunar environment,” JAXA stated. The robot’s diminutive size and small mass of 250 grams, JAXA added, “contributes to a reduction in volume during transportation to the moon. Therefore, it is expected to play active roles in future lunar exploration missions as well.”

Source: Japan will send a transforming robot ball to the moon to test lunar rover tech | Space

Navy’s Autonomous Swarm Boats Move Closer to the Battlefied | WIRED

“Future versions of these systems will be armed with non-lethal weapons that could shut down the engines of the targeted boat, and even lethal weapons that could be remotely operated by humans from afar,” says military analyst Peter Singer. “Israel, for instance, has a version that’s armed with a machine gun.”

Source: Navy’s Autonomous Swarm Boats Move Closer to the Battlefied | WIRED

IBM’s Watson Cognitive AI Platform Evolves, Senses Feelings And Dances Gangnam Style

Watson is more capable and human-like than ever before, especially when injected into a robot body. We got to see this first-hand at NVIDIA’s GPU Technology Conference (GTC) when Rob High, an IBM fellow, vice president, and chief technology officer for Watson, introduced attendees to a robot powered by Watson. During the demonstration, we saw Watson in robot form respond to queries just like a human would, using not only speech but movement as well. When Watson’s dancing skills were called into question, the robot responded by showing off its Gangnam Style moves.

This is the next level of cognitive computing that’s beginning to take shape now, both in terms of what Watson can do when given the proper form, and what it can sense. Just like a real person, the underlying AI can get a read on people through movement and cognitive analysis of their speech. It can determine mood, tone, inflection, and so forth.

Source: IBM’s Watson Cognitive AI Platform Evolves, Senses Feelings And Dances Gangnam Style

The funny things happening on the way to singularity | TechCrunch

It seems more obvious every day that man and machine are quickly assimilating. The transparency that’s inherent in technology will eventually destroy privacy. Automation will eventually eliminate the need for human labor. There’s a short window of time between then and now. We need a master plan for how we’ll manage the disruption that goes along with it.

Source: The funny things happening on the way to singularity | TechCrunch

Robotics makes baby steps toward solving Japan’s child care shortage | The Japan Times

“Unlike human day care staff, the Or-B don’t suffer from mental or physical fatigue. They’ll never tire of repeating the same stories and performing the same daily tasks,” Hara said.

“Furthermore, as they can access a vast library of “Anpanman” and “Teletubbies” episodes, they can quickly defuse any temper tantrum and crying jag that might occur.”

In terms of teaching and nurturing, Or-B units have certain advantages.

“Or-B’s voice can be female, male or gender neutral,” said Yoshikazu Musaki, a specialist in early childhood education. Furthermore its learning capabilities, coupled with the latest in artificial intelligence, will allow it to customize its care to each child, Musaki added.

Source: Robotics makes baby steps toward solving Japan’s child care shortage | The Japan Times

We all think robots are going to steal other people’s jobs

A majority of Americans – 65 percent – now believe that robots will “definitely” or “probably” take over much of the work we humans do within 50 years – but less than 20 percent see this transformation happening to their current job.

Despite the fact that this is already happening, many more people are worried about being pushed out because of someone else undercutting them, according to a Pew survey of 2,000 people, rather than their role simply being automated out of existence.

Source: We all think robots are going to steal other people’s jobs

Why You Want Your Drone to Have Emotions – IEEE Spectrum

Researchers from Stanford University, led by Dr. Jessica Cauchard, have established an “emotional model space” for drones, which consists of a set of eight emotional states (personalities) that have defining characteristics that can be easily recognized by human users, and that can be accurately represented through simple actions that the drone can perform. These personalities include: brave, dopey, sleepy, grumpy, happy, sad, scared, and shy. For example, a drone with a brave personality moves quickly and smoothly, and if you ask it to go backwards, it’ll instead turn around and go forwards. A dopey drone flies a little wobbly. A grumpy drone may require you to repeat commands, while a sad drone flies low to the ground.

Source: Why You Want Your Drone to Have Emotions – IEEE Spectrum

Deep learning helps robots perfect skills | KurzweilAI

Deep learning enables the robot to perceive its immediate environment, including the location and movement of its limbs. Reinforcement learning means improving  at a task by trial and error. A robot with these two skills could refine its performance based on real-time feedback.

Applications for such a skilled robot might range from helping humans with tedious housekeeping chores to assisting in highly detailed surgery. In fact, Abbeel says, “Robots might even be able to teach other robots.” Or humans?

Source: Deep learning helps robots perfect skills | KurzweilAI

Rats vs. computers vs. rat cyborgs in maze navigation | KurzweilAI

What would happen if we combined synthetic and biological systems, creating an intelligent cyborg rat? How would it perform?

Researchers in China decided to find out by comparing the problem-solving abilities of rats, computers, and rat-computer “cyborgs,” as they reported in an open-access PLOS ONE paper.

Source: Rats vs. computers vs. rat cyborgs in maze navigation | KurzweilAI