IBM’s Watson Cognitive AI Platform Evolves, Senses Feelings And Dances Gangnam Style

Watson is more capable and human-like than ever before, especially when injected into a robot body. We got to see this first-hand at NVIDIA’s GPU Technology Conference (GTC) when Rob High, an IBM fellow, vice president, and chief technology officer for Watson, introduced attendees to a robot powered by Watson. During the demonstration, we saw Watson in robot form respond to queries just like a human would, using not only speech but movement as well. When Watson’s dancing skills were called into question, the robot responded by showing off its Gangnam Style moves.

This is the next level of cognitive computing that’s beginning to take shape now, both in terms of what Watson can do when given the proper form, and what it can sense. Just like a real person, the underlying AI can get a read on people through movement and cognitive analysis of their speech. It can determine mood, tone, inflection, and so forth.

Source: IBM’s Watson Cognitive AI Platform Evolves, Senses Feelings And Dances Gangnam Style

IBM Looks to Make Watson More Humanlike

Watson and other systems, as they become more intelligent, “will have to communicate with us on our terms,” he said. “They will have to adapt to our needs, rather than us needing to interpret and adapt to them.”

They will have to not only understand the questions humans ask and the statements they say, but will have to be able to pick up on all the visual and other non-verbal cues—such as facial expressions, the emphasis placed on words in a sentence and the tone of the voice—that people do in the normal course of interactions they have with each other. High wants to “change the role between humans and computers.”

Source: IBM Looks to Make Watson More Humanlike

Managed services killed DevOps | TechCrunch

Today, developers are increasingly turning to managed services for toolsets and infrastructure requirements — tasks traditionally managed by DevOps teams. Amazon Web Services and other managed service providers have allowed for a dramatically simplified way of working, reducing complexity on the developer end and, thus, allowing them to focus on software development instead of installing databases and ensuring processes like backup, redundancy and uptime. In other words, managed services removed a lot of headaches with which DevOps teams were forced to deal.

While it might be hard for some people to accept, the only conclusion can be that DevOps teams are creating the same problem they were initially built to solve. DevOps was established to speed things up, but because of the nature of managed services today, you no longer need a whole team to facilitate them — why not simply teach all developers how to utilize the infrastructure tools in the cloud? The truth is, like QA before it, DevOps has itself become an unnecessary step in the continuous deployment process. As such, it is obsolete.

Source: Managed services killed DevOps | TechCrunch

The funny things happening on the way to singularity | TechCrunch

It seems more obvious every day that man and machine are quickly assimilating. The transparency that’s inherent in technology will eventually destroy privacy. Automation will eventually eliminate the need for human labor. There’s a short window of time between then and now. We need a master plan for how we’ll manage the disruption that goes along with it.

Source: The funny things happening on the way to singularity | TechCrunch

Alexa, Cortana, and Siri aren’t novelties anymore. They’re our terrifyingly convenient future.

“Alexa—and Siri and Cortana and all of the other virtual assistants that now populate our computers, phones, and living rooms—are just beginning to insinuate themselves, sometimes stealthily, sometimes overtly, and sometimes a tad creepily, into the rhythms of our daily lives. As they grow smarter and more capable, they will routinely surprise us by making our lives easier, and we’ll steadily become more reliant on them.”

Source: Alexa, Cortana, and Siri aren’t novelties anymore. They’re our terrifyingly convenient future.

New microscope controls brain activity of live animals — ScienceDaily

For the first time, researchers have developed a microscope capable of observing — and manipulating — neural activity in the brains of live animals at the scale of a single cell with millisecond precision. The device, which uses lasers to create holographic images within the brain, is envisioned as a “Rosetta Stone” to crack the code on how brains work.

Source: New microscope controls brain activity of live animals — ScienceDaily

Mapping the Brain to Build Better Machines | Quanta Magazine

An ambitious new program, funded by the federal government’s intelligence arm, aims to bring artificial intelligence more in line with our own mental powers. Three teams composed of neuroscientists and computer scientists will attempt to figure out how the brain performs these feats of visual identification, then make machines that do the same. “Today’s machine learning fails where humans excel,” said Jacob Vogelstein, who heads the program at the Intelligence Advanced Research Projects Activity (IARPA). “W

Source: Mapping the Brain to Build Better Machines | Quanta Magazine

Stop ‘innovating’: Aim higher | VentureBeat | Entrepreneur | by Iliya Rybchin, Highnote Foundry

When NASA achieved JFK’s goal of putting a man on the moon, it somehow managed to accomplish that without any bestselling innovation books, using a 10-step innovation processes, or employing innovation consultants. By any definition of the word, putting a man on the moon was innovative. It changed the basis of competition in the space race, and it had meaningful societal consequences for decades to come. Everyone working on the moon landing knew they were innovating — the word meant something.

Source: Stop ‘innovating’: Aim higher | VentureBeat | Entrepreneur | by Iliya Rybchin, Highnote Foundry

Robotics makes baby steps toward solving Japan’s child care shortage | The Japan Times

“Unlike human day care staff, the Or-B don’t suffer from mental or physical fatigue. They’ll never tire of repeating the same stories and performing the same daily tasks,” Hara said.

“Furthermore, as they can access a vast library of “Anpanman” and “Teletubbies” episodes, they can quickly defuse any temper tantrum and crying jag that might occur.”

In terms of teaching and nurturing, Or-B units have certain advantages.

“Or-B’s voice can be female, male or gender neutral,” said Yoshikazu Musaki, a specialist in early childhood education. Furthermore its learning capabilities, coupled with the latest in artificial intelligence, will allow it to customize its care to each child, Musaki added.

Source: Robotics makes baby steps toward solving Japan’s child care shortage | The Japan Times

Why Learning To Code Won’t Save Your Job | Fast Company | Business + Innovation

Although I certainly believe that any member of our highly digital society should be familiar with how these platforms work, universal code literacy won’t solve our employment crisis any more than the universal ability to read and write would result in a full-employment economy of book publishing.

It’s actually worse. A single computer program written by perhaps a dozen developers can wipe out hundreds of jobs. As the author and entrepreneur Andrew Keen has pointed out, digital companies employ 10 times fe

Source: Why Learning To Code Won’t Save Your Job | Fast Company | Business + Innovation