IBM Introduces Cognitive Computing to Developers with Project Intu
To help coders get a handle on the emerging practice of cognitive computing, IBM has launched an experimental service, called Project Intu, that provides a way to quickly integrate various IBM Bluemix cognitive services onto various computing platforms.
IBM debuted Project Intu at the company’s Watson Developer Conference in San Francisco last month, allowing developers to test out services such as speech-to-text. Attending developers were given free access to Watson’s BlueMix and Intu, and each presentation was followed up with a “Meet the Experts” working session where devs were encouraged to replicate the results for themselves.
A large work area was devoted Project Intu, designed to work immediately out-of-the-box on the device of your choice and includes a set of sample cognitive abilities, so you can immediately start having a conversation with Watson. The pre-configured product includes bindings to other services as a quick way for developers to get up and running with Watson by making it easy.
It was a huge hit among conference attendees. Over 350 developers took advantage of the mentoring provided on site at the work area and hundreds more have accessed Project Intu since the convention, said Rob High, IBM chief technology officer and vice president for Watson, in a phone interview. Developers can write agents to extend the behavior that Intu enables. He’s very excited about the kinds of impact this can have on creating cognitive experiences.
Bluemix services are grouped into zones, such as data systems, IoT, network, security, analytics, mobile, etc. “We don’t deliver you solutions,” said High during his presentation at the conference, “we provide the building blocks that allow you to create your own great solutions.”
The online workshops show how to extend the delivered functionality. For example, it shows how to create your own conversations with your own intents and dialogues. But more interestingly, it also demonstrates how to connect your device with other capabilities (e.g. GPS, gyroscope, camera) and incorporating that into the behavior into the conversation.
“Emotion is how the person is feeling, tone is the feeling they are trying to create in you” — Rob High, IBM
Amplifying human cognition ought to have the ability to inspire people in some way, he explained. “The sum is greater than its parts,” High said. “It leaves you more informed than if you just got a search result.”
Through vocalization or actions or other cognitive behaviors, the machine can give users a more intuitive interaction with the cognitive system. This intuitive experience provides a slightly different perspective, High said. And, he says hopes, they will leave at least a tiny bit inspired.
How Do You Do That?
Having the info at the right time is part of a successful cognitive application, explained High, but at the end of the day, it’s not just about the words, it’s about cadence and intonation, the way the words are spoken that help inform, acting out the meaning you want to send. At the end of the day, the point of cognitive capabilities is to effect human experiences.
Bluemix offers 30 cognitive-related services currently available, each one designed to help you address some aspect of that overall problem, explained High. Watson solutions are categorized into three main services, roughly equating to Nobel Memorial Prize in Economics laureate Daniel Kahneman’s work on the fast and slow mind. The three areas of service are:
- Higher Reasoning Skills
Almost all human interaction stems from the ability to maintain a conversation. Even reading a book or watching TV has a conversation overlay, said High. It’s the most fundamental building block when it comes to reasoning. Each conversation is broken down into three basic parts that the developer needs to address:
- Identify what was said, and discern the intention behind that.
- Map out entities may have been expressed within that.
- Identify what to do next with that information.
Watson’s set of tools allows developers to build this interaction conversationally, and some of the basic tools are included in Project Intu.
- Foundational Cognitive Skills
After you have the conversation, you want to understand the person behind it — their mood, their personality. Watson provides tools regarding speech, vision, empathy, and understanding the tone of the conversation. Being able to understand the tone, if it’s happy or angry or disgusted, allows Watson to compose better answers. Think of a customer service bot, which would answer the same question with different answers depending on if the customer was angry or sad.
You also need to understand emotions, said High. “Emotion is how the person is feeling, tone is the feeling they are trying to create in you.” These elements can lead you to understanding personality. Watson has controls to text-to-speech processing to discern the difference between a sorrowful statement or an angry statement to better continue the conversation.
- Knowledge Organizational Skills
These Watson services are based on intellectual cognition, what is the intelligence behind the cognitive interaction, including how can we intelligently approach somebody, impress them with the information they need to know, intonate and vocalize things in a way that is expressive for them.
Gestures are equally important in conversation and need to be included into bot conversations. Think of a concierge bot, said High, who needs to gesture towards the elevator at the same time saying ‘it’s over there.’ These gestures are part of the cognitive experience.
What Does This Look Like in the Real World?
High gave an example of how these components might be put together in a commercial service. If a person comes to a Watson-enabled website to ask about buying a mortgage, the cognitive assistant can infer from their other data that they don’t just want to know mortgage rates, but also might have questions that go beyond the mortgage rates, about housing prices, or what the neighborhood and school district is like for a particular house. Don’t have kids? Then you won’t see any school information. Do you frequent coffee shops? You’ll see a list of nearby coffee shops returned with your housing search. And perhaps throw in some information about how to cope with the stress of moving.
There are two key messages for big data, said High. First is the need to understand what mix of quantitative and qualitative analytic is needed to respond to different business and social problems. Analytics is only the first part of the solution. What is needed is “something that speaks the customer’s particular problem in their particular set of circumstances.”
The second thing to understand said High is “everything we know about computers in the past, and all of that is going to be surpassed by the amount of time spent in cognitive spaces.” Cognitive computing is expanding to everyday activities, into new spaces where it can learn all the things we have never had time to learn. The systems “will understand us, our motivations, our personalities, responding to us, learning what our needs are and responding emotionally.”
And a final note, developers are key in this new world, and there are not enough people who understand machine learning. But projects like INTU will help more coders get onboard with the impending cognitive computing revolution.
IBM is a sponsor of The New Stack
Photos by T.C. Currie.