IBM Can See What Is Next in Search: Voice to Text and APIs. Yes, APIs. APIs Do You Hear?
July 13, 2015
I just don’t believe it. You may. Navigate to “IBM Sees the Next Phase of Search.” The write up trots out the cognitive computing thing. That’s jargon buzz for smart software. (Google, as you may know, is making its band play this artificial intelligence tune as well.)
Here’s the paragraph next to which I put a multi stroke red exclamation point:
IBM has released several applications that will help move search services into the next phase as voice search replaces the act of typing keywords into a search box, and the need for something in a specific moment in time replaces intent. Humans will express a need and devices from cars to inanimate objects like refrigerators will respond in a more natural way, according to IBM VP of Watson Core Technology Jerome Pesenti. Many of these platforms will allow consumers to interact with Internet-connected devices.
Now Pesenti is one of the founders of Vivisimo, an outfit with nifty on the fly clustering and deduplicating technology. Scaling was not a core competency. Vivisimo also magically and instantly morphed into a Big Data company as soon as the IBM purchase of Vivisimo for about one year’s projected revenues (estimated at $20 million).
I am reasonably confident that Pesenti, who was hooked up with Carnegie Mellon, is aware of voice to text, gestures, and software which “watches” a user to try and figure out what the wacky human wants: To call a significant other, order a pizza, or look for clarification on the mathematical procedures for calculating an Einstein manifold.
The write up explains that IBM is offering application programming interfaces to Watson. Be still my heart. My goodness, APIs.
I find it interesting that IBM’s expensive, 24×7 Watson marketing campaign is reaching “real” journalists who are really excited about APIs. APIs!
Vivisimo used to make available a number of wacky demonstrations of its technology. Perhaps Pesenti and Watson will make available a public demo on the corpus of Wikipedia or the Hacker News content.
I don’t need wonky jargon; I need to bang on a system to see how others react. I want metrics for content processing. I want latency data. I want system resource consumption data. I want something other than hints and API talk.
Stephen E Arnold, July 13, 2015