Could Xyggy Be the Future of Search?
May 19, 2010
The internet is so amazing… we can find information and answers to almost any query we can think of. Sure… it may take time, hits and misses and trial and error… and cause enough frustration to raise blood pressure, but we can do it. That’s why we know more about everything today than anyone else in the world has ever known before us. Doesn’t necessarily mean we use that information better… we just know it… or where to find it. That’s because search does its ‘gee-whiz thing.’ And we are hooked.
However, whether we know it or not, we are lacking. “Search is nowhere near a solved problem,” says Amit Singhal Google Research Fellow. “Although I’ve been at this for almost two decades now, I’d still guess that search isn’t quite out of its infancy yet. The science is probably just about at the point where we’re crawling. Soon we’ll walk. I hope that in my lifetime, I’ll see search enter its adolescence.”
That’s kind of a surprise because for us users, Google, Bing, Yahoo! and the like bring us hundreds of thousands of answers to many of our queries. Who could want for more?
Ah, there’s the key. We couldn’t want for more… but we could be more satisfied with what we get. Who can use 60.3 million results in .18 seconds—the actual Google return for the term ‘search engine’? How about the top 20 results that are really what we want? How about a search tool that better fits smart phone capabilities? (See ATT offers a clue as to where search is going) For 40 years now, because of technology limitations, we have been limited to text-driven search. It has been great… but that won’t cut it for tomorrow.
So who has a better idea? Enter Xyggy (say Ziggy), a comparatively new player in the search field with something very different. Founded in June 2008 by Dinesh Vadhia, its originator and CEO, Xyggy’s tagline is: find anything… and, most simply stated, that’s exactly what it’s all about.
Vadhia explains: “In our everyday lives we search for and find things all the time where the things–or if you prefer, call them objects or items–can literally be anything. Xyggy brings item-search into our digital lives where the items can be documents, web pages, images, social profiles, ads, audio, articles, video, investments, patents, resumes, medical records… in fact, anything ranging from text to non-text.
“Unlike text-search that finds text in documents, item-search finds relevant items just like we do in everyday life. An item is a thing and given a query of items, Xyggy finds other relevant items. This is a game changer. The discrepancy between finding things in our everyday lives and the dominant but narrow text-search mode in our digital lives is obvious. We can now bridge that chasm.”
According to Vadhia, who has been walking me through the jargon and technical stuff, Xyggy is based on sound, statistical theory, Bayesian machine learning methods and psychological research.
What is Bayesian Sets, the algorithm used in Xyggy? “It is a new framework for information retrieval based on how humans learn new concepts and generalize,” explains Vadhia. “The inventors of Bayesian Sets algorithm (patent pending) are Professor Zoubin Ghahramani and Dr. Katherine Heller, both at Cambridge University. In this framework, a query consists of a set of items that are examples of some concept that defines your search… i.e. the movies Titanic and The Terminator. That suggests the concept of interest is movies directed by James Cameron and therefore, Bayesian Sets is likely to return other movies by Cameron.
How does it work? “Human generalization has been intensely studied in cognitive science, and various models have been proposed based on some measure of similarity and feature relevance,” says Vadhia. “Recently, Bayesian methods have emerged as models of both human cognition and as the basis of machine learning systems. In other words, as in everyday life, we are searching for and finding things.”
“Given a query consisting of a small set of items (e.g. a few images of buildings) the task is to retrieve other items (e.g. other images) that belong to the concept exemplified by the query. To achieve this, we calculate a measure, or score, of how well an available item fits in with the query items using a Bayesian statistical model.”
The unique Xyggy query box supports drag and drop as well as text input so that you can do most, or all, of the work with your cursor (or finger). You can begin your search with image or text. Best of search circumstance… put two or more items in the box to immediately give the algorithm more to draw from. If you only have one example, then look at your primary return and pick your second example from the results, drag that item to the box and watch those results zero in on what you actually want. If the items are documents, then you could use the document titles as the screen items or an appropriate icon with a text label. For music search, you could use a relevant image and so on
In fact, as you add more items into the search box, the algorithm automatically learns from the common features of the query items to return better results. For example, a query of pictures of a red car and a blue car will find other pictures with car features in common. Whereas, a picture of a red car and a tomato will find pictures that have red-ness in common. Context such as geo (location), temporal (time) and sensor information can also be added to the representation of items.
There is dynamic interplay between the search box and the results that attempt to make the searching and finding process a lot more intuitive and fun, but more importantly, to find the things that you want. Using personal taste information, recommendation services for movies, music, books and others are possible. Maybe, just maybe, the metaphorical needle in the haystack can at last be found.
Xyggy Information Box |
Item-based search where items can be any data type ranging from text to non-text |
Individual item represented by a vector of features of that item |
Drag and drop query box with dynamic interaction to improve relevance of results |
Based on Bayesian Sets algorithm |
Scalable to user needs |
Delivered on a cloud platform for developers to build and deploy item-search enabled Apps and services (first developer release slated for 2011) |
Apps and services can include ads from 3rd party networks |
Supports all browsers |
Search demo available at http://www.xyggy.com/patent.php |
Yes, Xyggy is different. It has to be. According to Nobel Prize Winning physicist and current U.S. Energy Secretary, Steven Chu, “If you use an old tool to tackle a problem, you’ve got to be really smarter than the rest of the folks because everybody has this tool… It does little good to start with the problem. Without a new tool you only have conventional and well-traveled paths to address it. It may not be obvious what problem exists and can be solved, but starting with a new tool gives you a chance to create something truly disruptive.”
Xyggy is definitely NOT an old tool. It doesn’t build on or modify what we have been using for all of our computer-driven lives. It is not text-search exclusive and actually most intuitive when ‘things’ can be used. Humans have the remarkable capacity to learn and generalize, and Bayesian Sets, the algorithm behind Xyggy, is based on that model. By using any image or text we may well have found item-search for the digital world… or at least, are off to a good running start.
“It is easy to see how Xyggy item-search with its interactive search box fits in almost naturally with touch devices such as the iPod and iPad and some of today’s smart phones. Imagine anything and everything that can be dragged in and out of the search box to find other relevant items. How simple and easy can that be?” asks Vadhia.
In the evolution of all things, we, the consumer, will ultimately decide what’s next. It could be Xyggy. It is new and sexy. It has a quirky, likeable name and seems to have the right idea. But perhaps more importantly, it is first with a dramatic new idea. Will Google, Microsoft and Yahoo! respond? Bet your life. It’s a game of one-upmanship with survival hanging in the balance. Kind of exciting, actually.
Read more about Xyggy. You may contact Xyggy at dinesh@xyggy.com,
Jerry Constantino, May 19, 2010
Unsponsored post
Comments
One Response to “Could Xyggy Be the Future of Search?”
As you mentioned in the article, Xyggy really requires a different mindset since it is a different type of search system. Most of us technically savvy people have used and become adapted to text/keyword searching and those of us who do searching for a living (I’m a Patent Analyst) are accustomed to techniques to utilize this system. Xyggy, somewhat like Google Sets before it, challenges us to change our mindset (perhaps BACK to the way it was, some may say) and in doing so we might find even more useful results.
The patent searching portion available is intriguing, and while it has a long way to go in terms of a patent application, is worth checking out for yourself. I wrote up two blog posts on the subject, and Dinesh was kind enough to drop by and have back and forth discussion that you might find interesting.
Part 1: http://intellogist.wordpress.com/2010/03/04/xyggy-and-the-golden-egg-2/
Part 2: http://intellogist.wordpress.com/2010/04/06/revisiting-xyggy/
Happy Searching,
Chris Jagalla