Microsoft Makes Bing Faster

March 16, 2015

Bing is classified as a generic search engine living in Google’s as well as DuckDuckGo’s shadows. In an attempt to make Bing a more viable product, ExtremeTech tells us that “Microsoft To Accelerate Bing Search With Neural Network.” When Bing scours the Internet, it pulls results from a Web index that is half the size of Google’s. Microsoft wants to increase Bing’s efficiency and speed, so they created the Field-Programmable Gate Array (FPGA) technology.

Microsoft breaks Bing’s search into three parts: machine learning scoring, feature extraction, and free-form expressions. Bing still uses Xeon processors for its document selection service and it needs to switch over to new FPGA software to increase its search speed. Microsoft called the team developing the new FPGA technology Project Catapult. Project Catapult uses similar tech designed in 2011, but it relies on half the servers as it did in the past.

Microsoft is relying on convolution neural network accelerators (CNNs) for the project:

“Convolutional neural networks (CNNS) are composed of small assemblies of artificial neurons, where each focuses on a just small part of an image — their receptive field. CNNs have already bested humans in classifying objects in challenges like the ImageNet 1000. Classifying documents for ranking is a similar problem, which is now one among many Microsoft hopes to address with CNNs.”

Armed with the new FPGA, Microsoft hopes to increase Bing’s search and rank business to compete at a greater level with Google. While that may increase Bing’s chances of returning better results, remember that Microsoft still creates OS’s that still fail on initial public releases.

Whitney Grace, March 16, 2015
Sponsored by ArnoldIT.com, developer of Augmentext

Comments

Comments are closed.

  • Archives

  • Recent Posts

  • Meta