AI Decides to Do the Audio Index Dance

June 14, 2017

Did you ever wonder how search engines could track down the most miniscule information?  Their power resides in indices that catalog Web sites, images, and books.  Audio content is harder to index because most indices rely on static words and images.  However, Audioburst plans to change that says Venture Beat in the article, “How Audioburst Is Using AI To Index Audio Broadcasts And Make Them Easy To Find.”

Who exactly is Audioburst?

Founded in 2015, Audioburst touts itself as a “curation and search site for radio,” delivering the smarts to render talk radio in real time, index it, and make it easily accessible through search engines. It does this through “understanding” the meaning behind audio content and transcribes it using natural language processing (NLP). It can then automatically attach metadata so that search terms entered manually by users will surface relevant audio clips, which it calls “bursts.”

Audioburst recently earned $6.7 million in funding and also announced their new API.   The API allows third-party developers to Audioburst’s content library to feature audio-based feeds in their own applications, in-car entertainment systems, and other connected devices.  There is a growing demand for audio content as more people digest online information via sound bytes, use vocal searches, and make use of digital assistants.

It is easy to find “printed” information on the Internet, but finding a specific audio file is not.  Audioburst hopes to revolutionize how people find and use sound.  They should consider a partnership with Bitext because indexing audio could benefit from advanced linguistics.  Bitext’s technology would make this application more accurate.

Whitney Grace, June 14, 2017

Comments

Comments are closed.

  • Archives

  • Recent Posts

  • Meta