January 5, 2016
I read “GuangDa Li, Co-Founder and CTO ViSenze on Enabling Search without Key Words.” The article, I wish to point out, is written in words. To locate the article, one will have to use words to search for information about Dr. Li. Dragging his image to Google Images will not do the trick. The idea for search without words continues to attract attention. Ecommerce and law enforcement are keen to find alternatives to word centric queries. Searching for a text message with a particular emoji is not easy using words and phrases.
According to the write up:
In February 2013, GuangDa Li along with Oliver Tan, an industry veteran started ViSenze, a spin-off company from NExT, a research centre jointly established between National University of Singapore (NUS) and Tsinghua University of China. ViSenze has developed a technology that enables search without keywords. Users simply need to click a photo and ViSenze brings you the relevant search results based on that image.
The write up contains several points which I found interesting.
First, Mr. Li said:
Because of my background in internet media processing, I anticipated the change in the industry about 4 years ago – there was a sharp rise in the amount of multimedia content on the internet. The management, search and discovery of media content has become more and more demanding.
Image search is a challenge. Once promising systems to query video like Exalead’s system have dropped from public view. Video search on most services is frustrating.
Second, the business model for ViSenze is API focused. Mr. Li said:
ViSearch Search API is our flagship product and it also serves as the fundamentals for our other vertical applications. The key advantage of ViSearch API is that it is a perfect combination of latency, scalability and accuracy.
The third passage of interest to me was:
We used to be in stealth mode for a while. Only after our API was launched on the Rakuten Taiwan Ichiba website, did we start to talk with investors. It just happened.
I interpreted this to suggest that Rakuten recognizes that traditional eCommerce search systems like Amazon are vulnerable to a different information access approach.
Should Amazon worry about Rakuten or regulators? Amazon does not worry about much it seems. Its core search and cloud based search systems are, in my view, old school and frustrating for some users. Maybe ViSenze will offer a way to deliver a more effective solution for Rakuten. Competition might motive Amazon to do a better job with its own search and retrieval systems.
Stephen E Arnold, January 5, 2016
December 18, 2015
I read “Podcasting’s Search Problem Could be Solved by This Spanish Startup.” According to the write up:
Smab’s web app will automatically transcribe podcasts, giving listeners a way to scan and search their content.
What’s the method? I learned from the article:
The company takes audio files and generates text files. If those text files are hosted on Smab’s site, a person can click on a word in the transcript and it will take them directly to that part of the recording, because the transcript and the text are synced. In fact, a second program assesses the audio to determine where sentences begin, making it easier to find chunks of audio. Both functions are uneven, but it’s worth noting here that the company is in a very early stage.
There are three challenges for automatic voice to text to indexing from audio and video sources:
First, there is a great deal of content. The computational cost to covert a large chunk of audio data to a searchable form and then offer a reasonably robust search engine is significant.
Second, selectivity requires an editorial policy. Business and government are likely paying customers, but the topics these folks chase change frequently. The risk is that a paying customer will be disappointed and drop the service. Thus, sustainable revenue may be an issue.
Third, indexing podcasts and YouTube is work that Apple handles rather off handedly and YouTube performs as part of its massive investment in the Google search system. The fact that neither of these firms has pushed forward with more sophisticated search systems suggests that market demand may not be significant.
I hope the Smab service becomes available. Worth watching.
Stephen E Arnold, December 21, 2015
November 17, 2015
Traditional TV is in a slow decline towards obsoleteness. With streaming options offering more enticing viewing options with less out of pocket expenses and no contracts, why would a person sign on for cable or dish packages that have notoriously bad customer service, commercials, and insane prices? Digital Trends has the most recent information from Nielsen about TV viewing habits, “New Nielsen Study On Streaming Points To More Bad News For Traditional TV.”
Pay-for-TV services have been on the decline for years, but the numbers are huge for the latest Nielsen Total Audience report:
“According to the data, broadband-only homes are up by 52 percent to 3.3 million from 2.2 million year over year. Meanwhile, pay-TV subscriptions are down 1.2 percent to 100.4 million, from 101.6 million at this time last year. And while 1.2 percent may not seem like much, that million plus decline has caused all sorts of havoc on the stock market, with big media companies like Viacom, Nickelodeon, Disney, and many others seeing tumbling stock prices in recent weeks.”
While one might suggest that pay-for-TV services should start the bankruptcy paperwork, there has been a 45% rise in video-on-demand services. Nielsen does not tabulate streaming services, viewership on mobile devices, and if people are watching more TV due to all the options?
While Nielsen is a trusted organization for TV data, information is still collected view paper submission forms. Nielsen is like traditional TV and need to update its offerings to maintain relevancy.
Whitney Grace, November 17, 2015
October 15, 2015
I like the idea of blaming what some MBA whiz called exogenous events. The idea is that hapless, yet otherwise capable senior managers, are unable to deal with the ups and downs of running a business. In short, an exogenous event is a variation on “it’s not my fault,” “there’s little I can do,” and “let’s just muddle forward.” The problem is that hunting for scapegoats is not a way to generate revenue. Wait. One can raise subscription fees.
I read “Netflix Is Blaming Slow US Growth on the Switch to Chip-Based Credit Cards.” The write up references a letter, allegedly written by the Netflix top dog. I noted this passage:
In his letter to investors, Netflix CEO Reed Hastings partially blamed America’s recent switch to chip-enabled credit cards. As credit card companies send new cards to their customers, some have been issuing new numbers, as well. And if people forget to update their credit card number with Netflix, they can’t pay their bill and become what Hastings called “involuntary churn.”
I like that involuntary churn. I remember working on a project for a telecommunications company in which churn was a burr under the saddle of some executives. Those pesky customers. Darn it.
The write up ignores the responsibility of management to deal with exogenous events. When a search system fails, is it the responsibility of customers to fix the system. Nah, users just go to a service that seems to work.
I interpreted this alleged explanation and the article’s willingness to allow Netflix’s management to say, in effect, “Hey, this is not something I can do anything about.” If not the top dog, who takes responsibility? Perhaps the reason is not chip enabled credit cards? Perhaps users are sending Netflix a signal about sometimes unfindable content, clunky search, and a lack of features. Not everyone is a binge watcher. Some folks look for Jack Benny films or other entertainment delights. When these are not available, perhaps some look elsewhere. See and you shall find often delivers the goods.
Stephen E Arnold, October 15, 2015
October 5, 2015
Short honk: I read “Facebook v. Google in Digital Video Battle: YouTube Is 11X Bigger.” The big factoid is in the headline: Facebook is a fraction of the size in terms of traffic than YouTube. A couple of thoughts: How rapidly is Facebook growing in video content compared to YouTube? What is Facebook’s monetization opportunity compared to the Alphabet Google’s opportunity? The chit chat I have heard is that Facebook’s growth in the last 18 months is more rapid that the Alphabet Google video revenue growth? I also have a suspicion that socially anchored monetization may generate a more stable stream of revenue for Facebook. My question, however, is, “When will Facebook surpass YouTube in revenue from video?”
Stephen E Arnold, October 5, 2015
September 24, 2015
I read “Google Charges Advertisers for Fake YouTube Video Views, Say Researchers.” My goodness, will criticism of Alphabet Google continue to escalate?
The trigger for the newspaper article’s story with the somewhat negative headline was an academic paper called “Understanding the Detection of Fake View Fraud in Video Content Portals.” The data presented in the journal by seven European wizards suggests that an Alphabet Google type company knows when a video is viewed by a software robot, not a credit card toting human.
“Fake view fraud” is a snappy phrase.
According to the Guardian newspaper write up about the technical paper:
The researchers’ paper says that while substantial effort has been devoted to understanding fraudulent activity in traditional online advertising such as search and banner ads, more recent forms such as video ads have received little attention. It adds that while YouTube’s system for detecting fake views significantly outperforms others, it may still be susceptible to simple attacks.
Is this a Volkswagen-type spoof? Instead of fiddling with fuel efficiency, certain online video portals are playing fast and loose with charging for video ads not displayed to a human with a PayPal account?
Years ago an outfit approached me with a proposition for a seminar about online advertising fraud. I declined. I am confident that the giant companies and their wizards in the ad biz possess business ethics which put the investment bankers to shame. I recall discussing systems and methods with a couple of with it New Yorkers. The lunch topic was dynamically relaxing the threshold for displaying content in response to certain queries.
My comment pointed to ways to determine if an ad “relevant” was relevant to a higher percentage of user queries. I called this “query and ad matching relaxation.”
I did not include a discussion of “relaxation” in my 2003-2004 study Google Version 2.0, which is now out of print. The systems and methods disclosed in technical papers by researchers who ended up working for large online advertising methods were just more plumbing for smart software.
When an ad does not match a query, that’s the challenge of figuring out what’s relevant and what’s irrelevant.
My thought in 2003 when I started writing the book was that most content was essentially spoofed and sponsored. I wanted to focus on more interesting innovations like the use of game theory in online advertising interfaces and the clever notion of “janitors” which were software routines able to clean up “bad” or “incomplete” data.
As I recall, that New York City guy was definitely interested in the notion of tuning ad results to generate money for the ad distribution and not so much for the advertiser. For me, no interest in lecturing a group of ad execs about their business. These folks can figure out the ins and outs of their business without inputs from an old person in Kentucky.
Mobile and video access to digital content do pose some interesting challenges in the online advertising world. My hunch is that the Alphabet Google type outfits and the intrepid researchers will find common ground. If the meeting progresses smoothly, perhaps a T shirt or mouse pad will be offered to some of the participants?
I remain confident that allegations about slippery behavior in online advertising are baseless. Online advertising is making life better and better for users everyday.
The experience of online advertising is thrilling. I am not sure the experience of receiving unwanted advertisements can be improved? Why read a Web page when one can view an overlay which obscures the desired content? Why work in a quite office? Answer: It is simply easier to hear the auto play videos on many Web pages. Why puzzle over a search results page which blurs sponsored hits from relevant content? By definition, displayed information is relevant information, gentle reader. Do you have a problem with that?
Google, according to the article, will chat up the seven experts who reported on the alleged fraud. I am confident that the confusion in the perceptions of the researchers will be replaced with crystal clear thinking.
Online ad fraud? What a silly notion.
Stephen E Arnold, September 24, 2015
August 26, 2015
Poor old Google. Imagine. Hassles with Google Now. Grousing from the no fun crowd in the European Commission. A new contact lens business. Exciting stuff.
Then the Googlers read “Facebook’s New Moments App Now Automatically Creates Music Videos From Your Photos.” The idea is that one or two of the half billion Facebookers who check their status multiple times a day can make a movie video automatically.
But instead of doing the professional video production thing, the video is created from the one’s shared photos.
I wonder how many of the young at heart will whip up and suck down videos of [a] children, [b] pets, [c] vacations, [d] tattoos (well, maybe not too many tattoos).
The idea is
With the update, Facebook Moments will automatically create a music video for any grouping of six or more photos. You can then tap this video in the app to customize it further by changing the included photos and selecting from about a dozen different background music options. When you’re finished making your optional edits to this video, one more tap will share the video directly to Facebook and tag the friend or friends with whom you’re already sharing those photos. The option to automatically create a video from your shared photos also makes Facebook Moments competitive with similar services like Flipagram, or those automatically created animations that Google Photos provides through its “Assistant” feature, which also helpfully builds out stories and collages.
Google may apply its Thought Vector research to the problem. The question is will Alphabet be able to spell success from its social services. Why would a grandmother care about a music video of a grandchild when there were Thought Vectors, Loon balloons, and eternal life to ponder?
Stephen E Arnold, August 26, 2015
August 12, 2015
i read “Firefox Sticks It to Microsoft, Redirects Cortana Searches in Windows 10.” Ah, ha. Windows 10. Some day, maybe. I assume the endless reboots upon updating are merely a rumor. The real story is that a Firefox user can use Cortana to search something other than Bing.
The write up says:
After blasting Microsoft’s attempts to set Edge as the default browser in Windows 10, Mozilla is enjoying some sweet revenge by steering Firefox users away from Bing. With the newly-released Firefox 40, users no longer have to use Bing for web searches from Cortana on the Windows 10 taskbar. Instead, Firefox will show results from whatever search engine the user has chosen as the default. Using Firefox isn’t the only way to replace Cortana’s Bing searches with Google or another search engine. But Firefox is currently the only browser that does so without the need for third-party extensions. (It wouldn’t be surprising, however, if Google follows suit.)
Poor Microsoft. The company has been trying to build bridges. The result is a dust up.
I long for the good old days. Microsoft would have been careful to avoid getting stuck in a squabble about its Siri and Google Voice killer Cortana.
What impact will this have? My hunch is that as Windows 10 flows into the hands of those who fondly recall Bob, then the issue will become more serious.
For now, this is amusing to me. Recall I am a person who abando0ned my Lumia Windows phone because the silly Cortana feature was in a location which made activation impossible to avoid. I dumped the phone. End of story.
Stephen E Arnold, August 12, 2015
August 10, 2015
I read “Why Facebook’s Video theft Problem Can’t Last.” My initial reaction was, “Sure it can.” The main point of the write up struck me as:
But then popular YouTuber Hank Green leveled a number of allegations at Facebook’s video team, including a charge of rampant copyright infringement from Facebook users who are uploading videos from YouTube and other platforms without creators’ consent. Facebook has responded that it has measures in place to address copyright infringement, including allowing users to report stolen content and suspending accounts guilty of repeated violations.
I noted this statement:
For Facebook, video represents an irresistible new business opportunity. Early experiments with running natively inside the News Feed showed that it kept users on the site longer — and kept them from clicking external links that took them to YouTube and elsewhere.
Money and irresistible are words which flow.
The gem appears deep in the write up:
Facebook hasn’t made it easy for creators like Green to find instances of copyright infringement — there’s no way to filter Facebook searches for videos. And even if the stolen videos can be found, creators must fill out multiple forms, meaning it could be several days (and countless views) before a stolen video is taken down.
I find it interesting that search and retrieval may not do the trick. Then the bureaucratic process adds a deft touch.
I will file this item in my follow up folder. I know I can search my system for text files. Search which does not allow one to find information may be a tactic which serves other purposes. Is flawed search a business advantage? If one cannot find something, does that mean the “something” is not there?
Stephen E Arnold, August 10, 2015
August 7, 2015
Short honk: Forget words in English. “There’s Finally a Good Way to Text in Sign Language” explains that a new mobile keyboard app allows American Sign Language speakers to send text messages to a hearing impaired individual. The write up states:
Signily also includes animated signs for many popular ASL phrases that don’t have exact English translations. This makes texting a more natural experience for signers.
How does one parse and search these messages? Think look up table maybe? Will semantic vendors be able to make sense of animated signs?
Sure, semantic search is just super. And the meeting to discuss this?
Stephen E Arnold, August 7, 2015