Skepticism for Google Micro-Moment Marketing Push

October 13, 2017

An article at Street Fight, “The Fallacy of Google’s ‘Micro-Moment’ Positioning,” calls out Google’s “micro-moments” for the gimmick that it is. Here’s the company’s definition of the term they just made up: “an intent-rich moment when a person turns to a device to act on a need—to know, go, do, or buy.” In other words, any time a potential customer has a need and picks up their smartphone looking for a solution. For Street Fight’s David Mihm and Mike Blumenthal, this emphasis seems like a distraction from the failure of Google’s analytics to provide a well-rounded view of the online consumer. In fact, such oversimplification could hurt businesses that buy into the hype. In their dialogue format, they write:

David:[The term “micro-moments”] reduces all consumer buying decisions to thoughtless reflexes, which is just not reality, and drives all creative to a conversion-focused experience, which is only appropriate for specific kinds of keywords or mobile scenarios.  It’s totally IN-appropriate for display or top-of-funnel advertising. I also think it’s intended to create a bizarre sense of panic among marketers — “OMG, we have to be present at every possible instant someone might be looking at their phone!” — which doesn’t help them think strategically or make the best use of their marketing or ad spend.

Mike: I agree. If you don’t have a sound, broad strategy no micro management of micro moments will help. To some extent I wonder if Google’s use of the term reflects the limits of their analytics to yet be able to provide a more complete picture to the business?

David: Sure, Google is at least as well-positioned as Amazon or Facebook to provide closed-loop tracking of purchase behavior. But I think it reflects a longstanding cultural worldview within the company that reduces human behavior to an algorithm. “Get Notification. Buy Thing.” or “See Ad. Buy Thing.”  That may work for the “head” of transactional behavior but the long tail is far messier and harder to predict. Much as Larry Page would like us to be, humans are never going to be robots.

Companies that recognize the difference between consumers and robots have a clear edge in this area, no matter how Google tries to frame the issue. The authors compare Google’s blind spot to Amazon’s ease-of-use emphasis, noting the latter seems to better understand where customers are coming from. They also ponder the recent alliance between Google and Walmart to provide “voice-activated shopping” with a bit of skepticism. See the article for more of their reasoning.

Cynthia Murrell, October 13, 2017

Twitch Incorporates ClipMine Discovery Tools

September 18, 2017

Gameplay-streaming site Twitch has adapted the platform of their acquisition ClipMine, originally developed for adding annotations to online videos, into a metadata-generator for its users. (Twitch is owned by Amazon.) TechCrunch reports the development in, “Twitch Acquired Video Indexing Platform ClipMine to Power New Discovery Features.” Writer Sarah Perez tells us:

The startup’s technology is now being put to use to translate visual information in videos – like objects, text, logos and scenes – into metadata that can help people more easily find the streams they want to watch. Launched back in 2015, ClipMine had originally introduced a platform designed for crowdsourced tagging and annotations. The idea then was to offer a technology that could sit over top videos on the web – like those on YouTube, Vimeo or DailyMotion – that allowed users to add their own annotations. This, in turn, would help other viewers find the part of the video they wanted to watch, while also helping video publishers learn more about which sections were getting clicked on the most.

Based in Palo Alto, ClipMine went on to make indexing tools for the e-sports field and to incorporate computer vision and machine learning into their work. Their platform’s ability to identify content within videos caught Twitch’s eye; Perez explains:

Traditionally, online video content is indexed much like the web – using metadata like titles, tags, descriptions, and captions. But Twitch’s streams are live, and don’t have as much metadata to index. That’s where a technology like ClipMine can help. Streamers don’t have to do anything differently than usual to have their videos indexed, instead, ClipMine will analyze and categorize the content in real-time.

ClipMine’s technology has already been incorporated into stream-discovery tools for two games from Blizzard Entertainment, “Overwatch” and “Hearthstone;” see the article for more specifics on how and why. Through its blog, Twitch indicates that more innovations are on the way.

Cynthia Murrell, September 18, 2017

AI to Tackle Image Reading

September 11, 2017

The new frontier in analytics might just be pictures. Known to baffle even the most advanced AI systems, the ability to break pictures into recognizable parts and then use them to derive meaning has been a quest for many for some time. It appears that Disney Research in cahoots with UC Davis believe they are near a breakthrough.

Phys.org quotes Markus Gross, vice president at Disney Research, as saying,

We’ve seen tremendous progress in the ability of computers to detect and categorize objects, to understand scenes and even to write basic captions, but these capabilities have been developed largely by training computer programs with huge numbers of images that have been carefully and laboriously labeled as to their content. As computer vision applications tackle increasingly complex problems, creating these large training data sets has become a serious bottleneck.

A perfect example of the application of this is MIT attempts to use AI to share recipes and nutritional information just by viewing a picture of food. The sky is the limit when it comes to possibilities if Disney and MIT can help AI over the current hump of limitations.

Catherine Lamsfuss, September 11, 2017

Yet Another Digital Divide

September 8, 2017

Recommind sums up what happened at a recent technology convention in the article, “Why Discovery & ECM Haven’t, Must Come Together (CIGO Summit 2017 Recap).” Author Hal Marcus first discusses that he was a staunch challenge to anyone who said they could provide a complete information governance solution. He recently spoke at CIGO Summit 2017 about how to make information governance a feasible goal for organizations.

The problem with information governance is that there is no one simple solution and projects tend to be self-contained with only one goal: data collection, data reduction, etc. When he spoke he explained that there are five main reasons for there is not one comprehensive solution. They are that it takes a while to complete the project to define its parameters, data can come from multiple streams, mass-scale indexing is challenging, analytics will only help if there are humans to interpret the data, risk, and cost all put a damper on projects.

Yet we are closer to a solution:

Corporations seem to be dedicating more resources for data reduction and remediation projects, triggered largely by high profile data security breaches.

Multinationals are increasingly scrutinizing their data sharing and retention practices, spurred by the impending May 2018 GDPR deadline.

ECA for data culling is becoming more flexible and mature, supported by the growing availability and scalability of computing resources.

Discovery analytics are being offered at lower, all-you-can-eat rates, facilitating a range of corporate use cases like investigations, due diligence, and contract analysis

Tighter, more seamless and secure integration of ECM and discovery technology is advancing and seeing adoption in corporations, to great effect.

And it always seems farther away.

Whitney Grace, September 8, 2017

Natural Language Queries Added to Google Analytics

August 31, 2017

Data analysts are valuable members of any company and do a lot of good, but in many instances, average employees – not versed in analyst-ese – need to find valuable data. Rather than bother the analysts with mundane questions, Google has upgraded their analytics to include natural language queries, much like their search function.

Reporting on this upcoming change, ZDnet explains what this will mean for businesses:

Once the feature is available, users will have the ability to type or speak out a query and immediately receive a breakout of analyzed data that ranges from basic numbers and percentages to more detailed visualizations in charts and graphs. Google says it’s aiming to make data analysis more accessible to workers across a business, while in turn freeing up analysts to focus on more complex research and discovery.

While in theory, this seems like a great idea, it may still cause issues with those not asking questions related to the data, analytic method or appropriate prior knowledge. Unfortunately, data analysts are still the best resource when trying to glean information from analytics reports.

Catherine Lamsfuss, August 31, 2017

Accenture Makes Two Key Acquisitions

August 29, 2017

Whither search innovation? It seems the future of search is now about making what’s available work as best it can. We observe yet another effort to purchase existing search technology and plug it into an existing framework; DMN reports, “Accenture Acquires Brand Learning and Search Technologies.” Brand Learning is a marketing and sales consultancy, and Search Technologies is a technology services firm. Will Accenture, a professional-services firm, work to improve the search and analysis functionalities within their newly acquired tools? DMN’s Managing Editor Elyse Dupre reports:

A press release states that Brand Learning’s advisory team will join the management consulting and industry specialists within Accenture’s Customer and Channels practice. The partnership, according to the press release, will enhance Accenture’s offerings in terms of marketing and sales strategy, organizational design, industry-specific consulting, and HR and leadership.

It is unclear whether the “advisory team” includes any of the talent behind Brand Learning’s software. As for the Search Technologies folks, the article gives us more reason to hope for further innovation. Citing another press release, Dupre notes that company’s API-level data connectors will greatly boost Accenture’s ability to access unstructured data, and continues:

Search Technologies will join the data scientists and engineers within Accenture Analytics. According to the press release, this team will focus on creating solutions that make unstructured content (e.g. social media, video, voice, and audio) easily searchable, which will support data discovery, analytics, and reporting. Accenture’s Global Delivery Network will also add a delivery center in Costa Rica, the release states, which will serve as the home-base for the more than 70 Search Technologies big data engineers who reside there. This team focuses on customer and content analytics, the release explains, and will work with Accenture Interactive’s digital content production and marketing services professionals.

 

Furthermore, Kamran Khan, president and CEO of Search Technologies, will now lead a new content analytics team that will reside within Accenture Analytics.

Let us hope those 70 engineers are given the freedom and incentive to get creative. Stay tuned.

Cynthia Murrell, August 29, 2017

An Automatic Observer for Neural Nets

August 25, 2017

We are making progress in training AI systems through the neural net approach, but exactly how those systems make their decisions remains difficult to discern. Now, Tech Crunch reveals, “MIT CSAIL Research Offers a Fully Automated Way to Peer Inside Neural Nets.” Writer Darrell Etherington recalls that, a couple years ago, the same team of researchers described a way to understand these decisions using human reviewers. A fully automated process will be much more efficient and lead to greater understanding of what works and what doesn’t. Etherington explains:

Current deep learning techniques leave a lot of questions around how systems actually arrive at their results – the networks employ successive layers of signal processing to classify objects, translate text, or perform other functions, but we have very little means of gaining insight into how each layer of the network is doing its actual decision-making. The MIT CSAIL team’s system uses doctored neural nets that report back the strength with which every individual node responds to a given input image, and those images that generate the strongest response are then analyzed. This analysis was originally performed by Mechanical Turk workers, who would catalogue each based on specific visual concepts found in the images, but now that work has been automated, so that the classification is machine-generated. Already, the research is providing interesting insight into how neural nets operate, for example showing that a network trained to add color to black and white images ends up concentrating a significant portion of its nodes to identifying textures in the pictures.

The write-up points us to MIT’s own article on the subject for more information. We’re reminded that, because the human thought process is still largely a mystery to us, AI neural nets are based on hypothetical models that attempt to mimic ourselves. Perhaps, the piece suggests, a better understanding of such systems could inform the field of neuroscience. Sounds fair.

Cynthia Murrell, August 25, 2017

Analytics for the Non-Tech Savvy

August 18, 2017

I regularly encounter people who say they are too dumb to understand technology. When people tell themselves this, they are hindering their learning ability and are unable to adapt to a society that growing more dependent on mobile devices, the Internet, and instantaneous information.  This is especially harmful for business entrepreneurs.  The Next Web explains, “How Business Intelligence Can Help Non-Techies Use Data Analytics.”

The article starts with the statement that business intelligence is changing in a manner equivalent to how Windows 95 made computers more accessible to ordinary people.  The technology gatekeeper is being removed.  Proprietary software and licenses are expensive, but cloud computing and other endeavors are driving the costs down.

Voice interaction is another way BI is coming to the masses:

Semantic intelligence-powered voice recognition is simply the next logical step in how we interact with technology. Already, interfaces like Apple’s Siri, Amazon Alexa and Google Assistant are letting us query and interact with vast amounts of information simply by talking. Although these consumer-level tools aren’t designed for BI, there are plenty of new voice interfaces on the way that are radically simplifying how we query, analyze, process, and understand complex data.

 

One important component here is the idea of the “chatbot,” a software agent that acts as an automated guide and interface between your voice and your data. Chatbots are being engineered to help users identify data and guide them into getting the analysis and insight they need.

I see this as the smart people are making their technology available to the rest of us and it could augment or even improve businesses.  We are on the threshold of this technology becoming commonplace, but does it have practicality attached to it?  Many products and services are common place, but it they only have flashing lights and whistles what good are they?

Whitney Grace, August 18, 2017

Social Intelligence a Nice Addition to Analytics, but Not Necessary

August 9, 2017

Social media is an ever-evolving tricky beast to tame when it comes to analytics which is why most companies do the best they can with the resources appointed to the job. Social intelligence gurus, however, are constantly pushing more ways to make sense of the mounting social data.

A recent CIO article exploring the growing field of social intelligence highlighted the role of Sally-Anne Kaminski, Global Social Media Strategy Manager, at Zebra Technologies. Her job was explained as:

When the sales enablement team approaches her about prospective clients, Kaminski taps Oracle’s Social Cloud, a social relationship management tool, to build a comprehensive dashboard to help the sales representative nail the sale. Kaminski loads Social Cloud’s Boolean search with keywords, phrases and topics to discover in conversations across Facebook, Twitter and LinkedIn, as well as message boards and blogs.

Is it effective though? Even Kaminski admits there is no data showing her role analyzing social media data (beyond what analytics alone can do) is benefiting anyone. At the end of the day, social intelligence is reliant on the human touch (think more money) and we must question the operational value it provides.

Catherine Lamsfuss, August 9, 2017

Banks Learn Sentiment Analysis Equals Money

July 26, 2017

The International Business Times reported on the Unicorn conference “AI, Machine Learning and Sentiment Analysis Applied To Finance” that discussed how sentiment analysis and other data are changing the financing industry in the article: “AI And Machine Learning On Social Media Data Is Giving Hedge Funds A Competitive Edge.”  The article discusses the new approach to understanding social media and other Internet data.

The old and popular method of extracting data relies on a “bag of words” approach.  Basically, this means that an algorithm matches up a word with its intended meaning in a lexicon.  However, machine learning and artificial intelligence are adding more brains to the data extraction.  AI and machine learning algorithms are actually able to understand the context of the data.

An example of this in action could be the sentence: “IBM surpasses Microsoft”. A simple bag of words approach would give IBM and Microsoft the same sentiment score. DePalma’s news analytics engine recognises “IBM” is the subject, “Microsoft” is the object and “surpasses” as the verb and the positive/negative relationships between subject and the object, which the sentiment scores reflect: IBM positive, Microsoft, negative.

This technology is used for sentiment analytics to understand how consumers feel about brands.  In turn, that data can determine a brand’s worth and even volatility of stocks.  This translates to that sentiment analytics will shape financial leanings in the future and it is an industry to invest in

Whitney Grace, July 26, 2017

Next Page »

  • Archives

  • Recent Posts

  • Meta