Upgraded Social Media Monitoring

February 20, 2017

Analytics are catching up to content. In a recent ZDNet article, Digimind partners with Ditto to add image recognition to social media monitoring, we are reminded images reign supreme on social media. Between Pinterest, Snapchat and Instagram, messages are often conveyed through images as opposed to text. Capitalizing on this, and intelligence software company Digimind has announced a partnership with Ditto Labs to introduce image-recognition technology into their social media monitoring software called Digimind Social. We learned,

The Ditto integration lets brands identify the use of their logos across Twitter no matter the item or context. The detected images are then collected and processed on Digimind Social in the same way textual references, articles, or social media postings are analysed. Logos that are small, obscured, upside down, or in cluttered image montages are recognised. Object and scene recognition means that brands can position their products exactly where there customers are using them. Sentiment is measured by the amount of people in the image and counts how many of them are smiling. It even identifies objects such as bags, cars, car logos, or shoes.

It was only a matter of time before these types of features emerged in social media monitoring. For years now, images have been shown to increase engagement even on platforms that began focused more on text. Will we see more watermarked logos on images? More creative ways to visually identify brands? Both are likely and we will be watching to see what transpires.

Megan Feil, February 20, 2017

 

The Current State of Enterprise Search, by the Numbers

February 17, 2017

The article and delightful Infographic on BA Insight titled Stats Show Enterprise Search is Still a Challenge builds an interesting picture of the present challenges and opportunities surrounding enterprise search, or at least alludes to them with the numbers offered. The article states,

As referenced by AIIM in an Industry Watch whitepaper on search and discovery, three out of four people agree that information is easier to find outside of their organizations than within. That is startling! With a more effective enterprise search implementation, these users feel that better decision-making and faster customer service are some of the top benefits that could be immediately realized.

What follows is a collection of random statistics about enterprise search. We would like to highlight one stat in particular: 58% of those investing in enterprise search get no payback after one year. In spite of the clear need for improvements, it is difficult to argue for a technology that is so long-term in its ROI, and so shaky where it is in place. However, there is a massive impact on efficiency when employees waste time looking for the information they need to do their jobs. In sum: you can’t live with it, and you can’t live (productively) without it.

Chelsea Kerwin, February 17, 2017

Enterprise Heads in the Sand on Data Loss Prevention

February 16, 2017

Enterprises could be doing so much more to protect themselves from cyber attacks, asserts Auriga Technical Manager James Parry in his piece, “The Dark Side: Mining the Dark Web for Cyber Intelligence” at Information Security Buzz. Parry informs us that most businesses fail to do even the bare minimum they should to protect against hackers. This minimum, as he sees it, includes monitoring social media and underground chat forums for chatter about their company. After all, hackers are not known for their modesty, and many do boast about their exploits in the relative open. Most companies just aren’t bothering to look that direction. Such an effort can also reveal those impersonating a business by co-opting its slogans and trademarks.

Companies who wish to go beyond the bare minimum will need to expand their monitoring to the dark web (and expand their data-processing capacity). From “shady” social media to black markets to hacker libraries, the dark web can reveal much about compromised data to those who know how to look. Parry writes:

Yet extrapolating this information into a meaningful form that can be used for threat intelligence is no mean feat. The complexity of accessing the dark web combined with the sheer amount of data involved, correlation of events, and interpretation of patterns is an enormous undertaking, particularly when you then consider that time is the determining factor here. Processing needs to be done fast and in real-time. Algorithms also need to be used which are able to identify and flag threats and vulnerabilities. Therefore, automated event collection and interrogation is required and for that you need the services of a Security Operations Centre (SOC).

The next generation SOC is able to perform this type of processing and detect patterns, from disparate data sources, real-time, historical data etc. These events can then be threat assessed and interpreted by security analysts to determine the level of risk posed to the enterprise. Forewarned, the enterprise can then align resources to reduce the impact of the attack. For instance, in the event of an emerging DoS attack, protection mechanisms can be switched from monitoring to mitigation mode and network capacity adjusted to weather the attack.

Note that Parry’s company, Auriga, supplies a variety of software and R&D services, including a Security Operations Center platform, so he might be a tad biased. Still, he has some good points. The article notes SOC insights can also be used to predict future attacks and to prioritize security spending. Typically, SOC users have been big businesses, but, Parry advocates, scalable and entry-level packages are making such tools available to smaller companies.

From monitoring mainstream social media to setting up an SOC to comb through dark web data, tools exist to combat hackers. The question, Parry observes, is whether companies will face the growing need to embrace those methods.

Cynthia Murrell, February 16, 2017

Why Do We Care More About Smaller Concerns? How Quantitative Numbing Impacts Emotional Response

February 14, 2017

The affecting article on Visual Business Intelligence titled When More is Less: Quantitative Numbing explains the phenomenon that many of us have probably witnessed on the news, in our friends and family, and even personally experienced in ourselves. A local news story about the death of an individual might provoke a stronger emotional response than news of a mass tragedy involving hundreds or thousands of deaths. Scott Slovic and Paul Slovic explore this in their book Numbers and Nerves. According to the article, this response is “built into our brains.” Another example explains the Donald Trump effect,

Because he exhibits so many examples of bad behavior, those behaviors are having relatively little impact on us. The sheer number of incidents creates a numbing effect. Any one of Trump’s greedy, racist, sexist, vulgar, discriminatory, anti-intellectual, and dishonest acts, if considered alone, would concern us more than the huge number of examples that now confront us. The larger the number, the lesser the impact…This tendency… is automatic, immediate, and unconscious.

The article suggests that the only reason to overcome this tendency is to engage with large quantities in a slower, more thoughtful way. An Abel Hertzberg quote helps convey this approach when considering the large-scale tragedy of the Holocaust: “There were not six million Jews murdered: there was one murder, six million times.” The difference between that consideration of individual murders vs. the total number is stark, and it needs to enter into the way we process daily events that are happening all over the world if we want to hold on to any semblance of compassion and humanity.

Chelsea Kerwin, February 14, 2017

Data Mining Firm Cambridge Analytica Set to Capture Trump White House Communications Contract and Trump Organization Sales Contract

February 13, 2017

The article titled Data Firm in Talks for Role in White House Messaging — And Trump Business on The Guardian discusses the future role of Cambridge Analytica in both White House communication and the Trump Organization as well. Cambridge Analytica is a data company based out of London that boasts crucial marketing and psychological data on roughly 230 million Americans. The article points out,

Cambridge’s data could be helpful in both “driving sales and driving policy goals”, said the digital source, adding: “Cambridge is positioned to be the preferred vendor for all of that.”… The potential windfall for the company comes after the Mercers and Cambridge played key roles in Trump’s victory. Cambridge Analytica was tapped as a leading campaign data vendor as the Mercers… The Mercers reportedly pushed for the addition of a few top campaign aides, including Bannon and Kellyanne Conway, who became campaign manager.

Robert Mercer is a major investor in Cambridge Analytica as well as Breitbart News, Steve Bannon’s alt-right news organization. Steve Bannon is also on the board of Cambridge Analytica. The entanglements mount. Prior to potentially snagging these two wildly conflicting contracts, Cambridge Analytica helped Trump win the presidency with their data modeling and psychological profiling that focuses on building intimate relationships between brands and consumers to drive action.

Chelsea Kerwin, February 13, 2017

The Game-Changing Power of Visualization

February 8, 2017

Data visualization may be hitting at just the right time. Data Floq shared an article highlighting the latest, Data Visualisation Can Change How We Think About The World. As the article mentions, we are primed for it biologically: the human eye and brain processes 10 to 12 separate images per second, comfortably. Considering the output, visualization provides the ability to rapidly incorporate new data sets, remove metadata and increase performance. Data visualization is not without challenge. The article explains,

Perhaps the biggest challenge for data visualisation is understanding how to abstract and represent abstraction without compromising one of the two in the process. This challenge is deep rooted in the inherent simplicity of descriptive visual tools, which significantly clashes with the inherent complexity that defines predictive analytics. For the moment, this is a major issue in communicating data; The Chartered Management Institute found that 86% of 2,000 financiers surveyed late 2013, were still struggling to turn volumes of data into valuable insights. There is a need, for people to understand what led to the visualisation, each stage of the process that led to its design. But, as we increasingly adopt more and more data this is becoming increasingly difficult.

Is data visualization changing how we think about the world, or is the existence of big data the culprit? We would argue data visualization is simply a tool to present data; it is a product rather than an impetus for a paradigm shift. This piece is right, however in bringing attention to the conflict between detail and accessibility of information. We can’t help but think the meaning is likely in the balancing of both.

Megan Feil, February 8, 2017

How to Quantify Culture? Counting the Bookstores and Libraries Is a Start

February 7, 2017

The article titled The Best Cities in the World for Book Lovers on Quartz conveys the data collected by the World Cities Culture Forum. That organization works to facilitate research and promote cultural endeavors around the world. And what could be a better measure of a city’s culture than its books? The article explains how the data collection works,

Led by the London mayor’s office and organized by UK consulting company Bop, the forum asks its partner cities to self-report on cultural institutions and consumption, including where people can get books. Over the past two years, 18 cities have reported how many bookstores they have, and 20 have reported on their public libraries. Hong Kong leads the pack with 21 bookshops per 100,000 people, though last time Buenos Aires sent in its count, in 2013, it was the leader, with 25.

New York sits comfortably in sixth place, but London, surprisingly, is near the bottom of the ranking with roughly 360 bookstores. Another measure the WCCF uses is libraries per capita. Edinburgh of all places surges to the top without any competition. New York is the only US city to even make the cut with an embarrassing 2.5 libraries per 100K people. By contrast, Edinburgh has 60.5 per 100K people. What this analysis misses out on is the size and beauty of some of the bookstores and libraries of global cities. To bask in these images, visit Bookshelf Porn or this Mental Floss ranking of the top 7 gorgeous bookstores.

Chelsea Kerwin, February 7, 2017

Oracle Pays Big Premium for NetSuite and Larry Ellison Benefits

February 6, 2017

The article on Reuters titled Oracle-NetSuite Deal May Be Sweetest for Ellison emphasizes the perks of being an executive chairman like Larry Ellison, of Oracle. Ellison ranks as the third richest person in America and fifth in the world. The article suggests that his fortune of over $50B is often considered as mingling with Oracle’s $160B in a way that makes, if no one else, at least Reuters, very uncomfortable. The article does offer some context to the most recent acquisition of NetSuite, for which Oracle paid a 44% premium on a company of which Ellison owns a 45% stake.

NetSuite was founded by an ex-Oracle employee, bankrolled by Ellison. While Oracle concentrated on selling enterprise software to giant corporations, the upstart focused on servicing small and medium-sized companies using the cloud. The two companies’ businesses have increasingly overlapped as larger customers have become comfortable using web-based software.

As a result, it makes strategic sense to combine the two firms. And the process seems to have been handled right, with a committee of independent Oracle directors calling the shots.

The article also points out that such high surcharges aren’t all that unusual. Salesforce.com recently paid a 56% premium for Demandware. But in this case, things are complicated by Ellison’s potential conflict of interest. If Oracle had done more to invest in cloud business or NetSuite earlier, say four or five years ago, they would not find themselves forking over just under $10B now.

Chelsea Kerwin, February 6, 2017

Synthetic Datasets: Reality Bytes

February 5, 2017

Years ago I did a project for an outfit specializing in an esoteric math space based on mereology. No, I won’t define it. You can check out the explanation in the Stanford Encyclopedia of Philosophy. The idea is that sparse information can yield useful insights. Even better, if mathematical methods were use to populate missing cells in a data system, one could analyze the data as if it were more than probability generated items. Then when real time data arrived to populate the sparse cells, the probability component would generate revised data for the cells without data. Nifty idea, just tough to explain to outfits struggling to move freight or sell off lease autos.

I thought of this company’s software system when I read “Synthetic Datasets Are a Game Changer.” Once again youthful wizards happily invent the future even though some of the systems and methods have been around for decades. For more information about the approach, the journal articles and books of Dr. Zbigniew Michaelewicz may be helpful.

The “Synthetic Databases…” write up triggered some yellow highlighter activity. I found this statement interesting:

Google researchers went as far as to say that even mediocre algorithms received state-of-the-art results given enough data.

The idea that algorithms can output “good enough” results when volumes of data are available to the number munching algorithms.

I also noted:

there are recent successes using a new technique called ‘synthetic datasets’ that could see us overcome those limitations. This new type of dataset consists of images and videos that are solely rendered by computers based on various parameters or scenarios. The process through which those datasets are created fall into 2 categories: Photo realistic rendering and Scenario rendering for lack of better description.

The focus here is not on figuring out how to move nuclear fuel rods around a reactor core or adjusting coal fired power plant outputs to minimize air pollution. The synthetic databases have an application in image related disciplines.

The idea of using rendering engines to create images for facial recognition or for video games is interesting. The write up mentions a number of companies pushing forward in this field; for example, Cvedia.

However, the use of NuTech’s methods populated databases of fact. I think the use of synthetic methods has a bright future. Oh, NuTech was acquired by Netezza. Guess what company owns the prescient NuTech Solutions’ technology? Give up? IBM, a company which has potent capabilities but does the most unusual things with those important systems and methods.

I suppose that is one reason why old wine looks like new IBM Holiday Spirit rum.

Stephen E Arnold, February 5, 2017

Give a Problem, Take a Problem

February 3, 2017

An article at the Telegraph, “Employees Are Faster and More Creative When Solving Other People’s Problems,” suggests innovative ways to coax creative solutions from workers. Writer Daniel H. Pink describes three experiments, performed by New York University’s Evan Polman and Cornell’s Kyle Emich. The researchers found that, when posed with hypothetical scenarios, participants devised more creative solutions when problems were framed as being someone else’s. But why? Pink writes:

Polman and Emich build upon existing psychological research showing that when we think of situations or individuals that are distant – in space, time, or social connection – we think of them in the abstract. But when those things are close – near us physically, about to happen, or standing beside us – we think about them concretely. Over the years, social scientists have found that abstract thinking leads to greater creativity. That means that if we care about innovation we need to be more abstract and therefore more distant. But in our businesses and our lives, we often do the opposite. We intensify our focus rather than widen our view. We draw closer rather than step back. That’s a mistake, Polman and Emich suggest. ‘That decisions for others are more creative than decisions for the self… should prove of considerable interest to negotiators, managers, product designers, marketers and advertisers, among many others,’ they write.

The article goes on to supply five practical suggestions this research has for business. For one, organizations can recruit independent directors to bring in more objective points of view. Pink also suggests keeping firms loosely structured, and bringing together peers from different fields to exchange ideas. On the individual level, he advises finding a “problem-swapping partner” with whom you can trade perspectives. Finally, workers can create psychological distance between themselves and their projects by imagining they’re helping out someone else.

Pink acknowledges a couple of caveats to this approach. For one, many tasks actually do require concrete thinking and laser focus; it is important to recognize them. Also, the business world is not currently structured to take advantage of this quirk of the human psyche. The article points to the growth of crowd-sourcing techniques as evidence that factor may change. Perhaps… but group think brings its own issues, like the potential for discounting experience and specialized skill sets, for example. To whom shall we turn for a fresh perspective on that problem?

Cynthia Murrell, February 3, 2017

Next Page »