Technology Does Not Level the Playing Field
July 12, 2016
Among the many articles about how too much automation of the labor force will devastate humanity, I found another piece that describes how technology as tools are a false equalizer. The Atlantic published the piece titled: “Technology, The Faux Equalizer.” What we tend to forget is that technology consists of tools made by humans. These tools have consistently become more complicated as society has advanced. The article acknowledges this by having us remember one hundred years ago, when electricity was a luxurious novelty. Only the wealthy and those with grid access used electricity, but now it is as common as daylight.
This example points to how brand new technology is only available to a limited percentage of people. Technological process and social progress are not mutually inclusive. Another example provided, notes that Gutenberg’s printing press did not revolutionize printing for society, but rather the discovery of cheaper materials to make books. Until technology is available for everyone it is not beneficial:
“Just compare the steady flow of venture capital into Silicon Valley with the dearth of funding for other technological projects, like critical infrastructure improvements to water safety, public transit, disintegrating bridges, and so on. ‘With this dynamic in mind, I would suggest that there is greater truth to the opposite of Pichai’s statement,’ said Andrew Russell, a professor at Stevens Institute of Technology. ‘Every jump in technology draws attention and capital away from existing technologies used by the 99 percent, which therefore undermines equality, and reduces the ability for people to get onto the ‘playing field’ in the first place.’”
In science-fiction films depicting the future, we imagine that technology lessens the gap between everyone around the world, but we need to be reminded that the future is now. Only a few people have access to the future, compare the average lifestyle of Europeans and Americans versus many African and Middle East nations. History tells us that this is the trend we will always follow.
Oh, oh. We thought technology would fix any problem. Perhaps technology exacerbates old sores and creates new wounds? Just an idle question.
Whitney Grace, July 12, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
VirtualWorks Purchases Natural Language Processing Firm
July 8, 2016
Another day, another merger. PR Newswire released a story, VirtualWorks and Language Tools Announce Merger, which covers Virtual Works’ purchase of Language Tools. In Language Tools, they will inherit computational linguistics and natural language processing technologies. Virtual Works is an enterprise search firm. Erik Baklid, Chief Executive Officer of VirtualWorks is quoted in the article,
“We are incredibly excited about what this combined merger means to the future of our business. The potential to analyze and make sense of the vast unstructured data that exists for enterprises, both internally and externally, cannot be understated. Our underlying technology offers a sophisticated solution to extract meaning from text in a systematic way without the shortcomings of machine learning. We are well positioned to bring to market applications that provide insight, never before possible, into the vast majority of data that is out there.”
This is another case of a company positioning themselves as a leader in enterprise search. Are they anything special? Well, the news release mentions several core technologies will be bolstered due to the merger: text analytics, data management, and discovery techniques. We will have to wait and see what their future holds in regards to the enterprise search and business intelligence sector they seek to be a leader in.
Megan Feil, July 8, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Who Will Connect the Internet of Things to Business
June 23, 2016
Remember when Nest Labs had all the hype a few years ago? An article from BGR reminds us how the tides have turned: Even Google views its Nest acquisition as a disappointment. It was in 2014 that Google purchased Nest Labs for $3.2 billion. Their newly launched products, a wifi smoke alarm and thermostat, at the time seemed to the position the company for greater and greater success. This article offers a look at the current state:
“Two and a half years later and Nest is reportedly in shambles. Recently, there have been no shortage of reports suggesting that Nest CEO Tony Fadell is something of a tyrannical boss cut from the same cloth as Steve Jobs (at his worst). Additionally, the higher-ups at Google are reportedly disappointed that Nest hasn’t been able to churn out more hardware. Piling it on, Re/Code recently published a report indicating that Nest generated $340 million in revenue last year, a figure that Google found disappointing given how much it spent to acquire the company. And looking ahead, particulars from Google’s initial buyout deal with Nest suggest that the pressure for Nest to ramp up sales will only increase.”
Undoubtedly there are challenges when it comes to expectations about acquired companies’ performance. But when it comes to the nitty gritty details of the work happening in those acquisitions, aren’t managers supposed to solve problems, not simply agree the problem exists? How the success of “internet of things” companies will pan out seems to be predicated on their inherent interconnectedness — that seems to apply at both the levels of product and business.
Megan Feil, June 23, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Data Wrangling Market Is Self-Aware and Growing, Study Finds
June 20, 2016
The article titled Self-Service Data Prep is the Next Big Thing for BI on Datanami digs into the quickly growing data preparation industry by reviewing the Dresner Advisory Services study. The article provides a list of the major insights from the study and paints a vivid picture of the current circumstances. Most companies often perform end-user data preparation, but only a small percentage (12%) find themselves to be proficient in the area. The article states,
“Data preparation is often challenging, with many organizations lacking the technical resources to devote to comprehensive data preparation. Choosing the right self-service data preparation software is an important step…Usability features, such as the ability to build/execute data transformation scripts without requiring technical expertise or programming skills, were considered “critical” or “very important” features by over 60% of respondents. As big data becomes decentralized and integrated into multiple facets of an organization, users of all abilities need to be able to wrangle data themselves.”
90% of respondents agreed on the importance of two key features: the capacity to aggregate and group data, and a straightforward interface for implementing structure on raw data. Trifacta earned the top vendor ranking of just under 30 options for the second year in a row. The article concludes by suggesting that many users are already aware that data preparation is not an independent activity, and data prep software must be integrated with other resources for success.
Chelsea Kerwin, June 20, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Amazon’s Alexa Popularizes Digital Assistants
June 17, 2016
Digital assistants are smarter than ever. I remember when PDAs were the wave of the future and meant to revolutionize lives, but they still relied on human input and did not have much in the ways of artificial intelligence. Now Cortana, Siri, and Alexa respond to vocal commands like an episode of Star Trek. Digital assistants are still limited in many ways, but according to Venture Beat Alexa might be changing how we interact with technology: “How Amazon’s Alexa Is Bringing Intelligent Assistance Into The Mainstream”.
Natural language processing teamed with artificial intelligence has made using digital assistants easier and more accepted. Predictive analytics specialist MindMeld commissioned a “user adoption survey” of voice-based intelligent assistants and the results show widespread adoption.
Amazon’s Echo teamed with the Alexa speech-enabled vocal device are not necessarily dominating the market, but Amazon is showing the potential for an intelligent system with added services like Uber, music-streaming, financial partners, and many more.
“Such routine and comfort will be here soon, as IA acceptance and use continue to accelerate. What started as a novelty and source of marketing differentiation from a smartphone manufacturer has become the most convenient user interface for the Internet of Things, as well as a plain-spoken yet empathetic controller of our digital existence.”
Amazon is on the right path as are other companies experimenting with the digital assistant. My biggest quip is that all of these digital assistants are limited and have a dollar sign attached to them greater than some people’s meal budgets. It is not worth investing in an intelligent assistant, unless needed. I say wait for better and cheaper technology that will be here soon.
Whitney Grace, June 17, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Next-Generation Business Intelligence Already Used by Risk Analysis Teams
June 1, 2016
Ideas about business intelligence have certainly evolved with emerging technologies. Addressing this, an article, Why machine learning is the new BI from CIO, speaks to this transformation of the concept. The author describes how reactive analytics based on historical data do not optimally assist business decisions. Questions about customer satisfaction are best oriented toward proactive future-proofing, according to the article. The author writes,
“Advanced, predictive analytics are about calculating trends and future possibilities, predicting potential outcomes and making recommendations. That goes beyond the queries and reports in familiar BI tools like SQL Server Reporting Services, Business Objects and Tableau, to more sophisticated methods like statistics, descriptive and predictive data mining, machine learning, simulation and optimization that look for trends and patterns in the data, which is often a mix of structured and unstructured. They’re the kind of tools that are currently used by marketing or risk analysis teams for understanding churn, customer lifetimes, cross-selling opportunities, likelihood of buying, credit scoring and fraud detection.”
Does this mean that traditional business intelligence after much hype and millions in funding is a flop? Or will predictive analytics be a case of polishing up existing technology and presenting it in new packaging? After time — and for some after much money has been spent — we should have a better idea of the true value.
Megan Feil, June 1, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
An Open Source Search Engine to Experiment With
May 1, 2016
Apache Lucene receives the most headlines when it comes to discussion about open source search software. My RSS feed pulled up another open source search engine that shows promise in being a decent piece of software. Open Semantic Search is free software that cane be uses for text mining, analytics, a search engine, data explorer, and other research tools. It is based on Elasticsearch/Apache Solrs’ open source enterprise search. It was designed with open standards and with a robust semantic search.
As with any open source search, it can be programmed with numerous features based on the user’s preference. These include, tagging, annotation, varying file format support, multiple data sources support, data visualization, newsfeeds, automatic text recognition, faceted search, interactive filters, and more. It has the benefit that it can be programmed for mobile platforms, metadata management, and file system monitoring.
Open Semantic Search is described as
“Research tools for easier searching, analytics, data enrichment & text mining of heterogeneous and large document sets with free software on your own computer or server.”
While its base code is derived from Apache Lucene, it takes the original product and builds something better. Proprietary software is an expense dubbed a necessary evil if you work in a large company. If, however, you are a programmer and have the time to develop your own search engine and analytics software, do it. It could be even turn out better than the proprietary stuff.
Whitney Grace, May 1, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Nasdaq Joins the Party for Investing in Intelligence
April 6, 2016
The financial sector is hungry for intelligence to help curb abuses in capital markets, judging by recent actions of Goldman Sachs and Credit Suisse. Nasdaq invests in ‘cognitive’ technology, from BA wire, announces their investment in Digital Reasoning. Nasdaq plans to connect Digital Reasoning algorithms with Nasdaq’s technology which surveils trade data. The article explains the benefits of joining these two products,
“The two companies want to pair Digital Reasoning software of unstructured data such as voicemail, email, chats and social media, with Nasdaq’s Smarts business, which is one of the foremost software for monitoring trading on global markets. It is used by more than 40 markets and 12 regulators. Combining the two products is designed to assess the context, content and relationships behind trading and spot signals that could indicate insider trading, market manipulation or even expenses rules violations.”
We have followed Digital Reasoning, and other intel vendors like them, for quite some time as they target sectors ranging from healthcare to law to military. This is just a case of another software intelligence vendor making the shift to the financial sector. Following the money appears to be the name of the game.
Megan Feil, April 6, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Yellowfin: Emulating i2 and Palantir?
March 22, 2016
I read “New BI Platform Focuses on Collaboration, Analytics.” What struck me about this explanation of a new version of YellowFin is that the company is adding the type of features long considered standard in law enforcement and intelligence. The idea is that visualizations and collaboration are components of a commercial business intelligence solution.
I noted this paragraph:
Other BI vendors have tried to push data preparation and analysis responsibilities onto business users “because it’s easier to adapt what they have to fulfill that goal.” But Yellowfin “isn’t a BI tool attempting to make the business user a techie. It is about presenting data to users in an attractive visual representation, backed-up with some of the most sophisticated collaboration tools embedded into a BI platform on the market.”
The reason for analyst involvement in the loading of data is a way to eliminate the issue of content ownership, indexing, and knowledge of what is in the system’s repository. I am not confident that any system which allows the user to whack away at whatever data have been processed by the system is ready for prime time. Sure, Google can win at Go, but the self driving auto ran into a bus.
The write up, which strikes me as New Age public relations, seems to want me to remember what’s new with YellowFin with this mnemonic example: Curated. Baffled? Here’s what curated means:
- Consistent: Governed, centralized and managed
- Usable: by any business to consume analytics
- Relevant: connected to all the data users need to do their jobs well
- Accurate: data quality is paramount
- Timely: Provide real time data and agile content development
- Engaging: Offer a social or collaborative component
- Deployed: widely across the organization.
Business intelligence is the new “enterprise search.” I am not sure the use of notions like curated and adding useful functions delivers the impact that some marketers promise. Remember that self driving car. Pesky humans.
Stephen E Arnold, March 23, 2016
Tech Unicorns May Soon Disappear as Fast as They Appeared
March 15, 2016
Silicon Valley “unicorns”, private companies valued at one billion or more, may not see the magic last. The article Palantir co-founder Lonsdale calls LinkedIn plunge a bad sign for unicorns from Airline Industry Today questions the future for companies like LinkedIn whose true value has yet to result in ever-increasing profits. After disappointing Wall Street with lower earnings and revenue, investors devalued LinkedIn by about $10 billion. Joe Lonsdale, the Formation 8 venture investor who co-founded Palantir Technologies is quoted stating,
“A lot of LinkedIn’s value, according to how many of us think about it, is tied to what it will achieve in the next five to 10 years,” Lonsdale said in an appearance on CNBC’s “Squawk Alley” on Friday. “It is very similar to a unicorn in that way. Yes, it is making a few billion in revenue and it’s a public company but it has these really big long-term plans as well and is very similar to how you see these other companies.” He added a lot of people who have been willing to suspend disbelief aren’t doing that anymore. “At this point, people are asking, ‘Are you actually going to be able to keep growing?’ And they’re punishing the unicorns and punishing the public companies the same way.”
Lonsdale understands why many private companies postpone an IPO for as long as possible, given these circumstances. Regardless of the pros and cons of when a company should go public, the LinkedIn devaluation seems as if it will send a message. Whether that message is one that fearmongers similar companies into staying private for longer or one that changes profitability norms for younger tech companies remains to be seen.
Megan Feil, March 15, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph