Interview with an Ethical Hacker

July 20, 2016

We’ve checked out a write-up on one of the white-hats working for IBM at Business Insider— “Here’s What It’s Really Like to Be a Hacker at One of the World’s Biggest Tech Companies.”  We wonder, does this wizard use Watson? The article profiles Charles Henderson. After summarizing the “ethical hacker’s” background, the article describes some of his process:

“The first thing I do every morning is catch up on what happened when I was sleeping. The cool thing is, since I run a global team, when I’m sleeping there are teams conducting research and working engagements with customers. So in the morning I start by asking, ‘Did we find any critical flaws?’ ‘Do I need to tell a client we found a vulnerability and begin working to fix it?’ From there, I am working with my team to plan penetration tests and make sure we have the resources we need to address the issues we have found. There isn’t an hour that goes by that I don’t find a cool, new way of doing something, which means my days are both unpredictable and exciting.

“I also do a lot of research myself. I like to look at consumer electronic devices, anything from planes to trains to automobiles to mobile devices. I try to find ways to break into or break apart these devices, to find new flaws and vulnerabilities.”

Henderson also mentions meeting with clients around the world to consult on security issues, and lists some projects his team has tackled. For example, a “physical penetration test” which involved stealing a corporate vehicle, and sending “tiger teams” to burgle client buildings. His favorite moments, though, are those when he is able to fix a vulnerability before it is exploited. Henderson closes with this bit of advice for aspiring hackers: “Always be curious. Never take anything at face value.”

 

 

Cynthia Murrell, July 20, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

There is a Louisville, Kentucky Hidden Web/Dark
Web meet up on July 26, 2016.
Information is at this link: http://bit.ly/29tVKpx.

Hewlett Packard Makes Haven Commercially Available

July 19, 2016

The article InformationWeek titled HPE’s Machine Learning APIs, MIT’s Sports Analytics Trends: Big Data Roundup analyzes Haven OnDemand, a large part of Hewlett Packard Enterprise’s big data strategy. For a look at the smart software coming out of HP Enterprise, check out this video. The article states,

“HPE’s announcement this week brings HPE Haven OnDemand as a service on the Microsoft Azure platform and provides more than 60 APIs and services that deliver deep learning analytics on a wide range of data, including text, audio, image, social, Web, and video. Customers can start with a freemium service that enables development and testing for free, and grow into a usage and SLA-based commercial model for enterprises.”

You may notice from the video that the visualizations look a great deal like Autonomy IDOL’s visualizations from the early 2000s. That is, dated, especially when compared to visualizations from other firms. But Idol may have a new name: Haven. According to the article, that name is actually a relaxed acronym for Hadoop, Autonomy IDOL, HP Vertica, Enterprise Security Products, and “n” or infinite applications. HPE promises that this cloud platform with machine learning APIs will assist companies in growing mobile and enterprise applications. The question is, “Can 1990s technology provide what 2016 managers expects?”

 

Chelsea Kerwin, July 19, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

There is a Louisville, Kentucky Hidden Web/Dark
Web meet up on July 26, 2016.
Information is at this link: http://bit.ly/29tVKpx.

Attivio Targets Profitability by the End of 2016 Through $31M Financing Round

July 18, 2016

The article on VentureBeat titled Attivio Raises $31 Million to Help Companies Make Sense of Big Data discusses the promises of profitability that Attivio has made since its inception in 2007. According to Crunchbase, the search vendor has raised over $100 million from four investors. In March 2016, the company closed a financing round at $31M with the expectation of becoming profitable within 2016. The article explains,

“Our increased investment underscores our belief that Attivio has game-changing capabilities for enterprises that have yet to unlock the full value of Big Data,” said Oak Investment Partners’ managing partner, Edward F. Glassmeyer. Attivio also highlighted such recent business victories as landing lab equipment maker Thermo Fisher Scientific as a client and partnering with medical informatics shop PerkinElmer. Oak Investment Partners, General Electric Pension Trust, and Tenth Avenue Holdings participated in the investment, which pushed Attivio’s funding to at least $102 million.”

In the VentureBeat Profile about the deal, Stephen Baker, CEO of Attivio makes it clear that 2015 was a turning point for the company, or in his words, “a watershed year.” Attivio prides itself on both speeding up the data preparation process and empowering their customers to “achieve true Data Dexterity.”  And hopefully they will also be profitable, soon.

 

Chelsea Kerwin, July 18, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

There is a Louisville, Kentucky Hidden Web/Dark
Web meet up on July 26, 2016.
Information is at this link: http://bit.ly/29tVKpx.

==

Technology Does Not Level the Playing Field

July 12, 2016

Among the many articles about how too much automation of the labor force will devastate humanity, I found another piece that describes how technology as tools are a false equalizer.  The Atlantic published the piece titled: “Technology, The Faux Equalizer.”  What we tend to forget is that technology consists of tools made by humans.  These tools have consistently become more complicated as society has advanced.  The article acknowledges this by having us remember one hundred years ago, when electricity was a luxurious novelty.  Only the wealthy and those with grid access used electricity, but now it is as common as daylight.

This example points to how brand new technology is only available to a limited percentage of people.  Technological process and social progress are not mutually inclusive.  Another example provided, notes that Gutenberg’s printing press did not revolutionize printing for society, but rather the discovery of cheaper materials to make books.  Until technology is available for everyone it is not beneficial:

“Just compare the steady flow of venture capital into Silicon Valley with the dearth of funding for other technological projects, like critical infrastructure improvements to water safety, public transit, disintegrating bridges, and so on. ‘With this dynamic in mind, I would suggest that there is greater truth to the opposite of Pichai’s statement,’ said Andrew Russell, a professor at Stevens Institute of Technology. ‘Every jump in technology draws attention and capital away from existing technologies used by the 99 percent, which therefore undermines equality, and reduces the ability for people to get onto the ‘playing field’ in the first place.’”

In science-fiction films depicting the future, we imagine that technology lessens the gap between everyone around the world, but we need to be reminded that the future is now.  Only a few people have access to the future, compare the average lifestyle of Europeans and Americans versus many African and Middle East nations.  History tells us that this is the trend we will always follow.

Oh, oh. We thought technology would fix any problem. Perhaps technology exacerbates old sores and creates new wounds? Just an idle question.

 

Whitney Grace,  July 12, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

VirtualWorks Purchases Natural Language Processing Firm

July 8, 2016

Another day, another merger. PR Newswire released a story, VirtualWorks and Language Tools Announce Merger, which covers Virtual Works’ purchase of Language Tools. In Language Tools, they will inherit computational linguistics and natural language processing technologies. Virtual Works is an enterprise search firm. Erik Baklid, Chief Executive Officer of VirtualWorks is quoted in the article,

“We are incredibly excited about what this combined merger means to the future of our business. The potential to analyze and make sense of the vast unstructured data that exists for enterprises, both internally and externally, cannot be understated. Our underlying technology offers a sophisticated solution to extract meaning from text in a systematic way without the shortcomings of machine learning. We are well positioned to bring to market applications that provide insight, never before possible, into the vast majority of data that is out there.”

This is another case of a company positioning themselves as a leader in enterprise search. Are they anything special? Well, the news release mentions several core technologies will be bolstered due to the merger: text analytics, data management, and discovery techniques. We will have to wait and see what their future holds in regards to the enterprise search and business intelligence sector they seek to be a leader in.

Megan Feil, July 8, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

Who Will Connect the Internet of Things to Business

June 23, 2016

Remember when Nest Labs had all the hype a few years ago? An article from BGR reminds us how the tides have turned: Even Google views its Nest acquisition as a disappointment. It was in 2014 that Google purchased Nest Labs for $3.2 billion. Their newly launched products, a wifi smoke alarm and thermostat, at the time seemed to the position the company for greater and greater success. This article offers a look at the current state:

“Two and a half years later and Nest is reportedly in shambles. Recently, there have been no shortage of reports suggesting that Nest CEO Tony Fadell is something of a tyrannical boss cut from the same cloth as Steve Jobs (at his worst). Additionally, the higher-ups at Google are reportedly disappointed that Nest hasn’t been able to churn out more hardware. Piling it on, Re/Code recently published a report indicating that Nest generated $340 million in revenue last year, a figure that Google found disappointing given how much it spent to acquire the company. And looking ahead, particulars from Google’s initial buyout deal with Nest suggest that the pressure for Nest to ramp up sales will only increase.”

Undoubtedly there are challenges when it comes to expectations about acquired companies’ performance. But when it comes to the nitty gritty details of the work happening in those acquisitions, aren’t managers supposed to solve problems, not simply agree the problem exists? How the success of “internet of things” companies will pan out seems to be predicated on their inherent interconnectedness — that seems to apply at both the levels of product and business.

 

Megan Feil, June 23, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Data Wrangling Market Is Self-Aware and Growing, Study Finds

June 20, 2016

The article titled Self-Service Data Prep is the Next Big Thing for BI on Datanami digs into the quickly growing data preparation industry by reviewing the Dresner Advisory Services study. The article provides a list of the major insights from the study and paints a vivid picture of the current circumstances. Most companies often perform end-user data preparation, but only a small percentage (12%) find themselves to be proficient in the area. The article states,

“Data preparation is often challenging, with many organizations lacking the technical resources to devote to comprehensive data preparation. Choosing the right self-service data preparation software is an important step…Usability features, such as the ability to build/execute data transformation scripts without requiring technical expertise or programming skills, were considered “critical” or “very important” features by over 60% of respondents. As big data becomes decentralized and integrated into multiple facets of an organization, users of all abilities need to be able to wrangle data themselves.”

90% of respondents agreed on the importance of two key features: the capacity to aggregate and group data, and a straightforward interface for implementing structure on raw data. Trifacta earned the top vendor ranking of just under 30 options for the second year in a row. The article concludes by suggesting that many users are already aware that data preparation is not an independent activity, and data prep software must be integrated with other resources for success.

 

Chelsea Kerwin, June 20, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Amazon’s Alexa Popularizes Digital Assistants

June 17, 2016

Digital assistants are smarter than ever.  I remember when PDAs were the wave of the future and meant to revolutionize lives, but they still relied on human input and did not have much in the ways of artificial intelligence.  Now Cortana, Siri, and Alexa respond to vocal commands like an episode of Star Trek.  Digital assistants are still limited in many ways, but according to Venture Beat Alexa might be changing how we interact with technology: “How Amazon’s Alexa Is Bringing Intelligent Assistance Into The Mainstream”.

Natural language processing teamed with artificial intelligence has made using digital assistants easier and more accepted.  Predictive analytics specialist MindMeld commissioned a “user adoption survey” of voice-based intelligent assistants and the results show widespread adoption.

Amazon’s Echo teamed with the Alexa speech-enabled vocal device are not necessarily dominating the market, but Amazon is showing the potential for an intelligent system with added services like Uber, music-streaming, financial partners, and many more.

“Such routine and comfort will be here soon, as IA acceptance and use continue to accelerate. What started as a novelty and source of marketing differentiation from a smartphone manufacturer has become the most convenient user interface for the Internet of Things, as well as a plain-spoken yet empathetic controller of our digital existence.”

Amazon is on the right path as are other companies experimenting with the digital assistant.  My biggest quip is that all of these digital assistants are limited and have a dollar sign attached to them greater than some people’s meal budgets.  It is not worth investing in an intelligent assistant, unless needed.  I say wait for better and cheaper technology that will be here soon.

 

Whitney Grace, June 17, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Next-Generation Business Intelligence Already Used by Risk Analysis Teams

June 1, 2016

Ideas about business intelligence have certainly evolved with emerging technologies. Addressing this, an article, Why machine learning is the new BI from CIO, speaks to this transformation of the concept. The author describes how reactive analytics based on historical data do not optimally assist business decisions. Questions about customer satisfaction are best oriented toward proactive future-proofing, according to the article. The author writes,

“Advanced, predictive analytics are about calculating trends and future possibilities, predicting potential outcomes and making recommendations. That goes beyond the queries and reports in familiar BI tools like SQL Server Reporting Services, Business Objects and Tableau, to more sophisticated methods like statistics, descriptive and predictive data mining, machine learning, simulation and optimization that look for trends and patterns in the data, which is often a mix of structured and unstructured. They’re the kind of tools that are currently used by marketing or risk analysis teams for understanding churn, customer lifetimes, cross-selling opportunities, likelihood of buying, credit scoring and fraud detection.”

Does this mean that traditional business intelligence after much hype and millions in funding is a flop? Or will predictive analytics be a case of polishing up existing technology and presenting it in new packaging? After time — and for some after much money has been spent — we should have a better idea of the true value.

 

Megan Feil, June 1, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

An Open Source Search Engine to Experiment With

May 1, 2016

Apache Lucene receives the most headlines when it comes to discussion about open source search software.  My RSS feed pulled up another open source search engine that shows promise in being a decent piece of software.  Open Semantic Search is free software that cane be uses for text mining, analytics, a search engine, data explorer, and other research tools.  It is based on Elasticsearch/Apache Solrs’ open source enterprise search.  It was designed with open standards and with a robust semantic search.

As with any open source search, it can be programmed with numerous features based on the user’s preference.  These include, tagging, annotation, varying file format support, multiple data sources support, data visualization, newsfeeds, automatic text recognition, faceted search, interactive filters, and more.  It has the benefit that it can be programmed for mobile platforms, metadata management, and file system monitoring.

Open Semantic Search is described as

“Research tools for easier searching, analytics, data enrichment & text mining of heterogeneous and large document sets with free software on your own computer or server.”

While its base code is derived from Apache Lucene, it takes the original product and builds something better.  Proprietary software is an expense dubbed a necessary evil if you work in a large company.  If, however, you are a programmer and have the time to develop your own search engine and analytics software, do it.  It could be even turn out better than the proprietary stuff.

 

Whitney Grace, May 1, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta