CyberOSINT banner

Opening Watson to the Masses

March 4, 2015

IBM is struggling financially and one of the ways they hope to pull themselves out of the swamp is to find new applications for its supercomputers and software. One way they are trying to cash in on Watson is to create cognitive computer apps. EWeek alerts open source developers, coders, and friendly hackers that IBM released a bunch of beta services: “13 IBM Services That Simplify The Building Of Cognitive Watson Apps.”

IBM now allows all software geeks the chance to add their own input to cognitive computing. How?

“Since its creation in October 2013, the Watson Developer Cloud (WDC) has evolved into a community of over 5,000 partners who have unlocked the power of cognitive computing to build more than 6,000 apps to date. With a total of 13 beta services now available, the IBM Watson Group is quickly expanding its developer ecosystem with innovative and easy-to-use services to power entirely new classes of cognitive computing apps—apps that can learn from experience, understand natural language, identify hidden patterns and trends, and transform entire industries and professions.”

The thirteen new IBM services involve language, text processing, analytical tools, and data visualization. These services can be applied to a wide range of industries and fields, improving the way people work and interact with their data. While it’s easy to imagine the practical applications, it is still a wonder about how they will actually be used.

Whitney Grace, March 04, 2015
Sponsored by ArnoldIT.com, developer of Augmentext

IBM: Think Big, Harder, Slower

February 28, 2015

i read “IBM Says Cloud, Mobile, and Data Businesses Will Reach $40 Billion by 2018.” The write up reports that IBM has some “strategic imperative.” I assumed that sustainable revenue growth and healthy profits were important. Well, maybe.

Computing and IT services giant IBM will spend $4 billion on its cloud services, data analytics and mobile businesses in a bid to turn it into what CEO Ginni Rometty said will be a $40-billion-a-year-in-revenue business by 2018. On a conference call ahead of its annual investors presentation in New York, Rometty said the three businesses, which she referred to as IBM’s “strategic imperatives,” have grown in overall importance as it has divested itself of its older traditional business units. Five years ago the divisions amounted to 13 percent of IBM’s sales, Rometty said. By the start of 2015 they accounted for 27 percent.

Not long ago, Watson was going to be a $10 billion business. IBM should be proud of these projections.

The hitch in the cloud, analytics, and mobile git along is that there are a few other outfits with the same idea. A couple of these companies seem to have some traction in the cloud and mobile markets.

With regard to analytics, IBM has some useful technologies. The problem is that the company does not know how to deliver solutions that generate sustainable revenue. As a result, a number of smaller firms are jockeying for lucrative US government contracts and deals with smaller firms eager to take advantage of more advantageous prices for comparable services.

The write up points out:

Rometty said IBM will also do more partnerships with other companies similar to deals announced last year with Apple to jointly sell and develop mobile software, and a deal announced earlier this month with SoftBank to bring the Watson cognitive computing system to Japan. The result, she said, will be “IBM reinvented again.”

Sounds great. Like HP, IBM is doing MBAish activities. Stakeholders will be looking for answers about job security, stock and dividends, and sustainable growth. So far I see marketing, stock buybacks, and fast dancing.

I don’t want to dance with Watson, the system that generates recipes requiring tamarind. Judging from the comments on the Alliance@IBM Web site, there are some internal issues that IBM must manage as well.

Stephen E Arnold, February 28, 2015

Tibco and Predictive Analytics

February 25, 2015

Business intelligence and infrastructure firm Tibco has been busy making deals lately. A press release at Digital Journal tells us that “Tibco and Lavastorm Analytics Announce Predictive Analytics Environment that Enhances IT and Business Collaboration.” Shortly thereafter, Virtual-Strategy Magazine reveals in its post, “Pilgrim Launches BI Solution for Quality Performance Insights,” that Tibco’s tech will underpin Pilgrim’s new platform.

The Digital Journal article discusses the embedding of Tibco’s TERR engine into Lavastorm’s Analytics Engine:

“The predictive analytics capability of TERR enhances the Lavastorm Analytics Engine’s drag-and-drop data assembly and analytical capabilities providing a high-performance, highly-scalable implementation of the popular R statistical computing language. Data scientists can now leverage R to apply predictive analytical techniques and package them into reusable analytic building blocks that enable rapid self-service data analysis by business users seeking insights and increased business efficiency.”

Meanwhile, the Virtual-Strategy post describes Pilgrim’s SmartSolve BI suite:

“SmartSolve BI is powered by TIBCO Spotfire technology. Its analytic and visualization engine is coupled with the proven capabilities of SmartSolve, Pilgrim’s quality management solution. Its numerous quality management metrics and dashboards enhance clients’ access to, and visibility of, their quality and compliance results and trends. Transforming this data with SmartSolve BI drives a multitude of analytical advantages including improved decision making with built-in quality KPIs and prebuilt dynamic dashboards that display a variety of sophisticated charts, graphs, plots and tables.”

Launched in 1997 and headquartered in Palo Alto, California, Tibco provides infrastructure and business intelligence solutions to businesses in several industries around the world. TERR, by the way, stands for the Tibco Enterprise Runtime for R; it is one of many Tibco products.

Lavastorm Analytics emphasizes data aggregation and user-friendly reports. Besides analytics and BI, the company offers tools for fraud management, data discovery, and revenue assurance. Lavastorm was founded in 1999, and is headquartered in Boston, Massachusetts.

Operating out of Tampa, Florida, Pilgrim focuses on risk, compliance, and quality management software for highly regulated industries around the world. They also happen to be hiring for several positions as of this writing.

Cynthia Murrell, February 25, 2015

Sponsored by ArnoldIT.com, developer of Augmentext

OpenText Acquires Actuate

February 24, 2015

The OpenText- Actuate deal has gone through, we learn from OpenText’s press release, “OpenText Buys Actuate Corporation.” It seems they were not much hindered by that legal snag the arrangement encountered at the end of last year. The press release reports:

“Complementing OpenText’s existing information management and B2B integration offerings, Actuate offers increased business process efficiencies, greater brand experience and personalized insight for better and faster decisions via analytics and visualization. OpenText customers will now benefit from added analytic capabilities to their existing deployments and a new breed of analytics that provide insight across entire business flows.”

“Actuate will continue to serve the embedded analytics market, the developer, and will be deeply integrated into OpenText Products and OpenText, enabling OpenText to deliver analytics for the entire EIM suite based on a common platform…. Designed to be embeddable, developers can use the platform to enrich nearly any application, whether it is deployed on premises or in the cloud.”

Founded in 1991 and based in Waterloo, Ontario, OpenText supplies its clients with enterprise content management, business process management, and customer experience management tools. Actuate is headquartered in San Mateo, California, and was launched in 1993. The company founded and co-lead the Eclipse BIRT (Business Intelligence and Reporting Tools) open source project.

Cynthia Murrell, February 24, 2015

Sponsored by ArnoldIT.com, developer of Augmentext

Bayes Explained with Lego Blocks

February 20, 2015

At yesterday’s successful CyberOSINT conference, several presenters who referenced Bayes. LaPlacian and Markovian methods. I came across a visual explanation of the good Reverand’s theorem. Navigate to “Bayes’ Theorum with Lego.” Worth a gander if the theorem does not coo to you.

Stephen E Arnold, February 20, 2015

Statistics, Statistics. Disappointing Indeed

February 16, 2015

At dinner on Saturday evening, a medical researcher professional mentioned that reproducing results from tests conducted in the researcher’s lab was tough. I think the buzzword for this is “non reproducibility.” The question was asked, “Perhaps the research is essentially random?” There were some furrowed brows. My reaction was, “How does one know what’s what with experiments, data, or reproducibility tests?” The table talk shifted to a discussion of Saturday Night Live’s 40th anniversary. Safer ground.

Navigate to “Science’s Significant Stat Problem.” The article makes clear that 2013 thinking may have some relevance today. Here’s a passage I highlighted in pale blue:

Scientists use elaborate statistical significance tests to distinguish a fluke from real evidence. But the sad truth is that the standard methods for significance testing are often inadequate to the task.

There you go. And the supporting information for this statement?

One recent paper found an appallingly low chance that certain neuroscience studies could correctly identify an effect from statistical data. Reviews of genetics research show that the statistics linking diseases to genes are wrong far more often than they’re right. Pharmaceutical companies find that test results favoring new drugs typically disappear when the tests are repeated.

For the math inclined the write up offers:

It’s like flipping coins. Sometimes you’ll flip a penny and get several heads in a row, but that doesn’t mean the penny is rigged. Suppose, for instance, that you toss a penny 10 times. A perfectly fair coin (heads or tails equally likely) will often produce more or fewer than five heads. In fact, you’ll get exactly five heads only about a fourth of the time. Sometimes you’ll get six heads, or four. Or seven, or eight. In fact, even with a fair coin, you might get 10 heads out of 10 flips (but only about once for every thousand 10-flip trials). So how many heads should make you suspicious? Suppose you get eight heads out of 10 tosses. For a fair coin, the chances of eight or more heads are only about 5.5 percent. That’s a P value of 0.055, close to the standard statistical significance threshold. Perhaps suspicion is warranted.

Now the kicker:

And there’s one other type of paper that attracts journalists while illustrating the wider point: research about smart animals. One such study involved a fish—an Atlantic salmon—placed in a brain scanner and shown various pictures of human activity. One particular spot in the fish’s brain showed a statistically significant increase in activity when the pictures depicted emotional scenes, like the exasperation on the face of a waiter who had just dropped his dishes. The scientists didn’t rush to publish their finding about how empathetic salmon are, though. They were just doing the test to reveal the quirks of statistical significance. The fish in the scanner was dead.

How are those Big Data analyses working out, folks?

Stephen E Arnold, February 16, 2015

Sci Tech Ripple: Lousy Data, Alleged Cover Ups

February 14, 2015

Short honk: I don’t want to get stuck in this tar pit. Read “Are Your Medications Safe?” A professor and some students dug up information that is somewhat interesting. If you happen to be taking medicine, you may not have the full dosage of facts. What’s up? Would the word “malfeasance” be suitable? It is Friday the 13th too.

Stephen E Arnold, February 14, 2015

Lexalytics Now Offers Intention Analysis

February 12, 2015

Lexalytics is going beyond predicting consumers’ feelings, or sentiment analysis, to anticipating their actions with what they call “intention analysis.” Information Week takes a look at the feature, soon to be a premium service for the company’s Semantria platform, in “Big Data Tool Analyzes Intentions: Cool or Creepy?” Writer Jeff Bertolucci consulted Lexalytics founder and CEO Jeff Catlin, and writes:

Catlin explained via email how intention analysis software would deconstruct the following tweet: “I’ve been saving like crazy for Black Friday. iPhone 6 here I come!”

“There are no words like ‘buy’ or ‘purchase’ in this tweet, even though their intention is to purchase an iPhone,” wrote Catlin. Here’s how an intention analysis tool would tag the tweet:

– intention = “buy”

– intended object = “iPhone”

– intendee = “I”

Grammar-parsing technology is the engine that makes intention analysis work.

“Intention is kind of the sexy feature, but the grammar parser is the key that makes it go, the ability to understand what people are talking about, regardless of content type,” said Catlin. “We’ve built a grammar-parser for Twitter, which deals with the fact that there’s bad punctuation, weird capitalization, and things like that.”

Companies can use the technology to determine buying patterns, of course, and may use it to ramp up personalized advertising. Another potential market is that of law enforcement, where agents can use the tool to monitor social media for potential threats.

Lexalytics has been leaders in the sentiment analysis field for years, and counts big tech names like Oracle and Microsoft among their clients. Designed to integrate with third-party applications, their analysis software chugs along in the background at many data-related organizations. Founded in 2003, Lexalytics is headquartered in Amherst, Massachusetts.

Cynthia Murrell, February 12, 2015

Sponsored by ArnoldIT.com, developer of Augmentext

Cyber Threats Boost Demand for Next Generation System

February 10, 2015

President Obama’s announcement of a new entity to combat the deepening threat from cyber attacks adds an important resource to counter cyber threats.

The decision reflects the need for additional counter terrorism resources in the wake of the Sony and Anthem security breaches. The new initiative serves both Federal and commercial sectors’ concerns with escalating cyber threats.

The Department of Homeland Security said in a public release: “National Cybersecurity and Communications Integration Center mission is to reduce the likelihood and severity of incidents that may significantly compromise the security and resilience of the Nation’s critical information technology and communications networks.”

For the first time, a clear explanation of the software and systems that perform automated collection and analysis of digital information is available. Stephen E. Arnold’s new book is “CyberOSINT: Next Generation Information Access” was written to provide information about advanced information access technology. The new study was published by Beyond Search on January 21, 2015.

The author is Stephen E Arnold, a former executive at Halliburton Nuclear Services and Booz, Allen & Hamilton . He said: “The increase in cyber threats means that next generation systems will play a rapidly increasing part in law enforcement and intelligence activities.”

The monograph explains why next generation information access systems are the logical step beyond keyword search. Also, the book provides the first overview of the architecture of cyber OSINT systems. The monograph provides profiles of more than 20 systems now available to government entities and commercial organizations. The study includes a summary of the year’s research behind the monograph and a glossary of the terms used in cyber OSINT.

Cyber threats require next generation information access systems due to proliferating digital attacks. According to Chuck Cohen, lieutenant with a major Midwestern law enforcement agency and adjunct instructor at Indiana University, “This book is an important introduction to cyber tools for open source information. Investigators and practitioners needing an overview of the companies defining this new enterprise software sector will want this monograph.”

In February 2015, Arnold will keynote a conference on CyberOSINT held in the Washington, DC area. Attendance to the conference is by invitation only. Those interested in the a day long discussion of cyber OSINT can write benkent2020 at yahoo dot com to express their interest in the limited access program.

Arnold added: “Using highly-automated systems, governmental entities and corporations can detect and divert cyber attacks and take steps to prevent assaults and apprehend the people that are planning them. Manual methods such as key word searches are inadequate due to the volume of information to be analyzed and the rapid speed with which threats arise.”

Robert David Steele, a former CIA professional and the co-creator of the Marine Corps. intelligence activity said about the new study: “NGIA systems are integrated solutions that blend software and hardware to address very specific needs. Our intelligence, law enforcement, and security professionals need more than brute force keyword search. This report will help clients save hundreds of thousands of dollars.”

Information about the new monograph is available at www.xenky.com/cyberosint.

Ken Toth, February 10, 2015

Advanced Analytics Are More Important Than We Think

February 3, 2015

Alexander Linden, one of Gartner’s research directors, made some astute observations about advanced analytics and data science technologies. Linden shared his insights with First Post in the article, “Why Should CIOs Consider Advanced Analytics?”

Chief information officers are handling more data and relying on advanced analytics to manage it. The data is critical gaining market insights, generating more sales, and retaining customers. The old business software cannot handle the overload anymore.

What is astounding is that many companies believe they are already using advanced analytics, when in fact they can improve upon their current methods. Advanced analytics are not an upgraded version of normal, descriptive analytics. They use more problem solving tools such as predictive and prescriptive analytics.

Gartner also flings out some really big numbers:

“One of Gartner’s new predictions says that through 2017, the number of citizen data scientists will grow five times faster than the number of highly skilled data scientists.”

This is akin to there being more people able to code and create applications than the skilled engineers with the college degrees. It will be a do it yourself mentality in the data analytics community, but Gartner stresses that backyard advanced analytics will not cut it. Companies need to continue to rely on skilled data scientists the interpret the data and network it across the business units.

Whitney Grace, February 03, 2014
Sponsored by ArnoldIT.com, developer of Augmentext

Next Page »