Sparse Data Gathering Abundant Attention

September 30, 2012

Sparse data has been gaining some attention lately, according to the article titled, “It’s Called ‘Sparse Data,’ and It Could Be a Big Deal” on Government Computer News. Sparse data is information that comes from sensors or other non-IT devices, such as temperature or how often something is used. Although this may not seem very important in the grand scheme of things, experts are taking another look at the usefulness and potential impact of sparse data. It could make an organization more efficient, according to Jerry Gentry, the vice president of IT program management at Nemertes Research.

We learn about his vision for the future of sparse data in the article:

“Government agencies, like other operations, could use this kind of data to more efficiently manage buildings, but compiling sparse data is already being used in other ways, such as monitoring traffic on bridges and roadways, or in a variety of weather monitors or tsunami prediction systems. Sensors are increasingly being deployed by agencies, which means sparse data likely will become a term you’ll hear more often.”

Although the term isn’t as big yet as, well, Big Data, it still warrants some attention. The potential to stream this data into manageable, useful information is there. Experts need to plan how to adequately harness this sparse data before it becomes unmanageable.

Andrea Hayden, September 30, 2012

Sponsored by ArnoldIT.com, developer of Augmentext

The Metaprocess Puzzle

September 29, 2012

BeyeNetwork suggests one reason metadata is not implemented comprehensively or well: “Lack of Metaprocess Information Impedes Ability to Collect Metadata.” Writer and database management expert Bill Inmon pins the lack of enterprise-wide metadata primarily on a lack of metaprocess information. Metaprocess covers high-level descriptive details about a process, like its name, the technology that houses it, its input and output, and algorithmic variables. It is pointless, Inmon insists, to attempt to understand a large organization’s information flow without this information.

Why is metaprocess information so hard to come by? The article explains:

“It resides in the old legacy code. In COBOL. In assembler. In AS/400 modules. In PL/1. In technology that has not seen the light of day in decades. Once there were technicians that could be hired to read and go through the old code. Today those technicians have retired or have been promoted to management positions. In another generation, it won’t even be possible to find anyone who understands these older technologies. And by that time, SQL and C++ will be the old legacy technologies of the day.”

How does one solve a metaprocess problem? What is the meta-metaprocess? Inmon doesn’t really have an answer to that. He does suggest that, since legacy code is a form of text, someone may someday find a way to coax this information from a text editor. Anyone up for the challenge?

Cynthia Murrell, September 29, 2012

Sponsored by ArnoldIT.com, developer of Augmentext

Communicating the Value of Big Data

September 28, 2012

The IT world has successfully communicated the message that Big Data is valuable. It has been less successful in explaining how, exactly. Following its recent BI & Analytics Perspectives Conference, Computerworld realizes that “Finding the Business Value in Big Data is a Big Problem.” The IT pros gathered at the conference seem to agree that the current problem with the data analytics phenomenon is figuring out just what to do with all that information. The article explains:

“Technology vendors and industry analysts tout the enormous business benefits that enterprises can gain from mashing up traditional structured data with unstructured data from the cloud, mobile devices, social media channels and other sources. But business executives have little idea of how to take advantage of Big Data or how to articulate their requirements to IT, according to several executives at the show.

“Business leaders often ‘don’t know what they don’t know,’ said one frustrated IT manager, and therefore they are incapable of explaining to IT shops what to do with all this data that’s being accumulated.”

A similar problem crops up with most new technologies. People have to break out of decades-old thought patterns to even see the possibilities—an overwhelming task for most humans. Some companies are creating “innovation labs” with the sole purpose of getting the best ROI from their Big Data investments.

That’s probably a good thing, but I think vendors must take responsibility for explaining the value in their products. Collecting and peering at data is only valuable as a means to some profitable end. Those who supply Big Data solutions must find ways to illustrate the worth of their products with clear examples, suggestions, and starting points for potential clients. If they cannot, their businesses may not outlast what could become known as the Big Data Fad of the early 21st Century.

Cynthia Murrell, September 28, 2012

Sponsored by ArnoldIT.com, developer of Augmentext

Data Analytics in Genetic Research

September 28, 2012

We’re pleased to see this excellent example of the use of analytics. ScienceDaily reveals, “Information Theory Helps Unravel DNA’s Genetic Code.” Specifically, scientists at the Indian Institute of Technology in Delhi were working on one of today’s biggest biology challenges—predicting the distribution of coding and noncoding regions (exons and introns, respectively) in a previously unannotated genome. The researchers were able to speed the process using information theory techniques. The brief write up explains:

“The researchers were able to achieve this breakthrough in speed by looking at how electrical charges are distributed in the DNA nucleotide bases. This distribution, known as the dipole moment, affects the stability, solubility, melting point, and other physio-chemical properties of DNA that have been used in the past to distinguish exons and introns.

“The research team computed the ‘superinformation,’ or a measure of the randomness of the randomness, for the angles of the dipole moments in a sequence of nucleotides. For both double- and single-strand forms of DNA, the superinformation of the introns was significantly higher than for the exons.”

Studying DNA regions helps scientists better understand diseases and develop more effective treatments. Just one of the many ways data analytics can be used for something other than boosting a corporations’ bottom line.

Cynthia Murrell, September 28, 2012

Sponsored by ArnoldIT.com, developer of Augmentext

Upcoming Discussion on Big Data and HealthCare

September 26, 2012

GigaOM recently reported on an upcoming round table discussion between several experts in the field of big data analytics in the article, “Using Big Data to Reinvent Health Care.”

According to the article, the panel of experts GigaOM Analyst Jody Ranck and LexisNexis HPCC Systems Data Scientist Joe Prichard will use real world examples to address how big data services are being used in early detection of diseases as well as how to detect fraud through big data software.

When providing some background on the topic, the article states:

“Today’s health care system is challenged in managing data and costs. From the rise of an aging population to the increase of fraud, waste and abuse, health care organizations are challenged to efficiently maximize their resources for legitimate patients who need care. New health care legislation brings challenges and opportunities for health care organizations and their patients. With health care costs projected to spiral into the trillions of dollars in the next few years, many organizations are applying the advances in IT, mobile and other forms of technology to the problem of containing costs and finding new and innovative ways to care for patients.”

Big data reinventing health care is a remarkable claim. This should be an interesting discussion to keep tabs on.

Jasmine Ashton, September 26, 2012

Sponsored by ArnoldIT.com, developer of Augmentext

Small Businesses Falling Short in Monitoring Online Conversations

September 25, 2012

The multitude of data needing to be analyzed by marketing professionals on a daily basis is currently coming from an overwhelming number of different channels. This makes it very difficult for marketing professionals to manually track all the disparate data coming in. Therefore, no one should be surprised that a recent survey conducted by PR Newswire found that “Only 37% of Small Business Communicators Monitor Conversations on a Daily Basis.”

According to the article, the majority of marketing and communications professionals agree that in order to maintain an online following and relevant online conversations, it is important to listen to a variety of relevant social media channels.

However, the survey found:

“Fewer than 40% of small business communicators monitor conversations daily, despite the speed with which conversations and rumors can take hold  online.    The good news is that only 3% of communicators reported that they don’t do any monitoring.  Another 18% indicated they monitor conversations weekly.

One reason why the majority of communicators aren’t listening on a daily basis likely stems from the simple fact that many people find themselves relying upon multiple channels in order to keep tabs of key social networks and online groups.”

As big data analytics technology becomes more readily available and affordable, small businesses are going to have to invest in these products in order to stay abreast of the needs of their clients and constituencies.

Jasmine Ashton, September 25, 2012

Sponsored by ArnoldIT.com, developer of Augmentext

Centrifuge Releases Latest Version of Visual Networks Analytics Platform

September 24, 2012

Centrifuge recently published “Centrifuge Delivers Scalable Big Data Analytics With Visual Networks Analytics Version 2.7” which discusses a new solution that minimizes the need for data scientists while accelerating discovery across disparate data points. Sounds pretty cool…and complicated.

According to the news release, Centrifuge, a provider of Big Data analytics and visualization solutions for fraud, security and risk, announced the availability of the latest version of its Visual Networks Analytics platform. It addresses the need to derive context intelligence and pattern discovery in big data by delivering powerful technology that addresses the growing need to quickly filter, sift and understand large amounts of data.

Renee Lorton, Centrifuge CEO, explains:

“Corporate Information security is a big data analytics challenge that cannot be addressed with traditional data mining, BI, or legacy analytics approaches. The sheer volume and complexity requires a powerful investigative discovery approach that is easy enough for a non-data scientist to use.  Machine data, for example, is one of the fastest growing segments of big data, generated by websites, applications, servers, networks, mobile devices and other sources.  Now, discovering patterns in Big Data is both easy and cost effective with Centrifuge’s powerful interactive data visualization.”

With an increasing number of organizations being hacked, information security is becoming a higher priority. A variety of industries would benefit from this technology.

Jasmine Ashton, September 24, 2012

Sponsored by ArnoldIT.com, developer of Augmentext

Funnelback Releases Version 12

September 22, 2012

Funnelback recently unveiled Version 12 of its software at its Funnelback User Conference in Brisbane, Australia, we learn from Image and Data Manager’s “Funnelback 12 Tackles Big Data.” The company’s R&D manager, Matthew Sheppard, declared that this version significantly boosts both speed and scale. The article tells us:

“Funnelback 12 adds faster and more powerful data searching capabilities. These include better performance of file share and HP TRIM gathering, and ‘Search as You Gather’, the ability to search immediately as content is gathered.

“The latest version of Funnelback also returns more informative search results with date-based facets, easily categorising results by dates, and TextMiner, a new feature that helps define terms and acronyms for users, providing direct access to more information and context on a query.”

That TextMiner feature sounds like an inspired addition—a real time- and face-saver. Funnelback also added a new web-based administration interface they say simplifies maintenance and customization. The write up further boasts of improved integration APIs and more tunable ranking algorithms. The software is available for Windows, for Linux, and as a cloud service.

Based in Australia, Funnelback was established in 2005. The company grew from technology developed by premier Australian scientific research agency CSIRO, and was bought by UK content management company Squiz in 2009. They offer Enterprise and Website Search, both of which include customizable features.

Cynthia Murrell, September 22, 2012

Sponsored by ArnoldIT.com, developer of Augmentext

VPI Taps Autonomy IDOL for Analytics

September 18, 2012

Autonomy vaunts its latest victory in “VPI Selects Autonomy, an HP Company, to Deliver Advanced Speech and Multichannel Analytics.” A global provider of workforce optimization solutions, VPI will soon embed Autonomy’s IDOL Server into its analytics tools VPI Empower and VPI Empower 911. The press release explains:

“Combining Autonomy IDOL with VPI’s context-directed interaction analytics gives enterprises and public safety organizations a more accurate and comprehensive understanding of all their communications activity. All calls to a help desk or emergency center can be automatically classified using VPI’s desktop analytics, which tags valuable data and events from CRM, ERP, CAD, helpdesk and other applications to recorded communications, to provide precise context of the conversations. This allows organizations to provide better and faster customer service and patient care. Managers can also apply speech analytics to any category of interest-such as repeat calls, high value sales, account cancellations or security breaches-increasing speed and accuracy of their search and analysis without having to listen to all calls.”

Headquartered in Camarillo, CA, VPI was founded in 1994. They supply customer experience and workforce optimization solutions for enterprises, trading floors, government agencies, and, perhaps most importantly, first responders. Let’s hear it for anything that makes their jobs faster and easier!

Autonomy, founded in 1996 and now owned by HP, offers solutions that use IDOL to tame mind-boggling amounts of unstructured data. The technology grew from research originally performed at Cambridge University, and now serves prominent public and private organizations around the world.

Cynthia Murrell, September 18, 2012

Sponsored by ArnoldIT.com, developer of Augmentext

FlowForce Beta 3 Automates Data Transformations

September 17, 2012

The Altova Blog recently reported on the release of the latest edition of Altova’s FlowForce Server, a new product that automates the execution of MapForce data transformations, in the article “FlowForce Server Beta 3 Now Available,”

According to the article, this new tool is designed to provide comprehensive management and control over data transformations performed by dedicated high speed servers. the beta test period FlowForce Beta 3 has been extended to March 31, 2013 and it is available in a 32-bit version as well as a 64-bit version.

The article states:

“FlowForce Server Beta 3 adds support for remote job requests via an HTTP client and job parameters that can be passed to any step in a job. When used together with the request interface, job parameters empower the HTTP client to specify input values in the job request.

FlowForce Server Beta 3 also permits any job to be called as a step within another job, implements individual job queues that make it possible to control server resources used by jobs, and adds many more refinements and enhancements.”

For more information on the free beta version of this solution, check out the FlowForce Server Beta 3 download page.

Jasmine Ashton, September 17, 2012

Sponsored by ArnoldIT.com, developer of Augmentext

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta