April 11, 2013
Sinequa is one of the leaders in enterprise search and unified information access, including the emerging big data market. The firm, based in Paris, provides large enterprises and administrations with the means to tame the complexity of their structured and unstructured data and to extract value from large volumes of very heterogeneous data.
Eurocopter was looking for a solution that could meet all its different professional needs in the context of accessing relevant information, rather than creating a specific solution for each profession.
After a proof of concept, Sinequa won the contract competing against a number of big players in the search market. The Sinequa replaced Eurocopter’s existing solution provided by a vendor recently acquired by a large conglomerate.
Eurocopter embraced Sinequa’s “grid architecture” because the approach provides effective scaling. Eurocopter has implemented a five-node Sinequa Grid distributed across the sites at Marignane (Grance), Donauwörth (Germany), and La Courneuve (France). This architecture can easily be extended to subsidiaries in America and Asia.
At this time, two business solutions are in operation. The first is access to technical data for a group of about 800 technical experts. The second provides access to the information on the Eurocopter Intranet. The system supports approximately 15.000 employees of the group working from locations throughout the world.
The Eurocopter professionals working in technical support require relevant information not only in technical data and documentation contained in such systems as Filenet and in operating systems’ file systems and emails. The unified information access offered by the Sinequa platform these Eurocopter professionals assemble the relevant information pertaining to a client case in one structured space. The content in the Sinequa “space” is easily navigated and accessed. In addition, the system provides access to the image bank of helicopters covers four languages: French, German, Spanish, and English. The unified access to data on the Intranet is simpler and offers a new navigation based on search.
Sinequa’s linguistic capabilities help analyze users’ requests as well as the contents of documents. Sinequa’s linguistic methods optimize the relevance of information delivered and, thus, reduce search time to a minimum. Filters and a taxonomy specific to Eurocopter’s activity are used to facilitate the extraction of technical terms from content processed by the Sinequa system.
Due to a high performance generalized search, each and every employee can now find, in real time, the specific information they need for their work images, rules and regulations, agreements, procedures, reports, and forms).
In coming months, Eurocopter plans to extending the usage of Sinequa’s unified information access to other business applications, including
the indexation of further applications, such as product lifecycle management and customer relationship management.
Stephen E Arnold, April 11, 2013
Sponsored by HighGain
April 9, 2013
The Text Radar big data analytics and content intelligence blog continually provides readers with informative resources on how big data is impacting modern workplaces. This week, I will highlight several articles that were particularly informative.
We all know the impact that big data has on marketers. But what about other industries? According to, “Big Data Analytics Reveals Vision Giving Major Disaster Responders Advance Notice” provides an example of how big data is helping the development of American bridges.
The article lays out a frightening scenario:
“The American Society of Civil Engineers says that one quarter of all American bridges is ‘deficient’. 17,000 bridges didn’t meet inspection criteria, including 3% of all freeway bridges.
Want a scary statistic? The average age of America’s bridges is 43 years. The average lifespan of America’s bridges: 50 years. This means, unless something changes, we should all avoid pretty much all river crossings after the year 2020.”
Another story, “Growing Big Data and Information Access Bring New IT Challenges” explains how big data is transforming the new world of computing.
When explaining some new challenges, the article states:
“The big change now is not that everyone is an I.T. manager – there are still plenty of ways companies will control devices, access to computers, and data – but that everyone is a consumer of a lot of data. Making that easy on them will most likely be a winning strategy.
‘There has been a revolution in design theory,’ says Phil Libin, chief executive of Evernote, a storage site for consumers and businesses. ‘We’ve all had to learn how to have taste.’ He credits the change toward a design focus, in both consumer electronics and enterprise software, to Apple.”
Another innovative way that big data is being utilized is in major league baseball. According to “MLB Uses Big Data for Uncovering Player Insight”, this data allows the performance of players to be predicted.
The article explains:
“‘We’re trying to predict the future performance of human beings, oftentimes in situations that those people themselves haven’t even encountered,’ he said. ‘One of the things we really need to do is the skill from the luck.’
DePodesta cited ‘The Success Equation: Untangling Skill and Luck in Business, Sports, and Investing,’ a book by Michael Mauboussin of Credit Suisse in the idea that ‘skill is more repeatable than the luck.”
This is just a small sampling of the creative ways that big data can be utilized to make the biggest industry impact. Smartlogic offers a suite of solutions that will help any organization transition into analytics.
Jasmine Ashton, April 09, 2013
April 3, 2013
I have been less and less enchanted with the Web as a mechanism for years. Good luck with some of the cloud computing services. Good luck with some of the hosted big data processing services. Good luck with hosted search.
Latency is often my companion.
I read “Our Regressive Web” and learned:
An entrepreneur friend of mine remarked to me recently that if someone invented the nightly news today—or a show like Brian Williams’ “Rock Center”—we’d all think it was a great idea. Think about it: Instead of having to follow all these different news sources, you could just tune in, get a digest of all the important stuff that happened, and you could trust that it had been verified, that it was balanced and high-quality, and would all be well-produced. It struck me, as I tried to wrap my head around the demise of Google Reader and Google’s inexplicable de-emphasis of Google Alerts, how both those ideas—Reader and Alerts—fit the same criteria.
The end of a push service is no big deal. I survived the death of PointCast and Backweb. I even worked through the shift from Desktop Data’s push to an alert type service which was free and less hostile to my in box.
But finding information is getting harder in my opinion. This passage resonated with me:
Yet here we sit, both of those awesome services essentially shuttered in the last year, primed for the scrap heap of Internet history. Then there’s Delicious—a similar idea that allows you to organize your links by categories and see what other people thought were valuable—which has not been shut down, per se, just slowly maimed beyond recognition, its loyal users driven away. And for what reason? Nothing better has risen up to replace them. The underlying needs of a fairly large user base (that these services meet) still exist. We’re just regressing. It’s the one thing I find most disheartening and perhaps most frustrating about this trend. It’s something that needs to be heard, particularly by the people who wrote off these services as Web 1.0 or Web 2.0 relics—the type who said, “Well, nobody used RSS, so good riddance.” The collapse of these services, to me, represents an alarming reduction of key services designed to improve online information from the user’s perspective.
Regressing? I don’t think the word is strong enough. The Web for many applications is almost unusable. Disagree? Use the comments section of the blog and feel free to insert links to unusable search and content processing services. Azure chip consultants need not participate.
Stephen E Arnold, April 3, 2013
Sponsored by Augmentext
April 2, 2013
This week, the Text Radar big data and advanced intelligence blog covered a variety of stories that were pertinent to the realm of big data and advanced intelligence systems.
One of the advantages of big data analytics technology is that it allows marketers to take a more targeted advertising approach to their customers. “Advertising Gets More Personalized and Customized with Big Data” explains how technology and analytics are providing more personalized and customized ads.
The article states:
“Checking out one’s Facebook page provides lots of information about a person in such ways as their likes and where they travel, etc. And, by customers registering with a company site, codes can be placed in a customer’s computer to follow other sites that person visits, and when. In addition, companies are targeting prospective customers with ads that are meaningful and more targeted and will pay-off in the end. The internet and metrics on search engines have changed the way ad agencies are doing business. Companies can now learn from ‘clicks’ how to advertise and valuable details that lead to more targeted successful ads.”
Microtargeting can have a similar impact, according to “Microtargeting the Way of the Future of Business.” The article explains the impact of the technical and political masterminds behind the 2012 Obama/Biden presidential campaign.
Text Radar writer Alice Wilson comments:
“Team Obama changed the way political campaigns will compete in the future. And, you can be sure microtargeting tools with accompanying skills will be in the mix. This same method will be incorporated in all levels of business plans as well.”
The final article that I would like to highlight explains the impact that big data is having on health care. “Crunching Medical Big Data Helps to Find Correct Therapy” provides a story about a baby that was diagnosed with type 1 diabetes but did not respond well to the typical regimen of treatment.
The takeaway is this:
“We’ll discover a lot about ourselves and our diseases from big data — assessing the outcomes of different therapies and finding out in retrospect what works best for who. We will then match that against our gene sequences, which may be stored confidentially at birth. If Cameron Lundfelt had been born a few years later, his parents and doctors would perhaps have known before his symptoms had even appeared that he had monogenic diabetes type KCNJ11. And they would have known immediately what to do.”
It does not matter what industry your company falls into. Big data analytics solutions are going to benefit you not matter what. Smartlogic’s Semaphore Content Intelligence Platform has been recognized as an industry leader and it is useful when helping companies make smarter business decisions.
Jasmine Ashton, April 2, 2013
March 26, 2013
This week, the Text Radar news service covered a variety of topics that are pertinent to big data’s takeover of our advanced intelligence systems.
One article, “Company Challenges with Maximizing Big Data Usage,” explores some common reasons why companies are not making the most of their big data analytics tools.
The article states:
“When asked to report the percentage of projects in which their companies use marketing analytics that are available and/or requested, CMOs report a dismal 30% usage rate. This number has decreased from 37% a year ago. So while companies are spending more on Big Data, less of it is being used.”
Another interesting story, “True Potential Reached With Data Analytics and Help from Industry Experts,” explains how a large number of companies are gaining a competitive edge through data analytics. A survey of 2500 business executive found that data is believed to be a fundamental asset to their marketing efforts.
The article states:
“This is a significant finding, in that power shifts can be disruptive. They often call into question experience and intuition that managers and employees have built up over years. Now, those who know how to marshal the data and put analytics behind their decision making are in a position of advantage.”
The third story that I would like to highlight is “Big Bucks Involved with Big Data Lobbying Efforts in Washington,” It discusses the amount of money that goes into social media lobbying. The article states:
“According to Ad Age’s analysis of U.S. lobbying disclosure reports, Facebook, whose efforts are heavily focused on data privacy and security, multiplied its spending 2.5 times in 2012 on outside lobbying firms and on in-house efforts. The company dropped nearly $4.6 million on lobbying last year — $4 million of which went toward its in-house staff’s lobbying — up from $1.8 million in 2011, the reports show. In 2012, the company tacked on an additional three outside firms to its data-related lobbying roster, using a total of seven in 2012 that dealt with data issues.”
All of these articles examine the impact that big data, and the technology used to harness it, has on our society. While there are many solutions to choose from, businesses should engage with tools that are built by industry experts like Smartlogic.
Jasmine Ashton, March 26, 2013
March 21, 2013
Language and analytics are starting a new trend by coming together. According to the Destination CRM.com article “New SDL Machine Translation Tool Integrates with Text Analytics” SDL has announced that its machine translation tool can now be integrated to work with text analytics solutions. SDL BeGlobal can translate both structured and unstructured information across more than 80 different language combinations. The information is then analyzed using text analytics solutions. This gives users the ability to access global customer insights as well as important business trends. Jean-Francois Damais, Deputy Managing Director of loyalty global clients solutions at Ispos had the following to say regarding SDL BeGlobal.
“With the growth in global business and the accessibility of online information, we now have a much greater need to access and analyze data from multiple languages. As a company focused on innovation and dedicated to our clients’ successes, we deployed SDL BeGlobal machine translation to further improve our research insights and bring new value to our customers.”
SDL BeGlobal has already caught on with several companies in the text analytics industry and several well known companies have jumped on the bandwagon. Raytheon BBN Technologies currently uses the technology for broadcast and Web content monitoring and Expert Systems uses it for semantic intelligence. Language and analytics are two things that are not normally thought of together but seems like SDL BeGlobal has a good thing going. Only time will tell if the new friendship between language and analytics will last the test of time.
April Holmes, March 21, 2012
March 19, 2013
This week, the Text Radar advanced intelligence blog covered some interesting articles on the subject of big data and content intelligence related problems and solutions.
According to “Cyber Attacks on the Rise Related to Big Data Activities Prompting Intelligent Security Strategy” explains the threat that cyber attacks can have on industries that use big data.
The article states:
“‘We need to begin preparing for the likelihood that with the move to IPv6 that will enable billions of devices to be connected, we will see more automated attacks that are destructive.’ Coviello said.
For this reason, he said, businesses need to move to intelligence-based security systems that will detect and respond to emerging attacks more quickly.
‘There is no shame in being breached. The shame is in not evolving security infrastructures to detect and respond to new types of attack.’”
Another article, “Manufacturers Must Use Data Analysis for Smart Predictions” explains new opportunities that are emerging in this industry to create a more nuanced approach to low cost labor challenges.
The post says:
“It is all about Big Data and manufacturers must use data analysis to determine crucial factors so that smart predictions can be made to offset the unknown as with struggling and/or growing economies and how it affects price fluctuations, availability of materials, etc. Having this insight will enable decisive actionable responses to variables so that concise on-the-money decisions can be made.”
The final article that I would like to highlight is, “New York City Uses Big Data to Improve Building Inspections”. It explains how cities and governments are using big data more and more to streamline their processes.
The author explained how the inspection team used dealt with the challenge before them:
“Flowers and his team embraced the messiness and developed a system that identified buildings by using a small area in the front of the property based on Cartesian coordinates and then drew in geo-loco data from the other agencies’ databases. While the system was inherently inexact, the vast amounts of data available compensated for slight imperfections. The team continued to improve upon the system to revolutionize the city’s building inspections and added much efficiency to the process.”
Data analytics and content management software like the Semaphore Content Intelligence Platform from Smartlogic can take the headache out of trying to organize data that has been fragmented throughout different company departments.
Jasmine Ashton, March 19, 2013
March 19, 2013
We are constantly on the lookout for movers and shakers in the area of text analysis and sentiment analysis. So, I was intrigued when I came across Semantria’s Web site recently, a company claiming text and sentiment analysis is made fast and easy with their software. With claims to simplify costs and high-value capturing, I had to research further.
The company was founded in 2011 as a software-as-a-service and services company, specializing in cloud-based text and sentiment analysis.The team boasts a foundation from text analytics provider Lexalytics, software development Postindustria, and demand generation consultancy DemandGen.
The company page shares about how its software can give insight into unstructured content:
“Semantria’s API helps organizations to extract meaning from large amounts of unstructured text. The value of the content can only be accessed if you see the trends and sentiments that are hidden within. Add sophisticated text analytics and sentiment analysis to your application: turn your unstructured content into actionable data.”
Semantria API is powered by the Lexalytics Salience 5 analytics engine and is fully REST compliant. A processing demo is available at at https://semantria.com/demo. We think it is well worth a look.
Andrea Hayden, March 19, 2013
March 12, 2013
This week, the Text Radar data analytics blog shared some innovative ways that companies are using big data to solve some of industries’ most difficult challenges.
The first article that I will highlight is “Using Analytics to Improve Cities and Governments,” shares how the Smarter Cities Technology Centre in Dublin is working to improve over 2000 cities around the world.
The article states:
“Each project develops practical solutions to specific urban management areas, such as traffic management. One that Dublin hosts is the development of the IBM Intelligent Operations Centre for Cities. No one envisages a total solution but the vision is that incrementally and over time, the development of smart systems for key areas will enable a city to integrate more and more of its operations into an overall ‘smart city’.
Due to the fact that so many different companies from different industries are gaining valuable insights from big data, many major corporations are beginning to invest in big data solutions. According to “IBM Makes More Investments in Big Data and Mobile” big data, business analytics, social business, and mobile represent a new era and much sought after ecosystems that IBM is going all out after.
The article summarizes:
“IBM continues to transform itself by going after higher value opportunities and Big Blue is moving into these new spaces with its ecosystem of business partners in tow.
At the IBM PartnerWorld Leadership Conference 2013 in Las Vegas, Bruno Di Leo, senior vice president of sales and distribution at IBM, said the company is looking at three primary imperatives: to lead in the new era of computing, to reach new kinds of clients and to demonstrate new types of expertise.”
Another article “Netflix Uses Big Data to Deliver an Original Series Success” explains the movie giant’s tactic of releases all of the episodes at once for their original series, House of Cards.
The article explains its Netflix’s strategy:
“Big bets are now being informed by Big Data, and no one knows more about audiences than Netflix. A third of the downloads on the Internet during peak periods on any given day are devoted to streamed movies from the service, according to Sandvine, a networking provider. And last year, by some estimates, more people watched movies streamed online than on physical DVDs.”
As you can see, big data has a variety of interesting and innovative uses. Those companies looking to spearhead their own big data initiative should seek out a third party solution like Smartlogic’s Semaphore Content Intelligence Platform.
Jasmine Ashton, March 12, 2013
March 7, 2013
Oracle’s support of locally partitioned indexes has created a need for users to be able to split those indexes and rebuild them in a timely manner. How do you rebuild an index without making your application unavailable for the entire time?
Prsync’s look into the maintenance disadvantages and subsequent problem solving by Oracle in “Partition Maintenance and Oracle Text Indexes” gives us a look at something new; a “Without Validation” and “Split Partition” features. These options offer a way to rebuild indexes without checking each line-by-line first.
“That solves the problem, but it’s rather heavy-handed. So instead we need to institute some kind of “change management”. There are doubtless several ways to achieve this, but I’ve done it by creating triggers which monitor any updates or inserts on the base table, and copy them to a temporary “staging” table. These transactions can then be copied back to the main table after the partition split or merge is complete, and the index sync’d in the normal way.”
So now that there is a solution, but, by avoiding the need for a system to check every partition key value to make sure the row is going to the correct partition, there is need for extra care when using the without validation feature.
It’s a long needed saving grace that will save time and ultimately money by getting apps back up and running in a more efficient manner but there is no substitute for attention to detail. For a more in-depth look at the process we suggest heading over to prsync.
Leslie Radcliff, March 07, 2013