Big Law Gets Small: Trimming at LexisNexis?

July 31, 2013

A reader alerted me to “LexixNexis Laying Off 500 Employees.” The key point of the story is that 500 employees are now free to become search engine optimization experts, azure chip consultants, or life coaches. Here’s the passage which I noted:

“LexisNexis continuously reviews its needs, operations and other factors to identify what resources and services are necessary to optimally support our customers and improve business operations. As a result of this ongoing process, we regularly build teams in certain areas of the business and reduce in others to be able to deliver next-generation solutions to customers. On balance, the total number of employees across the LexisNexis business remains consistent with prior years.”

On the bright side, LexisNexis had 122 job openings in June 2013. Presumably these new hires will work on Flavio Villanustre’s Big Data and HPCC systems. I know about this initiative because a LexisNexis PR agency, obviously not terminated, wrote me, asserting:

Fraud is rampant in the issuance of government subsidies and Big Data is shedding light on the losses and inefficiencies. Whether it is outrage that the OMB pays millions, almost $25M, in agriculture subsidies to farmers who have been dead for years or that the state of Florida’s DCF system distributed more than $27B in public assistance to Floridians with estimates that 3-5% of those dollars are lost to fraud.

Yep, fraud.

Rampant. Lawyers who must purchase for-fee online research are indeed available to work on these projects. Lots and lots of lawyers, including those dissatisfied with the promises of some law school recruitment professionals.

Free spending lawyers seem to be almost as rare as a literate high school graduates. The for-fee legal services may face more revenue pressures in the future.

However, I like the fraud thing. LexisNexis’ competitors are doing some fancy dancing tool Thomson Reuters sold some units recently. Ebsco is pushing money from one pocket to another with reorganization and bookkeeping. At least, LexisNexis is trying to go in a new direction. However, the Big Data and fraud path is well traveled. Outfits like Altegrity and IBM are also offering fraud services.

Will the traditional professional publishing companies make their various strategies work? Employees certainly hope so. Who wants to be one of the “500” at the digital battle of Thermopylae facing lawyers who will not pay a premium for online legal information?

Stephen E Arnold, July 31, 2013

Sponsored by Xenky

PeopleSoft Offers Advice on Deploying Secure Enterprise Search

July 31, 2013

Oracle’s human resource management division, PeopleSoft, has wrapped the corporation’s Secure Enterprise Search into its PeopleTools platform. Now, their PeopleSoft Technology Blog offers “A Few Tips on Deploying Secure Enterprise Search with PeopleSoft.” The helpful write-up tells us:

“Oracle’s Secure Enterprise Search is part of PeopleSoft now. It is provided as part of the PeopleTools platform as an appliance, and is used with applications starting with release 9.2. Secure Enterprise Search is a rich and powerful search product that can enhance search and navigation in PeopleSoft applications. It also provides useful features like facets and filtering that are common in consumer search engines.

“Several questions have arisen about the deployment of SES and how to administer it and insure optimum performance. People have also asked about what versions are supported on various platforms. To address the most common of these questions, we are posting this list of tips.”

In what promises to be the first in a series of informative posts, writer Matthew Haavisto offers tips on platform support and architecture. The article says a comprehensive red paper on PeopleSoft/SES administration is on its way. In the meantime, check back with the blog for more tips as they emerge.

Launched in 1987, PeopleSoft offered human resource, financial, supply chain, and customer relationship management solutions and other software. The firm counted large corporations, governments, and other organizations among its clients when Oracle snapped it up in 2005.

Cynthia Murrell, July 31, 2013

Sponsored by ArnoldIT.com, developer of Augmentext

RaptorXML

July 31, 2013

For all you XML lovers out there, particularly those with dual-core machines, RaptorXML is here. Market Wired hosts, “Altova Announces General Availability of RaptorXML.” The product is part of Altova’s suite of server products. The press release informs us:

“Altova RaptorXML is a high-performance XML and XBRL server optimized for today’s multi-CPU, multi-core computers and servers. Developers creating solutions using Altova MissionKit XML development and XBRL development tools will be able to power server applications with RaptorXML for hyper-performance, increased throughput, and efficient memory utilization to validate and process large amounts of XML or XBRL data cost-effectively. . . .

“RaptorXML conforms to the latest versions of all relevant XML and XBRL standards and has been submitted to rigorous regression and conformance testing. The server is available in three versions.”

These versions include Raptor XML Server, Raptor XML+XBRL Server, and RaptorXML Development Edition. The last of these facilitates applications testing by developers working in Altova’s XMLSpy, MapForce, and StyleVision. The products are available for use on Windows, 32-bit or 64-bit, and for the 64-bit MacOS. Pricing is on an annual licensing basis, determined by the number of CPU cores in a prospective customer’s server. A few features include a low memory footprint, cross-platform capabilities, and beefed-up error reporting. See the article above (and/or this one) for more details.

The developer-centered Altova focuses on data management, software development, and data integration. The company boasts that 91% of Fortune 500 companies use their products, but emphasizes that small and medium businesses are also valuable clients. Altova splits its headquarters between Beverly, Massachusetts and Vienna, Austria.

Cynthia Murrell, July 31, 2013

Sponsored by ArnoldIT.com, developer of Augmentext

SharePoint 2010 Disappoints on ECM

July 31, 2013

Microsoft’s SharePoint can be many things to an enterprise. It helps us manage intranets, portals, forms processing, BI, business process management, collaboration. . . . However, one of its most basic functions, content management, has underwhelmed many companies, we learn from “Does SharePoint Measure Up for Enterprise Content Management?” at Australia’s IDM. In fact, many organizations supplement their SharePoint investment with a traditional enterprise content management (ECM) platform.

The article cites a recent whitepaper from AIIM that addresses SharePoint’s shortcomings:

AIIM recently released a whitepaper that explores the topic of SharePoint adoption, titled: “The SharePoint Puzzle.” In this Whitepaper, AIIM discusses why organizations selected SharePoint in the first place and how it performed against expectations. AIIM describes the drivers within this report:

“The collaborative aspects of SharePoint were the strongest original driver for exactly half of our respondents, rising to 57% for the largest organizations, with 38% for the smallest. Web portal/intranet (26%) and project management (13%) were also strong drivers but of more interest is the fact that SharePoint was more often selected to be a file-share replacement than a live document/content management system.”

Some key findings include:

– 28% of respondents have SharePoint in use across their whole workforce. 70% have at least half of their staff using it once a week or more.

– Over half feel they would be 50% more productive with enhanced workflow, search, information reporting and automated document creation tools.

– Over half (54%) are using or planning to use 3rd party add-on products in order to enhance functionality. Only a third thinks they will stick with the vanilla product.

– Difficulty of content migration and information governance capabilities are given as the biggest shortfalls in expectations.

The article discusses anecdotal examples from a couple of companies. The Aussie offices of law firm Herbert Smith Freehills uses SharePoint 2010 for its intranet, but relies on the Autonomy‘s Interwoven Filesite for ECM. In the public sphere, the city of Bunbury, Western Australia, was happy to replace its old data repository with SharePoint 2010. However, reports the city’s IT manager, they are disappointed by the platform’s limited search capacity.

Note that in both these examples, the SharePoint version used is 2010. Does SharePoint 2013 step up its search-functionality game?

Cynthia Murrell, July 31, 2013

Sponsored by ArnoldIT.com, developer of Augmentext

DataStax Enterprise 3.1 Released

July 31, 2013

Open source enterprise software continues to gain steam and make headlines. The latest involves Datastax, the commercial side of the Apache Cassandra content management project. Read more about the DataStax Enterprise 3.1 release in the ZDNet article, “DataStax Enterprise 3.1: NoSQL; Yes, CQL.”

The article begins:

“DataStax, the major commercial entity behind the Apache Cassandra wide column store NoSQL database, is today announcing version 3.1 of its DataStax Enterprise distribution.  This release brings the Cassandra Query Language (“CQL”) — the SQL-like query language for Cassandra — to DataStax Enterprise.  DataStax will also supply Java and .NET drivers for the CQL interface. Other features include support for a 10-fold increase in data per node, and integration with Apache Solr 4.3, bringing 60 new search-related features.  Support for virtual nodes (“vnodes”) and new tracing features have been added as well.”

The article mentions the integration with Apache Solr 4.3, which is easily the most powerful and effective open source search appliance available. LucidWorks builds its open source enterprise solutions on the power of Apache Lucene Solr, and many organizations look to LucidWorks for flexible and affordable search capability.

Emily Rae Aldridge, July 31, 2013

Sponsored by ArnoldIT.com, developer of Beyond Search

Rich Media: Too Expensive to Store?

July 30, 2013

I saw an interesting post called “Cost of Storing All Human Audio visual Experiences.” I am no logician, but if one stores “all”, then isn’t the cost infinite? The person writing the post presents some data which pegs the cost for seven billion people at about $1 trillion a year.

Several observations:

  1. With the emergence of smart nanodevices with audio and video capabilities, perhaps the estimate is off the mark?
  2. Once the data are captured, who manages the content? Likely candidates include nation states, companies which operate as nation states, or venture funded start ups?
  3. How does one find a particular time segment germane to a query pertinent to a patent claim?

Interesting question if one sets aside the “all”. The next time I look for a video on YouTube or Vimeo, I will ask myself, “What type of search system is needed to deal with even larger volumes of rich media?”

Is the new Dark Ages of information access fast approaching? Yikes! Has the era already arrived?

Stephen E Arnold, July 30, 2013

Sponsored by Xenky

Natural Language Interface Will Boost Mobile Enterprise Search

July 30, 2013

Enterprise organizations are increasingly loosening the leash on mobility and this is causing an emphasis on cross-device search. CMS Wire ran an article on the subject called, “Cross-Device Search: The Next Step in Mobile Search Delivery.” The author discusses the known issues with mobile search within the consumer sector and points to natural search interfaces as a remarkable technology that will be one of the building blocks of mobile search.

Both collaborative search and cross-device search stick out as technologies that many companies will begin needing to utilize more and more frequently.

The article does a good job summing up where mobile search delivery will begin:

In 2011 Greg Nudelman wrote Designing Search — UX Strategies for eCommerce Success which had a strong mobile focus and there is an excellent chapter on mobile search in the recent book on Designing the Search Experience by Tony Russell-Rose and Tyler Tate. There is a consensus that natural search interfaces will be an important feature of mobile search design.

Natural search interfaces, or natural language interface (as others call this technology), are a vital piece of technology delivered by many innovative companies like Expert System. One of their solutions, Cogito Answers, utilizes natural language interface to understand the intention of users to deliver information and answers quickly and accurately with a single click.

Megan Feil, July 30, 2013

Sponsored by ArnoldIT.com, developer of Beyond Search

Dangerous Glitches Still to be Worked Out in Electronic Medical Records

July 30, 2013

Electronic medical records are part of the constantly evolving big-data landscape, and there have been some issues, according to Bloomberg‘s “Digital Health Records’ Risks Emerge as Deaths Blamed on Systems.” Not surprisingly, the period just after a new EMR system is implemented has been found to be the most dangerous time. See the article for examples of harm caused by errors. Journalist Jordan Robertson writes:

“Electronic health records are supposed to improve medical care by providing physicians quick and easy access to a patient’s history, prescriptions, lab results and other vital data. While the new computerized systems have decreased some kinds of errors, such as those caused by doctors’ illegible prescriptions, the shift away from paper has also created new problems, with sometimes dire consequences.”

Perhaps it would help if docs could access necessary data. Yet, even now, medical practices have trouble prying clinical data from their EMR systems. Apparently, a lack of data integration is the culprit, according to “Why do Docs Struggle with Population Health Data?” at Government HealthIT. That article summarizes:

“Today, with modern EHR systems, clinicians may have an easier time getting clinical data — but not all of it, which is a problem for providers pursuing population health goals. It’s also a problem as federal health officials and patient-safety organizations like the National Quality Forum try to transition from process-based quality measurements. . . to outcomes-based metrics.”

Are such digital-data challenges confined to health records alone? Unlikely, though in few (if any)other fields is big data consistently a life-and-death issue. We must remember that digital health records have also been shown to improve outcomes, but are we netting more good than bad? I suspect so, but we will probably never know for certain. One thing is sure: there’s no turning back now. Surely, mistakes will decline as systems are refined and staff acclimated. I know that is cold comfort to anyone who has suffered such a preventable loss.

Cynthia Murrell, July 30, 2013

Sponsored by ArnoldIT.com, developer of Augmentext

Healthcare Analytics Always Changing

July 30, 2013

The medical field is always evolving with new advances. The same can be said about the medical technology field, especially in mobile data analytics. Today’s hot trend is a relic faster than in almost any field, so we try hard to keep tabs, such as an illuminating article in CMS Wire, “Temis Acquires i3 Analytics to Boost Text + Data Mining.”

According to the story:

“While we don’t know how much Temis paid out in this deal, we know doctor’s love iPads. This tells us pretty much all we need to know about this deal. i3 Analytics specializes in what it calls biopharma, what most of us know as pharmaceutical research or biotechnology.”

Advances in biotech and biopharma mean more data for doctors and drug companies to rummage through, something a company like i3 Analytics is more than happy to help them with.

This is an interesting story of healthcare analytics. Frankly, nothing surprises us anymore. Heck, we recently heard that Kansas City is the new boomtown for healthcare analytics. We think if things like this are possible, there’s no way this dynamic industry will stop changing anytime soon.

Patrick Roland, July 30, 2013

Sponsored by ArnoldIT.com, developer of Beyond Search

Nanotechnology and Search are Ignored Part of Computing Future

July 30, 2013

The future of computing is here! That’s because it looks like the future of computing is the same as its past twenty years. Everywhere we see people talking about innovation, they seem to be missing some key instruments that will likely be shaping our next computing decade. Such was the case with a recent Fred Wu article, “The Future of Computer Programmers – An Interview with Yukihiro ‘Mats’ Matsumoto.”

According to Matsumoto:

“I believe in the foreseeable future the computing industry is still going to advance based on Moore’s law. Although, it is possible that in the next year or two quantum computers become a practical reality, in that case it will change everything! *chuckles* On a serious note, according to Moore’s law, the cost of computing will decrease and the performance and capacity of computing will increase – this basic principle is unlikely to change.“

Sorry, but cheaper computers isn’t a revelation. Nobody ever seems to focus on how nanotechnology and search will undoubtedly reshuffle the deck. Probably because A) it’s hard to determine just how radical of a shift we will see; B) these both teeter on privacy issue that have been so thorny. We can only hope journalists stop burying their head in the sand about the real future.

Patrick Roland, July 30, 2013

Sponsored by ArnoldIT.com, developer of Beyond Search

Next Page »

  • Archives

  • Recent Posts

  • Meta