November 25, 2013
Who cares about news releases? Apparently quite a few folks do. I read “Swatting at a Swarm of Public Relations Spam.” I thought the write up was interesting, but it seemed short on facts. Here’s the key passage in my opinion:
I liked this part. Also interesting was this passage:
But this one step seemed insufficient. P.R. spam is fed by companies that hire P.R. companies that pay database companies like Vocus, or their handful of competitors. So if you want to focus on root causes, you must ask: Why would any company spend money to blanket reporters with email they didn’t ask for and almost surely don’t want?
We have tested one of the Vocus systems and discovered some interesting factoids. Keep in mind that your mileage may vary:
ITEM. I did a story for Citizentekk.com based on my research for an uptown investment back. We submitted a short news release to PR Web, a Vocus “property.” The publicity professional I use reported that PR Web told her that I was not a recognized authority to PR Web. Furthermore, the information about Google’s investments in synthetic biology were not known so the news release would not be distributed. I found this interesting because the investment bank who commissioned the initial research published a report and the Citizentekk story generated some buzz and follow on commentary.
Is PR spam a food? Image source: http://goo.gl/kSKJEZ
ITEM. One of the editors for the Search Wizards Speak series of interviews tracked down the co founder of Silobreaker. This is an intelligence oriented online system that has a very strong following among the police and intelligence services in the European Community. We were told that my publicity person had to verify who she was and then provide two phone numbers for me and a valid email address. This was after PR Web had my Visa card and the short news release highlighting two key points in the interview.,
ITEM: Vocus pays its president $5 million per year. (Source: Hoover’s Company Records). At the same time, the October 23, 2013 quarterly financial results reported declining revenue ($45.217 millio0n against $46.615 million a year earlier). The net loss was $3.85 million against a net loss of 3.851 a year earlier. (You will need a subscription to Reportlinker to view other details or you can dig out the numbers at http://goo.gl/VeAH6g)
ITEM: Vocus is involved in a legal matter with an outfit called BWP Media USA doing business as Pacific Coast News. I am no attorney so the matter may be without merit. The dispute seems to involve copyright violations. Source: US District Court, Maryland, Case 8:13-cv-03322-RWT. I would reproduce the image attached to the legal document I saw but I found it unsettling.
November 23, 2013
I read “Tension and Flaws Before Health Website Crash.” The good news is that the story focuses on what is now old news: Management challenges at the agency responsible for Healthcare.gov. The bad news—at least for champions of XML repositories, XML normalization, and XML as the “answer” to a wide range of information management woes—is that XML (extensible markup language) is not the slam dunk, whiz bang solution some true believers hope.
Here’s the passage that caught my attention:
Another sore point was the Medicare agency’s decision to use database software, from a company called MarkLogic, that managed the data differently from systems by companies like IBM, Microsoft and Oracle. CGI officials argued that it would slow work because it was too unfamiliar. Government officials disagreed, and its configuration remains a serious problem.
MarkLogic has not been identified as a vendor creating some headaches until now. MarkLogic has a system that can store information and data in an XML data management system. The trick is that content not in XML must be normalized; that is, converted to XML. MarkLogic has developed some proprietary methods to perform its data management operations. A person familiar with XML may not be conversant with the MarkLogic conventions. The upside of this approach is that MarkLogic has experts who are able to address most customer requests. The downside is that a person familiar with XML but not MarkLogic can introduce some problems into an otherwise spiffy system.
In the last few years, MarkLogic has had a number of senior management changes. I track the company via my Overflight system and have noted that the firm has gone from a company that does a good job of publicizing itself to an outfit that has trimmed back on its public presence. You can check out the MarkLogic Overflight on the ArnoldIT.com Web site. The minimal news flow, the absence of tweets, and the termination of public blog content can be verified by visiting the paste every few days.
One interesting aspect of MarkLogic is that the company has positioned itself as a publishing platform. Once content is in the repository, it is possible to slice and dice information and data. Publishers can use this feature to whip out books with little or no involvement of human editors. But the company has, like Verity, grafted on other features and services. These range from enterprise search to text mining to electronic mail management.
I heard that the company was to have been a $200 or $300 million dollar a year operation a few years ago. The firm may be the best kept secret in terms of its revenues and profits. If so, kudos. But if the company has not been able to demonstrate strong growth and healthy net profits, the firm may need to ramp up its publicity and marketing activities.
The New York Times’s comment may be hogwash. Even if a stretch, getting a paragraph that strikes me as less than favorable raises some questions; for example:
- Are proprietary extensions a good idea for an XML system that must be used by folks who are not into XML?
- Will the transformations between and among content from disparate systems remain bottlenecks during periods of high content flow and usage?
- Will Oracle seize on the MarkLogic system and revive its flow of information about the weaknesses of XML as compared with content stored in an Oracle data management system?
MarkLogic has rolled through three of four presidents in the last few years. Dave Kellogg departed, and I mostly lost track of who followed him. At the time of his departure MarkLogic was in the $60 million estimated revenues. Will the management turmoil kick in again? Will the company continue to expand its features and functions as Verity did prior to its initial public offering? Are there parallels between the trajectories of Convera, Delphes, Entopia, and Verity and MarkLogic. For some case analyses, check out www.xenky.com/vendor-profiles.
Stephen E Arnold, November 23, 2013
November 18, 2013
I read “Bloomberg Newsroom to Lay Off 50” in the Wall Street Journal on November 18, 2013. (Note that this link may go dead or require that you pay to access the story.) The story appeared on page B-4 of the old-fashioned copy that finds its way to me—-usually.
The story contained three fascinating factoids. The first was the reduction in force (RIF) at Bloomberg. Then there was information about the Thomson Reuters’ RIF of 3,000 jobs, and finally the “minimal decline” in News Corp. of the alleged phone-related approach to news gathering.
In short, if the assertion that the economy is rebounding, why are these outfits admitting that employees have to go?
The question I have, “Is staff reduction the easiest way to cut costs?” My assumption was that new products and new services would attract paying customers. With the warm snuggles of innovation, sophisticated, able companies would prosper.
Firing those who work at information centric companies provides a partial answer to the question, “So you want to be a publisher?” The individuals who have to find a new job may start creating content.
That’s a plus for those who have a thirst for information. Manufacturing more competitors for organizations like Bloomberg, Thomson Reuters, and News Corp. may have unintended consequences.
I hope the new publishers are able to get their content indexed by an outfit like Blekko/Yandex.
Stephen E Arnold, November 18, 2013
November 15, 2013
“News Use across Social Media Platforms” confirms what I have suspected for a while. The mobile generation has some interesting behavior patterns with regard to news. Among the factoids that the Pew outfit has boiled down to numbers are:
ITEM: YouTube viewers are not using the service to get news. Maybe that explains why experiments with Thomson Reuters proved to be somewhat disappointing.
ITEM: Google Plus is less popular than Reddit, Twitter, and Facebook as a source of news. The push back about Google Plus as a prerequisite for YouTube comments may have more importance than some realize.
ITEM: Facebook is an important source of news. As the demographics of Facebook shift, the importance of news may suggest that Facebook has morphed into a more mature service.
The most interesting “fact” in the report is the apparent importance of Reddit, a service which points to public posts on a range of issues. The Reddit service offers a search function, but I find consistently disappointing. In fact, most of the unusual collections of links and comments are essentially unfindable.
Another interesting facet of the report is the inclusion of some trendy graphics. The diagram below is my favorite.
In my opinion, the good news in the report appears in this passage:
Social media news consumers still get news from a variety of other sources and, in some cases, even more so than the general public does. YouTube, LinkedIn and Google Plus news consumers are more likely than Facebook and Twitter news consumers to watch cable news. Twitter news consumers are among the least likely to turn to local and cable TV. And nearly four-in-ten LinkedIn news consumers listen to news on the radio, compared to about a quarter of the general population.
For now, the digital services cannot celebrate total victory. Publishers of traditional news media live to fight another day. For now.
Stephen E Arnold, November 15, 2013
October 30, 2013
I spoke with a former publishing executive last week about what he called “easy cost cutting.” Publishers like Thomson Reuters, Wolters Kluwers, and Pearson have been tightening their WalMart belts for some time. Chasing down expensive off site meetings and taking close looks at senior executives authorized expenditures is good business management.
Publishing companies have been struggling to get back to the good old days of William Randolph Hearst. But the cost of paper, pesky worker demands, the sky rocketing cost of buying advanced systems and then paying to try and get the systems to work so old fashioned work processes can be streamlined, upstarts like former middle school teachers who start a blog and give away the content for free, Amazon and its silly “anybody can publish” approach to content, environmental costs associated with ink and disposal of unsold printed matter, and the lousy outlook for law, accounting, library, and other high value materials are making life tough.
Well, the story “Thomson Reuters to Cut 3,000 Jobs in Second Layoff Round This Year” suggests life is getting pretty tough. Now that the easy cuts are gone, heads have to go away. Thomson Reuters is interesting because it has had some senior management turnover in the past two years. The article pointed out:
Third-quarter net income attributable to common shareholders fell 39 per cent to $271 million, or 33 cents a share, from $441 million, or 53 cents, a year earlier. Operating profit, which excludes one-time items and businesses that have closed, declined 15 per cent to $316 million in the period.
Thomson Reuters is revenue and profit oriented. So, the decline is worrisome.
However, set aside the serious problems at Thomson Reuters. Let’s ask a larger question, “What about Thomson Reuters as a flagship outfit?”
My view is that Thomson Reuters, particularly when headed by Michael Brown, was a pretty well managed outfit. Now the company seems to be signaling that the ship is listing. Thomson Reuters is not lying on the bottom of the Mediterranean Sea, but the company has got to make some changes that return the company to its former posture. Growth requires more than acquisitions in Argentina. Leadership requires more than a new crew in the pilot house.
Many publishing companies are in a similarly precarious position. The private companies do not have to report their financial woes, but they are evident if one pokes into specific markets; for example, the library sector. Libraries are not rolling in cash. The companies dependent on libraries for revenue are going to have to shop at the WalMart belt display too. Newspaper publishers are interesting. Perhaps Jeff Bezos knows how to make the Washington Post the Miley Cyrus of the dailies? Book publishers are trying to figure out what to do with the 300,000 to 600,000 self published books likely to be generated this year. Most are no good, but the sheer volume underscores the challenge the folks in London and New York face from an unemployed Webmaster with an Amazon account or Apple’s publishing software.
Will more layoffs occur? I hope not. Thomson Reuters once was the leader of the publishing pack. Is it now the leader in the headcount reduction derby? Worth monitoring.
Stephen E Arnold, October 30, 2013
October 30, 2013
The science journal Nature examines the changing state of academic journals in, “Open Access: The True Cost of Science Publishing.” Writer Richard Van Noorden goes in-depth on the costs behind publishing research articles, the factors behind those costs, and how open access publishing may turn the whole field on its ear (and whether this is a good thing). Let’s start with some crazy-sounding numbers; the article tells us:
“Data from the consulting firm Outsell in Burlingame, California, suggest that the science-publishing industry generated $9.4 billion in revenue in 2011 and published around 1.8 million English-language articles — an average revenue per article of roughly $5,000. Analysts estimate profit margins at 20–30% for the industry, so the average cost to the publisher of producing an article is likely to be around $3,500–4,000.”
That sure seems like a lot. Traditional publishers say that fans of open access understate the value they add to each article while overstating how much they make on them. It is difficult, though, to examine these claims, since these journals play their financial cards close to the vest. Such secrecy may eventually give way before the wealth of information available about open access options, which Noorden covers thoroughly. More and more researchers will hesitate to take the big names at their word on costs.
It is worth noting one downside to the current proliferation of open access journals: quality control. The traditional journals maintain that their high publishing fees are partially justified by the effort they put into sorting and disqualifying submissions. Both those publishers and open access journals ensure quality through the peer-review process, but recent findings bring doubt to the reliability of this measure at certain, newer publishers. Though the transition to open access journals may appear inevitable to some, the old-school players seem determined to defend their model. Will they succeed?
Cynthia Murrell, October 30, 2013
October 26, 2013
I read “Senator Intensifies Probe of Data Brokers.” It seems that the leaders in Washington, DC have discovered data aggregation. Let me think. Right. Data aggregation has been around for more than a half century. I remember Ian Sharp (anyone remember him?) telling me about his discovery of data aggregation when he was a lad and before he created his business in the 1970s.
The point of the write up I noted was:
“However, if these recent news accounts are accurate, they raise serious questions about whether Experian as a company has appropriate practices in place for vetting its customers and sharing sensitive consumer data with them, regardless of the particular line of business.” Mr. Rockefeller’s letter is part of a larger effort by the Commerce Committee to understand how companies collect, share and sell intimate details about the shopping habits, health concerns, family circumstances and financial status of consumers at a time when Americans are increasingly sharing personal information online.
I have not comment about Experian or any similar firm.
My reaction is that if the leaders in DC are willing to name a particular company, that’s interesting. More intriguing is the question, “Will the various committees start taking a closer look at outfits like Thomson Reuters, McGraw Hill, and (hold your breath), the New York Times?
There are many ways to deliver a solution to the problem of certain organizations disseminating information.
Stephen E Arnold, October 26, 2013
October 19, 2013
Are tablets the salvation of the newspaper industry? Google’s chief economist thinks they may be. In a speech he recently gave in Milan, Hal Varian points to the ways consumers’ usage of tablets differs from that of other devices. Writer Will Conley summarizes:
“Varian said tablets are the most newspaper-like electronic medium due to their status as ‘leisure time’ reading devices. Citing a Pew Foundation study, Varian pointed out that tablets are the preferred electronic news reading medium for mornings and evenings—during which readers spend the most time absorbing the news—beating out both desktop and smartphones for those periods. Ad revenue depends on the amount of time spent reading the news, he said, and therefore the proliferation of tablets will help the online newspaper industry to gain a new foothold for the first time in 40 years.”
Varian believes tablets might even prompt users to devote more time to reading news, restoring the “analytic depth” that has been eroding along with our attention spans. It’s a nice vision. Unfortunately, an article at Gigaom that came out on the same day as Conley’s piece takes a contradictory stance. Gigaom contributor Jon Lund explains “Why Tablet Magazines are a Failure.” (I think we can extrapolate his points to periodicals in general.)
Lund points out that, as of yet, magazine apps haven’t been selling as hoped. Website traffic still far outpaces app usage for the same publications. Lund believes there are several reasons for this, including the ever growing sea of apps in which magazines get swallowed. Then there is the closed nature of these apps. Their content can’t be easily “shared” with a wider social network, and readers have grown accustomed to sharing information with the click of a link. Curation apps like Flipboard and Zite are likewise blocked from reaching in and grabbing content from magazine apps. Finally, he asserts, reading a magazine on a tablet just doesn’t feel right. He laments:
“When I nevertheless manage to find the time to open up an iPad magazine, I feel as if I’m holding an outdated media product in my hands. That’s ironic because these apps tend to be visually appealing, with interactive graphics, embedded videos and well-crafted navigation tools. But the gorgeous layout that works so well in print gets monolithic, almost scary, in its perfectionism on the iPad, and I find myself longing for the web. It’s messy but far more open, more accessible and more adaptable to me, my devices and needs.”
Almost scary? I’m not sure Lund’s discomfort with periodicals in tablet form is widespread and, even if it is, that will probably recede as we move farther from print media. I don’t think his other points are insurmountable, but they are something to consider for Varian and others wishing to pursue a news-coverage renaissance through tablets.
Cynthia Murrell, October 19, 2013
October 18, 2013
Science magazine has published an important article about today’s open-access academic journals— Who’s Afraid of Peer Review?” I highly recommend reading the entire piece, but I’ll share some highlights here. Journalist John Bohannon begins:
“On 4 July, good news arrived in the inbox of Ocorrafoo Cobange, a biologist at the Wassee Institute of Medicine in Asmara. It was the official letter of acceptance for a paper he had submitted 2 months earlier to the Journal of Natural Pharmaceuticals, describing the anticancer properties of a chemical that Cobange had extracted from a lichen.
“In fact, it should have been promptly rejected. Any reviewer with more than a high-school knowledge of chemistry and the ability to understand a basic data plot should have spotted the paper’s short-comings immediately. Its experiments are so hopelessly flawed that the results are meaningless. I know because I wrote the paper.”
You see, Science performed an elaborate sting operation across the rapidly growing field of open-access journal publication. Most of these journals make money by charging authors upon acceptance of their articles; Bohannon began to suspect a number of these publications were motivated to accept papers that would not stand up to rigorous peer review, despite assertions on their websites to the contrary. What he found is truly disheartening.
See the article for the methodology behind the fake paper and Bohannon’s submissions procedure, both of which are informative in themselves. The results are disheartening. When the article went to press, far more journals (157) had accepted the bogus paper than rejected it (98). Even respected publishers like Elsevier and Sage were found to host at least one of these questionable journals. Most of the publishers that performed any review at all focused on mechanical issues like formatting, not substance. What is going on here? Bohannon offers:
“A striking picture emerges from the global distribution of open-access publishers, editors, and bank accounts. Most of the publishing operations cloak their true geographic location. They create journals with names like the American Journal of Medical and Dental Sciences or the European Journal of Chemistry to imitate—and in some cases, literally clone—those of Western academic publishers. But the locations revealed by IP addresses and bank invoices are continents away: Those two journals are published from Pakistan and Turkey, respectively, and both accepted the paper…
“About one-third of the journals targeted in this sting are based in India—overtly or as revealed by the location of editors and bank accounts—making it the world’s largest base for open-access publishing; and among the India-based journals in my sample, 64 accepted the fatally flawed papers and only 15 rejected it.”
So, opportunists in the developing world have seized upon faux-reviewed academic publishing as the way to turn a PC and an Internet connection into profits. Good for them, bad for science. How does one know when Bing or Google links to fake info? Does it matter anymore? I have to think it does. I hope that people in the field, like Bohannon, who care about open access to legitimate research will find a way to counter this flood of bad information. In the meantime, well… don’t believe every link you read.
Cynthia Murrell, October 18, 2013
October 16, 2013
I heard an AAAS podcast about fake academic papers in open access publications. I did not catch much information from the 20 second sound bite. I navigated to Google and keyed this query:
aaas open access journals
The hit I sought was number eight on the search results page. What is interesting is that the current “hot” item ranked below older information. In one case, the hit was irrelevant to my intent filtered by Google’s behind-the-scenes personalization methods; for example, www.sciencemag.org. Another hit pointed to a couple of outdated studies dating in one case from 2005.
And Bing? Same query. No relevant hit on the first page of the Bing results list. What about that Bing off stuff? Maybe baloney?
And Yandex? Same query. No relevant hits on the first page of results.
And DuckDuckGo, the metasearch engine causing some to swoon? No relevant hit.
- Timeliness is not a priority in the free Web indexing systems
- Access to rich media containing relevant information for a user’s query is NOT indexed. For all practical purposes, the podcasts are invisible without prior knowledge
- Junk results are not filtered by any of the systems.
No big deal for me. Just another example of how the simplest query can return some darned interesting results.
By the way, the Google results page include two ads, both from “traditional publishers.” One of the advertisers publishes commercial databases. My recollection is that some of the content in these information services could be viewed as incorrect. In fact, one of the Google advertisers accepted the bogus paper.
What’s my point?
The task of finding relevant, on point information is getting more difficult, not easier. Furthermore, as folks shift to “hectic” modes of work, the idea that most people will double check information before accepting it as gospel may be outmoded.
Stephen E Arnold, October 16, 2013