IBM: There Are Doubters

December 31, 2015

Watson has its works cut out for itself in 2016. I read “IBM Set to Drop 13% in 2015.” When one is tossing around a $100 billion outfit, the thought of a share drop is disconcerting. Will Alibaba or Jeff Bezos step in. Fixing up the Washington Post may be trivial compared with an IBM scale challenge.

According to the write up:

Much of the disappointment in the tech company is because it has been unable to replace its hardware and software legacy products with new cloud-based and AI products — at least not at a rate that would pull IBM’s revenue up. Its major branded product in new age technology is Watson. While Watson has been the source of press releases and small customer alliances, outsiders have trouble seeing what it does to sharply increase IBM’s sales. Granted, Watson may be one of the most impressive product advances among large companies in the sector recently, but what it does for IBM may be very modest.

Somewhat of a downer I perceive.

The smart software thing is not new. In the last 18 months, awareness of the use of various numerical recipes has increased. Faster chips, memories, and interconnections have worked their magic.

The challenge for IBM is to make money, not just marketing hyperbole. The crunch is that expectations for certain technologies are often more robust than possible in a market.

Watson is, when one keeps its eye on the ball, is a search and content processing system. The wrappers make it possible to call assorted functions. Unlike Palantir, which has its own revenue fish to catch, IBM is a publicly traded company. Palantir does its magic as a privately held company, ingesting money at rates which would make beluga whale’s diet look modest.

But IBM has exposed itself. The Watson marketing push is dragged into the reality of IBM’s overall company performance. In 2016, IBM Watson will have to deliver the bacon, or some of the millennialesque PR and marketing folks will have an opportunity to work elsewhere. Talk about smart software is not generating sustainable revenue from smart software.

Stephen E Arnold, December 31, 2015

Data Managers as Data Librarians

December 31, 2015

The tools of a librarian may be the key to better data governance, according to an article at InFocus titled, “What Librarians Can Teach Us About Managing Big Data.” Writer Joseph Dossantos begins by outlining the plight data managers often find themselves in: executives can talk a big game about big data, but want to foist all the responsibility onto their overworked and outdated IT departments. The article asserts, though, that today’s emphasis on data analysis will force a shift in perspective and approach—data organization will come to resemble the Dewey Decimal System. Dossantos writes:

“Traditional Data Warehouses do not work unless there a common vocabulary and understanding of a problem, but consider how things work in academia.  Every day, tenured professors  and students pore over raw material looking for new insights into the past and new ways to explain culture, politics, and philosophy.  Their sources of choice:  archived photographs, primary documents found in a city hall, monastery or excavation site, scrolls from a long-abandoned cave, or voice recordings from the Oval office – in short, anything in any kind of format.  And who can help them find what they are looking for?  A skilled librarian who knows how to effectively search for not only books, but primary source material across the world, who can understand, create, and navigate a catalog to accelerate a researcher’s efforts.”

The article goes on to discuss the influence of the “Wikipedia mindset;” data accuracy and whether it matters; and devising structures to address different researchers’ needs. See the article for details on each of these (especially on meeting different needs.) The write-up concludes with a call for data-governance professionals to think of themselves as “data librarians.” Is this approach the key to more effective data search and analysis?

Cynthia Murrell, December 31, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Scientific Research Has Turned into a Safe Space

December 31, 2015

The Internet is a cold, cruel place, especially if you hang out in the comments section on YouTube, eBay forums, social media, and 4chan.  If you practice restraint and limit your social media circles to trusted individuals, you can surf the Internet without encountering trolls and haters.  Some people do not practice common sense, so they encounter many hateful situations on the Internet and as a result they demand “safe spaces.”  Safe spaces are where people do not encounter anything negative.

Safe spaces are stupid.  Period.  What is disappointing is that the “safe space” and “only positive things” has made its way into the scientific community according to Nature in the article, “‘Novel, Amazing, Innovative’: Positive Words On The Rise In Science Papers.”

The University Medical Center in the Netherlands studied the use of positive and negative words in the titles of scientific papers and abstracts from 1974-2014 published on the medical database PubMed.  The researchers discovered that positive words in titles grew from 2% in 1974 to 17.5% in 2014.  Negative word usage increased from 1.3% to 2.4%, while neutral words did not see any change.  The trend only applies to research papers, as the same test was run using published books and it showed little change.

“The most obvious interpretation of the results is that they reflect an increase in hype and exaggeration, rather than a real improvement in the incidence or quality of discoveries… The findings “fit our own observations that in order to get published, you need to emphasize what is special and unique about your study,” he says. Researchers may be tempted to make their findings stand out from thousands of others — a tendency that might also explain the more modest rise in usage of negative words.”

While there is some doubt associated with the findings, because it was only applied to PubMed.  The original research team thinks that it points to much larger problem, because not all research can be “innovative” or “novel.”  The positive word over usage is polluting the social, psychological, and biomedical sciences.

Under the table, this really points to how scientists and researchers are fighting for tenure.  What would this mean for search engine optimization if all searches and descriptions had to have a smile?  Will they even invent a safe space filter?

Whitney Grace, December 31, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

 

 

Newsreaders through Time

December 30, 2015

Those chart mavens at CBInsights have produced another timeline for wild and crazy Internet services. “The Rise and Fall of Venture Backed News Readers” makes clear the long odds traditional news producers face when trying to find a business model. The chart is a shopping list of case studies for MBA programs. The idea of providing “news” to the hungry minds with mobile devices and sci fi laptops seems to be a bit of a challenge. For investors, these services trigger opportunities to explain why their investments did not perform particularly well. The chart, intentionally or unintentionally, causes Flipboard to stand out from the crowd. It may be the red logo and bold faced type. Alternatively, Flipboard has managed to attract money over the last five years. The chart makes clear why an average millennial may want to take a vacation instead of investing in a newsreader start up.

Stephen E Arnold, December 30, 2015

SEO Tips Based on Recent Google Search Quality Guidelines

December 30, 2015

Google has recently given search-engine optimization pros a lot to consider, we learn from “Top 5 Takeaways from Google’s Search Quality Guidelines and What They Mean for SEO” at Merkle’s RKG Blog. Writer Melody Pettula presents five recommendations based on Google’s guidelines. She writes:

“A few weeks ago, Google released their newest Search Quality Evaluator Guidelines, which teach Google’s search quality raters how to determine whether or not a search result is high quality.  This is the first time Google has released the guidelines in their entirety, though versions of the guidelines have been leaked in the past and an abridged version was released by Google in 2013. Why is this necessary? ‘Quality’ is no longer simply a function of text on a page; it differs by device, location, search query, and everything we know about the user. By understanding how Google sees quality we can improve websites and organic performance. Here’s a countdown of our top 5 takeaways from Google’s newest guidelines and how they can improve your SEO strategy.”

We recommend any readers interested in SEO check out the whole article, but here are the five considerations Pettula lists, from least to most important: consider user intent; supply supplementary content; guard your reputation well; consider how location affects user searches; and, finally, “mobile is the future.” On that final point, the article notes that Google is now almost entirely focused on making things work for mobile devices. SEO pros would do well to keep that new reality in mind.

Cynthia Murrell, December 30, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

Overhyped Science Stuff

December 30, 2015

After Christmas, comes New Year’s Eve and news outlets take the time to reflect on the changes in the past year.  Usually they focus on celebrities who died, headlining news stories, technology advancements, and new scientific discoveries.  One of the geeky news outlets on the Internet is Gizmodo  and they took their shot at highlighting things that happened in 2015, but rather than focusing on new advances they check off “The Most Overhyped Scientific Discoveries In 2015.”

There was extreme hype about an alien megastructure in outer space that Neil deGrasse Tyson had to address and tell folks they were overreacting.  Bacon and other processed meats were labeled as carcinogens and caused cancer!  The media, of course, took the bacon link and ran with it causing extreme panic, but in the long run everything causes cancer from cellphones to sugar.

Global warming is a hot topic that always draws arguments and it appears to be getting worse the more humans release carbon dioxide into the atmosphere.  Humans are always ready for a quick solution and a little ice age would rescue Earth.  It would be brought on by diminishing solar activity, but it turns out carbon dioxide pollution does more damage than solar viability can fix.  Another story involved the nearly indestructible tardigrades and the possibility of horizontal gene transfer, but a dispute between two rival labs about research on tardigrades ruined further research to understanding the unique creature.

The biggest overblown scientific discovery, in our opinion, is NASA’s warp drive.  Humans are desperate for breakthroughs in space travel, so we can blast off to Titan’s beaches for a day and then come home within our normal Earth time.  NASA experimented with an EM Drive:

“Apparently, the engineers working on the EM Drive decided to address some of the skeptic’s concerns head-on this year, by re-running their experiments in a closed vacuum to ensure the thrust they were measuring wasn’t caused by environmental noise. And it so happens, new EM Drive tests in noise-free conditions failed to falsify the original results. That is, the researchers had apparently produced a minuscule amount of thrust without any propellant.

Once again, media reports made it sound like NASA was on the brink of unveiling an intergalactic transport system.”

NASA might be working on warp drive prototype, but the science is based on short-term experiments, none of it has been peer reviewed, and NASA has not claimed that the engine even works.

The media takes the idea snippets and transforms them into overblown news pieces that are based more on junk science than real scientific investigation.

 

Whitney Grace, December 30, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

New and Improved Hacker Methods in China

December 30, 2015

We learn from an article at Yahoo News that, “On China’s Fringes, Cyber Spies Raise Their Game.” Reporters Clare Baldwin, James Pomfret, and Jeremy Wagstaff report that hackers backed by China are using some unique methods, according to Western security experts. Search is but a tiny part of this approach but, perhaps not surprisingly, cloud storage is a factor. The article relates:

“Hackers have expanded their attacks to parking malware on popular file-sharing services including Dropbox and Google Drive to trap victims into downloading infected files and compromising sensitive information. They also use more sophisticated tactics, honing in on specific targets through so-called ‘white lists’ that only infect certain visitors to compromised websites. Security experts say such techniques are only used by sophisticated hackers from China and Russia, usually for surveillance and information extraction. The level of hacking is a sign, they say, of how important China views Hong Kong, where 79 days of protests late last year brought parts of the territory, a major regional financial hub, to a standstill. The scale of the protests raised concerns in Beijing about political unrest on China’s periphery. ‘We’re the most co-ordinated opposition group on Chinese soil, (and) have a reasonable assumption that Beijing is behind the hacking,’ said Lam Cheuk-ting, chief executive of Hong Kong’s Democratic Party, which says it has been a victim of cyber attacks on its website and some members’ email accounts.”

Officially, China’s Defense Ministry denies any connection to the attacks, but that is nothing new. The adaptation of new hacking techniques is part of a continuing cycle; as journalists, scholars, and activists improve their security, hackers adapt. See the article for specifics on some attacks attributed to China-backed hackers, as well as some ways activists are trying to stay ahead.

Cynthia Murrell, December 30, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

DuckDuckGo Grows in 2015

December 30, 2015

Do you not love it when the little guy is able to compete with corporate giants?  When it comes to search engines DuckDuckGo is the little guy, because unlike big search engines like Google and Yahoo it refuses to track its users browsing history and have targeted ads.  According to Quartz, “DuckDuckGo, The Search Engine That Doesn’t Track Its Users, Grew More Than 70% This Year.”  Through December 15, 2015, DuckDuckGo received 3.25 billion queries up from twelve million queries received during the same time period in 2014.  DuckDuckGo, however, still has trouble cracking the mainstream..

Google is still the biggest search engine in the United States with more than one hundred million monthly searches, but DuckDuckGo only reached 325 million monthly searches in November 2015.  The private search engine also has three million direct queries via desktop computers, but it did not share how many people used DuckDuckGo via a mobile device to protect its users’ privacy.  Google, on the other hand, is happy to share its statistics as more than half of its searches come from mobile devices.

“What’s driving growth? DuckDuckGo CEO Gabriel Weinberg, reached via email, credits partnerships launched in 2014 with Apple and Mozilla, and word of mouth.  He also passes along a Pew study from earlier this year, where 40% of American respondents said they thought search engines ‘shouldn’t retain information about their activity.’… ‘Our biggest challenge is that most people have not heard of us,’ Weinberg says. ‘We very much want to break out into the mainstream.’”

DuckDuckGo offers an unparalleled service for searching.  Weinberg stated the problem correctly that DuckDuckGo needs to break into the mainstream.  Its current user base consists of technology geeks and those in “the know,” some might call them hipsters.  If DuckDuckGo can afford it, how about an advertising campaign launched on Google Ads?

Whitney Grace, December 30, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

IBM Supercomputer: Slick and Speedy

December 29, 2015

I read an unusual chunk of content marketing for IBM’s supercomputer. As you may know, IBM captured a US government project for supercomputers. I am not sure if IBM is in the quantum computing hunt, but I assume the IBM marketing folks will make this clear as the PR machine grinds forward in 2016.

The article on my radar is the link baity “Scientists Discover Oldest Words in the English Language, Predict Which Ones Are Likely to Disappear.”

First, the supercomputer rah rah from a university in the UK:

The IBM supercomputer at the University of Reading, known as ThamesBlue, is now one year old. Before it arrived, it took an average of six weeks to perform a computational task such as comparing two sets of words in different languages, now these same tasks can be executed in a few hours. Professor Vassil Alexandrov, the University’s leading expert on computational science and director of the University’s ACET Centre¹ said: “The new IBM supercomputer has allowed the University of Reading to push to the forefront of the research community. It underpins other important research at the university, including the development of accurate predictive models for environmental use. Based on weather patterns and the amounts of pollutant in the atmosphere, our scientists have been able to pinpoint likely country-by-country environmental impacts, such as the affect airborne chemicals will have on future crop yields and cross-border pollution”.

There you go. Testimony. Look at the wonderful use case for the IBM supercomputer: Environmental impact analyses.

Now back to the language research. It seems to me that the academic research scientists are comparing word lists. The concept seems very Watson like even though I did not spot a reference to IBM’s much hyped smart system.

The less frequently a word is used, the greater the likelihood that word will be forgotten, disused, or tossed in the dictionary writer’s dust bin. Examples of words in trouble are:

  • dirty
  • guts
  • squeeze
  • stick
  • throw

I would suggest that IBM’s marketing corpus from the foundation of the company as a vendor of tabulating equipment right up to the PurePower name be analyzed. Well, I am no academic, and I am not sure that the University of Reading would win a popularity contest at IBM after predicting which of its product names will fall into disuse in the future. (I sure would like to see the analysis for Watson, however.)

My thought is that frequency of use analyses are useful. A fast computer is helpful. I am not sure about the embedded IBM commercial in the write up.

Stephen E Arnold, December 28. 2015

Yahoo: The Value of Being a Parasite

December 29, 2015

I read a number of articles about Yahoo each day. Most of these are rehashes. Xoogler flops. Yahoo tanks. A fresh angle rare.

Why Yahoo Needs a Monopoly to Survive” is different. The approach takes a tough stance:

Yahoo is in trouble. Despite nearly $5 billion in annual revenue, investors value Yahoo’s business at next to nothing. Most of its value comes from its investment in Alibaba–to the point where Yahoo has largely become a tracking stock for Alibaba shares.

Direct and to the point.

The write up continues:

Google has the content platform in search. Facebook has the social networking platform. Amazon has the product marketplace (in the U.S.). Similarly, in China, Alibaba has the top product marketplace, Tencent has the top messaging platform and Baidu has the leading search platform. All leading platforms have a core monopoly that is the lifeblood of their business. Why? Once a platform has a monopoly, it can use its core network to expand into other markets Every subsequent platform can leverage the platform monopoly’s network to its advantage.

There you go. A monopoly is just darned good. Quite a generalization, but I like the frankness of the insight.

How does this relate to Yahoo?

Yahoo is not a monopoly. Yahoo must be a monopoly. The logic of the article is that Yahoo is a goner unless, like a pilot fish, it attaches itself to the shark Alibaba.

What will the Xoogler do? Do the parasite move or stick with a symbiotic relationship? Yahoooo!

Stephen E Arnold, December 29, 2015

Next Page »

  • Archives

  • Recent Posts

  • Meta