A Modest Proposal: Universal Internet Access and a Chief Digital Officer in Every Organization

September 28, 2015

Facebook supports universal Internet access. Support is not enough from one or two outfits. Facebook wants the United Nations to make universal Internet access a priority.

Navigate to “Mark Zuckerberg Addresses the UN, Declaring Universal Internet Access a Global Priority.” I wonder if peace keeping, food, education, and other priorities of the United Nation will be down prioritized or de-prioritized. If I were hoping for UN food assistance, I would definitely want to be able to check my Facebook. Maslow’s hierarchy of needs is obviously wrong. At the top, Facebook.

I also noted this article, “IT Still Doesn’t Understand Its Role in Society.” The author is a self described “IT leader.” The point is, I think:

It struck me, when I opened this September’s edition, just how much things have moved on. This month gives much more space to the changing role of IT and its part in business leadership. That of course lies at the heart of the debate about CIOs and CDOs – the former seeming inextricably constrained by operational IT matters, whether insourced or outsourced, and the latter filling the vacuum created by misalignment between IT activity and business priority. Everyone seems agreed that the role of a CDO is not about the technology. It is about people and process. But it cannot operate without a fundamental understanding about the opportunity that technology offers, and therefore CDOs must work closely with IT professionals.

The word choice is well matched with the imperative to make technology become the source for wild and crazy assertions. I like the use of the acronyms CIO and CDO. I am not sure what a CDO is, but that is not important. The precision of insourced and outsourced, where the outsourcing thing fills “the vacuum created by misalignment between IT activity and business priority.”

Okay, the folks running the business are not exactly sure what’s up with IT. If a senior manager tuned in to Facebook’s remarks about universal Internet access, there might be some furrowed brows.

What’s the fix?

The answer is a new job position at companies. The CDO. (My hunch is that this acronym means “chief digital officer.”) When revenues are stressed, most senior managers will gladly add to the organization’s headcount to get a CDO on the team.

I highlighted this passage:

So we need clever technical specialists, but we also need IT professionals who can bridge the gap between technology opportunity and the benefits that technology can bring society. That is why the goal of BCS – the Chartered Institute for IT – is “to make IT good for society”. That should be the role of IT professionals. This means that IT professionals need to understand the impact of technology, positive and negative, in the way systems and IT tools are designed. It means IT has a significant part to play in the debate about privacy and trust emerging from IT changes. It means we have a part to play in the way systems are designed to benefit society, not just to make profit. And it means IT is a creative, human discipline, not just a scientific and engineering profession.

Okay. But what about the Facebook suggestion to make Internet access universal. Will checking the Facebook obviate hunger and disease? Will information technology move beyond profit to benefiting society.

What’s at stake here? My hunch the focus for Facebook thing and the self appointed expert’s CDO recommendation has more to do with money and boosting the notion of the importance of technology in the modern world.

Was Maslow incorrect? Is Facebook connectivity more important than food? Are companies in need of more headcount in order to make headway in the datasphere?

Nope. Why not sit back and let the Alphabet Google thing do the job? Some big thinkers want governments to be more like Google. No Facebook. No information technology intermediaries. Why search for information when a commercial enterprise and self appointed experts know best what folks like me want?

Stephen E Arnold, September 28, 2015

Accidental and On-Purpose Insider Threats in Federal Agencies Still Raging

September 28, 2015

The article on Eweek titled Insider Threats a Major Security Issue for Federal Agencies looks at the recent results of a MeriTalk survey investigating federal response to insider threats through interviewing federal IT managers. The results are shocking, with almost 30% of agencies acknowledging data lost to an insider threat in the last year and half of respondents claiming that unauthorized personnel commonly fail to observe protocols. Even worse, most agencies have no tracking in place to recognize what a staffer may have seen or shared, making them virtually incapable of following up on risky behavior in their employees. The article says,

“The most startling finding from the survey is the fact that 45 percent of agencies say they’ve been a target of an attack – malicious or unintentional – yet 50 percent still say employees do not follow all the protocols in place,” Steve O’Keeffe, founder of MeriTalk…”There is also a lack of agreement on the best solution.  Frequent, hands-on employee training is the key to preventing these incidents, as well as accountability. However, we are all human and people make mistakes.”

O’Keefe recommends the immediate and comprehensive adoption of better encryption and two-factor authentication to address the issue. But perhaps equally important is continuously updated training, and ongoing training, to avoid the common accidental insider threats.
Chelsea Kerwin, September 28, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Help Wanted: Chief Marketing Technology Officer

September 28, 2015

A new, indispensable position for companies is the chief technology officer or the chief information officer. Their primary responsibilities are to manage the IT department, implement new ways to manage information, and/or develop software as needed. There is a new position that companies will be creating in the future and the title is chief marketing technology officer, says Live Mint in “Make Way CIOS, CMOS: Here Comes The CMTO.”

Formerly the marketing and IT departments never mixed, except for the occasional social media collaboration. Marketers are increasing their reliance on technology to understand their customers and it goes far beyond social media. Marketers need to be aware of the growing trends in mobile shopping and search, digital analytics, gamification, online communities, and the power of user-generated content.

“The CMO’s role will graduate to CMTO, a marketer with considerable knowledge of technology. The CMTO, according to Nasscom, will not only conceptualize but also build solutions and lay down the technical and commercial specifications while working alongside the IT team on vendor selection.”

It is not enough to know how to market a product or promote an organization. Marketers need to be able to engage with technology and understand how to implement to attract modern customers and increase sales. In other words, evolving the current marketing position with a new buzzword.

Whitney Grace, September 28, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Watch Anti-Money Laundering Compliances Sink

September 25, 2015

With a title like “AML-A Challenge Of Titanic Proportions” posted on Attivio metaphoric comparisons between the “ship of dreams” and icebergs is inevitable.  Anti-money laundering compliances have seen an unprecedented growth between 2011-2014 of 53%, says KPMG’s Global Anti-Money Laundering (AML) Survey.  The costs are predicted to increase by more than 25% in the next three years.  The biggest areas that are requiring more money, include transaction monitoring systems, Know Your Customer systems, and recruitment/retention systems for AML staff.

The Titanic metaphor plays in as the White Star Line director Bruce Ismay, builder Thomas Andrew, and nearly all of the 3327 passengers believed the ship was unsinkable and the pinnacle of modern technology.  The belief that humanity’s efforts would conquer Mother Nature was its downfall.  The White Star Line did not prepare the Titanic for disaster, but AML companies are trying to prevent their ships are sinking.  Except they cannot account for all the ways thieves can work around their system, just as the Titanic could not avoid the iceberg.

“Systems need to be smarter – even capable of learning patterns of transaction and ownership.  Staff needs more productive ways of investigating and positively concluding their caseload.  Alerting methods need to generate fewer ‘false positives’ – reducing the need for costly human investigation. New sources of information that can provide evidence need to come online faster and quickly correlate with existing data sources.”

The Titanic crew accidentally left the binoculars for the crow’s nest in England, which did not help the lookouts.  The current AML solutions are like the forgotten binoculars and pervasive action needs to be taken to avoid the AML iceberg.

Whitney Grace, September 25, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

Rundown on Legal Knowledge Management

September 24, 2015

One of the new legal buzzwords is knowledge management and not just old-fashioned knowledge management, but rather quick, efficient, and effective.  Time is an expensive commodity for legal professionals, especially with the amount of data they have to sift through for cases.  Mondaq explains the importance of knowledge management for law professionals in the article, “United States: A Brief Overview Of Legal Knowledge Management.”

Knowledge management first started in creating an effective process for managing, locating, and searching relevant files, but it quickly evolved into implementing a document managements system.  While knowledge management companies offered law practices decent document management software to tackle the data hill, an even bigger problem arose. The law practices needed a dedicated person to be software experts:

“Consequently, KM emphasis had to shift from finding documents to finding experts. The expert could both identify useful documents and explain their context and use. Early expertise location efforts relied primarily on self-rating. These attempts almost always failed because lawyers would not participate and, if they did, they typically under- or over-rated themselves.”

The biggest problem law professional face is that they might invest a small fortune in a document management license, but they do not know how to use the software or do not have the time to learn.  It is a reminder that someone might have all the knowledge and best tools at their fingertips, but unless people have the knowledge on how to use and access it, the knowledge is useless.

Whitney Grace, September 24, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

SharePoint Revealed

September 23, 2015

Microsoft SharePoint. It brings smiles to the faces of the consultants and Certified Experts who can make the collection of disparate parts work like refurbished John Harrison clock.

I read “Microsoft SharePoint ECM Suite for Content Management.” The write up explains that SharePoint became available in 2001. The write up does not reference the NCompass Labs’ acqusition or other bits and pieces of the SharePoint story. That’s okay. It is 2015. Ancient history in terms of Internet time. Also, what is content management? Does it include audio, video, and digital images? What about binaries? What about data happily stored on the state of Michigan’s mainframes?

image

 

 

 

 

 Jack Benny’s Maxwell reminds me of Fast Search’s 1998 approach to information access. With Fast Search inside, SharePoint delivers performance that is up to the Maxwell’s standards for speed, reliability, and engineering excellence.

The write up reveals that SharePoint evolved “gradually.” The most recent incarnation of the system includes a number of functions; specifically mentioned in the article, are:

  • A cloud based service
  • A foundation for collaboration and document sharing
  • A server. I thought there were multiple servers. Guess not.
  • A designer component for creating nifty looking user experiences. Isn’t Visual Studio or other programming tool required as well?
  • Cloud storage. Isn’t this redundant?
  • Search

I prefer a more modern approach to information access. The search systems I use are like a Kia Soul. The code often includes hamsters too.

Here’s what the write up says about search:

Microsoft FAST Search, which provides indexing and efficient search of content of all types.

I like the indexing and “efficient” description. The content of “all types” is interesting as well.

How well does Fast Search in its present incarnation handle audio and video? What about real time streams of social media like the Twitter fire hose? You get the idea. “All” is shorthand for “some” content.

I am not captivated by the whizzy features in SharePoint and its content management capabilities. I am not thrilled with building profiles of employees within an organization. I am pretty relaxed when it comes to collaboration. Phones work pretty well. Email is okay too. I work on documents alone and provide a version for authorized individuals to review. I need no big gun system necessary needed. Just a modern one.

What about Fast Search?

Let me highlight a few salient points:

  • The product originated in Norway. You know where Trondheim is, right? Oslo? Of course. Great in the winter too. The idea burst from academia prior to 1998, when the company was officially set up. That makes the architecture an agile, youthful 17 years old.
  • In 2008, Microsoft paid $1.2 billion for a company which was found wanting in its accountancy skills. After investigations and a legal proceeding, the company seems to have had revenues well below its reported $170 million in 2007. Until the HP Autonomy deal, this was a transaction that helped fuel the “search is a big payday” belief. At an estimated $60 million instead of $170, Microsoft paid about 20 times Fast Search’s 2007 earnings. After the legal eagles landed, the founder of Fast Search found himself on the wrong end of a court decision. Think lock up time.
  • Fast Search is famous for me because its founder told me that he was abandoning Web search for the enterprise search market. Autonomy’s revenue seemed to be a number the founder thought was reachable. As time unspooled, the big pay day arrived for Google. Enterprise search did not work out in terms of Google scale numbers. Fast Search backed out of an ad model to pursue an academic vision of information access as the primary enterprise driver.
  • The Fast Search solution which is part of SharePoint has breathed life into dozens of SharePoint search add ins. These range from taxonomy systems to clustering components to complete snap in replacements for the Fast Search components. Hundreds upon hundreds of consultants make their living implementing, improving, and customizing search and retrieval for SharePoint.

Net net: SharePoint has more than 150 million licensees. SharePoint is the new DOS for the enterprise. SharePoint is a consultant’s dream come true.

For me, I prefer simpler and more recent technology. That 17 year old approach seems more like Jack Benny’s Maxwell than a modern search Kia Soul.

Stephen E Arnold, September 23, 2015

Cloud Excitement: What Is Up?

September 21, 2015

I noted two items about cloud services. The first is summarized in “Skype Is Down Worldwide for Many Users.” I used Skype last week one time. I noted that the system was unable to allow my Skype conversationalist to hear me. We gave up fooling with the systems, and the person who wanted to speak with me called me up. I wonder how much that 75 minute international call cost. Exciting.

I also noted that Amazon went offline for some of its customers on September 21, 2015. The information was in “Amazon Web Services Experiences Outages Sunday Morning, Causing Disruptions On Netflix, Tinder, Airbnb And More.”

Several observations are warranted:

  • What happened to automatic failover, redundancy, and distributed computing? I assumed that Google’s loss of data in its Belgium data center was a reminder that marketing chatter is different from actual data center reality. Guess not?
  • Whom or what will be blamed? Amazon will have a run at the Ashburn, Virginia nexus. Microsoft will probably blame a firmware or software update. The cause may be a diffusion of boots on the ground technical knowledge. Let’s face it. These cloud services are complicated puppies. As staff seek their future elsewhere and training is sidestepped, the potential for failure exists. The fix-it-and-move on approach to engineering adds to the excitement. Failure, in a sense, is engineered into many of these systems.
  • What about the promise of having one’s data in the cloud so nothing is lost, no downtime haunts the mobile device user, and no break in a seamless user experience occurs? More baloney? Yep, probably.

Net net: I rely on old fashioned computing and software methods. I think I lost data about 25 years ago and went offline never. Redundancy, reliability, and fail over take work gentle reader, not marketing and advertising.

How old school. The reason my international call took place was a result of my having different mobile telephony accounts plus an old Bell head landline. Expensive? Sure, but none of this required me to issue a news release, publicize how wonderful my cloud system was, and the egg-on-the-face reality of failure.

Stephen E Arnold, September 21, 2015

Google: Single Point of Failure Engineering

September 18, 2015

Do you recall the lightning strike at the Alphabet Google’s data center in Belgium? Sure you do. Four lightning strikes caused the data center to lose data. See “Lightning in Belgium Disrupts Google Cloud Services.” I asked myself, “How could a redundant system, tweaked by AltaVista wizards decades ago, lose data?”

When I was assembling information for the first study in my three part Google series, I waded through many technical papers and patent documents from the GOOG (now Alphabet). These made clear to me that the GOOG was into redundancy. There were nifty methods with clever names. Chubby, anyone?

Now the Belgium “act of God” must have been an anomaly. Since 2003, the GOOG should have been improving its systems and their robustness. Well, maybe Belgium is lower on the hardened engineering list?

I found this article quite interesting: “Google Is 2 Billion Lines of Code. And It Is All in One Place.” Presumably the knowledge embodied in ones and zeros is not in one place. Nope. The code is in 10 data centers, kept in check with Piper, a home brew code management system.

But, I noted:

There are limitations this system. Potvin [Google wizard] says certain highly sensitive code—stuff akin to the Google’s PageRank search algorithm—resides in separate repositories only available to specific employees. And because they don’t run on the ‘net and are very different things, Google stores code for its two device operating systems—Android and Chrome—on separate version control systems. But for the most part, Google code is a monolith that allows for the free flow of software building blocks, ideas, and solutions.

No lightning strikes are expected. What are the odds for simultaneous lightning strikes at multiple data centers? Not worth worry about this unlikely disaster scenario. Earthquake? Nah. Sabotage? Nah.

No single point of failure for the Alphabet Google thingy. Cloud services just do not lose data most of the time. The key word is “most.”

Stephen E Arnold, September 18, 2015

US Government Outdoes Squarespace and Weebly

September 18, 2015

The ability of the US government to innovate is remarkable. I learned in “18F’s Federalist Helps Agencies Build Websites Faster.” You, gentle reader, probably know that 18F refers to the street on which the ever efficient General Services Administration, part of the White House’s Executive Branch, works its wonders. In addition to a big courtyard, the 18 F Street facility also has an auditorium which sometimes floods, thus becoming a convenient swimming pool for the tireless innovators laboring in the structure a short walk from the president’s oval office.

The write up explained to me:

Currently in its first phase of software testing, the Federalist [the US government’s Web site builder] “automates common tasks for integrating GitHub, a content editor and Amazon Web Services,” so that web developers can manage and create new static websites on one consolidated platform, 18F said in a post on GSA.gov. The toolkit is equipped with a collection of static-site templates and a web-based content editor, allowing agencies to easily add and create section 508-compliant content while cutting the cost of designing an entirely new site or standing up a content management system.

When I read this, I thought about Squarespace, Weebly, and other services which have been providing similar functions, often for free, for many years.

The write up pointed out:

The platform is intended to be a faster, less expensive and more efficient option for developers building static sites and agencies without the website expertise.  According to 18F, Federalist uses the same scalable content delivery strategy developed for healthcare.gov and the recently launched College Scorecard.

Obviously using one of the existing, free or low cost commercial services was inappropriate. The next project will be inventing the wheel and using vulcanized rubber, not polymers. The road map also calls for a multi year study of fire.

Stephen E Arnold, September 18, 2015

Svelte Python Web Crawler

September 17, 2015

Short honk: Looking for a compact, lean Web crawler? Navigate to “A Web Crawler With Asyncio Coroutines.” One of the code wizards is Guido van Rossum. You, gentle reader, are probably aware that Mr. Van Rossum was the author of Python. He is a former Xoogler. The presentation of the crawler is a bit like a box of Legos. You will be rewarded.

Stephen E Arnold, September 17, 2015

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta