Information: Dark Sides and Bright Sides
April 9, 2013
I find the information revolution semi-bright or semi-dark. I read “Are We Paying Enough Attention to Information Technology’s Dark Side?” My first reaction was, “Nah.” Most outfits are worrying about revenues. Google has to deal with the shift from the money Gold Rush of the desktop era to the lower revenue per click of the mobile world. Microsoft has to worry about the economic impact of its initiatives to nowhere. Smaller outfits in search have either been crushed like Convera or squished like Dieselpoint, mired in controversy like Autonomy and Fast Search, or just unable to make ends meet, deliver a product which works, or get their act together long enough to close a deal.
Paradise Lost may help illuminate the dark sides and bright sides of information. A happy quack to Lapidary Apothegms for reminding me of this phrase.
The concern of the “Dark Side” write up is broader. The big issue is Big Ideas. With references to high profile information luminaries like James Clapper, the director of national intelligence, and governmental issues. Here’s the quote I find interesting:
While the idea of lumbering bureaucracies adapting quickly may seem unlikely; it’s entirely possible they’ll adapt just fast enough to remain in place for awhile yet. And instead of quick change, the classic definition of the state will twist and wither. Whether its successor proves good or ill remains to be seen—but if history (and Marc Goodman) is any guide, it’ll be some of each.
The future is the semi-bright and semi-dark situation.
With regard to information, flows of information, data, and knowledge can erode certain structures. In an organization, as information moves more freely, the old chokepoints are bypassed. The notion which has gripped managers and bureaucrats is that flowing information has more of luminescence than cutting off that flow thus casting shadows.
In my experience, information is not neutral. Digitization has its own motive power. In one talk I gave years ago, I pointed out that information breeds more of itself. The image I used in my lecture was a sci-fi decision maker surrounded by Tribbles. Tribbles just kept on making more Tribbles. Bad news were Tribbles in the confines of a starship.
Even though I have worked in information centric businesses and government agencies for decades, I am not sure I understand information. I do not have a clear grasp of its behaviors. Over the years, I have formulated some “laws”, which I describe in some of my writings and talks. A recent example is Arnold’s Law of Vulnerability. In a nutshell, the “law” reports data from our research that says, “As the volume of information increases attack surfaces expand.”
The implication of this “law” is that digital information disconnects from the factual and becomes the propaganda described by Jacques Ellul. A software program which crashes a system or more importantly modifies it in a manner unknown to the system developers is a growing problem.
Conflating political movements, digital data, and next generation systems increases complexity. In short, as informationizing operates, clear thinking becomes more and more difficult. Thus, we now have to navigate in a datascape in which:
- Facts are not facts, even the results of a scientific experiment can be falsified or, more troubling, placed in an “objective journal” as an advertorial
- Systems have minimal ability to detect falsified data from sensors, SMS messages, or data streams which contain signals to which the smart software responds in a Pavlovian way
- Humans accept outputs of systems as though those outputs were a reality which corresponds to the actuality of a single individual.
Work needs to be done in the space between the bright and dark of information. Much remains to be done and not by failed webmasters, azure chip consultants, search engine optimization experts, and unemployed journalists. Perhaps Google’s smart software can just take on the job