Europe: Privacy Footnote

August 11, 2021

If you are not familiar with Chatcontrol, there’s a mostly useful list of resources on the Digital Human Rights blog. The article “Messaging and Chat Control” offers some context as well as a foreshadowing of the possible trajectory of this EU initiative.

The Chatcontrol legislation meshes with Apple’s recent statement that it would be more proactive and transparent about its monitoring activity. You can get a sense of this action in “Expanded Protections for Children.”

A schism exists between those who want to move whatever content is of interest freely. On the other side of the gap are those who want to put controls on digital content flow.

Observations I noted on a flight home from Washington, DC Monday, August 10, 2021, included:

  • Digital content flows accelerate and facilitate some unpleasant facets of human behavior. Vendors have done little since the dawn of “online” to manage corrosive bits. Is this now a surprise that after 50 years, elected officials are trying to take action.
  • The failure to regulate has been a result of generate misunderstanding of the nature of unfettered digital information flows. As I have pointed out, digital content works exactly like glass beads propelled at a rusted fender. Once the rust is gone, keeping the nozzle aimed at the fender blasts the fender away as well. Hence, we have the social fabric in its present and rapidly deteriorating condition.
  • One property of digital information is that those with expertise in digital information can innovate. Thus, there will be workarounds. Some of these will be deployed more rapidly than the filtering and control mechanisms can be updated. I point this out because once a control system is imposed, it becomes increasingly difficult and expensive to keep in tip top shape.

Net net: China has been the pace-setter in this approach to digital information. How easy is it to sketch the trajectory of these long-overdue actions? That’s an interesting question to ponder after a half century to stumble into the school room with a mobile phone and a perception that the online equipped person is a wizard.

Stephen E Arnold, August 11, 2021

A Tiny Idea: Is a New Governmental Thought Shaper Emerging?

August 11, 2021

I read “China’s Top Propaganda Agencies Want to Limit the Role of Algorithms in Distributing Online Content.” What an interesting idea. De-algorithm certain Fancy Dan smart software. Make a human or humanoids responsible for what gets distributed online. Laws apparently are not getting throiugh to the smart software used for certain technology publishing functions. The fix, according to the article, is:

China’s top state propaganda organs, which decide what people can read and watch in the country, have jointly urged better “culture and art reviews” in China partly by limiting the role of algorithms in content distribution, a policy move that could translate into higher compliance costs for online content providers such as ByteDance and Tencent Holdings. The policy guidelines from the Central Propaganda Department of the Communist Party, the Ministry of Culture and Tourism and the State Administration of Radio and Television as well as the China Federation of Literary and Art Circles and Chinese Writers Association, the two state-backed bodies for state-approved artists and authors, mark the latest effort by Beijing to align online content with the state’s agenda and to rein in the role of capital and technology in shaping the country’s minds and mainstream views.

The value of putting a human or humanoids in the target zone is an explicit acknowledgement that “gee, I’m sorry” and “our algorithms are just so advanced my team does not know what those numerical recipes are doing” will not fly or get to the airport.

I am not too interested in the impact of these rules in the Middle Kingdom. What I want to track is how these rules diffuse to nation states which are counting on a big time rail link or money to fund Chinese partners’ projects.

Net net: Chinese government agencies, where monitoring and internal checks and balances are an art form, possibly will make use of interesting algorithms. Commercial enterprises and organizations grousing about China’s rules and regulations will have fewer degrees of freeedom. Maybe no freedom at all. Ideas may not be moving from the US East Coast and the West Coast. Big ideas like clipping algorithmic wings are building in China and heading out. Will the idea catch on?

Stephen E Arnold, August 11, 2021

Online Anonymity: Maybe a Less Than Stellar Idea

July 20, 2021

On one hand, there is a veritable industrial revolution in identifying, tracking, and pinpointing online users. On the other hand, there is the confection of online anonymity. The idea is that by obfuscation, using a fake name, or hijacking an account set up for one’s 75 year old spinster aunt — a person can be anonymous. And what fun some can have when their online actions are obfuscated either by cleverness, Tor cartwheels, and more sophisticated methods using free email and “trial” cloud accounts. I am not a big fan of online anonymity for three reasons:

  1. Online makes it easy for a person to listen to one’s internal demons’ chatter and do incredibly inappropriate things. Anonymity and online, in my opinion, are a bit like reverting to 11 year old thinking often with an adult’s suppressed perceptions and assumptions about what’s okay and what’s not okay.
  2. Having a verified identity linked to an online action imposes social constraints. The method may not be the same as a small town watching the actions of frisky teens and intervening or telling a parent at the grocery that their progeny was making life tough for the small kid with glasses who was studying Lepidoptera.
  3. Individuals doing inappropriate things are often exposed, discovered, or revealed by friends, spouses angry about a failure to take out the garbage, or a small investigative team trying to figure out who spray painted the doors of a religious institution.

When I read “Abolishing Online Anonymity Won’t Tackle the Underlying Problems of Racist Abuse.” I agree. The write up states:

There is an argument that by forcing people to reveal themselves publicly, or giving the platforms access to their identities, they will be “held accountable” for what they write and say on the internet. Though the intentions behind this are understandable, I believe that ID verification proposals are shortsighted. They will give more power to tech companies who already don’t do enough to enforce their existing community guidelines to protect vulnerable users, and, crucially, do little to address the underlying issues that render racial harassment and abuse so ubiquitous.

The observation is on the money.

I would push back a little. Limiting online use to those who verify their identity may curtail some of the crazier behaviors online. At this time, fractious behavior is the norm. Continuous division of cultural norms, common courtesies, and routine interactions destroys.

My thought is that changing the anonymity to real identity might curtail some of the behavior online systems enable.

Stephen E Arnold, July 20, 2021

A Good Question and an Obvious Answer: Maybe Traffic and Money?

July 19, 2021

I read “Euro 2020: Why Is It So Difficult to Track Down Racist Trolls and Remove Hateful Messages on Social Media?” The write up expresses understandable concern about the use of social media to criticize athletes. Some athletes have magnetism and sponsors want to use that “pull” to sell products and services. I remember a technology conference which featured a former football quarterback who explained how to succeed. He did not reference the athletic expertise of a former high school science club member and officer.  As I recall, the pitch was working hard, fighting (!), and a overcoming a coach calling a certain athlete (me, for example) a “fat slug.” Relevant to innovating in online databases? Yes, truly inspirational and an anecdote from the mists of time.

The write up frames its concern this way about derogatory social media “posts”:

Over a quarter of the comments were sent from anonymous private accounts with no posts of their own. But identifying perpetrators of online hate is just one part of the problem.

And the real “problem”? The article states:

It’s impossible to discover through open-source techniques that an account is being operated from a particular country.

Maybe.

Referencing Instagram (a Facebook property), the Sky story notes:

Other users may anonymise their existing accounts so that the comments they post are not traceable to them in the offline world.

Okay, automated systems with smart software don’t do the job. Will another government bill in the UK help.

The write up does everything but comment about the obvious; for example, my view is that online accounts must be linked to a human and verified before posts are permitted.

The smart software thing, the government law thing, and the humans making decision thing, are not particularly efficacious. Why? The online systems permit — if not encourage — anonymity because money maybe? That’s a question for the Sky Data and Forensics team. It is:

a multi-skilled unit dedicated to providing transparent journalism from Sky News. We gather, analyse and visualise data to tell data-driven stories. We combine traditional reporting skills with advanced analysis of satellite images, social media and other open source information. Through multimedia storytelling we aim to better explain the world while also showing how our journalism is done.

Okay.

Stephen E Arnold, July 19, 2021

Experts in Information Experience Real Life Entropy: Not Much Fun, Right?

July 8, 2021

The Internet Is Rotting” is 6,000 words which suggest that the end of “knowledge” is nigh. I am not sure “rotting” is the word I would have used. The subtitle for the write up is quite dramatic:

Too much has been lost already. The glue that holds humanity’s knowledge together is coming undone.

Online has been blasting bits since the late 1960s. A half century later “rot” is evident to the experts who recognize a problem and can provide mostly interesting examples. Here’s one:

This absence of central control, or even easy central monitoring, has long been celebrated as an instrument of grassroots democracy and freedom. It’s not trivial to censor a network as organic and decentralized as the internet. But more recently, these features have been understood to facilitate vectors for individual harassment and societal destabilization, with no easy gating points through which to remove or label malicious work not under the umbrellas of the major social-media platforms, or to quickly identify their sources.

Yep, the example is pretty much everything.

Several observations:

  • Say “Hi” to what happens when “glue” fails in its basic job
  • The elimination of gatekeepers is like pulling rods from a nuclear core. Stuff heats quickly, melts, burns, and eventually decides to take a trip to Entropy World
  • The Internet is a manisfestation of online and is, therefore, one smaller component of the datasphere
  • You can’t go home again.

One of the most visible aspects of digitalization is disintermediation. The gatekeepers are sent packing. Everyone’s an expert in online search, including those who think that Google delivers high value, accurate, unbiased information to faculty and students 24×7.

Paper outputs leave “trails.” These trails can be followed, whether by Dr. Gene Garfield’s link analysis method or by forensic investigators looking at cancelled checks. Now try to find a hard copy of a technical journal in a public library or an institution of higher education. Now try to locate the backfiles. With the shift to digital there are some challenges in the Pathfinder approach:

  1. Gatekeepers cannot be trusted
  2. Digital content providers can filter content, delete it, or not include items
  3. Users cannot determine what information is on point what is baloney
  4. Institutional structures which once assumed responsibility for accuracy have become less stable than the basements of Florida high rises
  5. Government entities struggle to perform basic functions. Hey, the IRS with its whizzy computer systems is years behind in processing tax returns.
  6. Kick back has become the optimal mode for learning. Forget that “hot” approach: Note taking, old fashioned lectures, reading books printed on paper, and writing in longhand.

There is a cultural shift which has occurred. This is not a gerund like rotting. Entropy can be calculated. The math I have done on the back of a 4×6 index card produces one of those cute equations which articular an infinitesimal approach to the construction of linear models. The outputs of these models will be evidence of racing toward zero.

It won’t take 50 years to get a lot closer to the x axis.

Stephen E Arnold, July 8, 2021

Click Rattling: Tech Giants Explain Their Reality to China

July 6, 2021

Will this end well? Do US technology giants — Google, Facebook, Amazon, Apple and others — believe that operating in concert will alter Chinese policy? “American Internet Giants Hit Back at Hong Kong Doxxing Law” reports that “an industry group representing the largest American Internet companies warned Hong Kong’s government that changes to the city’s data-protection laws could impact companies’ ability to provide services in the city.” [You will have to pay up to read this Gray Lady confection, gentle reader.]

What? “Warn”, “could”, “data protection.”

I must be missing something. Isn’t China is a nation state? Its citizens and companies wishing to operate within its boundaries must conform to its rules and regulations or interesting things happen; for example, mobile death vans and a variation on adult day care.

It’s great that there is a Singapore outfit called the Asia Internet Coalition. I think that collaboration among largely unregulated, money centric US corporations is able to take place for such noble purposes as selling ads. However, what nuance of “China is a nation state” eludes this association and its US technology company members?

The write up reports: Shortly after the law was enacted, Facebook, Google and Twitter all said they had suspended responding to data requests from the Hong Kong authorities. Last month, police officers in the city invoked the law to briefly pull down a website that called for unity among expatriate Hong Kongers in the pro-democracy movement.

Will a refusal to respond to a nation state’s requests constitute behavior deemed illegal or seditious by a country like China?

If this news report is on the money, my hunch hypothesis is that some American technology giants are legends in their own minds. They seem to be acting as if they were real countries, just minus the fungible apparatuses of a country. I have a suggestion. Why doesn’t the Asia Internet Coalition invite the top 12 senior managers of those big US companies to a cruise up the Yangtze? The execs can tour the Shanghai Qingpu Prison and check out the abandoned cities of China’s “forced resettlement” policy.

Issue some warnings in a big news conference before boarding the boat. Warn? Hey, great idea. Issue a news release too. Post on social media. Tweet pictures of interesting structures.

Stephen E Arnold, July 6, 2021`

No Internet? No Problem. Well, Maybe a Small Almost Insignificant Hurdle

June 11, 2021

The Internet is an essential tool for modern life, but not everyone in the United States has ready access to broadband services. The US is one of the world’s most developed countries, so how many of its citizens cannot get online? In 2020, the Federal Communications Commission (FCC), estimated that 14.5 million Americans lacked Internet access, however, that number is no where near the truth.

The Daily Dot investigates the real amount in the article: “New Study Shows Digital Divide Is Much Worse Than The Government Says It Is.” BroadbandNow released a report that stated 42 million Americans were unable to access broadband services. The digital divide was a huge concern during the COVID pandemic as remote workers and students were forced to work in fast food parking lots and other locations with free Wi-Fi. BroadbandNow calculated 42 million by:

“BroadbandNow manually checked more than 58,000 addresses using “check availability” tools from 11 large internet service providers (ISPs) to see whether wired or fixed wireless service was available. The addresses were from areas that at least one of those 11 ISPs offered service according to a form the FCC has where ISPs self-report whether broadband is being served

That form, Form 477, has been criticized in the past because if an ISP offers service to just one home in a Census block, the FCC counts that entire area as having access from that provider. That is an issue because many Census blocks can be enormous, and counting one person as having access as serving an entire area leads to over-reporting of availability.”

The report also discovered that all types of Internet sources were over reported on broadband maps. The maps used to determine broadband access are known to contain errors. The FCC plans to design a new system to more accurately measure broadband needs in the US.

Congress passed the “Broadband DATA Act” in March 2020 and funds for broadband mapping were included in a COVID relief bill. Despite the need for Internet services, Congress continues to argue and waste taxpayer money over the last administration.

Whitney Grace, June 11, 2021

Google Is Not the Cause of a Decline in Newspaper Revenue

June 9, 2021

At a Google function, I met the founder of Craigslist. Now in a Silicon Valley way, the company has fingered that individual’s online service as the reason the newspaper industry collapsed. Well, maybe not completely collapsed but deteriorated enough for the likes of Silicon Valley titans to become the arbiters of truth.

The article “Google Decodes What Actually Led to Fall in Newspaper Revenue” states:

As print media houses struggle to sustain in the digital news era, a Google-led study has revealed that the decline of newspaper revenue is not happening because of Search or social advertising but from the loss of newspaper classifieds to specialist online players.

I believe this. I believe absolutely everything I read online. I am not a thumbtyper or a TikTokker, but I do try.

The analysis from economists at Accenture, commissioned by Google, looks at the revenues of newspapers in Western Europe over nearly two decades to reveal exactly what broke the old business model for newspapers.

The bad news is:

While many readers are not in the habit of paying for access to news, between 2013 and 2018, digital circulation volumes increased by 307 per cent to reach 31.5 million paying subscribers in the Western Europe region, more than offsetting the decline in paid print subscriptions.

The article reports that Google funded research revealed:

Google is significantly contributing to that growth. Over the past 20 years, Google has collaborated closely with the news industry and is one of the world’s biggest financial supporters of journalism, providing billions of dollars to support the creation of quality journalism in the digital age…

As I said, I believe everything I read online. And what about that person who created Craigslist? He may regret gobbling down those Googley hors d’oeuvres. Will newspaper publishers? Probably but those estimable titans of information may choke on the celery stick with weird sand color dip.

Stephen E Arnold, June 9, 2021

Misunderstanding Censorship: It Is Not Just Words

June 3, 2021

Popular words now are take down (killing servers), block (filter users or items on a stop list), cancel (ignoring a person or terminating an API call), and a pride of synonyms like terminate with extreme prejudice. The idea is that censorship is go to method to cultivate a more pleasing digital garden. But who owns the garden? The answer is that “ownership” depends on one’s point of view.  Big tech has one role to play. Those contributing content in different media have another. The person who reads, listens, or watches “information” gets in the act as well.

The popular words reflect an interesting development. Those “in charge” want to preserve their kingpin role. Those who have an audience want to remain popular and get even more popular if possible. Those users want to consume what they want and will use available tools to satisfy their wants and needs.

In short, censorship seems to be a way for someone in a position to be a gatekeeper to impose a particular view upon information, how something “works” in the datasphere, or what “content” can flow into, through, and out of a 2021 system.

The first example of this imposition of a view point is articulated in “PayPal Shuts Down Long-Time Tor Supporter with No Recourse.” The main point is that an individual who contributed to the Tor project has been “booted” or “terminated with extreme prejudice” from the quasi-bank financial services operation PayPal. The article asserts:

For years, EFF has been documenting instances of financial censorship, in which payment intermediaries and financial institutions shutter accounts and refuse to process payments for people and organizations that haven’t been charged with any crime. Brandt shared months of PayPal transactions with the EFF legal team, and we reviewed his transactions in depth. We found no evidence of wrongdoing that would warrant shutting down his account, and we communicated our concerns to PayPal. Given that the overwhelming majority of transactions on Brandt’s account were payments for servers running Tor nodes, EFF is deeply concerned that Brandt’s account was targeted for shut down specifically as a result of his activities supporting Tor.

Does PayPal the company have strong feelings about software which obfuscates certain online activities? Tor emerged years ago from a government commercial research project. Now it is one of the vehicles allowing some users to engage in cyber crime-like activities. The write up does not dig too deeply into the who, what, when, why, how, and circumstances of “financial persecution.” That’s not surprising because PayPal is a commercial enterprise and can mostly do what it wants. The main point for me is that this type of blocking action has nothing to do with words.

I also want to mention that Amazon Twitch has been wrestling with take downs too. A popular “content creator” named Amouranth was blocked. Also, a 21st century talk show host known as BadBunny was banned. Amouranth’s Twitch stream featured a kiddie pool, an interesting fashion statement in the form of a bathing suit, and lots of eye shadow. BadBunny’s “issue” was related to words. I am not sure what BadBunny is talking about, but apparently the Twitch “proctors” do. So she had to occupy herself with other content creation for two weeks until she was reinstated. At the same time, a content creator named ibabyrainbow (whom I featured in my April National Cyber Crime Conference talk) provides links to Twitch followers who want more intriguing videos of ibabyrainbow’s antics. Thus, far ibabyrainbow has not run afoul of Amazon’s “curators” but Amazon may not know that ibabyrainbow provides other content on different services under the name of babyrainbow. Some of this content could be considered improper in certain countries.

Then I want to reference a remarkable essay about censorship called “How Censorship Became the New Crisis for Social Networks.” This write up states:

There are two strains of outrage related to censorship currently coursing through the platforms. The first are concerns related to governments enacting increasingly draconian measures to prevent their citizens from expressing dissent…. The second and perhaps more novel strain of outrage over censorship relates not to governments but to platforms themselves.

That’s tidy: A dichotomy, an either or, good evil, savage and civilized. Not exactly. I think the reality is messy and generating new complexities as each mouse click or finger swipe occurs.

People generally dislike change. If change is inevitable, some people prefer to experience the change at their own pace. Today the ease with which a threshold can be changed in an algorithm is disconcerting. What happened to my Google photos? Or Why can’t I access my iTunes account? are part of everyday life. Where’s BadBunny, Mr. Twitch?

My view is that censorship and its synonyms to polish up these actions designed to control information has been a standard operating procedure for many, many years. Book burning, anyone? The motivation is to ensure that power is retained, money flows, and particular views are promulgated.

The datasphere is magnifying the ease, effectiveness, and intention of managing words, images, and actions. I prefer to think of censorship as “proaction”; that is, taking the necessary steps to allow those with their hands on the knobs and wheels to further their own ends.

Instead of “terminated with extreme prejudice” implore “proactive measures.” Who is doing it? Maybe China, Iran, North Korea, Russia, and a number of other nation states? What commercial enterprises are practicing proaction? Maybe the FAANGs, the Bezos property Washington Post, the hip digital thing known as the New York Times, and anyone who can direct digital streams to benefit themselves.

Censorship — what I call proaction — is the new normal.

Adapt and avoid dichotomies. That type of thinking is for third graders.

Stephen E Arnold, June 3, 2021

An App Twist: Online Interaction and Dark Patterns May Pose a Threat to Users

May 12, 2021

I don’t know if this write up is spot on, but it does raise some interesting questions. Navigate to “Snapchat Can Be Sued over Its Speed Filter, Which Is Blamed in Death of 3.” The main point is that a popular app provides “points”. One reward is linked to moving rapidly. Examples include bike riding and walking one’s dog. The story points out:

The parents of two of the victims say the filter, which tells users how fast they are moving in real-time, encouraged users to drive recklessly in order to receive achievement points. Now, it appears the 9th U.S. Circuit Court of Appeals agrees that a lawsuit should be permitted. In a ruling on Tuesday, the court argued that Snapchat was not shielded by Section 230 of the Communications Decency Act (CDA), which protects social media companies from being held liable for the content posted by its users. The lawsuit was originally filed in 2019 and had been shot down just last year. But Circuit Judge Kim McLane Wardlaw agreed with the families this week who argued that the lawsuit was aimed at the app itself and not its content.

Was the issue judgment? No, according to the article:

Snapchat has been accused of “negligent design” for implementing the speed filter into its app.

The write up includes this statement from the court:

Their negligent design lawsuit treats Snap as a products manufacturer, accusing it of negligently designing a product (Snapchat) with a defect (the interplay between Snapchat’s reward system and the Speed Filter),” Wardlaw [the legal eagle hearing the case] wrote.

Here are the questions which crossed my mind:

  • Will “design” emerge as a factor in other litigation related to apps’ use?
  • Is the “reward” idea a Dark Pattern which is coded so that those using the apps are manipulated into certain behaviors?
  • How do innovators respond to “design” centric issues?
  • Are the parents responsible for their progenies’ judgment? Schools?

On the surface, it seems that app design can lead to tragic consequences.  Life, liberty, and the pursuit of rewards echoes in the Beyond Search office.

Stephen E Arnold, May 12, 2021

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta