China and That Old Time Religion: Oil and Water?

September 22, 2021

Chairman Mao Zedong infamously said, “Religion is the opiate of the people.” Since the communist takeover in China, the country’s government has not sanctioned any religion. In short, China does not like religion at all. China does not like Christianity, Judaism, Hinduism, Buddhism, nor Islam.

Islam is a hot button issue for China, because of its extermination of Uyghurs Muslims. China has not formerly acknowledged the Uyghur genocide. China does not like the Uyghurs, because the the minor Islamic denomination are separating themselves from the main Chinese population. Under the Chinese government, all people are equal and the same. The government does not like it when people separate themselves into ethnic or religious groups. Uyghur adults are being sent to extermination camps, while Uyghur children are separated from their parents and reeducated. China’s population crisis is another issue.

China banning the Koran reader is not any different from banning the Bible, Torah, or other religious documents. China notoriously bans literature and other media that the government finds contrary to its ideals. A developer named Ameir tweeted on Twitter that he uploaded the Koran reader to the China Apple App Store and he was told:

“I got notified from Apple that the Quran Reader has been removed from sale in China because it has ‘content that is illegal in China as determined by the Cyberspace Administration of China.’It’s literally just the Quran.”

Another user replied that China does not allow the Bible online either.

Whitney Grace, September 22, 2021

Enlightened Newspaper Deletes Info

September 21, 2021

News media outlets usually post a retraction or correction if they delete something. The Daily Dot tattles on a popular British nets outlet when it deleted content: “ ‘This Is Astonishing’: The Guardian Removed A TERF-Critical Passage From An Article.” What is even more upsetting is that the Guardian removed the passage a few hours after it was posted.

The article in question was an interview with gender theorist Judith Butler, who also wrote the book Gender Trouble: Feminism and the Subversion of Identity that includes information about a partnership between fascists and trans exclusionary radical feminists (TERFs) or anti-trans feminists. The Guardian did post an editorial note saying the piece was changed on September 7, 2021. The deleted portion was mistakenly associated with an incident at Wi Spa in Los Angeles, where a purported trans-woman was in the women’s only nude section. The exposed trans-women was charged with indecent exposure in front of women and children the past.

Jules Gleeson, the article’s author, asked a question that referenced the Wi Spa incident, but Butler’s response was a general answer and did not mention the spa. Gleeson offered to rewrite the article, but The Guardian declined. The entire interview has fallen victim to the Streisand effect, it has become popular because the Guardian tried to cover it up:

“In an email to the Daily Dot, Gleeson confirmed that she offered to revise the question. ‘Unfortunately, the Guardian editors decided to go ahead with their decision to censor Judith Butler,’ she said. ‘I can only hope that the overall point Judith Butler was making can receive some wider circulation, in light of this controversy,’ she continued. ‘The Heritage Foundation and Proud Boys (and those who collaborate with them) are threats to us that deserve more than online intrigue and editorial backpedalling.’”

The British media leans towards an anti-trans opinion, so the deleted passage upset readers. Gleeson’s note is correct, it does draw more attention to trans-people’s struggles and approaching the trans-rights discussion with intellectual curiosity.

Whitney Grace, September 21, 2021

Wiki People: One Cannot Find Online Information If It Is Censored

September 2, 2021

Women have born the brunt of erasure from history, but thanks web sites like Wikipedia, their stories are shared more than ever. There is a problem with Wikipedia though, says CBC in the article: “Canadian Nobel Scientist’s Deletion From Wikipedia Points To Wider Bias, Study Finds.” Wikipedia is the most comprehensive, collaborative, and largest encyclopedia in human history. It is maintained by thousands of volunteer editors, who curate the content, verify information, and delete entries.

There are different types of Wikipedia editors. One type is an “inclusionist,” an editor who takes broad views about what to include in Wikipedia. The second type are “deflationists,” who have high content standards. American sociologist Francesca Tripodi researched the pages editors deleted and discovered that women’s pages are deleted more than men’s. Tripodi learned that 25% of women’s pages account for all deletion recommendations and their pages only make up 19% of the profiles.

Experts say it is either gender bias or notability problem. The notability is a gauge Wiki editors use to determine if a topic deserves a page and they weigh the notability against reliable sources. What makes a topic notable, Tripodi explained, leads to gender bias, because there is less information on them. It also does not help that many editors are men and there are attempts to add more women:

“Over the years, women have tried to fix the gender imbalance on Wikipedia, running edit-a-thons to change that ratio. Tripodi said these efforts to add notable women to the website have moved the needle — but have also run into roadblocks. ‘They’re welcoming new people who’ve never edited Wikipedia, and they’re editing at these events,’ she said. ‘But then after all of that’s done, after these pages are finally added, they have to double back and do even more work to make sure that the article doesn’t get deleted after being added.”

Unfortunately women editors complain they need to do more work to make sure their profiles are verifiable and are published. The Wikipedia Foundation acknowledges that the lack of women pages, because it reflects world gender biases. The Wikipedia Foundation, however, is committed to increasing the amount of women pages and editors. The amount of women editors has increased over 30% in the past year.

That is the problem when there is a lack of verifiable data about women or anyone erased from history due to biases. If there is not any information on them, they cannot be searched even by trained research librarians like me. Slick method, right?

Whitney Grace, September 2, 2021

Thailand Does Not Want Frightening Content

August 6, 2021

The prime minister of Thailand is Prayut Chan-o-cha. He is a retired Royal Thai Army officer, and he is not into scary content. What’s the fix? “PM Orders Internet Blocked For Anyone Spreading Info That Might Frighten People” reported:

Prime Minister Prayut Chan-o-cha has ordered internet service providers to immediately block the internet access of anyone who propagates information that may frighten people. The order, issued under the emergency situation decree, was published in the Royal Gazette on Thursday night and takes effect on Friday. It prohibits anyone from “reporting news or disseminating information that may frighten people or intentionally distorting information to cause a misunderstanding about the emergency situation, which may eventually affect state security, order or good morality of the people.”

So what’s “frightening?” I for one find the idea of having access to the Internet blocked. Why not just put the creator of frightening content in one of Thailand’s exemplary and humane prisons? These, as I understand the situation, feature ample space, generous prisoner care services, and healthful food. With an occupancy level of 300 percent, what’s not to like?

Frightening so take offline I guess.

Stephen E Arnold, August 6, 2021

Facebook Lets Group Admins Designate Experts. Okay!

August 2, 2021

Facebook once again enlists the aid of humans to impede the spread of misinformation, only this time it has found a way to avoid paying anyone for the service. Tech Times reports, “Facebook Adds Feature to Let Admin in Groups Chose ‘Experts’ to Curb Misinformation.” The move also has the handy benefit of shifting responsibility for bad info away from the company. We wonder—what happened to that smart Facebook software? The article does not say. Citing an article from Business Insider, writer Alec G. does tell us:

“The people who run the communities on Facebook now have the authority to promote individuals within its group to gain the title of ‘expert.’ Then, the individuals dubbed as experts can be the voices of which the public can then base their questions and concerns. This is to prevent misinformation plaguing online communities for a while now.”

But will leaving the designation of “expert” up to admins make the problem worse instead of better? The write-up continues:

“The social platform now empowers specific individuals inside groups who are devoted to solely spreading misinformation-related topics. The ‘Stop the Steal’ group, for example, was created in November 2020 with over 365,000 members. They were convinced that the election for the presidency was a fraud. If Facebook didn’t remove the group two days later, it would continue to have negative effects. Facebook explained that the organization talked about ‘the delegitimization of the election process,’ and called for violence, as reported by the BBC. Even before that, other groups within Facebook promoted violence and calls to action that would harm the civility of the governments.”

Very true. We are reminded of the company’s outsourced Oversight Board created in 2018, a similar shift-the-blame approach that has not worked out so well. Facebook’s continued efforts to transfer responsibility for bad content to others fail to shield it from blame. They also do little to solve the problem and may even make it worse. Perhaps it is time for a different (real) solution.

Cynthia Murrell, August 2, 2021

Putin Has Kill Switch

July 26, 2021

“Russia Disconnected Itself from the Global Internet in Tests” shares an intriguing factoid. Mr. Putin can disconnected the country from the potato fields near Estonia to the fecund lands where gulags once bloomed. The write up reports:

State communications regulator Roskomnadzor said the tests were aimed at improving the integrity, stability and security of Russia’s Internet infrastructure…

If a pesky cyber gang shuts down the Moscow subway from Lichtenstein, it’s pull the plug time. The idea is that Russia will not have to look outside of its territory to locate the malefactors. If outfits like Twitter refuse to conform to Russian law, the socially responsible company may lose some of its Russian content creators.

What other countries will be interested in emulating Russia’s action or licensing the technology? I can think of a few. The Splinter Net is starting to gain momentum. Those ideals about information wanting to be free and the value of distributed systems seem out of step with Mr. Putin’s kill switch.

Stephen E Arnold, July 26, 2021

Russia: Getting Ready for Noose Snugging

June 23, 2021

Tens of thousands of Russian citizens have taken to the streets in protest and the government is cracking down. On social media platforms, that is. Seattle PI reports, “Russia Fines Facebook, Telegram Over Banned Content.” The succinct write-up specifies that Facebook was just fined 17 million rubles (about $236,000) and messaging app Telegram 10 million rubles ($139,000) by a Moscow court. Though it was unclear what specific content prompted these latest fines, this seems to be a trend. We learn:

“It was the second time both companies have been fined in recent weeks. On May 25, Facebook was ordered to pay 26 million rubles ($362,000) for not taking down content deemed unlawful by the Russian authorities. A month ago, Telegram was also ordered to pay 5 million rubles ($69,000) for not taking down calls to protest. Earlier this year, Russia’s state communications watchdog Roskomnadzor started slowing down Twitter and threatened it with a ban, also over its alleged failure to take down unlawful content. Officials maintained the platform failed to remove content encouraging suicide among children and containing information about drugs and child pornography. The crackdown unfolded after Russian authorities criticized social media platforms that have been used to bring tens of thousands of people into the streets across Russia this year to demand the release of jailed Russian opposition leader Alexei Navalny, President Vladimir Putin’s most well-known critic. The wave of demonstrations has been a major challenge to the Kremlin. Officials alleged that social media platforms failed to remove calls for children to join the protests.”

Yes, Putin would have us believe it is all about the children. He has expressed to the police his concern for young ones who are tempted into “illegal and unsanctioned street actions” by dastardly grown-ups on social media. His concern is touching.

Beyond Search thinks Mr. Putin’s actions are about control. An article in Russian named “Sovereign DNS Is Already Here and You Haven’t Noticed” provides information that suggests Mr. Putin’s telecommunications authority has put the machinery in place to control Internet access within Russia.

Fines may be a precursor to more overt action against US companies and content the Russian authorities deem in appropriate.

Cynthia Murrell, June 23, 2021

China: More Than a Beloved Cuisine, Policies Are Getting Traction Too

June 16, 2021

As historical information continues to migrate from physical books to online archives, governments are given the chance to enact policies right out of Orwell’s 1984. And why limit those efforts to one’s own country? Quartz reports that “China’s Firewall Is Spreading Globally.” The crackdown on protesters in Tiananmen Square on June 4, 1989 is a sore spot for China. It would rather those old enough to remember it would forget and those too young to have seen it on the news never learn about it. The subject has been taboo within the country since it happened, but now China is harassing the rest of the world about it and other sensitive topics. Worse, the efforts appear to be working.

Writer Jane Li begins with the plight of activist group 2021 Hong Kong Charter, whose website is hosted by Wix. The site’s mission is to build support in the international community for democracy in Hong Kong. Though its authors now live in countries outside China and Wix is based in Israel, China succeeded in strong-arming Wix into taking the site down. The action did not stick—the provider apologized and reinstated the site after being called out in public. However, it is disturbing that it was disabled in the first place. Li writes:

“The incident appears to be a test case for the extraterritorial reach of the controversial national security law, which was implemented in Hong Kong one year ago. While Beijing has billed the law as a way to restore the city’s stability and prosperity, critics say it helps the authorities to curb dissent as it criminalizes a broad swathe of actions, and is written vaguely enough that any criticism of the Party could plausibly be deemed in violation of the law. In a word, the law is ‘asserting extraterritorial jurisdiction over every person on the planet,’ wrote Donald Clarke, a professor of law at George Washington University, last year. Already academics teaching about China at US or European universities are concerned they or their students could be exposed to greater legal risk—especially should they discuss Chinese politics online in sessions that could be recorded or joined by uninvited participants. By sending the request to Wix, the Hong Kong police are not only executing the expansive power granted to them by the security law, but also sending a signal to other foreign tech firms that they could be next to receive a request for hosting content offensive in the eyes of Beijing.”

One nation attempting to seize jurisdiction around the world may seem preposterous, but Wix is not the only tech company to take this law seriously. On the recent anniversary of the Tiananmen Square crackdown, searches for the event’s iconic photo “tank man” turned up empty on MS Bing. Microsoft blamed it on an “accidental human error.” Sure, that is believable coming from a company that is known to cooperate with Chinese censors within that country. Then there was the issue with Google-owned YouTube. The US-based group Humanitarian China hosted a ceremony on June 4 commemorating the 1989 event, but found the YouTube video of its live stream was unavailable for days. What a coincidence! When contacted, YouTube simply replied there may be a possible technical issue, what with Covid and all. Of course, Google has its own relationship to censorship in China.

Not to be outdone, Facebook suspended the live feed of the group’s commemoration with the auto-notification that it “goes against our community standards on spam.” Right. Naturally, when chastised the platform apologized and called the move a technical error. We sense is a pattern here. One more firm is to be mentioned, though to be fair some of these participants were physically in China: Last year, Zoom disabled Humanitarian China’s account mid-meeting after the group hosted its Covid-safe June 4th commemoration on the platform. At least that company did not blame the action on a glitch; it made plain it was at the direct request of Beijing. The honesty is refreshing.

Cynthia Murrell, June 16, 2021

The Addiction Analogy: The Cancellation of Influencer Rand Fishkin

May 5, 2021

Another short item. I read a series of tweets which you may be able to view at this link. The main idea is that an influencer was to give a talk about marketing. The unnamed organizer did not like Influencer Fishkin’s content. And what was that content? Information and observations critical of the outstanding commercial enterprises Facebook and Google. The apparent points of irritation were Influencer Fishkin’s statements to the effect that the two estimable outfits (Facebook and Google) were not “friendly, in-your-corner partners.” Interesting, but for me that was only part of the story.

Here’s what I surmised from the information provided by Influencer Fishkin:

  1. Manipulation is central to the way in which these two lighthouse firms operate in the dark world of online
  2. Both venerated companies function without consequences for their actions designed to generated revenue
  3. The treasured entities apply the model and pattern to “sector after sector.”

Beyond Search loves these revered companies.

But there is one word which casts a Beijing-in-a-sandstorm color over Influencer Fishkin’s remarks. And that word is?


The idea is that these cherished organizations use their market position (which some have described as a monopoly set up) and specific content to make it difficult for a “user” of the “free” service to kick the habit.

My hunch is that neither of these esteemed commercial enterprises wants to be characterized as purveyor of gateway drugs, digital opioids, or artificers who put large monkeys on “users” backs.

That’s not a good look.

Hence, cancellation is a pragmatic fix, is it not?

Stephen E Arnold, May 5, 2021

Selective YouTube Upload Filtering or Erratic Smart Software?

May 4, 2021

I received some information about a YouTuber named Aquachiggers. I watched this person’s eight minute video in which Aquachigger explained that his videos had been downloaded from YouTube. Then an individual (whom I shall described as an alleged bad actor) uploaded those Aquachigger videos with a the alleged bad actor’s voice over. I think the technical term for this is a copyright violation taco.

I am not sure who did what in this quite unusual recycling of user content. What’s clear is that YouTube’s mechanism to determine if an uploaded video violates Google rules (who really knows what these are other than the magic algorithms which operate like tireless, non-human Amazon warehouse workers). Allegedly Google’s YouTube digital third grade teacher software can spot copyright violations and give the bad actor a chance to rehabilitate an offending video.

According to Aquachigger, content was appropriated, and then via logic which is crystalline to Googlers, notified Aquachigger that his channel would be terminated for copyright violation. Yep, the “creator” Aquachigger would be banned from YouTube, losing ad revenue and subscriber access, because an alleged bad actor took the Aquachigger content, slapped an audio track over it, and monetized that content. The alleged bad actor is generating revenue by unauthorized appropriation of another person’s content. The key is that the alleged bad actor generates more clicks than the “creator” Aquachigger.

Following this?

I decided to test the YouTube embedded content filtering system. I inserted a 45 second segment from a Carnegie Mellon news release about one of its innovations. I hit the upload button and discovered that after the video was uploaded to YouTube, the Googley system informed me that the video with the Carnegie Mellon news snip required further processing. The Googley system labored for three hours. I decided to see what would happen if I uploaded the test segment to Facebook. Zippity-doo. Facebook accepted my test video.

What I learned from my statistically insignificant test that I could formulate some tentative questions; for example:

  1. If YouTube could “block” my upload of the video PR snippet, would YouTube be able to block the Aquachigger bad actor’s recycled Aquachigger content?
  2. Why would YouTube block a snippet of a news release video from a university touting its technical innovation?
  3. Why would YouTube, create the perception that Aquachigger be “terminated”?
  4. Would YouTube be allowing the unauthorized use of Aquachigger content in order to derive more revenue from that content on the much smaller Aquachigger follower base?

Interesting questions. I don’t have answers, but this Aquachigger incident and my test indicate that consistency is the hobgoblin of some smart software. That’s why I laughed when I navigated to Jigsaw, a Google service, and learned that Google is committed to “protecting voices in conversation.” Furthermore:

Online abuse and toxicity stops people from engaging in conversation and, in extreme cases, forces people offline. We’re finding new ways to reduce toxicity, and ensure everyone can safely participate in online conversations.

I also learned:

Much of the world’s internet users experience digital censorship that restricts access to news, information, and messaging apps. We’re [Google] building tools to help people access the global internet.

Like I said, “Consistency.” Ho ho ho.

Stephen E Arnold, May 4, 2021

Next Page »

  • Archives

  • Recent Posts

  • Meta