Easier Targets for Letter Signers: Joe Rogan and Spotify

January 13, 2022

YouTube received a missive from fact checkers exhorting the online ad giant to do more to combat misinformation. Ah, would there were enough fact checkers. YouTube, despite having lots of money, is an easier target for government regulators. Poke Googzilla in the nose or pull its charming tail, and the beast does a few legal thrashes and then outputs money. France and Russia love this beast baiting. Fact checkers? Not exactly in the same horsepower class as the country with fancy chickens or hearty Siberians wearing hats made of furry creatures.

I noted “Scientists, Doctors Call on Spotify to Implement Misinformation Policy Over Claims on Joe Rogan Show.” Spotify is not yet a Google-type of operation. Furthermore the point of concern is a person who was a paid cheerleader for that outstanding and humane sporting activity mixed martial arts. My recollection is that Mr. Rogan received some contractual inducements to provide content to the music service and cable TV wannabe. He allegedly has a nodding acquaintance with intravenous vitamin drips, creatine, and fish oil. You can purchase mugs from which one can guzzle quercetin liquid. Yum yum yum. Plus you can enjoy these Rogan-centric products wearing a Joe Rogan T shirt. (Is that a a mystic symbol or an insect on the likenesses’ forehead?)


The write up states:

More than 260 doctors, nurses, scientists, health professionals and others have signed an open letter calling on the streaming media platform Spotify to “implement a misinformation policy” in the wake of controversy over podcaster Joe Rogan’s promotion of an anti-vaccine rally with discredited scientist Robert Malone in an episode published on December 31st. Rogan has repeatedly spread vaccine misinformation and discouraged vaccine use. The December episode attracted attention in part because Dr. Malone falsely claimed millions of people were “hypnotized” to believe certain facts about COVID-19, and that people standing in line to get tested as the omicron variant has driven record new cases of the virus was an example of “mass formation psychosis,” a phenomenon that does not exist.

Impressive. The hitch in the git along is that Mr. Rogan attracts more eyeballs and listeners than some mainstream news outlets. He is an entertainer, and one might make the case that he is a comedian, pulling the leg of guests and of some listeners. I think of him as an intellectual Adam Carolla. Note that I am aware of the academic credentials of both of these stars.

The larger issue is that these letters beef up the résumé of the publicists working on these missives. Arguments and discussions in online for a whip up eddies of concern.

There are a few problems:

  1. Misinformation, disinformation, and reformation of factual data are standard functions of the human.
  2. Identifying and offering counter arguments depends upon one’s point of view.
  3. Spotify receives content and makes it available. Conduits are not as efficient in modifying what an entertainer does in near real time before the entertainer entertains.

Why not tell Spotify to drop Mr. Rogan? Money, contracts, and the still functional freedom of speech thing.

Will more letters arrive this week? My hunch is that the French, Russian, et al approach might ultimately be more pragmatic. Whom does the publicity for the control Rogan letter benefit?

Maybe Mr. Rogan?

Stephen E Arnold, January 13, 2022

Some Want YouTube to Check Facts: A Fantastical Idea

January 12, 2022

I wanted to look up a function for the DaVinci Resolve “scripting” feature. I spotted a YouTube video about the subject. The information in the video was incorrect. Is Google responsible for this factual misstep? Is DaVinci’s owner Black Magic going to rush to the editing room to create an accurate programming video? Will DaVinci users revolt, hold a protest, burn a pile of Black Magic video switchers? Nope.

An Open Letter to YouTube’s CEO from the World’s Fact Checkers” states:

What we do not see is much effort by YouTube to implement policies that address the problem [Covid information]. On the contrary, YouTube is allowing its platform to be weaponized by unscrupulous actors to manipulate and exploit others, and to organize and fundraise themselves. Current measures are proving insufficient. That is why we urge you to take effective action against disinformation and misinformation, and to elaborate a roadmap of policy and product interventions to improve the information ecosystem – and to do so with the world’s independent, non-partisan fact-checking organizations.

Okay, facts about Covid. How are those “facts” about Covid weathering the often conflicting flow of data? Government officials and Covid experts descend into primary school playground arguments. I love the use of visual aids too. What about the factual errors in many videos on YouTube? Who exactly is able to identify an error and take or recommend a specific action?

This is a fantastical idea, and it is one that may lead to online discussions, legal kerfuffles, and some videos being removed.

The notion of free hosting and streaming of videos means that unless YouTube gates the uploads or starts charging for storage and streaming, the volume is likely to overwhelm the world’s fact checkers. My hunch is that there are more wanna be YouTube stars than fact checkers. Perhaps Google’s stellar under content diving machine will automate the process using close enough for horse shoes methods? Perhaps will just hire an editorial team and operate in the manner of the late and much needed traditional newspaper industry despite the taint of yellow journalism, advertorials, hobby horses, and reportorial bias.

Net net: Nice letter but after a meeting the missive will be handed over to Google legal and PR. YouTube shall accept and stream as usual.

Stephen E Arnold, January 12, 2022

Russia May Not Contribute to the Tor Project in 2022

December 28, 2021

This is probably not a surprise to those involved with the Tor Project. We noted some evidence of Russia’s view of anonymized Internet browsing in “Russia Blocks Privacy Service Tor In Latest Move To Control Internet.” The article reports:

Russia’s media regulator has blocked the online anonymity service Tor in what is seen as the latest move by Moscow to bring the Internet in Russia under its control. Roskomnadzor announced it had blocked access to the popular service on December 8, cutting off users’ ability to thwart government surveillance by cloaking IP addresses.

The Tor Project responded with some tech tips for ways to get around the Putin partition. (Think Tor bridge. Some details are at this link.)

Does this mean that Russia has no interest in Tor? Nope. We think that some of Mr. Putin’s fellow travelers are hosting Tor relay servers, but that’s just something we heard from a person yapping about freedom.

What’s next? How about blocking any service originating in nation states not getting with Mr. Putin’s Ukrainian program? It is unlikely that Sergey Brin’s flight on a Russian rocket ship will become a reality in 2022. We also heard that the Google Cloud hosts some services that Mr. Putin thinks may erode the freedoms enjoyed by Russian citizens.

Stephen E Arnold, December 28, 2021

Content Control: More and More Popular

December 7, 2021

A couple recent articles emphasize there is at least some effort being made to control harmful content on social media platforms. Are these examples of responsible behavior or censorship? We are not sure. First up, a resource content creators may wish to bookmark—“5 Banned Content Topics You Can’t Talk About on YouTube” from MakeUseOf. Writer Joy Okumoko goes into detail on banned topics from spam and deception to different types of sensitive or dangerous content. Check it out if curious about what will get a YouTube video taken down or account suspended.

We also note an article at Engadget, “Personalized Warnings Could Reduce Hate Speech on Twitter, Researchers Say.” Researchers at NYUs Center for Social Media and Politics set up Twitter accounts and used them to warn certain users their language could get them banned. Just a friendly caution from a fellow user. Their results suggest such warnings could actually reduce hateful language on the platform. The more polite the warnings, the more likely users were to clean up their acts. Imagine that—civility begets civility. Reporter K. Bell writes:

“They looked for people who had used at least one word contained in ‘hateful language dictionaries’ over the previous week, who also followed at least one account that had recently been suspended after using such language. From there, the researchers created test accounts with personas such as ‘hate speech warner,’ and used the accounts to tweet warnings at these individuals. They tested out several variations, but all had roughly the same message: that using hate speech put them at risk of being suspended, and that it had already happened to someone they follow. … The researchers found that the warnings were effective, at least in the short term. ‘Our results show that only one warning tweet sent by an account with no more than 100 followers can decrease the ratio of tweets with hateful language by up to 10%,’ the authors write. Interestingly, they found that messages that were ‘more politely phrased’ led to even greater declines, with a decrease of up to 20 percent.”

The research paper suggests such warnings may be even more effective if they came from Twitter itself or from another organization instead of their small, 100-follower accounts. Still, lead researcher Mustafa Mikdat Yildirim suspects:

“The fact that their use of hate speech is seen by someone else could be the most important factor that led these people to decrease their hate speech.”


Cynthia Murrell, December 7, 2021

MIT: Censorship and the New Approach to Learning

October 27, 2021

MIT is one of the top science and technology universities in the world. Like many universities in the United States, MIT has had its share of controversial issues related to cancel culture. The Atlantic discusses the most recent incident in the article, “Why The Latest Campus Cancellation Is Different.”

MIT invited geophysicist Dorian Abbot to deliver the yearly John Carlson Lecture about his new climate science research. When MIT students heard Abbot was invited to speak, they campaigned to disinvite him. MIT’s administration caved and Abbot’s invitation was rescinded. Unlike other cancel culture issues, when MIT disinvited Abbot it was not because he denied climate change or committed a crime. Instead, he gave his opinion about affirmative action and other ways minorities have advantages in college admission.

Abbot criticized affirmative action, legacy, and athletic admissions, which favors white applicants. He then compared these admission processes to 1930s Germany and that is a big no-no:

“Abbot seemingly meant to highlight the dangers of thinking about individuals primarily in terms of their ethnic identity. But any comparison between today’s practices on American college campuses and the genocidal policies of the Nazi regime is facile and incendiary.

Even so, it is patently absurd to cancel a lecture on climate change because of Abbot’s article in Newsweek. If every cringe worthy analogy to the Third Reich were grounds for canceling talks, hundreds of professors—and thousands of op-ed columnists—would no longer be welcome on campus.”

Pew Research shows that the majority of the United States believes merit-based admissions or hiring is the best system. The liberal state California even voted to uphold a ban on affirmative action.

MIT’s termination of the Abbot lecture may be an example of how leading universities define learning, information, and discussion. People are no longer allowed to have opposing or controversial beliefs if it offends someone. It harms not only an academic setting, especially at a research heavy university like MIT, but all of society.

It is also funny that MIT was quick to cancel Abbot, but they happily accepted money from Jeffrey Epstein. Interesting.

Whitney Grace, October 27, 2021

Apple: Oh, One More Thing

October 15, 2021

I am not sure about the British Broadcasting Corporation. It’s a news maker, not just a news purveyor. Let’s assume that “Apple Takes Down Koran App in China” is on the money. The write up asserts:

Quran Majeed is available across the world on the App Store – and has nearly 150,000 reviews. However, Apple removed the app at the request of Chinese officials, for hosting illegal religious texts, the company said.

The Beeb tried its best to contact China and the iPhone outfit, saying:

Apple declined to comment, but directed the BBC to its Human Rights Policy, which states: “We’re required to comply with local laws, and at times there are complex issues about which we may disagree with governments.”

I find this interesting. If information is not available online, does it exist? If an entity removes content from an online service, how does one know that content has gone missing? Allegedly Microsoft faced a Chine moment. What did that company do? Disappeared the service. Here’s a report, but it will cost you money to read the news. (For a person without resources, is there a difference between disappearing content and services and censorship?)

That’s two more things but maybe just one at the end of a very long week in which more and more people are functioning as skilled, informed, and professional reference librarians.

Stephen E Arnold, October 15, 2021

A Compliance Hat Trick?

October 14, 2021

Apple, Google, and Microsoft have scored. I read “LinkedIn Caves Again, Blocks US Journalists’ Accounts in China.” I noted this passage:

LinkedIn — the business-oriented social media platform owned by Microsoft — has spent the last few years increasing its compliance with the Chinese government’s demands for censorship.

The write up points out that a reporter for Axios, another with-in online information service, has been disappeared.

The cited article provides links and more color for the Chinese action.

It appears that major US technology companies are complying with guidelines and regulations in the countries in which they operate.


One possible answer is revenue. Another may be a desire to avoid legal consequences for the firms’ in-country employees.

It seems reasonable to conclude that the era of the Wild West Internet is ending. Some large countries want to manage certain aspects of information and data flows.

Is this a good thing or a bad thing? The answer depends on one’s point of view, where one lives, and how one generates revenue/income.

Stephen E Arnold, October 14, 2021

Google and the Russian Law: A Mismatch

October 8, 2021

I think this may have been a social visit. You know. A couple of people who wanted to snag a Google mouse pad or one of those blinking Google lapel pins. “Court Marshals Visit Google’s Moscow Office to Enforce Censorship Decision” asserts in “real” news fashion:

In the run-up to Russia’s parliamentary elections on Sept. 17-18-19, the Kremlin’s battle against online dissent brings new developments almost daily. Tech giants are not spared, with Google at the forefront earlier this week. Court marshals visited the company’s Moscow office to enforce an injunctive measure to remove the opposition-minded ‘Smart Vote’ site from search results. This online voting recommendation system was designed by the team of jailed Kremlin critic Alexey Navalny.

Yep, just a casual drop in. Fun. Censorship? I think it depends on whom one asks.

What happened? Google and Apple rolled over. I assume that digital countries understand that real countries have some powers that commercial enterprises lack?

Do you remember when Sergey Brin hoped to ride a Russian rocket into space? Not going to happen this week.

Stephen E Arnold, October 8, 2021

Internet Defreedoming: An Emerging and Surging Market Sector

October 6, 2021

Check out “The Global Drive to Control big Tech.” The information in the article and the report suggest that Internet freedom is decreasing. The write up makes this point and supports it with data and research:

In the high-stakes battle between states and technology companies, the rights of internet users have become the main casualties.

The data can be viewed from a different perspective; namely, censorship is a growth business. Products and services needed to censor content at scale are available, but these are often clunkers or complex add ins to network components which are under load and often less reliable than a used Lada.

Opportunities include:

  • Repurposed software designed for artificial intelligence operations; for example, identifying and flagging problematic content
  • Workflow software which can automate the removal, posting of flags, or once in a while notifying a person his or her content is problematic
  • Tools for locating objectionable content and triggering removal, logging the issue, and locating other instances of the “problem.”

After a decade of consistent censorship growth, opportunities abound.

Censorship appears to be a hot business segment.

Stephen E Arnold, October 6, 2021

China and That Old Time Religion: Oil and Water?

September 22, 2021

Chairman Mao Zedong infamously said, “Religion is the opiate of the people.” Since the communist takeover in China, the country’s government has not sanctioned any religion. In short, China does not like religion at all. China does not like Christianity, Judaism, Hinduism, Buddhism, nor Islam.

Islam is a hot button issue for China, because of its extermination of Uyghurs Muslims. China has not formerly acknowledged the Uyghur genocide. China does not like the Uyghurs, because the the minor Islamic denomination are separating themselves from the main Chinese population. Under the Chinese government, all people are equal and the same. The government does not like it when people separate themselves into ethnic or religious groups. Uyghur adults are being sent to extermination camps, while Uyghur children are separated from their parents and reeducated. China’s population crisis is another issue.

China banning the Koran reader is not any different from banning the Bible, Torah, or other religious documents. China notoriously bans literature and other media that the government finds contrary to its ideals. A developer named Ameir tweeted on Twitter that he uploaded the Koran reader to the China Apple App Store and he was told:

“I got notified from Apple that the Quran Reader has been removed from sale in China because it has ‘content that is illegal in China as determined by the Cyberspace Administration of China.’It’s literally just the Quran.”

Another user replied that China does not allow the Bible online either.

Whitney Grace, September 22, 2021

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta