Selective YouTube Upload Filtering or Erratic Smart Software?

May 4, 2021

I received some information about a YouTuber named Aquachiggers. I watched this person’s eight minute video in which Aquachigger explained that his videos had been downloaded from YouTube. Then an individual (whom I shall described as an alleged bad actor) uploaded those Aquachigger videos with a the alleged bad actor’s voice over. I think the technical term for this is a copyright violation taco.

I am not sure who did what in this quite unusual recycling of user content. What’s clear is that YouTube’s mechanism to determine if an uploaded video violates Google rules (who really knows what these are other than the magic algorithms which operate like tireless, non-human Amazon warehouse workers). Allegedly Google’s YouTube digital third grade teacher software can spot copyright violations and give the bad actor a chance to rehabilitate an offending video.

According to Aquachigger, content was appropriated, and then via logic which is crystalline to Googlers, notified Aquachigger that his channel would be terminated for copyright violation. Yep, the “creator” Aquachigger would be banned from YouTube, losing ad revenue and subscriber access, because an alleged bad actor took the Aquachigger content, slapped an audio track over it, and monetized that content. The alleged bad actor is generating revenue by unauthorized appropriation of another person’s content. The key is that the alleged bad actor generates more clicks than the “creator” Aquachigger.

Following this?

I decided to test the YouTube embedded content filtering system. I inserted a 45 second segment from a Carnegie Mellon news release about one of its innovations. I hit the upload button and discovered that after the video was uploaded to YouTube, the Googley system informed me that the video with the Carnegie Mellon news snip required further processing. The Googley system labored for three hours. I decided to see what would happen if I uploaded the test segment to Facebook. Zippity-doo. Facebook accepted my test video.

What I learned from my statistically insignificant test that I could formulate some tentative questions; for example:

  1. If YouTube could “block” my upload of the video PR snippet, would YouTube be able to block the Aquachigger bad actor’s recycled Aquachigger content?
  2. Why would YouTube block a snippet of a news release video from a university touting its technical innovation?
  3. Why would YouTube, create the perception that Aquachigger be “terminated”?
  4. Would YouTube be allowing the unauthorized use of Aquachigger content in order to derive more revenue from that content on the much smaller Aquachigger follower base?

Interesting questions. I don’t have answers, but this Aquachigger incident and my test indicate that consistency is the hobgoblin of some smart software. That’s why I laughed when I navigated to Jigsaw, a Google service, and learned that Google is committed to “protecting voices in conversation.” Furthermore:

Online abuse and toxicity stops people from engaging in conversation and, in extreme cases, forces people offline. We’re finding new ways to reduce toxicity, and ensure everyone can safely participate in online conversations.

I also learned:

Much of the world’s internet users experience digital censorship that restricts access to news, information, and messaging apps. We’re [Google] building tools to help people access the global internet.

Like I said, “Consistency.” Ho ho ho.

Stephen E Arnold, May 4, 2021

Comments

Comments are closed.

  • Archives

  • Recent Posts

  • Meta