Deep Fakes: A Tough Nut to Crack

February 8, 2019

If you are in the media or intelligence business, you undoubtedly already know about the potential of deep fakes or “deepfake” videos. Clips that utilize AI technology to create realistic and completely fake videos using existing footage. The catch is that they are getting more and more convincing…and that’s not good, as we discovered in a recent article, “Misinformation Woes Could Multiply with Deepfake Videos.”

According to the story:

“As the technology advances, worries are growing about how deepfakes can be used for nefarious purposes by hackers or state actors. ‘A well-timed and thoughtfully scripted deepfake or series of deepfakes could tip an election, spark violence in a city primed for civil unrest, bolster insurgent narratives about an enemy’s supposed atrocities, or exacerbate political divisions in a society.’”

What’s “true” and what’s “false” is an issue which may not lend itself to zeros and ones. Google asserts that it is developing software that helps spot deepfakes. Does Google have a solution?

Does anyone?

If an artifact is created and someone labels it “false,” smart software has to decide. Humans, history suggests, struggle with defining the truth.

The problem is likely to be difficult to resolve. Censorship anyone?

Patrick Roland, February 8, 2019

Censorship: An Interesting View

December 7, 2018

I read “Former ‘Guardian’ Editor On Snowden, WikiLeaks And Remaking Journalism.”

I noted this passage:

In the modern world, it is very difficult to prevent good information (and sadly, bad information) … from being published, because it’s like water, and you can’t you can’t control it in the way that you could even 50 years ago. [emphasis added]

That 50 year date means that censorship was easy and presumably widely practiced in 1968.


How did I come to know about Prague Spring, the murder of Martin Luther King, the assassination of Senator Robert Kennedy, anti-Vietnam protests, Surveyor 7, the moon landing, the strike in Paris, the Pueblo (remember Mogen David and the Grapes of Wrath), and my getting encouragement in my quest to index Latin sermons?

Telepathy? What did I miss?

Stephen E Arnold, December 7, 2018

Censorship: Deleted and Blocked Content Popular

November 7, 2018

The Internet is a tool and companies harness the Internet to offer services, such as social media, search, news, and commerce. These companies act as portals for users to post their information and content. The Digital Millennium Copyright Act (DMCA) protects companies from being held liable for their users’ actions. This means that companies cannot be sued or prosecuted for what their users share. This could all change.

Inc. takes a look at how this could change in the article, “Facebook, Google, And Twitter Must Censor The Web, Demand Investors.” Why would this change? It would change because bad actors use social media and other services for illegal activities. The law that could change the DMCA is the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA) and Web sites would be held liable for content posted on them. Any content posted on say Facebook, Twitter, Google, etc. that results in illegal activities could get the Internet providers arrested.

“FOSTA creates a legal precedent to hold Internet providers responsible for user-created content that drives other behaviors. Hate speech might lead to murder and terrorism, for instance. Therefore, it’s easy to imagine that the US government will pass laws similar to FOSTA holding Internet providers legally liable for that content. Other examples of user-content that might face FOSTA-style laws include sexual harassment, racism, fake news, and election interference.”

Investors are not happy about this inevitability and at future shareholder meetings they will demand these companies clean up their acts. Since nobody wants to see CEOs and other employees arrested, investors are pushing for censorship of user-generated content.

This would mean the end of free speech on the Internet, because everyone finds everything and anything offensive. It also violates the First Amendment. The backlash is going to huge and we cannot wait to see how 4chan, YouTube, and Reddit react.

Whitney Grace, November 7, 2018

Google and Its Smart Software: Stupid?

October 16, 2018

I received an email from the owner of a Web site focused on providing consumers with automobile information. The individual shared with me an email sent to his company by the Google smart entity “”.

The letter was an AdSense Publisher Policy Violation Report. In short, Google’s smart software spotted an offensive article. The Google document said:

  • New violations were detected. As a result, ad serving has been restricted or disabled on pages where these violations of the AdSense Program Policies were found. To resolve the issues, you can either remove the violating content and request a review, or remove the ad code from the violating pages.

Translating the Google speak: “You are showing ads on a page which contains pornography, contraband, hate speech, etc. Make this right, or no AdSense money for you.”

Okay, I was intrigued. How can information about cars be about porn, contraband, hate speech, etc.

The offensive item, my colleagues and I determined, was a review of a 2004 Saab 9-3 Arc Convertible, published about 14 years ago. The offense was that the review contained words of a sexual nature.

2004 saab label

Does this vehicle and the height of its truck or boot offend you? If it does, you are not Googley.

I read the review and noted that the author of the review does indeed focus on an automobile. The problem is that the review is a long tail news story. That means that old content rarely gets clicks. So what’s Google doing? Processing historical data in order to locate porn, contraband, and hate speech? Must be. This suggests that the company is playing catch up. I thought Google was on top of offensive content and had been for more than a decade. Google forbidden word lists have been kicking around for years.

Image result for saab 2004 convertible rear seat

I find this extremely suggestive? Perhaps that is why the reviewer described the tiny rear seating area as needful of a way to “ease rear seat access.” I am not sure my French bulldog would fit in the back seat of this Saab nor could he engage in hanky panky.

I noted that the Saab convertible has a “high rear.” Looking at the picture, it looks as if the mechanical engineers did increase the height of the trunk or boot in order to accommodate the folding hard top for this model Saab. I am not sure if I would have thought the phrase “high rear” was sexual because I was reading about how the solid convertible top had been accommodated by the engineering team. Who reads about trunk lids or boots as a sexual reference.

But wait. There’s more lingo about the car described about 14 years ago. Check out this passage:

While the convertible’s interior is similar to the sedan’s, with a semi-wraparound cockpit- style instrument panel, it has unique and very comfortable front seats, with the shoulder straps anchored to the seat frame to ease rear-seat access.

Can you spot offensive language. Well, there’s the cockpit, which I assume could be interpreted in a way different from where the driver sits to drive the vehicle. Then there is “rear seat access.” My goodness. That is offensive. Imagine buying a convertible in which a person could sit in the back seat. Obviously “rear seat” is a trigger phrase. When combined with “cockpit,” the Google smart software becomes. What is the word. Oh, right. Stupid.

Let’s step back. Some observations:

  • Google positions itself as having a whiz bang system for preventing offensive  content from reaching its “customers.” I must say that the system seems to be doing a less than brilliant job. (See. I did not use the word stupid again.) In my DarkCyber video news program for October 23, 2018, I point out that YouTube offers videos which explain to teens how to buy drugs on the Dark Web. The smart filters, I assume, think these vids are A Okay.
  • At the same time Google’s smart software is deciding that car reviews are filthy and offensive, the company is telling elected officials it does not know what it will do about its possible China search system. But today I noted “Sundar Pichai Spoke about Google’s China Plans for the First Time and It Doesn’t Look Like He’s Backing Down.” So Google is thinking more about assisting a government with its censorship effort when it cannot figure out that a car review is not pornographic? Stupid is not the word. Maybe mendacious?
  • The company seems to be expending resources to reprocess content which it had already identified, copied, parsed, and indexed. This Saab story was indexed and available 14 years ago. I wonder if Google realized that its index and Web archives are digital time bombs. Could the content become evidence in the event Google was subjected to a thorough investigation by European or US regulators? House cleaning before visitors arrive? Interesting because the smart software may be tweaked to be overzealous, not stupid at all.

Our view from Harrod’s Creek is simple. We think Google is a smart company. These minor, trivial, inconsequential filter failures are anomalies. In fact, the offensive auto reviews must go. What else must go? Another interesting question.

Google is great. Very intelligent.

I suppose one could pop the boot in the high rear and go for some rear seat access. I think there is a vernacular bound phrase for this sentiment.

Stephen E Arnold, October 16, 2018

Google Censorship Related Document

October 10, 2018

I am not sure this is a real Google document with the name “Google Leak.” If the link goes dead, you are on your own. Plus it is a long one, chuck full of quotes and images and crunchy statements. Some Googlers like crunchy statements.

An entity named Allum Bokhari uploaded the document.

For me the main point is that Google can embrace censorship. Makes sense I suppose.

The images of the slides in a PowerPoint-type presentation could have been created by Google, a third party, or some combination of thinkers with a design firm added for visual spice.

The group through whose hands the artifact passed was was Breitbart, a semi famous outfit. I know this because the name Breitbart is overlaid in orange on each of the pages of the document. The document also contains the Google logo and the branding “Insights Lab.”

I know there is an Insights Lab in Colorado, but it is tough to figure out who crated the document from what appears to be hours spent running queries against the Google search engine and fiddling with a PowerPoint type presentation system.

But who exactly is responsible for the document? Anonymity is popular with the outputs of the New York Times, Bloomberg, and online postings like this one.

The who is a bit of a mystery.

To get the document from Scribd, yep, the service with the pop ups, pleas for sign ups, etc., you have to sign up with Facebook or Google. Makes sense.

Plus, the document contains more than 80 pages, and it takes some time to dig through the lingo, the images, and the gestalt of the construct.

Here’s an image, which explains that the least free countries are China and Syria. The most free countries are Estonia and Iceland. Estonia and Iceland are good places to be free. The downside of Estonia is the tension between Estonians and Russians, who are if the chart is accurate, not into living without censorship. Plus, the border between Russia and Estonia is not formidable. It is a bit like a potato field in places. Iceland is super, particularly if one enjoys low cost data center services, fishing, hiking, and brisk winters.


The future, it seems, is censorship. I noted the phrase “well ordered spaces for safety and civility.”

The document is worth a look if you can tolerate the fact that one registers via Facebook and Google to view the alleged Google document. Viewing the document for now does not require registration. Downloading may invite endless appeals for cash.

Stephen E Arnold, October 10, 2010

Surf with Freedom: China, Iran, Russia, and Other Countries May Not Notice

October 5, 2018

How does this sound to you?

Intra included the following feature list:

• Free access to websites and apps blocked by DNS manipulation
• No limits on data usage and it won’t slow down your internet connection
• Open source
• Keep your information private – Intra doesn’t track the apps you use or websites you visit
• Customize your DNS server provider — use your own or pick from popular providers

You can get the scoop by reading “On Protected: Your Connection Is Protected from DNS Attacks.”

The service is provided by Jigsaw, an outfit under the wing of Google.

The article explains:

With Intra, they’ve created an app that protects against DNS manipulation. This is an app for the world to access the entire internet without, for example, government censorship.

For now this is an Android app, which may be a mobile phone operating system less of a hurdle for some surveillance activities. Of course, authorities in China, Iran, and Russia will remain unaware of this Google-centric app. I wonder if anyone in the US will notice?

Nah, probably not. I like the warnings issued to me by my browsers about unsafe sites, and I think the outcomes of DNS manipulations are interesting.

Stephen E Arnold, October 5, 2018

Content Filtering Seeps Into Mainstream

August 27, 2018

Content filtering is a new trend. For those fearing fake news, or simply tired of bad news, Google is trying to brighten their day. Their home assistant will deliver just good news if you ask it, but is there a dark underbelly to such actions? We started wrestling with this topic after a Digital Trends story, “By Request, Google Assistant Makes it Easy to Find Good News.”

According to the story:

“Google sources the positive difference stories from the Solutions Journalism Network (SJN). The nonprofit, nonpartisan organization focuses on publishing stories about how people can make the world a better place — the practice is called “solutions journalism.” SJN gathers and summaries articles from a large and diverse range of media sources.”

While this seems like a cute news snippet, it is potentially dangerous. Take for example, the news of a EU official who penned an op-ed about the dangers of filtering copy written works. Of course, a bot simply filed a complaint of copyright infringement and got the story wiped from the internet. Google’s good news filter is far from this kind of deviousness, but it’s also not so far that one day we could all have important, yet unpleasant news stripped from our world.

Patrick Roland, August 27, 2018

Twitter Bans Accounts

August 22, 2018

i read “Facebook and Twitter Ban over 900 Accounts in Bid to Tackle Fake News.” Twitter was founded about 12 years ago. The company found itself in the midst of the 2016 election messaging flap. The article reports:

Facebook said it had identified and banned 652 accounts, groups and pages which were linked to Iran and to Russia for “co-ordinated inauthentic behavior”, including the sharing of political material.

One of the interesting items of information which surfaced when my team was doing the research for CyberOSINT and the Dark Web Notebook, both monographs designed for law enforcement and intelligence professionals, was the ease with which Twitter accounts can be obtained.

For a program we developed for a conference organizer in Washington, DC, in 2015, we illustrated Twitter messages with links to information designed to attract young men and women to movements which advocated some activities which broke US laws.

The challenge had in 2015 several dimensions. Let me run down the ones the other speakers and I mentioned; for example:

  • The ease with which an account could be created
  • The ease with which multiple accounts could be created
  • The ease with which messages could be generated with suitable index terms
  • The ease with which messages could be disseminated across multiple accounts via scripts
  • The lack of filtering to block weaponized content.

Back to the present.

Banning an account addresses one of these challenges.

The notion of low friction content dissemination, unrestricted indexing, and the ability to create accounts is one to ponder.

Killing an account or a group of accounts may not have the desired effect.

Compared to other social networks, Twitter has a strong following in certain socio economic sectors. That in itself adds a bit of spice to the sauce.

Stephen E Arnold, August 22, 2018

Internet Platforms Are Something New. But What Does “New” Mean?

August 12, 2018

“New” is an interesting concept. A new car suggests a vehicle that emits the mix of polyvinyls, warm electronics, and snake oil. “New” in a camp in Yemen means a T shirt abandoned by a person and claimed by another. “New” in a temple in Kyoto means repairs made a century ago.

But I learned in “Platforms Are Not Publishers”:

Google, Facebook, Twitter, and the internet are not media. They are something new we do not yet fully understand.

Would it be helpful to have the context and intended connotation of “new” defined?

Nah, after the Internet revolution, everyone knows the meaning of the word.

The problems generated when flows of data rip across the digital landscape is that these bits and bytes erode. The impact is more rapid but less easy to detect than the impact of a flash flood gushing through the streets of a Rio hillside slum.

The notion that commercial enterprises are the context. The platforms emerged from the characteristics of digital technology; that is, concentration, velocity, disintermediation, etc.

The large platforms are like beavers. Put a beaver in the observation deck of the Chrysler Building in Manhattan and the beavers are going to do what beavers do. They may die, but their beaverness makes them behave in a way that to some degree is predictable.

I like the idea that individuals in the “media”—another term which warrants defining—have to shoulder some of the blame. Better hurry. I am no longer sure how long the real media and the real journalists will survive.

Their future will be finding a way to exploit the digital flows.

In short, Internet platforms today are not much different from the BRS, DataStar, Dialog, and Lexis type systems before the Internet.

What’s different is the scope, scale, and speed of today’s digital flows. In the context of the information environment (what I continue to call the datasphere) is unchanged.

The problem is that today’s digital experts have a limited perception of “new” and the context of online systems and services.

In short, too late folks. Russia, Turkey, Iran, and other countries have figured out that the shortest distance between A and B is censorship.

Censorship is now a content fashion trend. That’s “new” as in governments are punching the “off” button. The action may be futile, but it is a reminder that old school methods may deliver because responsible commercial organizations ignore what may be their “duty.” Publishing? What’s that?

Stephen E Arnold, August 12, 2018

Applique Logic: Alex Jones and Turbo Charging Magnetism

August 9, 2018

I am not sure I have read an Alex Jones’ essay or watched an Alex Jones’ video. In fact, he was one of the individuals of whom I was aware, but he was not on my knowledge radar. Now he is difficult to ignore.

Today’s New York Times corrected my knowledge gap. I noted in my dead tree edition today (August 9, 2018) these stories:

  • Facebook’s Worst Demons Have Come Home to Roost, page B1
  • Infowars App Is Trending As Platforms Ban Content, B6
  • The Internet Trolls Have Won. Get Used to It, B7

I want to mention “Rules Won’t Save Twitter. Values Will” at this online location.

From my vantage point in rural Kentucky, each of the writes up contributes to the logic quilt for censoring the real Alex Jones.

Taken together, the information in the write ups provide a helpful example of what I call “appliqué logic.”

Applique means, according to Google which helpfully points to Wikipedia, another information source which may be questionable to some, is:

Appliqué is ornamental needlework in which pieces of fabric in different shapes and patterns are sewn or stuck onto a larger piece to form a picture or pattern. It it commonly used as decoration, especially on garments. The technique is accomplished either by hand or machine. Appliqué is commonly practiced with textiles, but the term may be applied to similar techniques used on different materials.

Applique logic is reasoning stuck on to something else. In this case, the “something else” are the online monopolies which control access to certain types of information.

The logic is that the monopolies are technology, which is assumed to be neutral. I won’t drag you through my Eagleton Award lecture from a quarter century ago to remind you that the assumption may not be correct.

The way to fix challenges like “Alex Jones” is to stick a solution on the monopoly. This is similar to customizing a vehicle like this one:

Image result for outrageous automobiles

Notice how the school bus (a mundane vehicle) has been enhanced with what are appliqués. The result does not change the functioning of the school bus, but it now has some sizzle. I suppose the appliqué logician could write a paper and submit the essay to an open access publisher to explain the needed improvements the horns add.

With the oddly synchronized actions against the Alex Jones content, we have the equivalent of a group of automobile customizers finding ways to “enhance” their system.

The result is to convert what no one notices into something that would make a Silicon Valley PR person delighted to promote. I assume that a presentation at a zippy new conference would be easy for the appliqué team to book.

The apparent censorship of Alex Jones is now drawing a crowd. Here I am in Harrods Creek writing about a person to whom I previously directed zero attention. The New York Times coverage is doing a better job than I could with a single write up in a personal blog. In the land of “free speech” the Alex Jones affair may become an Amazon Prime or Netflix original program. Maybe a movie is in the works?

Back to appliqué logic. When it comes to digital content, sticking on a solution may not have the desired outcome. The sticker wants one thing. The stickee is motivated to solve the problem; for example, the earthquake watcher Dutch Sinse has jumped from YouTube to Twitch to avoid censorship. He offered an explanation about this action and referenced the Washington Post. I don’t follow Dutch Sinse so I don’t know what he is referencing, and I don’t care to be honest.

But the more interesting outcome of these Alex Jones related actions is that the appliqué logic has to embrace the “stickoids.” These are the people who now have a rallying point. My hunch is that whatever information Alex Jones provides, he is in a position to ride a pretty frisky pony at least for a a moment in Internet time.

Why won’t appliqué logic work when trying to address the challenges companies like Facebook, Google, et al face?

  1. Stick ons increase complexity. Complexity creates security issues which, until it is too late, remain unknown
  2. Alex Jones type actions rally the troops. I am not a troop, but here I am writing about this individual. Imagine the motivation for those who care about Mr. Jones’ messages
  3. Opportunities for misinformation, disinformation, and reformation multiply. In short, the filtering and other appliqué solutions will increase computational cost, legal costs, and administrative costs. Facebook and Google type companies are not keen on increased costs in my opinion.
  4. Alex Jones type actions attack legal eagles.

What’s the fix? There is a spectrum of options available. On one end, believe that the experts running the monopolies will do the right thing. Hope is useful, maybe even in this case. At the other end, the Putin approach may be needed. Censorship, fines, jail time, and more extreme measures if the online systems don’t snap a crisp salute.

Applique solutions are what’s available. I await the final creation. I assume there will be something more eye catching than green paint, white flame decoration, and (I don’t want to forget) the big green horns.

For Alex Jones, censorship may have turbocharged his messaging capability. What can one stick on him now? What will the stickoids do? Protest marches, Dark Web collections of his content, encrypted chat among fans?

I know one thing: Pundits and real journalists will come up with more appliqué fixes. Easy, fast, and cheap. Reasoning from the aisles of Hobby Lobby or Michael’s is better than other types of analytic thought.

Stephen E Arnold, August 9, 2018

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta