Google and Its Smart Software: Stupid?
October 16, 2018
I received an email from the owner of a Web site focused on providing consumers with automobile information. The individual shared with me an email sent to his company by the Google smart entity “publisher-policy-noreply.com”.
The letter was an AdSense Publisher Policy Violation Report. In short, Google’s smart software spotted an offensive article. The Google document said:
- New violations were detected. As a result, ad serving has been restricted or disabled on pages where these violations of the AdSense Program Policies were found. To resolve the issues, you can either remove the violating content and request a review, or remove the ad code from the violating pages.
Translating the Google speak: “You are showing ads on a page which contains pornography, contraband, hate speech, etc. Make this right, or no AdSense money for you.”
Okay, I was intrigued. How can information about cars be about porn, contraband, hate speech, etc.
The offensive item, my colleagues and I determined, was a review of a 2004 Saab 9-3 Arc Convertible, published about 14 years ago. The offense was that the review contained words of a sexual nature.
Does this vehicle and the height of its truck or boot offend you? If it does, you are not Googley.
I read the review and noted that the author of the review does indeed focus on an automobile. The problem is that the review is a long tail news story. That means that old content rarely gets clicks. So what’s Google doing? Processing historical data in order to locate porn, contraband, and hate speech? Must be. This suggests that the company is playing catch up. I thought Google was on top of offensive content and had been for more than a decade. Google forbidden word lists have been kicking around for years.
I find this extremely suggestive? Perhaps that is why the reviewer described the tiny rear seating area as needful of a way to “ease rear seat access.” I am not sure my French bulldog would fit in the back seat of this Saab nor could he engage in hanky panky.
I noted that the Saab convertible has a “high rear.” Looking at the picture, it looks as if the mechanical engineers did increase the height of the trunk or boot in order to accommodate the folding hard top for this model Saab. I am not sure if I would have thought the phrase “high rear” was sexual because I was reading about how the solid convertible top had been accommodated by the engineering team. Who reads about trunk lids or boots as a sexual reference.
But wait. There’s more lingo about the car described about 14 years ago. Check out this passage:
While the convertible’s interior is similar to the sedan’s, with a semi-wraparound cockpit- style instrument panel, it has unique and very comfortable front seats, with the shoulder straps anchored to the seat frame to ease rear-seat access.
Can you spot offensive language. Well, there’s the cockpit, which I assume could be interpreted in a way different from where the driver sits to drive the vehicle. Then there is “rear seat access.” My goodness. That is offensive. Imagine buying a convertible in which a person could sit in the back seat. Obviously “rear seat” is a trigger phrase. When combined with “cockpit,” the Google smart software becomes. What is the word. Oh, right. Stupid.
Let’s step back. Some observations:
- Google positions itself as having a whiz bang system for preventing offensive content from reaching its “customers.” I must say that the system seems to be doing a less than brilliant job. (See. I did not use the word stupid again.) In my DarkCyber video news program for October 23, 2018, I point out that YouTube offers videos which explain to teens how to buy drugs on the Dark Web. The smart filters, I assume, think these vids are A Okay.
- At the same time Google’s smart software is deciding that car reviews are filthy and offensive, the company is telling elected officials it does not know what it will do about its possible China search system. But today I noted “Sundar Pichai Spoke about Google’s China Plans for the First Time and It Doesn’t Look Like He’s Backing Down.” So Google is thinking more about assisting a government with its censorship effort when it cannot figure out that a car review is not pornographic? Stupid is not the word. Maybe mendacious?
- The company seems to be expending resources to reprocess content which it had already identified, copied, parsed, and indexed. This Saab story was indexed and available 14 years ago. I wonder if Google realized that its index and Web archives are digital time bombs. Could the content become evidence in the event Google was subjected to a thorough investigation by European or US regulators? House cleaning before visitors arrive? Interesting because the smart software may be tweaked to be overzealous, not stupid at all.
Our view from Harrod’s Creek is simple. We think Google is a smart company. These minor, trivial, inconsequential filter failures are anomalies. In fact, the offensive auto reviews must go. What else must go? Another interesting question.
Google is great. Very intelligent.
I suppose one could pop the boot in the high rear and go for some rear seat access. I think there is a vernacular bound phrase for this sentiment.
Stephen E Arnold, October 16, 2018
Google Censorship Related Document
October 10, 2018
I am not sure this is a real Google document with the name “Google Leak.” If the link goes dead, you are on your own. Plus it is a long one, chuck full of quotes and images and crunchy statements. Some Googlers like crunchy statements.
An entity named Allum Bokhari uploaded the document.
For me the main point is that Google can embrace censorship. Makes sense I suppose.
The images of the slides in a PowerPoint-type presentation could have been created by Google, a third party, or some combination of thinkers with a design firm added for visual spice.
The group through whose hands the artifact passed was was Breitbart, a semi famous outfit. I know this because the name Breitbart is overlaid in orange on each of the pages of the document. The document also contains the Google logo and the branding “Insights Lab.”
I know there is an Insights Lab in Colorado, but it is tough to figure out who crated the document from what appears to be hours spent running queries against the Google search engine and fiddling with a PowerPoint type presentation system.
But who exactly is responsible for the document? Anonymity is popular with the outputs of the New York Times, Bloomberg, and online postings like this one.
The who is a bit of a mystery.
To get the document from Scribd, yep, the service with the pop ups, pleas for sign ups, etc., you have to sign up with Facebook or Google. Makes sense.
Plus, the document contains more than 80 pages, and it takes some time to dig through the lingo, the images, and the gestalt of the construct.
Here’s an image, which explains that the least free countries are China and Syria. The most free countries are Estonia and Iceland. Estonia and Iceland are good places to be free. The downside of Estonia is the tension between Estonians and Russians, who are if the chart is accurate, not into living without censorship. Plus, the border between Russia and Estonia is not formidable. It is a bit like a potato field in places. Iceland is super, particularly if one enjoys low cost data center services, fishing, hiking, and brisk winters.
The future, it seems, is censorship. I noted the phrase “well ordered spaces for safety and civility.”
The document is worth a look if you can tolerate the fact that one registers via Facebook and Google to view the alleged Google document. Viewing the document for now does not require registration. Downloading may invite endless appeals for cash.
Stephen E Arnold, October 10, 2010
Surf with Freedom: China, Iran, Russia, and Other Countries May Not Notice
October 5, 2018
How does this sound to you?
Intra included the following feature list:
• Free access to websites and apps blocked by DNS manipulation
• No limits on data usage and it won’t slow down your internet connection
• Open source
• Keep your information private – Intra doesn’t track the apps you use or websites you visit
• Customize your DNS server provider — use your own or pick from popular providers
You can get the scoop by reading “On Protected: Your Connection Is Protected from DNS Attacks.”
The service is provided by Jigsaw, an outfit under the wing of Google.
The article explains:
With Intra, they’ve created an app that protects against DNS manipulation. This is an app for the world to access the entire internet without, for example, government censorship.
For now this is an Android app, which may be a mobile phone operating system less of a hurdle for some surveillance activities. Of course, authorities in China, Iran, and Russia will remain unaware of this Google-centric app. I wonder if anyone in the US will notice?
Nah, probably not. I like the warnings issued to me by my browsers about unsafe sites, and I think the outcomes of DNS manipulations are interesting.
Stephen E Arnold, October 5, 2018
Content Filtering Seeps Into Mainstream
August 27, 2018
Content filtering is a new trend. For those fearing fake news, or simply tired of bad news, Google is trying to brighten their day. Their home assistant will deliver just good news if you ask it, but is there a dark underbelly to such actions? We started wrestling with this topic after a Digital Trends story, “By Request, Google Assistant Makes it Easy to Find Good News.”
According to the story:
“Google sources the positive difference stories from the Solutions Journalism Network (SJN). The nonprofit, nonpartisan organization focuses on publishing stories about how people can make the world a better place — the practice is called “solutions journalism.” SJN gathers and summaries articles from a large and diverse range of media sources.”
While this seems like a cute news snippet, it is potentially dangerous. Take for example, the news of a EU official who penned an op-ed about the dangers of filtering copy written works. Of course, a bot simply filed a complaint of copyright infringement and got the story wiped from the internet. Google’s good news filter is far from this kind of deviousness, but it’s also not so far that one day we could all have important, yet unpleasant news stripped from our world.
Patrick Roland, August 27, 2018
Twitter Bans Accounts
August 22, 2018
i read “Facebook and Twitter Ban over 900 Accounts in Bid to Tackle Fake News.” Twitter was founded about 12 years ago. The company found itself in the midst of the 2016 election messaging flap. The article reports:
Facebook said it had identified and banned 652 accounts, groups and pages which were linked to Iran and to Russia for “co-ordinated inauthentic behavior”, including the sharing of political material.
One of the interesting items of information which surfaced when my team was doing the research for CyberOSINT and the Dark Web Notebook, both monographs designed for law enforcement and intelligence professionals, was the ease with which Twitter accounts can be obtained.
For a program we developed for a conference organizer in Washington, DC, in 2015, we illustrated Twitter messages with links to information designed to attract young men and women to movements which advocated some activities which broke US laws.
The challenge had in 2015 several dimensions. Let me run down the ones the other speakers and I mentioned; for example:
- The ease with which an account could be created
- The ease with which multiple accounts could be created
- The ease with which messages could be generated with suitable index terms
- The ease with which messages could be disseminated across multiple accounts via scripts
- The lack of filtering to block weaponized content.
Back to the present.
Banning an account addresses one of these challenges.
The notion of low friction content dissemination, unrestricted indexing, and the ability to create accounts is one to ponder.
Killing an account or a group of accounts may not have the desired effect.
Compared to other social networks, Twitter has a strong following in certain socio economic sectors. That in itself adds a bit of spice to the sauce.
Stephen E Arnold, August 22, 2018
Internet Platforms Are Something New. But What Does “New” Mean?
August 12, 2018
“New” is an interesting concept. A new car suggests a vehicle that emits the mix of polyvinyls, warm electronics, and snake oil. “New” in a camp in Yemen means a T shirt abandoned by a person and claimed by another. “New” in a temple in Kyoto means repairs made a century ago.
But I learned in “Platforms Are Not Publishers”:
Google, Facebook, Twitter, and the internet are not media. They are something new we do not yet fully understand.
Would it be helpful to have the context and intended connotation of “new” defined?
Nah, after the Internet revolution, everyone knows the meaning of the word.
The problems generated when flows of data rip across the digital landscape is that these bits and bytes erode. The impact is more rapid but less easy to detect than the impact of a flash flood gushing through the streets of a Rio hillside slum.
The notion that commercial enterprises are the context. The platforms emerged from the characteristics of digital technology; that is, concentration, velocity, disintermediation, etc.
The large platforms are like beavers. Put a beaver in the observation deck of the Chrysler Building in Manhattan and the beavers are going to do what beavers do. They may die, but their beaverness makes them behave in a way that to some degree is predictable.
I like the idea that individuals in the “media”—another term which warrants defining—have to shoulder some of the blame. Better hurry. I am no longer sure how long the real media and the real journalists will survive.
Their future will be finding a way to exploit the digital flows.
In short, Internet platforms today are not much different from the BRS, DataStar, Dialog, and Lexis type systems before the Internet.
What’s different is the scope, scale, and speed of today’s digital flows. In the context of the information environment (what I continue to call the datasphere) is unchanged.
The problem is that today’s digital experts have a limited perception of “new” and the context of online systems and services.
In short, too late folks. Russia, Turkey, Iran, and other countries have figured out that the shortest distance between A and B is censorship.
Censorship is now a content fashion trend. That’s “new” as in governments are punching the “off” button. The action may be futile, but it is a reminder that old school methods may deliver because responsible commercial organizations ignore what may be their “duty.” Publishing? What’s that?
Stephen E Arnold, August 12, 2018
Applique Logic: Alex Jones and Turbo Charging Magnetism
August 9, 2018
I am not sure I have read an Alex Jones’ essay or watched an Alex Jones’ video. In fact, he was one of the individuals of whom I was aware, but he was not on my knowledge radar. Now he is difficult to ignore.
Today’s New York Times corrected my knowledge gap. I noted in my dead tree edition today (August 9, 2018) these stories:
- Facebook’s Worst Demons Have Come Home to Roost, page B1
- Infowars App Is Trending As Platforms Ban Content, B6
- The Internet Trolls Have Won. Get Used to It, B7
I want to mention “Rules Won’t Save Twitter. Values Will” at this online location.
From my vantage point in rural Kentucky, each of the writes up contributes to the logic quilt for censoring the real Alex Jones.
Taken together, the information in the write ups provide a helpful example of what I call “appliqué logic.”
Applique means, according to Google which helpfully points to Wikipedia, another information source which may be questionable to some, is:
Appliqué is ornamental needlework in which pieces of fabric in different shapes and patterns are sewn or stuck onto a larger piece to form a picture or pattern. It it commonly used as decoration, especially on garments. The technique is accomplished either by hand or machine. Appliqué is commonly practiced with textiles, but the term may be applied to similar techniques used on different materials.
Applique logic is reasoning stuck on to something else. In this case, the “something else” are the online monopolies which control access to certain types of information.
The logic is that the monopolies are technology, which is assumed to be neutral. I won’t drag you through my Eagleton Award lecture from a quarter century ago to remind you that the assumption may not be correct.
The way to fix challenges like “Alex Jones” is to stick a solution on the monopoly. This is similar to customizing a vehicle like this one:
Notice how the school bus (a mundane vehicle) has been enhanced with what are appliqués. The result does not change the functioning of the school bus, but it now has some sizzle. I suppose the appliqué logician could write a paper and submit the essay to an open access publisher to explain the needed improvements the horns add.
With the oddly synchronized actions against the Alex Jones content, we have the equivalent of a group of automobile customizers finding ways to “enhance” their system.
The result is to convert what no one notices into something that would make a Silicon Valley PR person delighted to promote. I assume that a presentation at a zippy new conference would be easy for the appliqué team to book.
The apparent censorship of Alex Jones is now drawing a crowd. Here I am in Harrods Creek writing about a person to whom I previously directed zero attention. The New York Times coverage is doing a better job than I could with a single write up in a personal blog. In the land of “free speech” the Alex Jones affair may become an Amazon Prime or Netflix original program. Maybe a movie is in the works?
Back to appliqué logic. When it comes to digital content, sticking on a solution may not have the desired outcome. The sticker wants one thing. The stickee is motivated to solve the problem; for example, the earthquake watcher Dutch Sinse has jumped from YouTube to Twitch to avoid censorship. He offered an explanation about this action and referenced the Washington Post. I don’t follow Dutch Sinse so I don’t know what he is referencing, and I don’t care to be honest.
But the more interesting outcome of these Alex Jones related actions is that the appliqué logic has to embrace the “stickoids.” These are the people who now have a rallying point. My hunch is that whatever information Alex Jones provides, he is in a position to ride a pretty frisky pony at least for a a moment in Internet time.
Why won’t appliqué logic work when trying to address the challenges companies like Facebook, Google, et al face?
- Stick ons increase complexity. Complexity creates security issues which, until it is too late, remain unknown
- Alex Jones type actions rally the troops. I am not a troop, but here I am writing about this individual. Imagine the motivation for those who care about Mr. Jones’ messages
- Opportunities for misinformation, disinformation, and reformation multiply. In short, the filtering and other appliqué solutions will increase computational cost, legal costs, and administrative costs. Facebook and Google type companies are not keen on increased costs in my opinion.
- Alex Jones type actions attack legal eagles.
What’s the fix? There is a spectrum of options available. On one end, believe that the experts running the monopolies will do the right thing. Hope is useful, maybe even in this case. At the other end, the Putin approach may be needed. Censorship, fines, jail time, and more extreme measures if the online systems don’t snap a crisp salute.
Applique solutions are what’s available. I await the final creation. I assume there will be something more eye catching than green paint, white flame decoration, and (I don’t want to forget) the big green horns.
For Alex Jones, censorship may have turbocharged his messaging capability. What can one stick on him now? What will the stickoids do? Protest marches, Dark Web collections of his content, encrypted chat among fans?
I know one thing: Pundits and real journalists will come up with more appliqué fixes. Easy, fast, and cheap. Reasoning from the aisles of Hobby Lobby or Michael’s is better than other types of analytic thought.
Stephen E Arnold, August 9, 2018
Google and China: A New Management Approach to Silicon Valley Pragmatism
August 3, 2018
I read “While Pragmatist Pichai Ploughs into China, Google Workers Fume over Concession to Censorship.” The main point of the write up for me is that Google has a management challenge on its hands. I learned:
Co-founders Larry Page and Sergey Brin built Google to “organize the world’s information and make it universally available”. They viewed China as a threat to the company’s stance as a defender of the open web. Pichai, in contrast, sees China as a hotbed of engineering talent and an appealing market.
The only problem is that I think that the omission of money is a modest flaw in the logic of the quoted passage.
I noted this statement:
People trust Google to share true information and the Chinese search app is a betrayal of that, the employee said. The Google workers asked not to be identified because they are not permitted to discuss internal matters.
I assume the nifty buzzword “pragmatism” (possibly a metaphor for “governance”) embraces this disconnect between what one or more Googlers perceive, and what the GOOG actually does to deliver “relevant” results.
I highlighted:
Dragonfly [the code name for the new China specific search app] was a popular topic on Memegen, an internal online photo messaging board and cultural barometer at the company. One meme cited a popular Google slogan – “Put the user first” – with an asterisk attached: “Chinese users excluded, because we do not agree with your government.” A second post questioned the merits of American staff deciding global policies. Westerners debating Google entering China “feels somehow like men debating regulating women’s bodies,” it read.
Yep, relevant results. Pragmatic results too.
Stephen E Arnold, August 3, 2018
Fake News: Maybe Deadly
July 25, 2018
Politics aside for a moment, a disturbing new trend is becoming more obvious thanks to social media and fake news. Human lives are being lost thanks to false news stories being circulated and it might just be the one arena in which everyone can agree there is a problem. This first came to our attention via an NBC News story, “Social Media Rumors Trigger Violence in India; 3 Killed by Mobs.”
According to the story:
“Mobs of villagers killed at least three people and attacked several others after social media messages warned that gangs of kidnappers were roaming southern India in search of children, police said ….Authorities said there was no indication that such gangs actually existed.”
This scourge of fake news leading to real world consequences has led to the government stepping in and perhaps becoming an incubator for other nations going forward. The Indian Government has reached out to WhatsApp and demanded that they begin filtering out fake news stories. Google and Facebook have already begun attempting to police themselves. If the Indian government’s move to take control over fake news proves successful, censorship dominoes are falling in many different nation states. In the July 31, 2018, DarkCyber video we report about recent developments and Kazakhstan. The video will be available on the 31st at www.arnoldit.com/wordpress.
Patrick Roland, July 25, 2018
Heigh Ho, Filter, Away: You Cannot Find Info If It Is Not in the Index
July 19, 2018
Google recently revealed some data about the effects of adjustments it made to its algorithm back in 2014 in an effort to minimize piracy. TorrentFreak shares these figures in, “Google Downranks 65,000 Pirate Sites in Search Results.” Writer Ernesto informs us:
“In a comment to Australian media, Google states that it has demoted 65,000 sites in search results, a list that’s still growing every week. In total, the company received DMCA takedown requests for over 1.8 million domain names, so a little under 4% of these are downranked. The result of the measures is that people are less likely to see a pirate site when they type ‘watch movie X’ or ‘download song Y.’ This means that these sites see a drop in visitors from Google and a quite significant one too. ‘Demotion results in sites losing around 90 percent of their visitors from Google Search,’ a Google spokesperson told The Age. Indeed, soon after the demotion signal was implemented, pirate sites were hit hard. However, pirates wouldn’t be pirates if they didn’t respond with their own countermeasures. In recent years, many infringing sites have hopped from domain to domain, in part to circumvent the downranking efforts. In addition, Google’s measures also created an opportunity for smaller, less reputable, sites to catch search traffic that would otherwise go to the main players.”
Still, it seems to be a net win against piracy, all told. Some still call for Google to completely remove sites guilty of piracy from their search results, a move Google has its reasons for refusing to make. We’re reminded the company has also described piracy as an “availability and pricing problem,” and says governments should be promoting new business models instead of laying blame at the search engine’s feet. That is an interesting argument.
Cynthia Murrell, July 19, 2018