Filtering: Facebook Asserts Filtering Progress
November 29, 2017
i read “Hard Questions: Are We Winning the War on Terrorism Online?” The main point is that Facebook is filtering terrorism related content. Let’s assume that the assertion is correct. Furthermore, let’s assume that private group participants are reporting terror-related content so that information not available to the general Facebook community is devoid of terror related content.
This appears to be a step forward.
My thought is that eliminating the content may squeeze those with filtered messages to seek other avenues of information dissemination. For most people, the work arounds will be unfamiliar.
But options exist, and these options are becoming more widely used and robust. I remind myself that bad actors can be every bit as intelligent, resourceful, and persistent as the professionals working at companies like Facebook.
Within the last four months, the researchers assisting me on the second edition of the Dark Web Notebook have informed me:
- Interest in certain old-school methods of online communication has increased; for example, text communication
- Encrypted apps are gaining wider use
- Peer-to-peer mechanisms show strong uptake by certain groups
- Dark Web or i2p communication methods are not perfect but some work despite the technical hassles and latency
- Burner phones and sim cards bought with untraceable forms of payment are widely available from retail outlets like Kroger and Walgreens in the US.
Those interested in information which is filtered remind me of underground movements in the 1960s. At the university I attended, the surface looked calm. Then bang, an event would occur. Everyone was surprised and wondered where that “problem” came from. Hiding the problem does not resolve the problem I learned by observing the event.
The surface is one thing. What happens below the surface is another. Squeezing in one place on a balloon filled with water moves the water to another place. When the pressure is too great, the balloon bursts. Water goes in unexpected places.
My view is that less well known methods of communication will attract more attention. I am not sure if this is good news or bad news. I know that filtering alone does not scrub certain content from digital channels.
Net net: Challenges lie ahead. Net neutrality may provide an additional lever, but there will be those who seek to circumvent controls. Most will fail, but some will succeed. Those successes may be difficult to anticipate, monitor, and address.
Facebook filtering is comparatively easy. Reacting to consequences of filtering may be more difficult. It has taken many years to to achieve the modest victory Facebook has announced. That reaction time, in itself, is a reminder that there is something called a Pyrrhic victory.
Stephen E Arnold, November 29, 2017
Stephen E Arnold, November
China: Online Behavior, Censorship, and Innovation
September 12, 2017
My recollection of China is fuzzy. The place is big, so details are tiny fragments. That’s why I find reports like “China’s Ever-Tighter Web Controls Jolt Companies, Scientists.” The operative words are “control” and “jolt” in the headline.
I noted these “real” facts:
- Consumer research firm GlobalWebIndex said a survey of Chinese Web surfers this year found 14 percent use a VPN daily.
- 8.8 percent of people in the survey use VPNs to look at “restricted sites”
- [China’s] government spokespeople refuse to acknowledge any site is blocked, though researchers say they can see attempts to reach sites such as Google stopped within servers operated by state-owned China Telecom Ltd., which controls China’s links to the global internet.
- The agency in charge of the crackdown [is] the Cyberspace Administration of China
As I noted in my short article “Dark Web Explained” in the Recorded Future blog, censorship will squeeze some online behaviors to the Dark Web. Perhaps China will take even more aggressive action to make the use of Tor and i2p an opportunity for corrective instruction. Developers may find the tighter controls a reason to innovate.
In short, the cat-and-mouse games are about to get underway in earnest.
Stephen E Arnold, September 12, 2017
Math Professor Alleges Google Has Disappeared His Equations
August 21, 2017
I read “One Statistics Professor Was Just Banned By Google: Here Is His Story.” The Beyond Search goose is baffled. We learned that Salil Mehta’s email and blog are no longer online. I did not that the blog was “ads free.” Hmmm. Even the Beyond Search goose does the Google ads things. We noted this statement:
Now instead of mathematics, reporters have turned to this latest circus nightmare from Google as an example of how they are compounding bad decisions on good people anywhere and at any time. Can they not differentiate me from an evil person? Can they not see the large and reputable people and institutions that have relied on my work? Do they have better people who can coach them on how to make decisions with much better taste and finesse? What’s next, all CEOs and professors and politicians are going to be shut down from social media whenever it is least expected? Overnight hi-tech lynching squads are a thing of the past. We can’t have kangaroo courts and hope to lead with moral authority.
Keep Calm Studio will sell those stressed this excellent poster. Its message is germane to the allegations.
Oh, oh. This passage suggests to me that Google is a circus. But not any circus. A circus that invokes nightmares. Yikes. Google?
The passage does call attention to one of the very tiny issues some people have with smart software. Obviously the algorithm may have a bit of a drift because it is possible for smart software to learn that sites like the Daily Stormer are 0.000001 on the Google Quality Index and quite possible have misconstrued a discussion of statistical methods as problematic. Google is doing its best to stamp out hate speech, but statistical procedures, even when informed by Big Data, can deliver off point results.
The passage suggests that Google management needs a coach. Hey, that was Eric Schmidt’s job, and he did it well. Perhaps the author is unnecessarily critical of a company which makes an engineer into the technical equivalent of Lady Gaga.
The passage also raises the question of Google’s future endeavors. I don’t like to predict what Google will do, and I have mocked those who want to tell Google what to do. If Google asks, I output. If Google does not act, I just note the activity and go back to the pond filled with mine drainage. (It looked nice in the ecliptic gloaming.
I also note the phrase “hi-tech lynching squads.” This word choice will probably cause some types of analytic software to spit out an alert. (Maybe misspelling “high” will slip through the filter. Software, even Google’s, may have some idiosyncrasies.
As the write up moved to its conclusion, I circled in anguished ocher this paragraph:
We are going to be looking back on this time in Google’s history and those of other social media and know that they have done some very immoral and confusing things, and it has hurt their public reputation with decent people who wanted to grow into the next future with them.
I am not too keen on saying that the GOOG has done “immoral and confusing actions.” Here in Harrod’s Creek we are eagerly awaiting our Google Fiber T shirt with the message “Make the Internet More Googley.”
We don’t have any suggestions for rectifying the issue. If the author were a member of law enforcement or an intelligence professional, we can provide a “clean,” “untraceable” identity. But the person whose content disappeared is a professor, and I don’t provide untraceable identities to individuals who are disappeared.
May I suggest a new career? Microsoft Bing / LinkedIn may welcome the posts and the résumé?
Oh, the Daily Stormer is available on the Dark Web. My hunch is that not too many statisticians with disappeared content are into the Dot Onion thing.
Remember. The Beyond Search team is on board with the Google. We also try to stay on the search train if you get my drift because we don’t write articles that make Google look like the people from my high school’s machine shop class.
Stephen E Arnold, August 21, 2017
India Jumps on the Filtering Bandwagon
August 9, 2017
We noted “Internet Archive Contacted Indian Govt Regarding the Block but Got No Response.” The main point is that a repository (incomplete as its collection of Web pages may be) seems to be unavailable in India. Perhaps the Indian government has found a way to search for information in the service. We have noted that searching for rich media, including the collection of 78 rpm records, is a tough slog. It is tough to find information even when it is online. When services are filtered, locating facts, semi-facts, and outright hoohaw becomes impossible. We think the actions could impair the outstanding customer support services provided by the world’s second largest nation. Efficient delivery of information centric services, however, are like to improve in Mumbai. China, Indonesia, Russia, Turkey, and now India may be taking steps to put the data doggies in the kennel.
Stephen E Arnold, August 9, 2017
Online Filtering: China and “All” Rich Media
July 6, 2017
i read “China’s Bloggers, Filmmakers Feel Chill of Internet Crackdown.” The main idea is that control over Internet content is getting exciting. I noted this point in the “real” news story:’
Over the last month, Chinese regulators have closed celebrity gossip websites, restricted what video people can post and suspended online streaming, all on grounds of inappropriate content.
Yep, an “all” in the headline and an “all” in the text of the story.
I also thought the point that emerges from the alleged statement of an academic whose travel to and from China is likely to become more interesting:
“According to these censorship rules, nothing will make it through, which will do away with audiovisual artistic creation,” Li Yinhe, an academic who studies sexuality at the government-run Chinese Academy of Social Sciences, wrote in an online post. Under the government rules, such works as Georges Bizet’s opera “Carmen” and Shakespeare’s “Othello” would technically have to be banned for depicting prostitution and overt displays of affection, she said.
What’s the key point? It seems to me that China wants to prevent digital content from eroding what the write up calls via a quote from “an industry association” “socialist values.” Yep, bad. Filtering and controls applied by commercial enterprises, therefore, must be better. If government filters applied by countries other than China may be sort of better than China’s approach.
Hey, gentle reader, this is news. But does “news” exist if one cannot access it online? Perhaps actions designed to limit Surface Web online content will increase the use of encrypted systems such as sites accessible via Tor.
Presumably Thomson Reuters new incubator for smart software and big data will not do any of the filtering thing? On the other hand, my hunch is that Thomson Reuters will filter like the Dickens: From screening ideas to fund to guiding the development trajectories of the lucky folks who get some cash.
Worth watching the publishing giant which has been struggling to generate significant top line growth.
Stephen E Arnold, July 5, 2017
Facebook Factoid: Deleting User Content
July 6, 2017
Who knows if this number is accurate. I found the assertion of a specific number of Facebook deletions interesting. Plus, someone took the time to wrap the number is some verbiage about filtering, aka censorship. The factoid appears in “Facebook Deletes 66,000 Posts a Week to Curb Hate Speech, Extremism.”
Here’s the passage with the “data”:
Facebook has said that over the past two months, it has removed roughly 66,000 posts on average per week that were identified as hate speech.
My thought is that the 3.2 million “content objects” is neither high nor low. The number is without context other than my assumption that Facebook has two billion users per month. The method used to locate and scrub the data seems to be a mystical process powered by artificial intelligence and humans.
One thing is clear to me: Figuring out what to delete will give both the engineers writing the smart software and the lucky humans who get paid to identity inappropriate content in the musings of billions of happy Facebookers seems to be a somewhat challenging task.
What about those “failures”? Good question. What about that “context”? Another good question. Without context what have we with this magical 66,000? Not much in my opinion. One can’t find information if it has been deleted. That’s another issue to consider.
Stephen E Arnold, July 6, 2017
About That Freedom of Speech Thing
May 26, 2017
I read “G7 Summit: Theresa May to Ask World Leaders to Launch Internet Crackdown after Manchester Attack.” The Internet means online to me. Crackdowns trigger thoughts of filtering, graph analysis, and the interesting challenge of explaining why someone looked up an item of information.
The write up interpreted “online” as social media, which is interesting. Here’s a passage I highlighted:
The prime minister will ask governments to unite to regulate what tech companies like Google, Facebook and Twitter allow to be posted on their networks. By doing so, she will force them to remove “harmful” extremist content, she will suggest to G7 members at a meeting in Italy.
The named companies have been struggling to filter inappropriate content. On a practical level, certain inappropriate content may generate ad revenue. Losing ad revenue is not a popular notion in some of these identified companies.
The companies have also been doing some thinking about their role. Are these outfits supposed to be “responsible” for what their users and advertisers post? If the identified companies are indeed “responsible,” how will the mantle of responsibility hang on the frames of Wild West outfits in Silicon Valley. The phrase “It is easier to ask forgiveness than seek permission” is a pithy way of summing up some Silicon Valley action plans.
The write up enumerates the general types of digital information available on “the Internet.” I noted this statement:
She [Theresa May, Britain’s prime minister] will also call for industry guidelines to be revised by the tech companies to make absolutely clear what constitutes harmful material, with those that fail to do so being held to account.
The impact of Ms. May’s suggestion may create some interesting challenges for the companies facilitating the flow of real time information. Will Silicon Valley companies which often perceive themselves as more important than nation states will respond in a manner congruent with Ms. May’s ideas?
My thought is that “responsibility” will be a moving target. What’s more important? Advertising revenue or getting bogged down in figuring out which item of information is okay and which is not?
At this moment, it looks to me as if revenue and self interest might be more important than broader political considerations. That Maslow’s hierarchy of need takes on a special significance when Silicon Valley constructs consider prioritize their behaviors.
What happens if I run an online query for “Silicon Valley” and “content filtering”? Bing wants me to personalize my results based on my interests and for me to save “things for later.” I decline and get this output:
I particularly liked the reference to Silicon Valley sending “its ambassador” to Appalachia. Sorry, Ms. May, my query does not encourage my thinking about your idea for responsible censorship.
Google displays an ad for social media monitoring performed by GFI Software in Malta. I am also directed to hits which do not relate to Ms. May’s ideas.
Google interprets the query as one related to third party software which blocks content. That’s closer to what Ms. May is suggesting.
Neither search giant points to itself as involved in this content filtering activity.
That tells me that Ms. May’s idea may be easy to articulate but a bit more difficult to insert into the Wild West of capitalistic constructs.
Digital information is a slippery beastie composed of zeros and ones, used by billions of people who don’t agree about what’s okay and what’s not okay, and operated by folks who may see themselves as knowing better than elected officials.
Interesting stuff.
Stephen E Arnold, May 26, 2017
China and Facebook: Coincidence, Trend, the Future?
May 23, 2017
I read “China Clamps Down on Online News With New Security Rules.” The main idea is that China is taking steps to make sure the right news reaches the happy Internet consumers in the middle kingdom. Forget the artificial intelligence approach. China may be heading down a more traditional water buffalo path. Human herders will keep those bovines in line. Bad bovines become Chinese beef with broccoli. The Great Firewall is, it seems, not so magnificent. VPNs are on the hit list too. Monitoring is the next big thing in making sure 1.2 billion Chinese are fully informed. The question is, “Didn’t the previous online intercept and filtering mechanism work?” Who knew?
I also noted “Facebook Is Hiring a Small Army to Block Murder and Suicide Videos.” The point of the write up is that the vaunted revolution in artificial intelligence is not so vaunted. To find and censor nasty videos, Facebook is embracing an old-fashioned approach—humans. The term for this digital “fast food” type workers is moderators. The moderators will be part of Facebook’s “community operations team. If the “real journalism” outfit is correct, Facebook’s COT has a cadre of 4,500 people. For those lucky enough to work at the Taco Bell of deciding what’s “good”, “appropriate,” or “Facebooky”, I learned:
Facebook says the people hired to review Facebook content for the company will receive psychological support…
I would imagine that it might be easier to hire individuals who don’t worry about free speech and figuring out the answer to such questions as, “Exactly what is Facebooky?” Tom Aquinas, John Locke, Socrates, Bertrand Russell, and Descartes are not available to provide their input.
More intriguing is that Google is adding “workshops” for humans. Presumably, Google has cracked the problem of figuring out what’s in and what’s out under the US Constitution’s First Amendment. The high power Google smart software are getting a spring tune up. But humanoids will be working on identifying hate speech if the information in “Google Search Changes Tackle Fake News and Hate Speech.”
For a moment, I thought there was some similarity among the China, Facebook, and Google approaches. I realized that China is a real country and it is engaged in information control. Facebook and Google are “sort of countries”? Each is engaged in a process of deciding what’s okay and what’s not okay?
Am I worried? Not really. I think that nation states make decisions so their citizens are fully informed. I think that US monopolies operate for the benefit of their users.
The one issue which gives me a moment’s pause is the revolution in big thinking. China, Facebook, and Google have obviously resolved the thorny problem of censorship.
Those losers like Socrates deserved to die. Tom Aquinas had the right idea: Stay inside and focus on a greater being. Descartes was better at math than the “I think and therefore I am” silliness. Perhaps the spirit of John Locke has been revivified, and it is guiding the rationalists in China, Facebook, and Google in their quest to decide what’s good and what’s bad.
Three outfits have “Russell-ed” up answers to tough philosophical questions. Trivial, right?
Stephen E Arnold, May 23, 2017
Advocacy Groups Back Google over Right to Be Forgotten Conflict
January 31, 2017
Does a European’s “right to be forgotten” extend around the globe? (And if not, is one really “forgotten”?) Can one nation decide what the rest of the world is allowed to see about its citizens? Thorny questions are at the heart of the issue MediaPost examines in, “Google Draws Support in Showdown Over ‘Right to Be Forgotten’.”
Privacy-protection rights, established by European judges, demand Google remove search-result links that could embarrass a European citizen at the subject’s request (barring any public interest in the subject, of course). French regulators want Google to extend this censorship on its citizens’ behalf around the world, rather than restrict access just within that country’s borders. No way, says Google, and it has some noteworthy support—the Center for Democracy & Technology, Electronic Frontier Foundation, Human Rights Watch, and other organizations agree that what France is attempting sets a dangerous precedent. Writer Wendy Davis elaborates:
Google argues that it can comply with the ruling by preventing links from appearing in the results pages of search engines aimed at specific countries, like Google.fr, for French residents. But the French authorities say Google must delete the links from all of its search engines, including Google.com in the U.S. Earlier this year, France’s CNIL [Commission Nationale de l’Informatique et des Libertés ]rejected Google’s position and fined the company $112,000. Google is now appealing that ruling, and the Center for Democracy & Technology and others are backing Google’s position.
The CDT argues in a blog post that authorities in one country shouldn’t be able to decide whether particular search results are available in other countries—especially given that authorities in some parts of the world often object to material that’s perfectly legal in many nations. For instance, Pakistan authorities recently asked Google (unsuccessfully) to take down videos that satirized politicians, while Thai authorities unsuccessfully asked Google to remove YouTube clips that allegedly insulted the royal family.
Google itself has argued that no one country should be able to censor the Web internationally. ‘In the end, the Internet would only be as free as the world’s least free place,’ global privacy counsel Peter Fleischer wrote on the company’s blog last year.
Indeed. As someone whose (most) foolish years occurred before the Web was a thing, I sympathize with folks who want to scrub the Internet of their embarrassing moments. However, trying to restrict what citizens of other countries can access simply goes too far.
Cynthia Murrell, January 31, 2017
Google Needs a Time-Out for Censorship, But Who Will Enforce Regulations
January 26, 2017
The article on U.S. News and World Report titled The New Censorship offers a list of the ways in which Google is censoring its content, and builds a compelling argument for increased regulation of Google. Certain items on the list, such as pro-life music videos being removed from YouTube, might have you rolling your eyes, but the larger point is that Google simply has too much power over what people see, hear, and know. The most obvious problem is Google’s ability to squash a business simply by changing its search algorithm, but the myriad ways that it has censored content is really shocking. The article states,
No one company, which is accountable to its shareholders but not to the general public, should have the power to instantly put another company out of business or block access to any website in the world. How frequently Google acts irresponsibly is beside the point; it has the ability to do so, which means that in a matter of seconds any of Google’s 37,000 employees with the right passwords or skills could laser a business or political candidate into oblivion…
At times the article sounds like a sad conservative annoyed that the most influential company in the world tends toward liberal viewpoints. Hearing white male conservatives complain about discrimination is always a little off-putting, especially when you have politicians like Rand Paul still defending the right of businesses to refuse service based on skin color. But from a liberal standpoint, just because Google often supports left-wing causes like gun control or the pro-choice movement doesn’t mean that it deserves a free ticket to decide what people are exposed to. Additionally, the article points out that the supposed “moral stands” made by Google are often revealed to be moneymaking or anticompetitive schemes. Absolute power corrupts no matter who yields it, and companies must be scrutinized to protect the interests of the people.
Chelsea Kerwin, January 26, 2017