Google: Do Small Sites Need Anti Terrorism Help or Is the Issue Better Addressed Elsewhere?

January 3, 2023

Are “little sites” really in need of Google’s anti-terrorism tool? Oh, let me be clear. Google is — according to “Google Develops Free Terrorism-Moderation Tool for Smaller Websites” — in the process of creating Googley software. This software will be:

a free moderation tool that smaller websites can use to identify and remove terrorist material, as new legislation in the UK and the EU compels Internet companies to do more to tackle illegal content.

And what institutions are working with Google on this future software? The article reports:

The software is being developed in partnership with the search giant’s research and development unit Jigsaw and Tech Against Terrorism, a UN-backed initiative that helps tech companies police online terrorism.

What’s interesting to me is that the motivation for this to-be software or filtering system is in development. The software, it seems, does not exist.

Why would Google issue statements about vaporware?

The article provides a clue:

The move comes as Internet companies will be forced to remove extremist content from their platforms or face fines and other penalties under laws such as the Digital Services Act in the EU, which came into force in November, and the UK’s Online Safety bill, which is expected to become law this year.

I understand. Google’s management understands that regulation and fines are not going away in 2023. It is logical, therefore, to get in front of the problem. How does Google propose to do this?

Yep, vaporware. (I have a hunch there is a demonstration available.) Nevertheless, the genuine article is not available to small Web sites, who need help in coping with terrorism-related content.

How will the tool work? The article states:

Jigsaw’s tool aims to tackle the next step of the process and help human moderators make decisions on content flagged as dangerous and illegal. It will begin testing with two unnamed sites at the beginning of this year.

Everything sounds good when viewed the top of Mount Public Relations, where the vistas are clear and the horizons are unlimited.

I want to make one modest observation: Small Web sites run on hosting services. These hosting services are, in my opinion, more suitable locations for filtering software. The problem is that hosting providers comprise a complex and diverse group of enterprises. In fact, I have yet to receive from my research team a count of service providers that is accurate and comprehensive.

Pushing the responsibility to the operator of a single Web site strikes me as a non-functional approach. Would it make sense for Google’s tool to be implemented in service providers. The content residing on the service providers equipment or co-located hardware and in the stream of data for virtual private systems or virtual private servers. The terrorism related content would be easier to block.

Let’s take a reasonable hosting service; for example, Hertzner in Germany or OVHCloud in France. The European Union could focus on these enabling nodes and implement either the Google system if and when it becomes available and actually works or an alternative filtering method devised by  a European team. (I would suggest that Europol or similar entities can develop the needed filters, test them, and maintain them.) Google has a tendency to create or talk about solutions and then walk away after a period of time. Remember Google’s Web Accelerator?)

Based on our research for an upcoming presentation to a group of investigators focused on cyber crime, service providers (what I call enablers) should be the point of attention in an anti-terrorism action. Furthermore, these enablers are also pivotal in facilitating certain types of online crime. Examples abound. These range from right-wing climate activists using services in Romania to child pornography hosted on what we call “shadow ISPs.” These shadow enablers operate specialized services specifically to facilitate illegal activities within specialized software like The Onion Router and other obfuscation methods.

For 2023, I advocate ignoring PR motivated “to be” software. I think the efforts of national and international law enforcement should be directed at the largely unregulated and often reluctant “enablers.” I agree that some small Web site operators could do more. But I think it is time to take a closer look at enablers operating from vacant lots in the Seychelles or service providers running cyber fraud operations to be held responsible.

Fixing the Internet requires consequences. Putting the focus on small Web sites is a useful idea. But turning up the enforcement and regulatory heat on the big outfits will deliver more heat where being “chill” has allowed criminal activity to flourish. I have not mentioned the US and Canada. I have not forgotten that there are enablers operating in plain sight in such places as Detroit and Québec City. Google’s PR play is a way to avoid further legal and financial hassles.

It is time to move from “to be” software to “taking purposeful, intentional action.”

Stephen E Arnold, January 3, 2023

Comments

Comments are closed.

  • Archives

  • Recent Posts

  • Meta