AI Limits: The Wind Cannot Hear the Shouting. Sorry.

March 14, 2024

green-dino_thumb_thumb_thumbThis essay is the work of a dumb dinobaby. No smart software required.

One of my teachers had a quote on the classroom wall. It was, I think, from a British novelist. Here’s what I recall:

Decide on what you think is right and stick to it.

I never understood the statement. In school, I was there to learn. How could I decide whether what I was reading was correct. Making a decision about what I thought was stupid because I was uninformed. The notion of “stick” is interesting and also a little crazy. My family was going to move to Brazil, and I knew that sticking to what I did in the Midwest in the 1950s would have to change. For one thing, we had electricity. The town to which we were relocating had electricity a few hours each day. Change was necessary. Even as a young sprout, trying to prevent something required more than talk, writing a Letter to the Editor, or getting a petition signed.

I thought about this crazy quote as soon as I read “AI Bioweapons? Scientists Agree to Policies to Reduce Risk of Human Disaster.” The fear mongering note of the write up’s title intrigued me. Artificial intelligence is in what I would call morph mode. What this means is that getting a fix on what is new and impactful in the field of artificial intelligence is difficult. An electrical engineering publication reported that experts are not sure if what is going on is good or bad.

image

Shouting into the wind does not work for farmers nor AI scientists. Thanks, MSFT Copilot. Busy with security again?

The “AI Bioweapons” essay is leaning into the bad side of the AI parade. The point of the write up is that “over 100 scientists” want to “prevent the creation of AI bioweapons.” The article states:

The agreement, crafted following a 20230 University of Washington summit and published on Friday, doesn’t ban or condemn AI use. Rather, it argues that researchers shouldn’t develop dangerous bioweapons using AI. Such an ask might seem like common sense, but the agreement details guiding principles that could help prevent an accidental DNA disaster.

That sounds good, but is it like the quote about “decide on what you think is right and stick to it”? In a dynamic environment, change is appears to accelerate. Toss in technology and the potential for big wins (either financial, professional, or political), and the likelihood of slowing down the rate of change is reduced.

To add some zip to the AI stew, much of the technology required to do some AI fiddling around is available as open source software or low-cost applications and APIs.

I think it is interesting that 100 scientists want to prevent something. The hitch in the git-along is that other countries have scientists who have access to AI research, tools, software, and systems. These scientists may feel as thought their reminding people that doom is (maybe?) just around the corner or a ruined building in an abandoned town on Route 66.

Here are a few observations about why individuals rally around a cause, which is widely perceived by some of those in the money game as the next big thing:

  1. The shouters perception of their importance makes it an imperative to speak out about danger
  2. Getting a group of important, smart people to climb on a bandwagon makes the organizers perceive themselves as doing something important and demonstrating their “get it done” mindset
  3. Publicity is good. It is very good when a speaking engagement, a grant, or consulting gig produces a little extra fame and money, preferably in a combo.

Net net: The wind does not listen to those shouting into it.

Stephen E Arnold, March 14, 2024

Comments

Comments are closed.

  • Archives

  • Recent Posts

  • Meta