Smart Software: Can Humans Keep Pace with Emergent Behavior ?

November 29, 2022

For the last six months, I have been poking around the idea that certain behaviors are emergent; that is, give humans a capability or a dataspace, and those humans will develop novel features and functions. The examples we have been exploring are related to methods used by bad actors to avoid take downs by law enforcement. The emergent behaviors we have noted exploit domain name registry mechanisms and clever software able to obfuscate traffic from Tor exit nodes. The result of the online dataspace is unanticipated emergent behaviors. The idea is that bad actors come up with something novel using the Internet’s furniture.

We noted “137 Emergent Abilities of Large Language Models.” If our understanding of this report is mostly accurate, large language models like those used by Google and other firms manifest emergent behavior. What’s interesting is that the write up explains that there is not one type of emergent behavior. The article ideas a Rivian truck bed full of emergent behaviors.

Here’s are the behaviors associated with big data sets and LaMDA 137B. (The method is a family of Transformer-based neural language models specialized for dialog. Correctly or incorrectly we associate LaMBA with Google’s smart software work. See this Google blog post.) Now here are the items mentioned in the Emergent Abilities paper:

Gender inclusive sentences German

Irony identification

Logical arguments

Repeat copy logic

Sports understanding

Swahili English proverbs

Word sorting

Word unscrambling

Another category of emergent behavior is what the paper calls “Emergent prompting strategies.” The idea is more general prompting strategies manifest themselves. The system can perform certain functions that cannot be implemented when using “small” data sets; for example, solving multi step math problems in less widely used languages.

The paper includes links so the different types of emergent behavior can be explored. The paper wraps up with questions researchers may want to consider. One question we found suggestive was:

What tasks are language models currently not able to to perform, that we should evaluate on future language models of better quality?

The notion of emergent behavior is important for two reasons: [a] Systems can manifest capabilities or possible behaviors not anticipated by developers and [b] Novel capabilities may create additional unforeseen capabilities or actions.

If one thinks about emergent behaviors in any smart, big data system, humans may struggle to understand, keep up, and manage downstream consequences in one or more dataspaces.

Stephen E Arnold, November 29, 2022

Comments

Comments are closed.

  • Archives

  • Recent Posts

  • Meta