Laws, Rules, Regulations for Semantic AI (No, I Do Not Know What Semantic AI Means)

March 31, 2023

Vea4_thumb_thumbNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

I am not going to dispute the wisdom and insight in the Microsoft essay “Consider the Future of This Decidedly Semantic AI.” The author is Xoogler Sam Schillace, CVP or corporate vice president and now a bigly wizard at the world’s pre-eminent secure software firm. However, I am not sure to what the “this” refers. Let’s assume that it is the Bing thing and not the Google thing although some plumbing may be influenced by Googzilla’s open source contributions to “this.” How would you like to disambiguate that statement, Mr. Bing?

The essay sets forth some guidelines or bright, white lines in the lingo of the New Age search and retrieval fun house. The “Laws” number nine. I want to note some interesting word choice. The reason for my focus on these terms is that taken as a group, more is revealed than I first thought.

Here are the terms I circled in True Blue (a Microsoft color selected for the blue screen of death):

  • Intent. Rule 1 and 3. The user’s intent at first glance. However, what if the intent is the hard wiring of a certain direction in the work flow of the smart software. Intent in separate parts of a model can and will have a significant impact on how the model arrives at certain decisions. Isn’t that a thumb on the scale?
  • Leverage. Rule 2. Okay, some type of Archimedes’ truism about moving the world I think. Upon rereading the sentence in which the word is used, I think it means that old-school baloney like precision and recall are not going to move anything. The “this” world has no use for delivering on point information using outmoded methods like string matching or Boolean statements. Plus, the old-school methods are too expensive, slow, and dorky.
  • Right. Rule 3. Don’t you love it when an expert explains that a “right” way to solve a problem exists. Why then did I have to suffer through calculus classes in which expressions had to be solved different ways to get the “right” answer. Yeah, who is in charge here? Isn’t it wonderful to be a sophomore in high school again?
  • Brittle. Rule 4. Yep, peanut brittle or an old-school light bulb. Easily broken, cut fingers, and maybe blinded? Avoid brittleness by “not hard coding anything.” Is that why Microsoft software is so darned stable? How about those email vulnerabilities in the new smart Outlook?
  • Lack. Rule 5. Am I correct in interpreting the use of the word “lack” as a blanket statement that the “this” is just not very good. I do love the reference to GIGO; that is, garbage in, garbage out. What if that garbage is generated by Bard, the digital phantasm of ethical behavior?
  • Uncertainty. Rule 6. Hello, welcome to the wonderful world of statistical Fancy Dancing. Is that “answer” right? Sure, if it matches the “intent” of the developer and the smart software helping that individual. I love it when smart software is recursive and learns from errors, at least known errors.
  • Protocol. Rule 7. A protocol is, according to the smart search system You.com is:

In computer networking, a protocol refers to a set of rules and guidelines that define a standard way of communicating data over a network. It specifies the format and sequence of messages that are exchanged between the different devices on the network, as well as the actions that are taken when errors occur or when certain events happen.

Yep, more rules and a standard, something universal. I think I get what Microsoft’s agenda has as a starred item: The operating system for smart software in business, the government, and education.

  • Hard. Rule 8. Yes, Microsoft is doing intense, difficult work. The task is to live up to the marketing unleashed at the World Economic Forum. Whew. Time for a break.
  • Pareidolia. Rule 9. The word means something along the lines is that some people see things that aren’t there. Hello, Bruce Lemoine, please. Oh, he’s on a date with a smart avatar. Okay, please, tell him I called. Also, some people may see in the actions of their French bulldog, a certain human quality.

If we step back and view these words in the context of the Microsoft view of semantic AI, can we see an unintentional glimpse into the inner workings of the company’s smart software? I think so. Do you see a shadowy figure eager to dominate while saying, “Ah, shucks, we’re working hard at an uncertain task. Our intent is to leverage what we can to make money.” I do.

Stephen E Arnold, March 31, 2023

Comments

Comments are closed.

  • Archives

  • Recent Posts

  • Meta