Google Smart Software: Lawyers to the Rescue
May 2, 2023
The article “Beginning of the End of OpenAI” in Analytics India raised an interesting point about Google’s smart software. The essay suggests that a legal spat over a trademark for “GPT” could allow Google to make a come-from-behind play in the generative software race. I noted this passage:
A lot of product names appear with the term ‘GPT’ in it. Now, if OpenAI manages to get its trademark application decided in favour, all of these applications would have to change their name, and ultimately not look appealing to customers.
Flip this idea to “if Google wins…”, OpenAI could — note “could” — face a fleet of Google legal eagles and the might of Google’s prescient, forward forward, quantumly supreme marketing army.
What about useful products, unbiased methods of generating outputs, and slick technology? Wait. I know the answer. “That stuff is secondary to our new core competency. The outputs of lawyers and marketing specialists.”
Stephen E Arnold May 2, 2023
Digital Dumplings: AI Outputs Must Be Groomed, Trimmed, and Message Aligned
May 1, 2023
Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.
I read “Beijing Moves to Force AI Bots to Display socialist Core Values.” I am not sure that the write up is spot on, but let’s assume that it is close enough for horseshoes. The main idea is that AI can chat. However, the AI must be steered to that it outputs content displaying “socialist core values.”
The write up states:
Companies will also have to make sure their chatbots create words and pictures that are truthful and respect intellectual property, and will be required to register their algorithms, the software brains behind chatbots, with regulators. The rules are not final, and regulators may continue to modify them, but experts said engineers building AI services in China were already figuring out how to incorporate the edicts into their products.
My reaction is that those who would argue that training smart software plus any post training digital filters will work. Let’s assume that those subject to the edict achieve the objective. What about US smart software whose developers insist that objectivity is the real deal? China’s policy if implemented and delivers, makes it clear that smart software is not objective. Developers can and will use its malleability to achieve their goals.
How about those students who reveal deep secrets on TikTok? Will these individuals be manipulated via smart software informed of the individuals’ hot buttons?
Is that data dumpling a psychographic trigger with a payload different from ground pork, veggies, and spices?
Stephen E Arnold, May 1, 2023
Gmail: An Example of Control Addiction
May 1, 2023
Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.
I read “Is Gmail Killing Independent Email?” The main idea for the essay by an outfit called Tutanota is to answer the question with a reasonably well-reasoned, “Yes.” I am not going to work through the headaches caused by Google’s spam policies. Instead I want to present one statement from the write up and invite you to consider it in the content of “control addiction.”
I circled one statement which illustrates how Alphabet responds to what I call “control addiction.” My definition of the term is that a firm in a position of power wants more power because it validates the company plus it creates revenue opportunities via lock in. Addicts generally feel compelled to keep buying from their supplier I believe.
Is it okay that Gmail has the power to decide whether a business is sending spam or not? At the very least, Gmail support team should have listened to the company and looked into the issue to fix it. If Google is not willing to do this, it is just another sign of how Google can abuse their market power and hinder smaller services or – in this case – self-hosting emails, limiting the options people and businesses have when they want that their emails are reliably received by Gmail.
Several observations:
- Getting a human at Google is possible; however, some sort of positive relationship with a Googler of influence is necessary in my experience.
- That Googler may not know what to do about the problem. Command-and-control at the Alphabet, Google, YouTube construct is — how shall I phrase it? — quantumly supreme. The idea is that procedures and staff responsible for something wink in an out of existence without warning and change state following the perturbations of mysterious dynamical forces.
- Google is not into customer service, user service, or any other type of other directed service unless it benefits the Googler involved.
Net net: Decades of regulatory floundering have made life cushy for Googlers. Some others? Yeah, not so much.
Stephen E Arnold, May 1, 2023