Hit Delete. Save Money. Data Liability Is Gone. Is That Right?
July 17, 2023
Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.
“Reddit Removed Your Chat History from before 2023” stated:
… legacy chats were being migrated to the new chat platform and that only 2023 data is being brought over, adding that they “hope” a data export will help the user get back the older chats. The admin told another user asking whether there was an option to stay on the legacy chat that no, there isn’t, and Reddit is “working on making new chats better.”
A young attorney studies ancient Reddit data from 2023. That’s when information began because the a great cataclysm destroyed any previous, possibly useful data for a legal matter. But what about the Library of Congress? But what about the Internet Archive? But what about back up tapes at assorted archives? Yeah, right. Thanks for the data in amber MidJourney.
The cited article does not raise the following obviously irrelevant questions:
- Are there backups which can be consulted?
- Are their copies of the Reddit data chat data?
- Was the action taken to reduce costs or legal liability?
I am not a Reddit user, nor do I affix site:reddit or append the word “reddit” to my queries. Some may find the service useful, but I am a dinobaby and hopeless out of touch with where the knowledge action is.
As an outsider, my initial reaction is that dumping data has two immediate paybacks: Reduce storage and the likelihood that a group of affable lawyers will ask for historic data about a Reddit user’s activity. My hunch is that users of a free service cannot fathom why a commercial enterprise would downgrade or eliminate a free service. Gee, why?
I think I would answer the question with one word, “Adulting.”
Stephen E Arnold, July 17, 2023
Financial Analysts, Lawyers, and Consultants Can See Their Future
July 17, 2023
It is the middle of July 2023, and I think it is time for financial analysts, lawyers, and consultants to spruce up their résumés. Why would a dinobaby make such a suggestion to millions of the beloved Millennials, GenXers, the adorable GenY folk, and the vibrant GenZ lovers of TikTok, BMWs, and neutral colors?
I read three stories helpfully displayed by my trusty news reader. Let’s take a quick look at each and offer a handful of observations.
The first article is “This CEO Replaced 90% of Support Staff with an AI Chatbot.” The write up reports:
The chief executive of an Indian startup laid off 90% of his support staff after the firm built a chatbot powered by artificial intelligence that he says can handle customer queries much faster than his employees.
Yep, better, faster, and cheaper. Pick all three which is exactly what some senior managers will do. AI is now disrupting. But what about “higher skill” jobs than talking on the phone and looking up information for a clueless caller?
The second article is newsy or is it newsie? “Open AI and Associated Press Announce Partnership to Train AI on New Articles” reports:
[The deal] will see OpenAI licensing text content from the AP archives that will be used for training large language models (LLMs). In exchange, the AP will make use of OpenAI’s expertise and technology — though the media company clearly emphasized in a release that it is not using generative AI to help write actual news stories.
Will these stories become the property of the AP? Does Elon Musk have confidence in himself?
Young professionals learning that they are able to find their future elsewhere. In the MidJourney confection is a lawyer, a screenwriter, and a consultant at a blue chip outfit selling MBAs at five times the cost of their final year at university.
I think that the move puts Google in a bit of a spot if it processes AP content and a legal eagle can find that content in a Bard output. More significantly, hasta la vista reporters. Now the elimination of hard working, professional journalists will not happen immediately. However, from my vantage point in rural Kentucky, I hear the train a-rollin’ down the tracks. Whooo Whooo.
The third item is “Producers Allegedly Sought Rights to Replicate Extras Using AI, Forever, for Just $200.” The write up reports:
Hollywood’s top labor union for media professionals has alleged that studios want to pay extras around $200 for the rights to use their likenesses in AI – forever – for just $200.
Will the unions representing these skilled professionals refuse to cooperate? Does Elon Musk like Grimes’s music?
A certain blue chip consulting firm has made noises about betting $2 billion on smart software and Microsoft consulting. Oh, oh. Junior MBAs, it may not be too late to get an associate of arts degree in modern poetry so you can work as a prompt engineer. As a famous podcasting person says, “What say you?”
Several questions:
- Will trusted, reliable, research supporting real news organizations embrace smart software and say farewell to expensive humanoids?
- Will those making videos use computer generated entities?
- Will blue chip consulting firms find a way to boost partners’ bonuses standing on the digital shoulders of good enough software?
I sure hope you answered “no” to each of these questions. I have a nice two cruzeiro collectable from Brazil, circa 1952 to sell you. Make me an offer. Collectible currency is an alternative to writing prompts or becoming a tour guide in Astana. Oh, that’s in Kazakhstan.
Smart software is a cost reducer because humanoids [a] require salaries and health care, [b] take vacations, [c] create security vulnerabilities or are security vulnerabilities, and [d] require more than high school science club management methods related to sensitive issues.
Money and good enough will bring changes in news, Hollywood, and professional services.
Stephen E Arnold, July 17, 2023
Need Research Assistance, Skip the Special Librarian. Go to Elicit
July 17, 2023
Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.
Academic databases are the bedrock of research. Unfortunately most of them are hidden behind paywalls. If researchers get past the paywalls, they encounter other problems with accurate results and access to texts. Databases have improved over the years but AI algorithms make things better. Elicit is a new database marketed as a digital assistant with less intelligence than Alexa, Siri, and Google but can comprehend simple questions.
“This is indeed the research library. The shelves are filled with books. You know what a book is, don’t you? Also, will find that this research library is not used too much any more. Professors just make up data. Students pay others to do their work. If you wish, I will show you how to use the card catalog. Our online public access terminal and library automation system does not work. The university’s IT department is busy moonlighting for a professor who is a consultant to a social media company,” says the senior research librarian.
What exactly is Elicit?
“Elicit is a research assistant using language models like GPT-3 to automate parts of researchers’ workflows. Currently, the main workflow in Elicit is Literature Review. If you ask a question, Elicit will show relevant papers and summaries of key information about those papers in an easy-to-use table.”
Researchers use Elicit to guide their research and discover papers to cite. Researcher feedback stated they use Elicit to answer their questions, find paper leads, and get better exam scores.
Elicit proves its intuitiveness with its AI-powered research tools. Search results contain papers that do not match the keywords but semantically match the query meaning. Keyword matching also allows researchers to narrow or expand specific queries with filters. The summarization tool creates a custom summary based on the research query and simplifies complex abstracts. The citation graph semantically searches citations and returns more relevant papers. Results can be organized and more information added without creating new queries.
Elicit does have limitations such as the inability to evaluate information quality. Also Elicit is still a new tool so mistakes will be made along the development process. Elicit does warn users about mistakes and advises to use tried and true, old-fashioned research methods of evaluation.
Whitney Grace, July 16 , 2023
AI Analyzed by a Human from Microsoft
July 14, 2023
Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.
“Artificial Intelligence Doesn’t Have Capability to Take Over, Microsoft Boss Says” provides some words of reassurance when Sam AI-Man’s team are suggesting annihilation of the human race. Here are two passages I found interesting in the article-as-interview write up.
This is an illustration of a Microsoft training program for its smart future employees. Humans will learn or be punished by losing their Microsoft 365 account. The picture is a product of the gradient surfing MidJourney.
First snippet of interest:
“The potential for this technology to really drive human productivity… to bring economic growth across the globe, is just so powerful, that we’d be foolish to set that aside,” Eric Boyd, corporate vice president of Microsoft AI Platforms told Sky News.
Second snippet of interest:
“People talk about how the AI takes over, but it doesn’t have the capability to take over. These are models that produce text as output,” he said.
Now what about this passage posturing as analysis:
Big Tech doesn’t look like it has any intention of slowing down the race to develop bigger and better AI. That means society and our regulators will have to speed up thinking on what safe AI looks like.
I wonder if anyone is considering that AI in the hands of Big Tech might have some interest in controlling some of the human race. Smart software seems ideal as an enabler of predatory behavior. Regulators thinking? Yeah, that’s a posture sure to deal with smart software’s applications. Microsoft, do you believe this colleague’s marketing hoo hah?
Stephen E Arnold, July 14, 2023
What, Google? Accuracy Through Plagiarism
July 14, 2023
Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.
Now that AI is such a hot topic, tech companies cannot afford to hold back due to small flaws. Like a tendency to spit out incorrect information, for example. One behemoth seems to have found a quick fix for that particular wrinkle: simple plagiarism. Eager to incorporate AI into its flagship Search platform, Google recently released a beta version to select users. Forbes contributor Matt Novak was among the lucky few and shares his observations in, “Google’s New AI-Powered Search Is a Beautiful Plagiarism Machine.”
The blacksmith says, “Oh, oh, I think I have set my shop on fire.” The image is the original work of the talented MidJourney system.
The author takes us through his query and results on storing live oysters in the fridge, complete with screenshots of the Googlebot’s response. (Short answer: you can for a few days if you cover them with a damp towel.) He highlights passages that were lifted from websites, some with and some without tiny tweaks. To be fair, Google does link to its source pages alongside the pilfered passages. But why click through when you’ve already gotten what you came for? Novak writes:
“There are positive and negative things about this new Google Search experience. If you followed Google’s advice, you’d probably be just fine storing your oysters in the fridge, which is to say you won’t get sick. But, again, the reason Google’s advice is accurate brings us immediately to the negative: It’s just copying from websites and giving people no incentive to actually visit those websites. Why does any of this matter? Because Google Search is easily the biggest driver of traffic for the vast majority of online publishers, whether it’s major newspapers or small independent blogs. And this change to Google’s most important product has the potential to devastate their already dwindling coffers. … Online publishers rely on people clicking on their stories. It’s how they generate revenue, whether that’s in the sale of subscriptions or the sale of those eyeballs to advertisers. But it’s not clear that this new form of Google Search will drive the same kind of traffic that it did over the past two decades.”
Might Google be like a blacksmith who accidentally sets fire to his workshop? Content is needed to make the fires of revenue burn brightly. No content, problem?
Cynthia Murrell, July 14, 2023
Microsoft Causing Problems? Heck, No
July 14, 2023
Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.
I cruised through the headlines my smart news system prepared for me. I noted two articles on different subjects. The two write ups were linked with a common point of reference: Microsoft Corp., home of the Softies and the throbbing heart of a significant portion of the technology governments in North America and Western Europe find essential.
“What’s the big deal?” asks Mr. Microsoft. “You have Windows. You have Azure. Software has bugs. Get used to it. You can switch to Linux anytime.” Thin interesting scene is the fruit of MidJourney’s tree of creativity.
The first article appeared in TechRadar. an online real news outfit. The title was compelling; specifically, “Windows 11 Update Is Reportedly Slowing Down PCs and Breaking Internet Connections.” The write up reports:
KB5028185, the ‘Moment 3’ update, is proving seriously problematic for some users … The main bones of contention with patch KB5028185 for Windows 11 22H2 are instances of performance slowdown – with severe cases going by some reports – and problems with flaky internet connections.
The second story appeared on cable “real” news. I tracked down the item titled “US and Microsoft Sound Alarm about China-Based Cybersecurity Threat.” The main idea seems to be:
The U.S. and Microsoft say China-based hackers, focused on espionage, have breached email accounts of about two dozen organizations, including U.S. government agencies.
Interesting. Microsoft seems to face two challenges: Desktop engineering and cloud engineering. The common factor is obviously engineering.
I am delighted that Bing is improving with smart software. I am fascinated by Microsoft’s effort to “win” in online games. However, isn’t it time for something with clout to point out that Microsoft may need to enhance its products’ stability, security, and reliability.
Due to many organizations’ and individuals’ dependence on Microsoft, the company seems to have a knack for creating a range of issues. Will someone step up and direct the engineering in a way that does not increase vulnerability and cause fiduciary loss for its customers?
Anyone? Crickets I fear. Bad actors find Microsoft’s approach more satisfying than a stream of TikTok moments.
Stephen E Arnold, July 14, 2023
Refining Open: The AI Weak Spot during a Gold Rush
July 13, 2023
Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.
Nope, no reference will I make to sell picks and denim pants to those involved in a gold rush. I do want to highlight the essay “AI Weights Are Not Open Source.” There is a nifty chart with rows and columns setting forth some conceptual facets of smart software. Please, navigate to the cited document so you can read the text in the rows and columns.
For me, the most important sentence in the essay in my opinion is this one:
Many AI weights with the label “open” are not open source.
How are these “weights” determined or contrived? Are these weights derived by proprietary systems and methods? Are these weights assigned by a subject matter expert, a software engineer using guess-timation, or are low wage workers pressed against the task?
The answers to these questions reveal how models are configured to generate “good enough” results. Present models are prone to providing incomplete, incorrect, or pastiche information.
Furthermore, the popularity of obtaining images of Mr. Trump in an orange jumpsuit illustrates how “censorship” is applied to certain requests for information. Try it yourself. Navigate to MidJourney. Jump through the Discord hoops. Input the command “President Donald Trump in an orange jumpsuit.” Get the improper request flag. Then ask yourself, “How does BoingBoing keep creating Mr. Trump in an orange jumpsuit?”
Net net: The power of AI rests with the weights and controls which allow certain information and disallows other types of information. “Open” does not mean open like “the door is open.” Open for AI means a means to obtain power and exert control in my opinion.
Stephen E Arnold, July 13, 2023
What Is the Purpose of a Library? Maybe WiFi?
July 13, 2023
Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.
The misinformed believe libraries only offer access to free books, DVDs, and public computers in various states of obsoleteness. Libraries are actually a hub for Internet access, including WiFi. The Internet is a necessary tool and many people do no have reliable access either due to low income, homelessness, and rural locations. KQED reports on how San Francisco is handling WiFi access and homelessness at a local library: “What Happens When Libraries Stop Sharing Wi-Fi?”
San Francisco, California is experiencing record high homelessness. Businesses and people are abandoning the city, crime is running rampant on the streets, and law enforcement’s hands are tied. Homeless people regularly visit libraries to use the computers and the WiFi. Libraries usually keep their WiFi on 24/7 so their parking lots and outside areas are active hotspots.
The Eureka Valley/Harvey Milk Memorial Branch Library turns off its WiFi at night. This is the only San Franciscan library that shuts off its WiFi, because homeless people visited the library after hours and the surrounding neighborhood had an increase in crime. Pulling the WiFi plug is another way San Francisco clears sidewalks and prevents people sleeping in areas.
“ ‘Neighbors in that area have been dealing with repeated encampments, open-air drug sales and use, harassment of local businesses and all-around problematic situations going on for a decade at this point,” said [Supervisor Rafael Mandelman]. ‘It reached its nadir in the pandemic in 2020. There were encampments on both sides of the street, the sidewalk was impassable, and the historic AIDS mural had been wildly defaced. Neighbors were being threatened. It was bad.’ ”
The library faced homeless people camping on the roof, hacking into their electricity, and breaking into a closet. After the library shut off the WiFi, emergency services were called less in the area. However, it is not 100% attributed to the WiFi shutdown. Some homeless people in the area found permanent housing, a mural was repainted, and other services were enacted.
While WiFi is an essential service. If the people and places that bring the service are harmed it is ruined for everybody. It is an ethical conundrum but if crime, drugs, debris, and homeless encampments make an area dangerous, then measures must be taken to resolve the problems.
Whitney Grace, July 13, 2023
Business Precepts for Silicon Valley: Shouting at the Grand Canyon?
July 13, 2023
Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.
I love people with enthusiasm and passion. What’s important about these two qualities is that they often act like little dumpsters at the Grand Canyon. People put a range of discarded items into them, and hard-working contractors remove the contents and dump them in a landfill. (I hear some countries no longer accept trash from the US. Wow. Imagine that.)
During one visit many years ago with the late industrial photographer John C Evans, we visited the Grand Canyon. We were visiting uranium mines and snapping pictures for a client. I don’t snap anything; I used to get paid to be in charge of said image making. I know. Quite a responsibility. I did know enough not to visit the uranium mine face. The photographer? Yeah, well, I did not provide too much information about dust, radiation, and the efficacy of breathing devices in 1973. Senior manager types are often prone to forgetting some details.
Back to the Grand Canyon.
There was an older person who was screaming into or at the Grand Canyon. Most visitors avoided the individual. I, however, walked over and listened to him. He was explaining that everyone had to embrace the sacred nature of the Grand Canyon and stop robbing the souls of the deceased by taking pictures. He provided other outputs about the evils of modern society, the cost of mule renting, and the prices in the “official” restaurants. Since I had no camera, he ignored me. He did yell at John C Evens, who smiled and snapped pictures.
I asked MidJourney to replicate this individual who thought the Grand Canyon, assorted unseen spirits, and the visitors were listening. Here’s what the estimable art system output:
I thought of this individual when I read “Seven Rules For Internet CEOs To Avoid Enshittification.” The write up, inspired by a real journalist, surfs on the professional terminology for ruining a free service. I find the term somewhat offensive, and I am amused at the broad use the neologism has found.
The article provides what I think are similar to the precepts outlined in a revered religious book or a collection of Ogden Nash statements. Let me point out that these statements contain elements of truth and would probably reduce philosophers like A.E.O. Taylor and William James to tears of joy because of their fundamental rational empiricism. Yeah. (These fellows would have told the photographer about the visit to the uranium mine face too.)
The write up lays out a Code of Conduct for some Silicon Valley-type companies. Let me present three of the seven statements and urge you to visit the original article to internalize the precepts as a whole. You may want to consider screaming these out loud in a small group of friends or possibly visiting a local park and shouting at the pedestal where a Civil War statue once stood ignored.
Selected precept one:
Tell your investors that you’re in this for the long haul and they need to be too.
Selected precept two:
Find ways to make money that don’t undermine the community or the experience.
Selected precept three and remember there are four more in the original write up:
Never charge for what was once free.
I selected three of these utterances because each touches upon money. Investors provide money to get more money in return. Power and fame are okay, but money is the core objective. Telling investors to wait or be patient is like telling a TikTok influencer to wait, stand in line like everyone else, or calm down. Tip: That does not work. Investors want money and in a snappy manner. Goals and timelines are part of the cost of taking their money. The Golden Rule: Those with the gold rule.
The idea of giving up money for community or the undefined experience is okay as long as it does not break the Golden Rule. If it does, those providing the funding will get someone who follows the Golden Rule. The mandate to never charge for what was once free is like a one-liner at a Comedy Club. Quite a laugh because money trumps customers and the fuzzy wuzzy notion of experience.
What’s my take on these and the full listing of precepts? Think about the notion of a senior manager retaining information for self preservation. Think about the crazy person shouting rules into the Grand Canyon. Now think about how quickly certain Silicon Valley type outfits will operate in a different way? Free insight: The Grand Canyon does not listen. The trash is removed by contractors. The old person shouting eventually gets tired, goes to the elder care facility or back to van life, and the Silicon Valley steps boldly toward enshittification. That’s the term, right?
Stephen E Arnold, July 12, 2023
Understanding Reality: A Job for MuskAI
July 12, 2023
Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid
I read “Elon Musk Launches His Own xAI Biz to Understand Reality.” Upon reading this article, I was immediately perturbed. The name of the company should be MuskAI (pronounced mus-key like the lovable muskox (Ovibos moschatus). This imposing and aromatic animal can tip the scales at up to 900 pounds. Take that to the cage match and watch the opposition wilt or at least scrunch up its nose.
I also wanted to interpret the xAI as AIX. IBM, discharger of dinobabies, could find that amusing. (What happens when AIX memory is corrupted? Answer: Aches in the posterior. Snort snort.)
Finally, my thoughts coalesced around the name Elon-AI, illustrated below by the affable MidJourney:
Bummer. Elon AI is the name of a “coin.” And the proper name Elonai means “a person who has the potential to attain spiritual enlightenment.” A natural!
The article reports:
Elon Musk is founding of his own AI company with some lofty ambitions. According to the billionaire, his xAI venture is being formed “to understand reality.” Those hoping to get a better explanation than Musk’s brief tweet by visiting xAI’s website won’t find much to help them understand what the company actually plans to do there, either. “The goal of xAI is to understand the true nature of the universe,” xAI said of itself…
I have a number of questions. Let me ask one:
Will Elon AI go after the Zuck AI?
And another:
Will the two AIs power an unmanned fighter jet, each loaded with live ordnance?
And the must-ask:
Will the AIs attempt to kill one another?
The mano-a-mano fight in Las Vegas (maybe in the weird LED appliqued in itsy bitsy LEDs) is less interesting to me than watching two warbirds from the Dayton Air Museum gear up and dog fight.
Imagine a YouTube video, then some TikToks, and finally a Netflix original released to the few remaining old-fashioned theaters.
That’s entertainment. Sigh. I mean xAI.
Stephen E Arnold, July 12, 2023