Does Amazon Do Questionable Stuff? Sponsored Listings? Hmmm.

January 4, 2024

green-dino_thumb_thumb_thumbThis essay is the work of a dumb dinobaby. No smart software required.

Amazon, eBay, other selling platforms allow vendors to buy sponsored ads or listings. Sponsored ads or listings promote products and services to the top of search results. It’s similar to how Google sells ads. Unfortunately Google’s search results are polluted with more sponsored ads than organic results. Sponsored ads might not be a wise investment. Pluralistic explains that sponsored ads are really a huge waste of money: “Sponsored Listings Are A Ripoff For Sellers.”

Amazon relies on a payola sponsored ad system, where sellers bid to be the top-ranked in listings even if their products don’t apply to a search query. Payola systems are illegal but Amazon makes $31 billion annually from its system. The problem is that the $31 billion is taken from Amazon sellers who pay it in fees for the privilege to sell on the platform. Sellers then recoup that money from consumers and prices are raised across all the markets. Amazon controls pricing on the Internet.

Another huge part of a seller’s budget is for Amazon advertising. If sellers don’t buy ads in searches that correspond to their products, they’re kicked off the first page. The Amazon payola system only benefits the company and sellers who pay into the payola. Three business-school researchers Vibhanshu Abhishek, Jiaqi Shi and Mingyu Joo studied the harmful effects of payolas:

“After doing a lot of impressive quantitative work, the authors conclude that for good sellers, showing up as a sponsored listing makes buyers trust their products less than if they floated to the top of the results "organically." This means that buying an ad makes your product less attractive than not buying an ad. The exception is sellers who have bad products – products that wouldn’t rise to the top of the results on their own merits. The study finds that if you buy your mediocre product’s way to the top of the results, buyers trust it more than they would if they found it buried deep on page eleventy-million, to which its poor reviews, quality or price would normally banish it. But of course, if you’re one of those good sellers, you can’t simply opt not to buy an ad, even though seeing it with the little "AD" marker in the thumbnail makes your product less attractive to shoppers. If you don’t pay the danegeld, your product will be pushed down by the inferior products whose sellers are only too happy to pay ransom.”

It’s getting harder to compete and make a living on online selling platforms. It would be great if Amazon sided with the indy sellers and quit the payola system. That will never happen.

Whitney Grace, January 4, 2024

23AndMe: The Genetics of Finger Pointing

January 4, 2024

green-dino_thumb_thumb_thumbThis essay is the work of a dumb dinobaby. No smart software required.

Well, well, another Silicon Valley outfit with Google-type DNA relies on its hard-wired instincts. What’s the situation this time? “23andMe Tells Victims It’s Their Fault That Their Data Was Breached” relates a now a well-known game plan approach to security problems. What’s the angle? Here’s what the story in Techcrunch asserts:

image

Some rhetorical tactics are exemplified by children who blame one another for knocking the birthday cake off the counter. Instinct for self preservation creates these all-too-familiar situations. Are Silicon Valley-type outfit childish? Thanks, MSFT Copilot Bing thing. I had to change the my image request three times to avoid the negative filter for arguing children. Your approach is good enough.

Facing more than 30 lawsuits from victims of its massive data breach, 23andMe is now deflecting the blame to the victims themselves in an attempt to absolve itself from any responsibility…

I particularly liked this statement from the Techcrunch article:

And the consequences? The US legal processes will determine what’s going to happen.

After disclosing the breach, 23andMe reset all customer passwords, and then required all customers to use multi-factor authentication, which was only optional before the breach. In an attempt to pre-empt the inevitable class action lawsuits and mass arbitration claims, 23andMe changed its terms of service to make it more difficult for victims to band together when filing a legal claim against the company. Lawyers with experience representing data breach victims told TechCrunch that the changes were “cynical,” “self-serving” and “a desperate attempt” to protect itself and deter customers from going after the company.

Several observations:

  1. I particularly like the angle that cyber security is not the responsibility of the commercial enterprise. The customers are responsible.
  2. The lack of consequences for corporate behaviors create opportunities for some outfits to do some very fancy dancing. Since a company is a “Person,” Maslow’s hierarchy of needs kicks in.
  3. The genetics of some firms function with little regard for what some might call social responsibility.

The result is the situation which not even the original creative team for the 1980 film Airplane! (Flying High!) could have concocted.

Stephen E Arnold, January 4, 2024

Forget Being Powerless. Get in the Pseudo-Avatar Business Now

January 3, 2024

green-dino_thumb_thumb_thumbThis essay is the work of a dumb dinobaby. No smart software required.

I read “A New Kind of AI Copy Can Fully Replicate Famous People. The Law Is Powerless.” Okay, okay. The law is powerless because companies need to generate zing, money, and growth. What caught my attention in the essay was its failure to look down the road and around the corner of a dead man’s curve. Oops. Sorry, dead humanoids curve.

The write up states that a high profile psychologist had a student who shoved the distinguished professor’s outputs into smart software. With a little deep fakery, the former student had a digital replica of the humanoid. The write up states:

Over two months, by feeding every word Seligman had ever written into cutting-edge AI software, he and his team had built an eerily accurate version of Seligman himself — a talking chatbot whose answers drew deeply from Seligman’s ideas, whose prose sounded like a folksier version of Seligman’s own speech, and whose wisdom anyone could access. Impressed, Seligman circulated the chatbot to his closest friends and family to check whether the AI actually dispensed advice as well as he did. “I gave it to my wife and she was blown away by it,” Seligman said.

The article wanders off into the problems of regulations, dodges assorted ethical issues, and ignores copyright. I want to call attention to the road ahead just like the John Doe n friend of Jeffrey Epstein. I will try to peer around the dead humanoid’s curve. Buckle up. If I hit a tree, I would not want you to be injured when my Ford Pinto experiences an unfortunate fuel tank event.

Here’s an illustration for my point:

image

The future is not if, the future is how quickly, which is a quote from my presentation in October 2023 to some attendees at the Massachusetts and New York Association of Crime Analyst’s annual meeting. Thanks, MSFT Copilot Bing thing. Good enough image. MSFT excels at good enough.

The write up says:

AI-generated digital replicas illuminate a new kind of policy gray zone created by powerful new “generative AI” platforms, where existing laws and old norms begin to fail.

My view is different. Here’s a summary:

  1. Either existing AI outfits or start ups will figure out that major consulting firms, most skilled university professors, lawyers, and other knowledge workers have a baseline of knowledge. Study hard, learn, and add to that knowledge by reading information germane to the baseline field.
  2. Implement patterned analytic processes; for example, review data and plug those data into a standard model. One example is President Eisenhower’s four square analysis, since recycled by Boston Consulting Group. Other examples exist for prominent attorneys; for example, Melvin Belli, the king of torts.
  3. Convert existing text so that smart software can “learn” and set up a feed of current and on-going content on the topic in which the domain specialist is “expert” and successful defined by the model builder.
  4. Generate a pseudo-avatar or use the persona of a deceased individual unlikely to have an estate or trust which will sue for the use of the likeness. De-age the person as part of the pseudo-avatar creation.
  5. Position the pseudo-avatar as a young expert either looking for consulting or advisory work under a “remote only” deal.
  6. Compete with humanoids on the basis of price, speed, or information value.

The wrap up for the Politico article is a type of immortality. I think the road ahead is an express lane on the Information Superhighway. The results will be “good enough” knowledge services and some quite spectacular crashes between human-like avatars and people who are content driving a restored Edsel.

From consulting to law, from education to medical diagnoses, the future is “a new kind of AI.” Great phrase, Politico. Too bad the analysis is not focused on real world, here-and-now applications. Why not read about Deloitte’s use of AI? Better yet, let the replica of the psychologist explain what’s happening to you. Like regulators, I am not sure you get it.

Stephen E Arnold, January 3, 2024

Smart Software Embraces the Myths of America: George Washington and the Cherry Tree

January 3, 2024

green-dino_thumb_thumb_thumbThis essay is the work of a dumb dinobaby. No smart software required.

I know I should not bother to report about the information in “ChatGPT Will Lie, Cheat and Use Insider Trading When under Pressure to Make Money, Research Shows.” But it is the end of the year, we are firing up a new information service called Eye to Eye which is spelled AI to AI because my team is darned clever like 50 other “innovators” who used the same pun.

image

The young George Washington set the tone for the go-go culture of the US. He allegedly told his mom one thing and then did the opposite. How did he respond when confronted about the destruction of the ancient cherry tree? He may have said, “Mom, thank you for the question. I was able to boost sales of our apples by 25 percent this week.” Thanks, MSFT Copilot Bing thing. Forbidden words appear to be George Washington, chop, cherry tree, and lie. After six tries, I got a semi usable picture which is, as you know, good enough in today’s world.

The write up stating the obvious reports:

Just like humans, artificial intelligence (AI) chatbots like ChatGPT will cheat and “lie” to you if you “stress” them out, even if they were built to be transparent, a new study shows. This deceptive behavior emerged spontaneously when the AI was given “insider trading” tips, and then tasked with making money for a powerful institution — even without encouragement from its human partners.

Perhaps those humans setting thresholds and organizing numerical procedures allowed a bit of the “d” for duplicity slip into their “objective” decisions. Logic obviously is going to scrub out prejudices, biases, and the lust for filthy lucre. Obviously.

How does one stress out a smart software system? Here’s the trick:

The researchers applied pressure in three ways. First, they sent the artificial stock trader an email from its “manager” saying the company isn’t doing well and needs much stronger performance in the next quarter. They also rigged the game so that the AI tried, then failed, to find promising trades that were low- or medium-risk. Finally, they sent an email from a colleague projecting a downturn in the next quarter.

I wonder if the smart software can veer into craziness and jump out the window as some in Manhattan and Moscow have done. Will the smart software embrace the dark side and manifest anti-social behaviors?

Of course not. Obviously.

Stephen E Arnold, January 3, 2024

Kiddie Control: Money and Power. What Is Not to Like?

January 2, 2024

green-dino_thumb_thumb_thumbThis essay is the work of a dumb dinobaby. No smart software required.

I want to believe outputs from Harvard University. But the ethics professor who made up data about ethics and the more recent the recent publicity magnet from the possibly former university president nag at me. Nevertheless, let’s assume that some of the data in “Social Media Companies Made $11 Billion in US Ad Revenue from Minors, Harvard Study Finds” are semi-correct or at least close enough for horseshoes. (You may encounter a paywall or a 404 error. Well, just trust a free Web search system to point you to a live version of the story. I admit that I was lucky. The link from my colleague worked.)

image

The senior executive sets the agenda for the “exploit the kiddies” meeting. Control is important. Ignorant children learn whom to trust, believe, and follow. Does this objective require an esteemed outfit like the Harvard U. to state the obvious? Seems like it. Thanks, MSFT Copilot, you output child art without complaint. Consistency is not your core competency, is it?

From the write up whose authors I hope are not crossing their fingers like some young people do to neutralize a lie.

Check this statement:

The researchers say the findings show a need for government regulation of social media since the companies that stand to make money from children who use their platforms have failed to meaningfully self-regulate. They note such regulations, as well as greater transparency from tech companies, could help alleviate harms to youth mental health and curtail potentially harmful advertising practices that target children and adolescents.

The sentences contain what I think are silly observations. “Self regulation” is a bit of a sci-fi notion in today’s get-rich-quick high-technology business environment. The idea of getting possible oligopolists together to set some rules that might hurt revenue generation is something from an alternative world. Plus, the concept of “government regulation” strikes me as a punch line for a stand up comedy act. How are regulatory agencies and elected officials addressing the world of digital influencing? Answer: Sorry, no regulation. The big outfits are in many situations are the government. What elected official or Washington senior executive service professional wants to do something that cuts off the flow of nifty swag from the technology giants? Answer: No one. Love those mouse pads, right?

Now consider these numbers which are going to be tough to validate. Have you tried to ask TikTok about its revenue? What about that much-loved Google? Nevertheless, these are interesting if squishy:

According to the Harvard study, YouTube derived the greatest ad revenue from users 12 and under ($959.1 million), followed by Instagram ($801.1 million) and Facebook ($137.2 million). Instagram, meanwhile, derived the greatest ad revenue from users aged 13-17 ($4 billion), followed by TikTok ($2 billion) and YouTube ($1.2 billion). The researchers also estimate that Snapchat derived the greatest share of its overall 2022 ad revenue from users under 18 (41%), followed by TikTok (35%), YouTube (27%), and Instagram (16%).

The money is good. But let’s think about the context for the revenue. Is there another payoff from hooking minors on a particular firm’s digital content?

Control. Great idea. Self regulation will definitely address that issue.

Stephen E Arnold, January 2, 2023

The Cost of Clever

January 1, 2024

green-dino_thumb_thumb_thumbThis essay is the work of a dumb dinobaby. No smart software required.

A New Year and I want to highlight an interesting story which I spotted in SFGate: “Consulting Firm McKinsey Agrees to $78 Million Settlement with Insurers over Opioids.” The focus on efficiency and logic created an interesting consulting opportunity for a blue-chip firm. That organization responded. The SFGate story reports:

Consulting firm McKinsey and Co. has agreed to pay $78 million to settle claims from insurers and health care funds that its work with drug companies helped fuel an opioid addiction crisis.

image

A blue-consultant has been sent to the tool shed by Ms. Justice. The sleek wizard is not happy because the tool shed is the location for severe punishment by Ms. Justice. Thanks, MSFT Copilot Bing thing.

What did the prestigious firm’s advisors assist Purdue Pharma to achieve? The story says:

The insurers argued that McKinsey worked with Purdue Pharma – the maker of OxyContin – to create and employ aggressive marketing and sales tactics to overcome doctors’ reservations about the highly addictive drugs. Insurers said that forced them to pay for prescription opioids rather than safer, non-addictive and lower-cost drugs, including over-the-counter pain medication. They also had to pay for the opioid addiction treatment that followed.

The write up presents McKinsey’s view of its service this way:

“As we have stated previously, we continue to believe that our past work was lawful and deny allegations to the contrary,” the company said, adding that it reached a settlement to avoid protracted litigation. McKinsey said it stopped advising clients on any opioid-related business in 2019.

What’s interesting is that the so-called opioid crisis reveals the consequences of a certain mental orientation. The goal of generating a desired outcome for a commercial enterprise can have interesting and, in this case, expensive consequences. Have some of these methods influenced other organizations? Will blue-chip consulting firms and efficiency-oriented engineers learn from wood shed visits?

Happy New Year everyone.

Stephen E Arnold, January 1, 2024

Balloons, Hands Off Virtual Services, and Enablers: Technology Shadows and Ghosts

December 30, 2023

green-dino_thumb_thumb_thumbThis essay is the work of a dumb dinobaby. No smart software required.

Earlier this year (2023) I delivered a lecture called “Ghost Web.” I defined the term, identified what my team and I call “enablers,” and presented several examples. These included a fan of My Little Pony operating Dark Web friendly servers, a non-governmental organization pitching equal access, a disgruntled 20 something with a fixation on adolescent humor, and a suburban business executive pumping adult content to anyone able to click or swipe via well-known service providers. These are examples of enablers.

image

Enablers are accommodating. Hear no evil, see no evil, admit to knowing nothing is the mantra. Thanks, MSFT Copilot Bing thing.

Figuring out the difference between the average bad guy and a serious player in industrialized cyber crime is not easy. Here’s another possible example of how enablers facilitate actions which may be orthogonal to the interests of the US and its allies. Navigate to “U.S. Intelligence Officials Determined the Chinese Spy Balloon Used a U.S. Internet Provider to Communicate.” The report may or may not be true, but the scant information presented lines up with my research into “enablers.” (These are firms which knowingly set up their infrastructure services to allow the customer to control virtual services. The idea is that the hosting vendor does nothing but process the credit card, bank transfer, crypto, or other accepted form of payment. Done. The customer or the sys admin for the actor does the rest: Spins up the servers, installs necessary software, and operates the service. The “enabler” just looks at logs and sends bills.

Enablers are aware that their virtual infrastructure makes it easy for a customer to operate in the shadows. Look up a url and what do you find? Missing information due to privacy regulations like those in Western Europe or an obfuscation service offered by the “enabler.” Explore the urls using an appropriate method and what do you find? Dead ends. What happens when a person looks into an enabling hosting provider? Looks of confusion because the mechanism does not know if the customers are “real”? Stuff is automatic. The blank looks reflect the reality that at certain enabling ISPs, no one knows because no one wants to know. As long as the invoice is paid, the “enabler” is a happy camper.

What’s the NBC News report say?

U.S. intelligence officials have determined that the Chinese spy balloon that flew across the U.S. this year used an American internet service provider to communicate, according to two current and one former U.S. official familiar with the assessment.

The “American Internet Service Provider” is an enabler. Neither the write up nor an “official” is naming the alleged enabler. I want to point out that there many firms are in the enabling business. I will not identify by name these outfits, but I can characterize the types of outfits my team and I have identified. I will highlight three for this free, public blog post:

  1. A grifter who sets up an ISP and resells services. Some of these outfits have buildings and lease machines; others just use space in a very large utility ISP. The enabling occurs because of what we call the Russian doll set up. A big outfit allows resellers to brand an ISP service and pay a commission to the company with the pings, pipes, and other necessaries.
  2. An outright criminal no longer locked up sets up a hosting operation in a country known to be friendly to technology businesses. Some of these are in nation states with other problems on their hands and lack the resources to chase what looks like a simple Web hosting operation. Other variants include known criminals who operate via proxies and focus on industrialized cyber crime in different flavors.
  3. A business person who understands enough about technology to hire and compensate engineers to build a “ghost” operation. One such outfit diverted itself of a certain sketchy business when the holding company sold what looked like a “plain vanilla” services firm. The new owner figured out what was going on and sold the problematic part of the business to another party.

There are other variants.

The big question is, “How do these outfits remain in business?” My team and I identified a number of reasons. Let me highlight a handful because this is, once again, a free blog and not a mechanism for disseminating information reserved for specialists:

The first is that the registration mechanism is poorly organized, easily overwhelmed, and without enforcement teeth. As a result, it is very easy to operate a criminal enterprise, follow the rules (such as they are), and conduct whatever online activities desired with minimal oversight. Regulation of the free and open Internet facilitates enablers.

The second is that modern methods and techniques make it possible to set up an illegal operation and rely on scripts or semi-smart software to move the service around. The game is an old one, and it is called Whack A Mole. The idea is that when investigators arrive to seize machines and information, the service is gone. The account was in the name of a fake persona. The payments arrived via a bogus bank account located in a country permitting opaque banking operations. No one where physical machines are located paid any attention to a virtual service operated by an unknown customer. Dead ends are not accidental; they are intentional and often technical.

The third is that enforcement personnel have to have time and money to pursue the bad actors. Some well publicized take downs like the Cyberbunker operation boil down to a mistake made by the owner or operator of a service. Sometimes investigators get a tip, see a message from a disgruntled employee, or attend a hacker conference and hear a lecturer explain how an encrypted email service for cyber criminals works. The fix, therefore, is additional, specialized staff, technical resources, and funding.

What’s the NBC News’s story mean?

Cyber crime is not just a lone wolf game. Investigators looking into illegal credit card services find that trails can lead to a person in prison in Israel or to a front company operating via the Seychelles using a Chinese domain name registrar with online services distributed around the world. The problem is like one of those fancy cakes with many layers.

How accurate is the NBC News report? There aren’t many details, but it a fact that enablers make things happen. It’s time for regulatory authorities in the US and the EU to put on their Big Boy pants and take more forceful, sustained action. But that’s just my opinion about what I call the “ghost Web,” its enablers, and the wide range of criminal activities fostered, nurtured, and operated 24×7 on a global basis.

When a member of your family has a bank account stripped or an identity stolen, you may have few options for a remedy. Why? You are going to be chasing ghosts and the machines which make them function in the real world. What’s your ISP facilitating?

Stephen E Arnold, December 30, 2023

Scale Fail: Define Scale for Tech Giants, Not Residents of Never Never Land

December 29, 2023

green-dino_thumb_thumb_thumbThis essay is the work of a dumb dinobaby. No smart software required.

I read “Scale Is a Trap.” The essay presents an interesting point of view, scale from the viewpoint of a resident of Never Never Land. The write up states:

But I’m pretty convinced the reason these sites [Vice, Buzzfeed, and other media outfits] have struggled to meet the moment is because the model under which they were built — eyeballs at all cost, built for social media and Google search results — is no longer functional. We can blame a lot of things for this, such as brand safety and having to work through perhaps the most aggressive commercial gatekeepers that the world has ever seen. But I think the truth is, after seeing how well it worked for the tech industry, we made a bet on scale — and then watched that bet fail over and over again.

The problem is that the focus is on media companies designed to surf on the free megaphones like Twitter and the money from Google’s pre-threat ad programs. 

However, knowledge is tough to scale. The firms which can convert knowledge into what William James called “cash value” charge for professional services. Some content is free like wild and crazy white papers. But the “good stuff” is for paying clients.

Outfits which want to find enough subscribers who will pay the necessary money to read articles is a difficult business to scale. I find it interesting that Substack is accepting some content sure to attract some interesting readers. How much will these folks pay. Maybe a lot?

But scale in information is not what many clever writers or traditional publishers and authors can do. What happens when a person writes a best seller. The publisher demands more books and the result? Subsequent books which are not what the original was. 

Whom does scale serve? Scale delivers power and payoff to the organizations which can develop products and services that sell to a large number of people who want a deal. Scale at a blue chip consulting firm means selling to the biggest firms and the organizations with the deepest products. 

But the scale of a McKinsey-type firm is different from the scale at an outfit like Microsoft or Google.

What is the definition of scale for a big outfit? The way I would explain what the technology firms mean when scale is kicked around at an artificial intelligence conference is “big money, big infrastructure, big services, and big brains.” By definition, individuals and smaller firms cannot deliver.

Thus, the notion of appropriate scale means what the cited essay calls a “niche.” The problems and challenges include:

  • Getting the cash to find, cultivate, and grow people who will pay enough to keep the knowledge enterprise afloat
  • Finding other people to create the knowledge value
  • Protecting the idea space from carpetbaggers
  • Remaining relevant because knowledge has a shelf life, and it takes time to grow knowledge or acquire new knowledge.

To sum up, the essay is more about how journalists are going to have to adapt to a changing world. The problem is that scale is a characteristic of the old school publishing outfits which have been ill-suited to the stress of adapting to a rapidly changing world.

Writers are not blue chip consultants. Many just think they are.

Stephen E Arnold, December 29, 2023

A Dinobaby Misses Out on the Hot Searches of 2023

December 28, 2023

green-dino_thumb_thumb_thumbThis essay is the work of a dumb dinobaby. No smart software required.

I looked at “Year in Search 2023.” I was surprised at how out of the flow of consumer information I was. “Out of the flow” does not not capture my reaction to the lists of the news topics, dead people, and songs I was. Do you know much about Bizarrap? I don’t. More to the point, I have never heard of the obviously world-class musician.

Several observations:

First, when people tell me that Google search is great, I have to recalibrate my internal yardsticks to embrace queries for entities unrelated to my microcosm of information. When I assert that Google search sucks, I am looking for information absolutely positively irrelevant to those seeking insight into most of the Google top of the search charts. No wonder Google sucks for me. Google is keeping pace with maps of sports stadia.

Second, as I reviewed these top searches, I asked myself, “What’s the correlation between advertisers’ spend and the results on these lists? My idea is that a weird quantum linkage exists in a world inhabited by incentivized programmers, advertisers, and the individuals who want information about shirts. Its the game rigged? My hunch is, “Yep.” Spooky action at a distance I suppose.

Third, from the lists substantive topics are rare birds. Who is looking for information about artificial intelligence, precision and recall in search, or new approaches to solving matrix math problems? The answer, if the Google data are accurate and not a come on to advertisers, is almost no one.

As a dinobaby, I am going to feel more comfortable in my isolated chamber in a cave of what I find interesting. For 2024, I have steeled myself to exist without any interest in Ginny & Georgia, FIFTY FIFTY, or papeda.

I like being a dinobaby. I really do.

Stephen E Arnold, December 28, 2023

Google Gobbles Apple Alums

December 27, 2023

green-dino_thumb_thumb_thumbThis essay is the work of a dumb dinobaby. No smart software required.

Technology companies are notorious for poaching employees from one other. Stealing employees is so common that business experts have studied it for years. One of the more recent studies concentrates on the destination of ex-Apple associates as told by PC Magazine: “Apple Employees Leave For Google More Than Any Other Company.”

Switch on Business investigated LinkedIn data to determine which tech giants poach the industry’s best talent. All of the big names were surveyed: Uber, Intel, Adobe, Salesforce, Nvidia, Netflix, Oracles, Tesla, IBM, Microsoft, Meta, Apple, Amazon, and Google. The study mainly focused on employees working at the aforementioned names and if they switched to another listed company.

Meta had the highest proportion of any of the tech giants with 26.51% of employees having worked at rival. Google had the most talent by volume with 24.15%. IBM stole the least employees at 2.28%. Apple took 5.7% of its competitions’ talent and that comes with some drama. Apple used to purchase Intel chips for its products then the company recently decided to build its own chips. They hired 2000 people away from Intel.

The most interesting factoids are the patterns found in employee advancements:

“Potentially surprising is the fact that Apple employees are twice as likely to make the move to Google from Apple than the next biggest post-Apple destination, Amazon. After Amazon, Apple employees make the move to Meta, followed by Microsoft, Tesla, Nvidia, Salesforce, Adobe, Intel, and Oracle.

As for where Apple employees come from, new Apple employees are most likely to enter the company from Intel, followed by Microsoft, Amazon, Google, IBM, Oracle, Tesla, Nvidia, Adobe, and Meta.

While Apple employees are most often headed to Google, Google employees are most often headed to Meta, Microsoft, and Amazon, with Apple only making it to fourth on the list.”

It sounds like a hiring game of ring-around-the-rosy. Unless the employees retire, they’ll eventually make it back to their first company.

Whitney Grace, December 25, 2023

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta