Useless Search Results? Thank Advertising

September 17, 2021

We thought this was obvious. The Conversation declares, “Google’s ‘Pay-Per-Click’ Ad Model Makes it Harder to Find What You’re Looking For.” Writers Mohiuddin Ahmed and Paul Haskell-Dowland begin by pointing out “to google” has literally become synonymous with searching online via any online search platform. Indeed, Google has handily dominated the online search business, burying some competitors and leaving the rest in the dust. Not coincidentally, the company also rules the web browser and online advertising markets. As our dear readers know, Google is facing pushback from competition and antitrust regulators in assorted countries. However, this article addresses the impact on search results themselves. The authors report:

“More than 80% of Alphabet’s revenue comes from Google advertising. At the same time, around 85% of the world’s search engine activity goes through Google. Clearly there is significant commercial advantage in selling advertising while at the same time controlling the results of most web searches undertaken around the globe. This can be seen clearly in search results. Studies have shown internet users are less and less prepared to scroll down the page or spend less time on content below the ‘fold’ (the limit of content on your screen). This makes the space at the top of the search results more and more valuable. In the example below, you might have to scroll three screens down before you find actual search results rather than paid promotions. While Google (and indeed many users) might argue that the results are still helpful and save time, it’s clear the design of the page and the prominence given to paid adverts will influence behavior. All of this is reinforced by the use of a pay-per-click advertising model which is founded on enticing users to click on adverts.”

We are reminded Google-owned YouTube is another important source of information for billions of users, and it is perhaps the leading platform for online ads. In fact, these ads now intrude on videos at a truly annoying rate. Unless one pays for a Premium subscription, of course. Ahmed and Haskell-Dowland remind us alternatives to Google Search exist, with the usual emphasis on privacy-centric DuckDuckGo. They conclude by pointing out other influential areas in which Google plays a lead role: AI, healthcare, autonomous vehicles, cloud computing, computing devices, and the Internet of Things. Is Google poised to take over the world? Why not?

Cynthia Murrell, September September 17, 2021, 2021

Lucky India. Google Wants to Help

September 16, 2021

Google seeks to clear up a misunderstanding. Odisha’s OrissaPost reports, “Google Says Firmly Sees Itself as Partner to India’s Financial Ecosystem.” At issue is Google Pay and its Spot platform. It sounds like some reports about its partnerships with banks may have given the impression Google is trying to supplant or undermine existing financial institutions in India. We learn:

“The company emphasized that in every geography where Google Pay is present, its stance is consistently one of partnering with the existing financial services and banking systems to help scale and enable frictionless delivery of financial products and services and contribute to the goal of financial inclusion. In a blogpost, Google India said there have been a few instances where these offerings have been reported as ‘Google Pay’s offerings’, which fuels misinterpretation. ‘To be clear, we have always looked at our role firmly as a partner to the existing financial ecosystem that brings unique skill sets and offerings to drive further adoption of digital payments in the country,’ it said. … The internet major also noted that its Spot platform works as an additional discovery channel for many businesses to build and offer new experiences to users to drive adoption of their services. The use cases span across ticket purchase, food ordering, paying for essential services like utility bills, shopping and getting access to various financial products.”

See the write-up or Google India’s blog post for more specific details. The company emphasizes bringing partners onto the Google Pay platform connects them to customers around India who would otherwise be unable to access their services, helping to “level social inequalities.” Aw Google, always looking out for the little guy aren’t you?

Cynthia Murrell, September 16, 2021

Silicon Valley: Fraud or Fake Is an Incorrect Characterization

September 10, 2021

I read “Elizabeth Holmes: Has the Theranos Scandal Changed Silicon Valley?” The write up contains a passage I found interesting; to wit:

In Silicon Valley, hyping up your product – over-promising – isn’t unusual…

Marketing is more important than the technology sold by the cash hype artists. Notice that I don’t use the word “entrepreneur,” “innovator,” “programmer,” or the new moniker “AIOps” (that’s artificial intelligence operations).

The Theranos story went wrong because there was not a “good enough” method provided. The fact that Theranos could not cook up a marginally better way of testing blood is less interesting than the fact about the money. She had plenty of money, and her failure is what I call the transition from PowerPoint to “good enough.”

Why not pull a me-too and change the packaging? Why not license a method from Eastern Europe or Thailand and rebrand it? Why not white label a system known to work, offer a discount, and convince the almost clueless Walgreen’s-type operation that the  Zirconia was dug out of a hole in a far-off country.

Each of these methods has been used to allow an exit strategy with honor and not a career-ending Tesla-like electric battery fire which burns for days.

The write up explains:

Particularly at an early stage, when a start-up is in its infancy, investors are often looking at people and ideas rather than substantive technology anyway. General wisdom holds that the technology will come with the right concept – and the right people to make it work. Ms Holmes was brilliant at selling that dream, exercising a very Silicon Valley practice: ‘fake it until you make it’. Her problem was she couldn’t make it work.

The transgression, in my opinion, was a failure to use a me-too model. That points to what I call a denial of reality.

Here are some examples of how a not-so-good solution has delivered to users a disappointing product or service yet flourished. How many of these have entered your personal ionosphere?

  1. Proprietary app stores which offer mobile software which is malware? The purpose of the proprietary app store is to prevent malfeasance, right?
  2. Operating systems which cannot provide security? My newsfeed is stuffed full of breaches, intrusions, phishing scams, and cloud vulnerabilities. How about that Microsoft Exchange and Azure security or the booming business of NSO Group-types of surveillance functionality?
  3. Self-driving vehicles anyone? Sorry, not for me.
  4. Smart software which is tuned to deliver irrelevant advertising despite a service’s access to browser history, user location, and email mail? If I see one more ad for Grammarly or Ke Chava when I watch a Thomas Gast French Foreign Legion video in German, I may have a stroke. (Smart software is great, isn’t it? Just like ad-supported Web search results!)
  5. Palantir-type systems are the business intelligence solutions for everyone with a question and deep pockets.

The article is interesting, but it sidesteps the principal reason why Theranos has become a touchstone for some people. The primum movens from my vantage point is:

There are no meaningful consequences: For the funders. For the educational institutions. For the “innovators.”

The people who get hurt are not part of the technology club. Maybe Ms. Holmes, the “face” of Theranos will go to jail, be slapped with a digital scarlet A, and end up begging in Berkeley?

I can’t predict the future, but I can visualize a Michael Milkin-type or Kevin Mitnick-type of phoenixing after walking out of jail.

Theranos is a consequence of the have and have not technology social construct. Technology is a tool. Ms. Holmes cut off her finger in woodworking class. That’s sort of embarrassing. Repurposing is so darned obvious and easy.

More adept pioneers have done the marketing thing and made a me-too approach to innovation work. But it does not matter. This year has been a good one for start ups. Get your digital currency. Embrace AIOps. Lease a self driving vehicle. Use TikTok. No problem.

Stephen E Arnold, September 10. 2021

More AI Bias? Seems Possible

September 10, 2021

Freddie Mac and Fannie Mae are stuck in the past—the mid-1990s, to be specific, when the Classic FICO loan-approval software was developed. Since those two quasi-government groups basically set the rules for the mortgage industry, their reluctance to change is bad news for many would-be home buyers and their families. The Markup examines “The Secret Bias Hidden in Mortgage-Approval Algorithms.” Reporters Emmanuel Martinez and Lauren Kirchner reveal what their organization’s research has uncovered:

“An investigation by The Markup has found that lenders in 2019 were more likely to deny home loans to people of color than to white people with similar financial characteristics — even when we controlled for newly available financial factors the mortgage industry for years has said would explain racial disparities in lending. Holding 17 different factors steady in a complex statistical analysis of more than two million conventional mortgage applications for home purchases, we found that lenders were 40 percent more likely to turn down Latino applicants for loans, 50 percent more likely to deny Asian/Pacific Islander applicants, and 70 percent more likely to deny Native American applicants than similar White applicants. Lenders were 80 percent more likely to reject Black applicants than similar White applicants. These are national rates. In every case, the prospective borrowers of color looked almost exactly the same on paper as the White applicants, except for their race.”

Algorithmic bias is a known and devastating problem in several crucial arenas, but recent years have seen efforts to mitigate it with better data sets and tweaked machine-learning processes. Advocates as well as professionals in the mortgage and housing industries have been entreating Fannie and Freddie to update their algorithm since 2014. Several viable alternatives have been developed but the Federal Housing Finance Agency, which oversees those entities, continues to drag its heels. No big deal, insists the mortgage industry—bias is just an illusion caused by incomplete data, representatives wheedle. The Markup’s research indicates otherwise. We learn:

“The industry had criticized previous similar analyses for not including financial factors they said would explain disparities in lending rates but were not public at the time: debts as a percentage of income, how much of the property’s assessed worth the person is asking to borrow, and the applicant’s credit score. The first two are now public in the Home Mortgage Disclosure Act data. Including these financial data points in our analysis not only failed to eliminate racial disparities in loan denials, it highlighted new, devastating ones.”

For example, researchers found high-earning Black applicants with less debt get rejected more often than white applicants with similar income but more debt. See the article for more industry excuses and the authors’ responses, as well some specifics on mechanisms of systemic racism and how location affects results. There are laws on the books that should make such discrimination a thing of the past, but they are difficult to enforce. An outdated algorithm shrouded in secrecy makes it even more so. The Federal Housing Finance Agency has been studying its AI’s bias and considering alternatives for five years now. When will it finally make a change? Families are waiting.

Cynthia Murrell, September 10, 2021

Techno-Psych: Perception, Remembering a First Date, and Money

September 9, 2021

Navigate to “Investor Memory of Past Performance Is Positively Biased and Predicts Overconfidence.” Download the PDF of the complete technical paper at this link. What will you find? Scientific verification of a truism; specifically, people remember good times and embellish those memory with sprinkles.

The write up explains:

First, we find that investors’ memories for past performance are positively biased. They tend to recall returns as better than achieved and are more likely to recall winners than losers. No published paper has shown these effects with investors. Second, we find that these positive memory biases are associated with overconfidence and trading frequency. Third, we validated a new methodology for reducing overconfidence and trading frequency by exposing investors to their past returns.

The issue at hand is investors who know they are financial poobahs. Mix this distortion of reality with technology and what does one get? My answer to this question is, “NFTs for burned Banksy art.”

The best line in the academic study, in my view, is:

Overconfidence is hazardous to your wealth.

Who knew? My answer is the 2004 paper called “Overconfidence and the Big Five.” I also think my 89-year-old great grandmother who told me when I was 13, “Don’t be over confident.”

I wonder if the Facebook artificial intelligence wizards were a bit too overconfident in the company’s smart software. There was, if I recall, a question about metatagging a human as a gorilla.

Stephen E Arnold, September 9, 2021

Taliban: Going Dark

September 3, 2021

I spotted a story from the ever reliable Associated Press called “Official Taliban Websites Go Offline, Though Reasons Unknown.” (Note: I am terrified of the AP because quoting is an invitation for this outfit to let loose its legal eagles. I don’t like this type of bird.)

I can, I think, suggest you read the original write up. I recall that the “real” news story revealed some factoids I found interesting; for example:

  • Taliban Web site “protected” by Cloudflare have been disappeared. (What’s that suggest about the Cloudflare Web performance and security capabilities?)
  • Facebook has disappeared some Taliban info and maybe accounts.
  • The estimable Twitter keeps PR maven Z. Mjuahid’s tweets flowing.

I had forgotten that the Taliban is not a terrorist organization. I try to learn something new each day.

Stephen E Arnold, September 3, 2021

Why Big Tech Is Winning: The UK Admission

August 31, 2021

I read “UK’s FCA Say It Is Not Capable of Supervising Crypto Exchange Binance.” This is a paywalled story, and I am not sure how much attention it will get. As Spotify is learning from locking up the estimable Joe Rogan, paywalls make sense to a tiny slice of one’s potential audience.

The story is an explanation about government helplessness when it comes to fintech or financial technology. The FCA acronym means Financial Conduct Authority. Think about London. Think about the wizards who cooked up some nifty digital currency methods at assorted UK universities less than one hour from the Pickle. Think about the idea that a government agency with near instant access to the wonks at the National Crime Agency, the quiet ones at Canary Wharf, and the interesting folks in Cheltenham. Now consider this passage from the write up:

… the Financial Conduct Authority said that Binance’s UK affiliate had “failed to” respond to some of its basic queries, making it impossible to oversee the sprawling group, which has no fixed headquarters and offers services around the world. The admission underscores the scale of the challenge facing authorities in tackling potential risks to consumers buying frequently unregulated products through nimble crypto currency businesses, which can often circumvent national bans by giving users access to facilities based overseas.

Hello? Rural Kentucky calling, is anyone at work?

Let’s step back. I need to make one assumption; that is, government entities’ have authority and power. What this write up makes clear is that when it comes to technology, the tech outfits have the authority and the power.

Not good in my opinion for the “consumer” and maybe for some competitors. Definitely not good for enforcement authorities.

Who finds sun shining through the clouds after reading this Financial Times’s story? I would wager that tech centric outfits are thinking about a day or more at the beach. No worries. And look. Here comes Snoop Dog handing out free beer. What a day!

Stephen E Arnold, August 31, 2021

Fancy Code? Nope, Just Being Nice to Apple Customer Care

August 25, 2021

I continue to be fascinated by the number of cyber security companies reporting new exploits. If an exploit is a hot ticket, should not multiple cyber security threat identification services report a breach? Maybe, but the reality is that some expensive and often exotic smart software fumble the ball.

How do bad actors gain access to what these individuals perceive as high value targets? It is not a team of hackers sponsored by a rogue state or a tech-literate oligarch. The crime often is the anti-security action of a single individual.

Lone wolves being nice is a technique not captured by artificially intelligent, over-hyped platforms. “La Puente Man Steals 620,000 iCloud Photos in Plot to Find Images of Nude Women” may be an example of the methods which can penetrate the security of outfits which tout their concerns about privacy and take pains to publicize how secure their online systems, services, and products are.

The allegedly accurate write up states:

Chi, who goes by David, admitted that he impersonated Apple customer support staff in emails that tricked unsuspecting victims into providing him with their Apple IDs and passwords, according to court records. He gained unauthorized access to photos and videos of at least 306 victims across the nation, most of them young women, he acknowledged in his plea agreement with federal prosecutors in Tampa, Fla.

The “real” news report added some color to this action:

Chi said he hacked into the accounts of about 200 of the victims at the request of people he met online. Using the moniker “icloudripper4you,” Chi marketed himself as capable of breaking into iCloud accounts to steal photos and videos, he admitted in court papers. Chi acknowledged in court papers that he and his unnamed co-conspirators used a foreign encrypted email service to communicate with each other anonymously. When they came across nude photos and videos stored in victims’ iCloud accounts, they called them “wins,” which they collected and shared with one another.

What’s happening in this example?

  • Social engineering
  • Pretending to be a concerned professional at a big company
  • A distributed group of anti security types who don’t know one another too well
  • Victims.

Net net: Fancy security systems are indeed fancy. The security part is different from what bad actors are doing. That’s a bit of a problem for outfits like Microsoft and T-Mobile, among others.

Stephen E Arnold, August 25, 2021

Amazon AWS: Personalization? What Is That? Who Cares?

August 23, 2021

I read the impassioned “AWS Doesn’t Know Who I Am. Here’s Why That’s A Problem.” The individual appears to perceive himself as an Amazon-savvy professional.  I learned:

My name is Ben Kehoe. I’m an AWS Serverless Hero. I’ve spoken at re:Invent. I meet regularly with teams across AWS. I’m followed by @awscloud on Twitter. But AWS doesn’t know who I am.

There are examples of services which pay attention to the “identity” or “alleged identity” of a user. These are helpful examples, and I liked the inclusion of Microsoft GitHub as an outfit who appears to care about an individual’s or a persona’s identity.

The write up includes the many tokens used to keep track of an AWS user or account. There is, it seems, no meta-token basket. Thus, instead of being a single entity, there are many separate AWS entities.

Several thoughts occurred to me:

  1. Fragmenting makes it easier to assess fees on hard-to-track services one part of an entity incurs. Why make it easy to manage AWS fees?
  2. Like security, Amazon AWS shifts the burden from the utility to the person, entity, or software process. My hunch is that the approach allows AWS to say, “Not our problem.”
  3. Amazon and AWS require that users and entities recognize that the company is, in effect, a person. Most people forget that a commercial enterprise may have more rights than a humanoid.

Net net: Amazon has no incentive to care about anyone, including Ben Kehoe unless the corporate person benefits in my opinion. Humans want to be perceived as unique. AWS is not mom. Thus, the problem is not Amazon’s.

Stephen E Arnold, August 23, 2021

Federated AI: A Garden of Eden. Will There Be a Snake or Two?

August 23, 2021

I read “Eden AI Launches Platform to Unify ML APIs.” I had two immediate reactions. The first was content marketing, and the second was that there was a dark side to the Garden of Eden, wasn’t there?

Eden is a company pulling a meta-play or leveling up. The idea is that one can pop up higher, pull disparate items together, and create a new product or service.

This works for outfits ranging from a plumbing supply company serving smaller towns to an outfit like the Bezos bulldozer. Why not apply this model to the rock solid world of machine learning application programming interfaces.

The write up states:

… using Eden AI, a company could feed a document in Chinese into Google Cloud Platform’s optical character recognition service to extract its contents. Then it could have an IBM Watson model translate the extracted Chinese characters into English words and queue up an Amazon Web Services API to analyze for keywords. Eden AI makes money by charging providers a commission on the revenues generated by its platform.

Latency? Apparently no problem. The costs of maintaining the meta-code as the APIs change. Apparently no problem. Competition from outfits like Microsoft who whether the technology works or not wants to maintain its role as the go-to place for advanced whatevers. No problem.

Someday.

Stephen E Arnold, August 23, 2021

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta