Truth and Justice the Amazon Apple Way
August 3, 2021
At the request of its good friend Amazon, Apple has come down on the side of preserving flawed and biased guidance. Mashable tells us that “Apple Boots App that Called BS on Fake Amazon Reviews from App Store.” Reporter Jack Morse writes:
“Publicly calling out frauds has always been a risky proposition. That reality came crashing down hard Friday for an app designed to spot fake Amazon reviews, after Apple kicked it out of its App Store. Apple confirmed in a statement that it removed the app after Amazon reached out. The news of Fakespot’s outing was first reported by The Verge. We reached out to Apple, Amazon, and Fakespot to confirm the Verge’s reporting. An Apple spokesperson provided a statement, attributed to the company, which says Amazon kicked off the inter-company beef early in June. It also insists that Apple attempted to give both parties time to work things out.”
Apple frames the issue as a matter of intellectual property rights, and insists it tried to work with Fakespot before removing the app. Saoud Khalifah, Fakespot’s founder and CEO, disagrees. He stated in a phone interview:
“Apple are claiming that they gave us a notice that they are going to take us down, but these are all template emails that seem to be from a robot. Anyone would be disappointed with this whole process, especially when your livelihood depends on it.”
Amazon claims Fakespot spreads misleading information, harms its sellers’ businesses, and even makes for security risks. Sure. The app attracted attention in 2019 when it reported a surge in fake reviews around the much-hyped Amazon Prime Day. The existence of fake reviews is a known problem, one the FTC has taken action against. We are also reminded of all the counterfeit products that plague the commerce site and the fake reviews that keep them moving. Nevertheless, Apple has decided it is Fakespot who is in the wrong here. Ah, capitalism at its finest.
Cynthia Murrell, August 2, 2021
Is MIT Dissing Its CompSci Grads, Maybe the Google, IBM, and Possibly AI in General
August 3, 2021
Hey, what does one expect from an outfit which did some Fancy Dancing with alleged human trafficker Jeffrey Epstein? I don’t expect much. It was amusing to me to read “Hundreds of AI Tools Have Been Built to Catch Covid. None of Them Helped.” Why am I laughing? Well, there are the MIT spawned smart software systems populating architecturally disappointing structures in the Boston area. There is also the really nifty teaming with IBM Watson (yep, the smart software systems which is less exciting that RedHat when it comes to tickling shareholders’ fancies). Watson, as you may recall, is allegedly the first artificial intelligence system to be placed on a patient trolley and bustled out of the emergency room exit by a clutch of cancer docs.
The referenced write up makes clear that Covid is either smarter than the smartest people in the world, or the smartest people in the world are dumber than their résumés suggest. The truth, I admit, might be somewhere in the middle of tenure squabbles, non-reproducible results, and good old PT Barnun malarkey.
The write up states:
The AI community, in particular, rushed to develop software that many believed would allow hospitals to diagnose or triage patients faster, bringing much-needed support to the front lines—in theory. In the end, many hundreds of predictive tools were developed. None of them made a real difference, and some were potentially harmful.
Yo, what’s this harm thing? Like Google’s brilliant progress on solving death, the hubris and rah rah about what a PhD demonstration implies, and what those Rube Goldberg constructions of downloadable code deliver are quite different.
The write up drags a reader through case examples of baloney. The write up documents failure. Yep, F. Failure for whiz kids who never experienced a set back which a helicopter mom or proud PhD mentor couldn’t address. A phone call from donors like Mr. Epstein probably helped too.
So the big question is posed by the estimable MIT cuddled write up: What went wrong?
What’s the answer?
Guess what. Lots. Bad data, wonky algorithms, statistical drift, grant crazed researchers.
This is a surprise?
Nope. In a nutshell, the entire confection of smart software’s capabilities is deconstructed in my opinion:
In a sense, this is an old problem with research. Academic researchers have few career incentives to share work or validate existing results. There’s no reward for pushing through the last mile that takes tech from “lab bench to bedside,”.
Should I bring up Mr. Epstein’s penchant for bedside activities. History will have to judge which is the more problematic social behavior.
Stephen E Arnold, August 3, 2021
NSO Group: Now the Women Allegedly Harmed Gain Media Traction. Wowza!
August 2, 2021
I read “I Will Not Be Silenced: Women Targeted in Hack and Leak Attacks Speak Out about Spyware.” My first reaction to the story was, “How many college sociology and poli-sci classes will make NSO Group, its product Pegasus, and the implications of “targeting” a subject for a case study, discussion groups, and papers? My second thought was, “NSO Group has been able to watch the ripples of intelware crashing against the awareness of the naïve, the clueless, and the mobile phone addicts.”
I don’t know if the peacock’s news report is accurate or just one of those weird bird noises made by the species. That probably doesn’t matter because the write up pulls in women and hooks intelware to a quite magnetic topic: The treatment of women.
The peacock squawked:
Female journalists and activists say they had their private photos shared on social media by governments seeking to intimidate and silence them.
Now that’s a heck of an assertion. True or not, the idea of “personal” pix nestling in distributed and local storage devices is not something that most people want to have happen.
Here’s a quote from the write up, and it will be interesting to watch how the crisis management advisors to NSO Group tap dance across this allegedly true statement:
“I am used to being harassed online. But this was different,” she added. “It was as if someone had entered my home, my bedroom, my bathroom. I felt so unsafe and traumatized.”
That’s a whiz bang statement which drags in nuances of privacy invasion and personal safety. Let’s call a meeting and maybe issue another feel good, make streets safer story. Yeah, how’s that working out?
The write up has another quote that glues NSO Group to the notion of freedom. Hello, Israel?
“Pegasus is a spyware tool and a weapon used against freedom of the press, freedom of expression, human rights activism and journalism,” said Rasha Abdul Rahim, director of Amnesty Tech, a division of Amnesty International focused on technology and surveillance tools. “Women’s freedom of expression is abused and targeted in a very specific way both online and offline. “The focus is on silencing them, putting the attention on their bodies or what they should be wearing or saying,” she added.
I have noticed that more people are aware of intelware as a result of this NSO Group toe stubbing.
What about those intelligence conference organizers? How about those experts pitching intel-related conferences on LinkedIn? What about those nifty white papers on intelware vendors’ Web sites?
My thought is that as more content is downloaded and more of the journalists chasing NSO Group info punch their searches into the Google, the more those ripples will be agitated.
Yikes. No easy fix it seems. Chasing revenues and making intelware into a household word are problematic. Many entities are likely to be suffering the slings and arrows of outrageous fortune. PR is good until it is not.
Stephen E Arnold, August 2, 2021
NSO Group and an Alert Former French Diplomat: Observation Is Often Helpful
August 2, 2021
I read “French Ex-Diplomat Saw Potential for Misuse While Working at NSO.” The allegedly accurate write up reports that Gerard Araud [once a French ambassador] took a position at NSO Group. The write up adds:
His one-year mission from September 2019, along with two other external consultants from the United States, was to look at how the company could improve its human rights record after a host of negative news stories. Earlier that year, the group’s technology had been linked publicly to spying or attempted spying on the murdered Saudi journalist Jamal Khashoggi by Saudi Arabian security forces, which it denied. The group was acquired in 2019 by a London-based private equity group, Novalpina, which hired Araud to recommend ways to make the company’s safeguard procedures “more rigorous and a bit more systematic,” he said.
The write up explains how a prospect becomes an NSO Group customer:
Its [the Pegasus software and access credentials] export is regulated “like an arms sale,” said Araud, meaning NSO must seek approval from the Israeli government to sell it, and state clients then sign a lengthy commercial contract stipulating how the product will be used. They are meant to deploy Pegasus only to tackle organised crime or terrorism — the company markets itself this way — but Araud said “you could see all the potential for misuse, even though the company wasn’t always responsible.”
The argute veteran of the French ambassadorial team maybe, possibly, could have discerned the potential for misuse of the Pegasys system.
The write up includes this information, allegedly direct from the former diplomat, who obviously provides information diplomatically:
In a firm that practices “a form of extreme secrecy,” he says he nonetheless became convinced that NSO Group worked with Israel’s Mossad secret services, and possibly with the CIA. He said there were three Americans who sat on the group’s advisory board with links to the US intelligence agency, and the company has said that its technology cannot be used to target US-based numbers. “There’s a question about the presence of Mossad and the CIA. I thought it was both of them, but I have no proof,” he said. “But I suspect they’re both behind it with what you call a ‘backdoor’.” A “backdoor” is a technical term meaning the security services would be able to monitor the deployment of Pegasus and possibly the intelligence gathered as a result.
Interesting. Several years ago, the BBC published “When Is a Diplomat Really Just a Spy?” In that 2018 write up, the Beeb stated:
So where do you draw the line between official diplomacy and the murky world of espionage? “Every embassy in the world has spies,” says Prof Anthony Glees, director of the Centre for Security and Intelligence Studies at the University of Buckingham. And because every country does it, he says there’s “an unwritten understanding” that governments are prepared to “turn a blind eye” to what goes on within embassies.
Would French diplomats have some exposure to ancillary duties at a French embassy? Potentially.
Stephen E Arnold, August 3, 2021
News, Misios, Rejoice: Aid Has Arrived
August 2, 2021
Misio? Strange word. It means “street person.”
Is it me, or does this feel like a PR move? CanIndia reports, “Google Launches AI Academy for Small Newsrooms.” “We from Google and we are here to help small news outfits.” Right. The brief write-up tells us about the project, dubbed JournalismAI:
“In a bid to help small media publishers reach new audiences and drive more traffic to their content, the Google News Initiative (GNI) has launched a training academy for 20 media professionals to learn how Artificial Intelligence (AI) can be used to support their journalism. Google is partnering with Polis, the London School of Economics and Political Science’s journalism think tank, to launch the training academy, it said in a statement on Thursday. The AI Academy for Small Newsrooms is a six-week long, free online programme taught by industry-leading journalists and researchers who work at the intersection of journalism and AI. It will start in September this year and will welcome journalists and developers from small news organisations in the Europe, Middle East, and Africa (EMEA) region.”
Wow, free to 20 professionals. Don’t be too generous, Google. We are told these lucky few will gain practical knowledge of AI technology’s challenges and opportunities like automating repetitive tasks and determining which content engages audiences. They will each emerge with an action plan for implementing AI projects. For journalists not fortunate enough to be enrolled in the course, the GNI has made its training modules available online. In fact, more than 110,000 folks have taken advantage of these materials. Then why bother with this “AI Academy?” I suspect because it reads better for PR purposes than “online learning module.” Just a hunch.
Cynthia Murrell, August 2, 2021
Facebook Lets Group Admins Designate Experts. Okay!
August 2, 2021
Facebook once again enlists the aid of humans to impede the spread of misinformation, only this time it has found a way to avoid paying anyone for the service. Tech Times reports, “Facebook Adds Feature to Let Admin in Groups Chose ‘Experts’ to Curb Misinformation.” The move also has the handy benefit of shifting responsibility for bad info away from the company. We wonder—what happened to that smart Facebook software? The article does not say. Citing an article from Business Insider, writer Alec G. does tell us:
“The people who run the communities on Facebook now have the authority to promote individuals within its group to gain the title of ‘expert.’ Then, the individuals dubbed as experts can be the voices of which the public can then base their questions and concerns. This is to prevent misinformation plaguing online communities for a while now.”
But will leaving the designation of “expert” up to admins make the problem worse instead of better? The write-up continues:
“The social platform now empowers specific individuals inside groups who are devoted to solely spreading misinformation-related topics. The ‘Stop the Steal’ group, for example, was created in November 2020 with over 365,000 members. They were convinced that the election for the presidency was a fraud. If Facebook didn’t remove the group two days later, it would continue to have negative effects. Facebook explained that the organization talked about ‘the delegitimization of the election process,’ and called for violence, as reported by the BBC. Even before that, other groups within Facebook promoted violence and calls to action that would harm the civility of the governments.”
Very true. We are reminded of the company’s outsourced Oversight Board created in 2018, a similar shift-the-blame approach that has not worked out so well. Facebook’s continued efforts to transfer responsibility for bad content to others fail to shield it from blame. They also do little to solve the problem and may even make it worse. Perhaps it is time for a different (real) solution.
Cynthia Murrell, August 2, 2021
YouTube and News Corp: BBFs Forever? Ah, No.
August 2, 2021
I read “Murdochs’ Sky News Australia Suspended From YouTube Over COVID-19 Misinformation.” Wow. I thought the Google and Australian publishers were best friends forever. Both are refined, elegant, and estimable organizations. Okay, there are those allegations about monopolistic behavior and the deft handling of the Dr. Timnit Gebru matter. But, hey, Google is great. And there is the Mr. Murdoch empire. The phone tapping thing is a mere trifle.
The write up explains:
The video hosting site said in a statement Sunday that the suspension was dealt over videos allegedly denying the existence of COVID-19 and encouraging people to use untested experimental drugs like hydroxychloroquine to treat the virus. “We apply our policies equally for everyone and in accordance with these policies and our long-standing strikes system, removed videos from and issued a strike to Sky News Australia’s channel,” a YouTube spokesperson said in a statement to Reuters.
Equality is good. Are employees at Google treated equally? The cafeteria thing is small potatoes because real employees can work from their home or vans or whatever.
Pretty exciting stuff. I thought Google and Australian publishers were in a happy place. But Covid imposes stress on BFFs obviously.
Stephen E Arnold, August 2, 2021