Bugged? Hey, No One Can Get Our Data

December 22, 2023

green-dino_thumb_thumb_thumbThis essay is the work of a dumb dinobaby. No smart software required.

I read “The Obscure Google Deal That Defines America’s Broken Privacy Protections.” In the cartoon below, two young people are confident that their lunch will be undisturbed. No “bugs” will chow down on their hummus, sprout sandwiches, or their information. What happens, however, is that the young picnic fans cannot perceive what is out of sight. Are these “bugs” listening? Yep. They are. 24×7.

image

What the young fail to perceive is that “bugs” are everywhere. These digital creatures are listening, watching, harvesting, and consuming every scrap of information. The image of the picnic evokes an experience unfolding in real time. Thanks, MSFT Copilot. My notion of “bugs” is obviously different from yours. Good enough and I am tired of finding words you can convert to useful images.

The essay explains:

While Meta, Google, and a handful of other companies subject to consent decrees are bound by at least some rules, the majority of tech companies remain unfettered by any substantial federal rules to protect the data of all their users, including some serving more than a billion people globally, such as TikTok and Apple.

The situation is simple: Major centers of techno gravity remain unregulated. Law makers, regulators, and “users” either did not understand or just believed what lobbyists told them. The senior executives of certain big firms smiled, said “Senator, thank you for that question,” and continued to build out their “bug” network. Do governments want to lose their pride of place with these firms? Nope. Why? Just reference bad actors who commit heinous acts and invoke “protect our children.” When these refrains from the techno feudal playbook sound, calls to take meaningful action become little more than a faint background hum.

But the article continues:

…there is diminishing transparency about how Google’s consent decree operates.

I think I understand. Google-type companies pretend to protect “privacy.” Who really knows? Just ask a Google professional. The answer in my experience is, “Hey, dude, I have zero idea.”

How does Wired, the voice of the techno age, conclude its write up? Here you go:

The FTC agrees that a federal privacy law is long overdue, even as it tries to make consent decrees more powerful. Samuel Levine, director of the FTC’s Bureau of Consumer Protection, says that successive privacy settlements over the years have become more limiting and more specific to account for the growing, near-constant surveillance of Americans by the technology around them. And the FTC is making every effort to enforce the settlements to the letter…

I love the “every effort.” The reality is that the handling of online data collection presages the trajectory for smart software. We live with bugs. Now those bugs can “think”, adapt, and guide. And what’s the direction in which we are now being herded? Grim, isn’t it?

Stephen E Arnold, December 23, 2023

How about Fear and Paranoia to Advance an Agenda?

December 6, 2023

green-dino_thumb_thumb_thumbThis essay is the work of a dumb dinobaby. No smart software required.

I thought sex sells. I think I was wrong. Fear seems to be the barn burner at the end of 2023. And why not? We have the shadow of another global pandemic? We have wars galore. We have craziness on US air planes. We have a Cybertruck which spells the end for anyone hit by the behemoth.

I read (but did not shake like the delightful female in the illustration “AI and Mass Spying.” The author is a highly regarded “public interest technologist,” an internationally renowned security professional, and a security guru. For me, the key factoid is that he is a fellow at the Berkman Klein Center for Internet & Society at Harvard University and a lecturer in public policy at the Harvard Kennedy School. Mr. Schneier is a board member of the Electronic Frontier Foundation and the most, most interesting organization AccessNow.

image

Fear speaks clearly to those in retirement communities, elder care facilities, and those who are uninformed. Let’s say, “Grandma, you are going to be watched when you are in the bathroom.” Thanks, MSFT Copilot. I hope you are sending data back to Redmond today.

I don’t want to make too much of the Harvard University connection. I feel it is important to note that the esteemed educational institution got caught with its ethical pants around its ankles, not once, but twice in recent memory. The first misstep involved an ethics expert on the faculty who allegedly made up information. The second is the current hullabaloo about a whistleblower allegation. The AP slapped this headline on that report: “Harvard Muzzled Disinfo Team after $500 Million Zuckerberg Donation.” (I am tempted to mention the Harvard professor who is convinced he has discovered fungible proof of alien technology.)

So what?

The article “AI and Mass Spying” is a baffler to me. The main point of the write up strikes me as:

Summarization is something a modern generative AI system does well. Give it an hourlong meeting, and it will return a one-page summary of what was said. Ask it to search through millions of conversations and organize them by topic, and it’ll do that. Want to know who is talking about what? It’ll tell you.

I interpret the passage to mean that smart software in the hands of law enforcement, intelligence operatives, investigators in one of the badge-and-gun agencies in the US, or a cyber lawyer is really, really bad news. Smart surveillance has arrived. Smart software can process masses of data. Plus the outputs may be wrong. I think this means the sky is falling. The fear one is supposed to feel is going to be the way a chicken feels when it sees the Chik-fil-A butcher truck pull up to the barn.

Several observations:

  1. Let’s assume that smart software grinds through whatever information is available to something like a spying large language model. Are those engaged in law enforcement are unaware that smart software generates baloney along with the Kobe beef? Will investigators knock off the verification processes because a new system has been installed at a fusion center? The answer to these questions is, “Fear advances the agenda of using smart software for certain purposes; specifically, enforcement of rules, regulations, and laws.”
  2. I know that the idea that “all” information can be processed is a jazzy claim. Google made it, and those familiar with Google search results knows that Google does not even come close to all. It can barely deliver useful results from the Railway Retirement Board’s Web site. “All” covers a lot of ground, and it is unlikely that a policeware vendor will be able to do much more than process a specific collection of data believed to be related to an investigation. “All” is for fear, not illumination. Save the categorical affirmatives for the marketing collateral, please.
  3. The computational cost for applying smart software to large domains of data — for example, global intercepts of text messages — is fun to talk about over lunch. But the costs are quite real. Then the costs of the computational infrastructure have to be paid. Then the cost of the downstream systems and people who have to figure out if the smart software is hallucinating or delivering something useful. I would suggest that Israel’s surprise at the unhappy events in October 2023 to the present day unfolded despite the baloney for smart security software, a great intelligence apparatus, and the tons of marketing collateral handed out at law enforcement conferences. News flash: The stuff did not work.

In closing, I want to come back to fear. Exactly what is accomplished by using fear as the pointy end of the stick? Is it insecurity about smart software? Are there other messages framed in a different way to alert people to important issues?

Personally, I think fear is a low-level technique for getting one’s point across. But when those affiliated with an outfit with the ethics matter and now the payola approach to information, how about putting on the big boy pants and select a rhetorical trope that is unlikely to anything except remind people that the Covid thing could have killed us all. Err. No. And what is the agenda fear advances?

So, strike the sex sells trope. Go with fear sells.

Stephen E Arnold, December 6, 2023

Google: Privacy Is Number One?

September 19, 2023

Big tech companies like Google do not respect users’ privacy rights. Yes, these companies have privacy statements and other legal documents that state they respect individuals’ privacy but it is all smoke and mirrors. The Verge has the lowdown on a privacy lawsuit filed against Google and a judge’s recent decision: “$5 Billion Google Lawsuit Over ‘Incognito Mode’ Tracking Moves A Step Closer To Trial.”

Chasom Brown, Willian Byatt, Jeremy Davis, Christopher Castillo, and Monique Trujillo filed a class action lawsuit against Google for collecting user information while in “incognito mode.” Publicly known as Chasom Brown, et. Al v. Google, the plaintiffs seek $5 billion in damages. Google requested a summary judgment, but Judge Yvonne Gonzalez Rogers of California denied it.

Judge Gonzalez noted that statements in the Chrome privacy nonie, Privacy Policy, Incognito Splash Screen, and Search & Browse Privately Help page explains how Incognito mode limits information and how people can control what information is shared. The judge wants the court to decide if these notices act as a binding agreement between Google and users that the former would not collect users’ data when they browsed privately.

Google disputes the claims and state that every time a new incognito tab is opened, Web sites might collect user information. There are other issues the plaintiffs and judge want to discuss:

“Another issue going against Google’s arguments that the judge mentioned is that the plaintiffs have evidence Google ‘stores users’ regular and private browsing data in the same logs; it uses those mixed logs to send users personalized ads; and, even if the individual data points gathered are anonymous by themselves, when aggregated, Google can use them to ‘uniquely identify a user with a high probability of success.’’

She also responded to a Google argument that the plaintiffs didn’t suffer economic injury, writing that ‘Plaintiffs have shown that there is a market for their browsing data and Google’s alleged surreptitious collection of the data inhibited plaintiffs’ ability to participate in that market…Finally, given the nature of Google’s data collection, the Court is satisfied that money damages alone are not an adequate remedy. Injunctive relief is necessary to address Google’s ongoing collection of users’ private browsing data.’”

Will Chasom Brown, et. Al v. Google go anywhere beyond the California court? Will the rest of the United States and other countries that have a large Google market, the European Union, do anything?

Whitney Grace, September 19, 2023

Malware: The NSO Group and a Timeline

September 8, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_tNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

A flurry of NSO Group news appeared in my newsfeeds this morning. Citizen Labs issued an advisory. You can find that short item in “BLASTPASSNSO Group iPhone Zero-Click, Zero-Day Exploit Captured in the Wild.” Recorded Future, a cyber security company, published “Apple Discloses Zero-Days Linked.” Variants of these stories are percolating, including British tabloid newspapers like The Metro. One message comes through: Update your iPhones.

The information makes clear that a vulnerability “path” appears to be blocked. That’s good news. The firm which allegedly discovered the way into user mobile devices is the NSO Group. The important fact, at least for me, is that this organization opened its doors for business in 2010. The origin story, if one believes the information once can find using a free Web search engine, is that the company evolved from a mobile phone repair business. After repairing and tinkering, the founder set up a company to assist government agencies in obtaining information from mobile devices believed to be used by bad actors. Agree or disagree, the origin story is interesting.

What’s important for me is that the time between the company’s start up and the “good news” about addressing a vulnerability in certain devices has been a decade, maybe more. I don’t have an opinion about whether the time window could have been closed more quickly. What’s important to me is that the information is diffusing quickly. On one hand, that’s beneficial to those concerned about the security of their devices. On the other hand, that’s the starter’s gun for bad actors to deploy another hard-to-spot exploit.

I have several observation about this vulnerability:

  1. The challenge to those who create hardware and software is to realize that security issues are likely to exist. Those who discover these and exploit them, blindside the company. The developers have to reverse engineer the exploit and then figure out what their colleagues missed. Obviously this is a time consuming and difficult process. Perhaps 10 years is speedy or slow. I don’t know. But an error made many years ago can persist and affect millions of device owners.
  2. The bad actor acts and the company responsible for chasing down the flaw reacts. This is a cat-and-mouse game. As a result, the hardware and software developers are playing defense. The idea that a good defense is better than a good offense may not be accurate. Those initial errors are, by definition, unknown. The gap between the error and the exploit allows bad actors to do what they want. Playing defense allows the offense time to gear up something new. The “good guys” are behind the curve in this situation.
  3. The fact that the digital ecosystem is large means that the opportunity for mischief increases. In my lectures, I like to point out that technology yields benefits, but it also is an enabler of those who want to do mischief.

Net net: The steady increase in cyber crime and the boundary between systems and methods which are positive and negative becomes blurred. Have we entered a stage in technical development in which the blurred space between good and bad has become so large that one cannot tell what is right or wrong, correct or incorrect, appropriate or inappropriate? Are we living in a “ghost Web” or a “shadow land?”

Stephen E Arnold, September 8, 2023

India Where Regulators Actually Try or Seem to Try

August 22, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_tNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

I read “Data Act Will Make Digital Companies Handle Info under Legal Obligation.” The article reports that India’s regulators are beavering away in an attempt to construct a dam to stop certain flows of data. The write up states:

Union Minister of State for Electronics and Information Technology Rajeev Chandrasekhar on Thursday [August 17, 2023] said the Digital Personal Data Protection Act (DPDP Act) passed by Parliament recently will make digital companies handle the data of Indian citizens under absolute legal obligation.

What about certain high-technology companies operating with somewhat flexible methods? The article uses the phrase “punitive consequences of high penalty and even blocking them from operating in India.”

8 18 eagles

US companies’ legal eagles take off. Destination? India. MidJourney captures 1950s grade school textbook art quite well.

This passage caught my attention because nothing quite like it has progressed in the US:

The DPDP [Digital Personal Data Protection] Bill is aimed at giving Indian citizens a right to have his or her data protected and casts obligations on all companies, all platforms be it foreign or Indian, small or big, to ensure that the personal data of Indian citizens is handled with absolute (legal) obligation…

Will this proposed bill become law? Will certain US high-technology companies comply? I am not sure of the answer, but I have a hunch that a dust up may be coming.

Stephen E Arnold, August 22, 2023

What Does Apple Value? Money or Privacy

January 18, 2023

Ten Ways Apple Breaks its Privacy Promise to hear Apple tell it, the company makes protecting users’ privacy a top priority. While it does a better job than Google or Meta, that is not saying much. Gizmodo describes “10 Apple Privacy Problems that Might Surprise You.” Surprise? Nope, not us. Reporter Thomas Germain writes:

“Apple wants you to know that it cares about your privacy. For years, the company has emblazoned billboards with catchy slogans about its robust data protection practices, criticized tech rivals for their misuse of users’ personal information, and made big pronouncements about how it shields users. There’s no question that Apple handles your data with more care and respect than a lot of other tech companies. Unlike Google and Meta, parent company of Facebook and Instagram, Apple’s business doesn’t depend on mining and monetizing your data. But that doesn’t mean owning an iPhone spells perfect privacy. Apple harvest lots of personal information, often in ways that you might not expect if you buy into the company’s promise that ‘what happens on your iPhone, stays on your iPhone.’ It uses that information for advertising, developing new products, and more. Apple didn’t comment on the record for this story.”

Of course it didn’t. Germain describes each of the 10 privacy problems, complete with links to further reading on each one. Here are his headings: Apple appears to track you even with its own privacy settings turned off; Apple collects details about every single thing you do in the app store; A hidden map of everywhere you go; You ask your apps not to track you, but sometimes Apple lets them do it anyway; Apple collects enough data from your phone to track the people you hang out with; Apple makes iMessage less private on purpose; Targeted ads; Think your VPN hides all your data? think again; How private are your conversations with Siri?; and finally, Harvesting your music, movie and stocks data—and a whole lot more. Though none of these points actually surprise us, it is a bit startling to see them all laid out together. Navigate to the article for the details on each, including ways to lock down iDevices to the limited extent possible.

Cynthia Murrell, January 18, 2023

Another Lilting French Cash Register Chime

January 2, 2023

An outfit call SC Magazine reported that the French cash registers — you know the quaint one with brass letters and the cheery red enamel — has chimed again. “Microsoft Fined $64 Million by France over Cookies Used in Bing Searches” reports:

France’s privacy watchdog fined Microsoft €60 million ($64 million) for not offering clear enough instruction for users to reject cookies used for online ads, as part of the move to enforce Europe’s tightening data protection law.

The write up noted:

Microsoft has been ordered to solve the issue within three months by implementing a simplified cookie refusal mechanism, or it could face additional fines of €60,000 a day…

It seems that some US companies do not take those French and EU regulations seriously. My suggestion to the Softies: France in not the US. Get on a couple of special lists and you may find some quality time in a glass room at CDG next time you visit. The good news is that US embassy personnel can visit you without too much red tape bedecking those gray suits.

Stephen E Arnold, January 2, 2023

On the Path of a Super App for Crime

December 14, 2022

I know I am in the minority. In fact, I may the only person in Harrod’s Creek, Kentucky, thinking about Telegram and its technical evolution. From a humble private messaging service, Telegram has become the primary mechanism for armchair experts to keep track of Russia’s special operation, send secret messages, and engage in a range of interesting pursuits. Is it possible to promote and sell CSAM via an encrypted messaging app like Telegram? Okay, that’s a good question.

I noted another Telegram innovation which has become public. “No-SIM Signup, Auto-Delete All Chats, Topics 2.0 and More” explains that a person can sign up for the encrypted messaging service without having a SIM card and its pesky identifiers tagging along. To make sure a message about a special interest remains secret, the service allegedly deletes messages on a heartbeat determined by the Telegram user. The Telegram group function makes it possible for those who join a group to discuss a “special” interest to break up a group into sub groups. The idea is that a special interest group has special special interests. I will leave these to your imagination in the event you are wondering where some of the i2p and Tor accessible content has gone in the last few years.

As Telegram approach super app status for certain types of users, keep in mind that even the Telegram emoji have some new tricks. That little pony icon can do much more.

Stephen E Arnold, December 14, 2022

A Digital Schism: Is It the 16th Century All Over Again?

December 12, 2022

I noted “FBI Calls Apple’s Enhanced iCloud Encryption Deeply Concerning As Privacy Groups Hail It As a Victory for Users.” I am tempted to provide some historical color about Galileo, Jesuits, and infinitesimals. I won’t. I will point out that schisms appear to be evident today and may be as fraught as those when data flows were not ripping apart social norms. (How bad was it in the 16th century? Think in terms of toasting in fires those who did not go with the program. Quite toasty for some.)

The write up explains:

Apple yesterday [December 7, 2022] announced that end-to-end encryption is coming to even more sensitive types of iCloud data, including device backups, contacts, messages, photos, and more, meeting the longstanding demand of both users and privacy groups who have rallied for the company to take the significant step forward in user privacy.

Who is in favor of Apple’s E2EE push? The article says:

We [the Electronic Frontier Foundation] applaud Apple for listening to experts, child advocates, and users who want to protect their most sensitive data. Encryption is one of the most important tools we have for maintaining privacy and security online. That’s why we included the demand that Apple let users encrypt iCloud backups in the Fix It Already campaign that we launched in 2019.

Across the E2EE chess board is the FBI. The article points out:

In a statement to The Washington Post, the FBI, the largest intelligence agency in the world, said it’s “deeply concerned with the threat end-to-end and user-only-access encryption pose.” The bureau said that end-to-end encryption and Apple’s Advanced Data Protection make it harder for them to do their work and that they request “lawful access by design.”

I don’t have a dog in this commercial push for E2EE encryption which is one component in Apple’s marketing of itself as the Superman/Superwoman of truth, justice, and the American way. (A 30 percent app store tariff is part of this mythic set up as well.) I understand the concern of the investigators, but I am retired and sitting on the sidelines as I watch the Grim Reaper’s Rivian creep closer.

Several observations:

  1. In the boundary between these two sides or factions, the emergent behavior will get around the rules. That emergent behavior is a consequence of apparently irreconcilable differences. The impact of this schism will reverberate for an unknown amount of time.
  2. Absolutism makes perfect sense in a social setting where one side enjoys near total control of behavior, access, thoughts, etc. However we live in a Silicon Valley environment partially fueled by phenomenological existentialism. Toss in the digital flows of information, and the resulting mixture is likely to be somewhat unpredictable.
  3. Compromise will be painful but baby steps will be taken. Even Iran is reassigning morality police to less riot inducing activities. China has begun to respond to increasingly unhappy campers in lock down mode. Like I said, Baby steps.

Net net: Security and privacy are a bit like love and Plato’s chair. Welcome to the digital Middle Ages. The emergent middle class may well be bad actors.

Stephen E Arnold, December 12, 2022

Study Concludes Apple Privacy Promises a Sham, Lawsuit Follows

December 2, 2022

Apple would have us believe it is a bastion of privacy protection. Though it talks a good game, Techdirt reports, “Apple Sued After Another Study Finds Its Well-Hyped Privacy Standards Are Often Theatrical.” Researchers at software firm Mysk found Apple’s data tracking basically ignores privacy settings altogether. The study prompted a lawsuit (pdf) under the California Invasion of Privacy Act. Write Karl Bode notes:

“This isn’t the first time Apple’s new privacy features have been found to be a bit lacking. Several studies have also indicated that numerous app makers have been able to simply tap dancing around Apple’s heavily hyped do not track restrictions for some time, often without any penalty by Apple months after being contacted by reporters. That’s a notably different story than the one Apple has gotten many press outlets to tell. Apple desperately wants to differentiate its brand by a dedication to privacy (as you might have noticed from the endless billboards that simply say: ‘Privacy. That’s iPhone.’). And while the company may certainly be better on privacy than many other large tech giants, that’s simply not saying much.”

Good point. The lawsuit observes that details about app usage can be “intimate and potentially embarrassing.” Not to mention financially sensitive. This is why some of us have refused to bring our devices into every aspect of our lives; a suspicious nature pays off occasionally. Yep, Apple privacy… a bit lacking. No kidding?

Cynthia Murrell, December 2, 2022

Next Page »

  • Archives

  • Recent Posts

  • Meta