Veraset: Another Data Event

November 22, 2021

Here is a good example of how personal data, in this case tracking data, can be used without one’s knowledge. In its article “Files: Phone Data Shared” the Arkansas Democrat Gazette reports that data broker Veraset provided phone location data to the US Department of Health last year as part of a free trial. The transaction was discovered by digital-rights group Electronic Frontier Foundation. The firm marketed the data as valuable for COVID research, but after the trial period was up the agency declined to move forward with a partnership. The data was purportedly stripped of names and other personal details and the researchers found no evidence it was misused. However, Washington Post reporter Drew Harwell writes:

“[Foundation technologist Bennett Cyphers] noted that Veraset’s location data includes sequences of code, known as ‘advertising identifiers,’ that can be used to pinpoint individual phones. Researchers have also shown that such data can be easily ‘de-anonymized’ and linked to a specific person. Apple and Google announced changes earlier this year that would allow people to block their ID numbers from being used for tracking. Veraset and other data brokers have worked to improve their public image and squash privacy concerns by sharing their records with public health agencies, researchers and news organizations.”

Amidst a pandemic, that tactic just might work. How do data brokers get this information in the first place? We learn:

“Data brokers pay software developers to include snippets of code in their apps that then sent a user’s location data back to the company. Some companies have folded their code into games and weather apps, but Veraset does not say which apps it works with. Critics have questioned whether users are aware that their data is being shared in such a way. The company is a spinoff of the location-data firm SafeGraph, which Google banned earlier this year as part of an effort to restrict covert location tracking.”

Wow, banned by Google—that is saying something. Harwell reports SafeGraph shared data with the CDC during the first few weeks of the pandemic. The agency used that data to track how many people were staying home for its COVID Data Tracker.

App users, often unwittingly, agree to data sharing in those opaque user agreements most of us do not read. The alternative, of course, is to deprive oneself of technology that is increasingly necessary to operate in today’s world. It is almost as if that were by design.

Cynthia Murrell November 22, 2021

The Boss of the DoubleClick Outfit Offers Some Advice

October 19, 2021

I read “Alphabet CEO Sundar Pichai Calls for Federal Tech Regulation, Investments in Cybersecurity.” What did the owner of DoubleClick talk about?

That’s easy. Big things like quantum computing which is unlikely to arrive on the Google phone any time soon. And regulation. You know the rules of the road which the DoubleClick outfit follows like a super slick Waymo vehicle which rarely drive into a dead end or create a thrill or two for those spotting one in a bus lane. Plus cybersecurity. Right. That’s why the DoubleClick outfit apparently alerted some Gmail users that a mere nation state or two or three were interested in their missives.

The write up reports that the boss of the DoubleClick systems and methods stated in an interview at a high class technology event:

Pichai additionally tied consumer privacy to security, even noting that “one of the biggest risks to privacy is the data getting compromised” — an interesting statement coming only days after Amazon, a top Google rival, saw its game streaming site Twitch hacked. As for where to draw the line in regulating tech, Pichai said the law shouldn’t encroach on the open internet.

Yep, DoubleClick’s owner did not mention online advertising as originally crafted by pay-to-play innovator Yahoo. Right? Yahoo, the pre IPO settlement, and the GoTo.com/Overture business.

Nope, DoubleClick’s owner did not talk about online advertising and how that money machines has shaped Alphabet Google into the sleek, trustworthy, reliable, and Timnit Gebru-sensitive outfit it is today.

Minor omission. Understandable from the owner of the DoubleClick technology.

Following rules is the name of the game. The question is, “What rules is Alphabet Google following?”

Why new ones are important to the company is not particularly clear to me. But I just sit in my computer lab in rural Kentucky and marvel at how the owner of the DoubleClick technology can be so darned sincere and earnest.

As Oscar Wilde observed in the Importance of Being Earnest:

The truth is rarely pure and never simple.

That’s why it is challenging to delete old email on the Gmail system, why Android is a busy beaver in the transfer data stream, and why the Importance of Being Earnest is relevant to the mom-and-pop online advertising company and, of course, to quantum computing.

Stephen E Arnold, October 19, 2021

Human Editors and Subject Matter Experts? Dinosaurs but Just from a Previous Era

October 15, 2021

I read “Bugs in our Pockets: The Risks of Client-Side Scanning.” The embargo is amusing, and it underscores the issues related to confidential information and the notion of information wants to be free. Amusing, maybe not?

The write up looks a bit like a paper destined for a pay-to-play publisher or an outfit which cultivates a cabal-like approach to publishing. (Hello, ACM?) The paper includes 13 authors, and I suppose the idea is to convey consensus or a lead author who wishes to keep his or her head below the concrete bunker in order to avoid direct hits from those who don’t agree with the write up.

I neither agree nor disagree. I interpreted the write up as:

  • A clever bit of SEO, particularly the embargo and the availability of the paper to certain saucy online information services
  • A way to present some entities, although with the titles and email contacts favored by some link hunters
  • A technical bit of push back for assorted government mumbling about privacy, security, and another assault on personal freedoms.

Yep, the sky is falling.

Please, read the paper. One business executive allegedly said, “There is no return to normal. Today’s environment is the new normal.”

Is it possible this paper triggers Apple TV or YouTube to cue 1973 hit “The Way We Were”?

Stephen E Arnold, October 15, 2021

99 Percent Accurate: Close Enough for PR Output

August 24, 2021

I am not entering a horse in this race, a dog in this fight, or a pigeon in this race. I want to point to a write up in a newspaper in some way very tenuously connected to the former driver of the Bezos bulldozer. That write is “Opinion: Apple’s New Child Safety Tool Comes with Privacy Trade-Offs — Just Like All the Others.”

Right off the bat I noted the word “all.” Okay, categorical affirmatives put my teeth edge the same way Miss Blackburn’s fingernails scraping on the blackboard in calculus class did. “All”. Very tidy.

The write up contains an interesting statement or two. I circled this one in Bezos bulldozer orange:

The practice of on-device flagging may sound unusually violative. Yet Apple has a strong argument that it’s actually more protective of privacy than the industry standard. The company will learn about the existence of CSAM only when the quantity of matches hits a certain threshold, indicating a collection.

The operative word is threshold. Like “all”, threshold sparks a few questions in my mind. Does it yours? Let me provide a hint: Who or what sets a threshold? And under what conditions is a threshold changed? There are others, but I want to make this post readable to my TikTok-like readers.

I liked the conundrum angle too:

The benefit of nabbing abusers in this case may outweigh these hypothetical harms, especially if Apple holds itself to account — and the public keeps on the pressure. Yet the company’s conundrum emphasizes an unpleasant truth: Doing something to protect public safety in the Internet age is better than doing nothing — yet every “something” introduces issues of its own.

Fascinating. I am curious how Apple PR and marketing will respond. Hopefully with fewer unsupported assertions, info about thresholds, and the logician’s bane: A categorical affirmative.

Stephen E Arnold, August 24, 2021

Does Google Play Protect and Serve—Ads?

August 20, 2021

We hope, gentle reader, that you have not relied on the built-in Google Play Protect to safeguard your Android devices when downloading content from the Play store. MakeUseOf cites a recent report from AV-Test in, “Report: Google Play Protect Sucks at Detecting Malware.” Writer Gavin Phillips summarizes:

“With a maximum of 18 points on offer across the three test sections of Protection, Performance, and Usability, Google Play Protect picked up just 6.0—a full ten points behind the next option, Ikarus. AV-TEST pits each of the antivirus tools against more than 20,000 malicious apps. In the endurance test running from January to June 2021, there were three rounds of testing. Each test involved 3,000 newly discovered malware samples in a real-time test, along with a reference set of malicious apps using malware samples in circulation for around four weeks. Google Play Protect detected 68.8 percent of the real-time malware samples and 76.7 percent of the reference malware samples. In addition, AV-TEST installs around 10,000 harmless apps from the Play Store on each device, aiming to detect any false positives. Again, Google’s Play Protect came bottom of the pile, marking 70 harmless apps as malware.”

A chart listing the test’s results for each security solution can be found in the writeup or the report itself. More than half received the full 18 points while the rest fall between 16 and 17.8 points. Except for Google—its measly 6 points really set it apart as the worst option by far. Since Google “Protect” is the default security option for Android app downloads, this is great news for bad actors. The rest of us would do well to study the top half of that list. iOS users excepted.

Based in Magdeburg, Germany, research institute AV-Test pits the world’s cyber security solutions against its large collection of digital malware samples and makes results available to private users for free. The firm makes its money on consulting for companies and government institutions. AV-Test was founded in 2004 and was just acquired by Ufenau Capital Partners in February of this year.

Cynthia Murrell, August 20, 2021

DuckDuckGo Produces Privacy Income

August 10, 2021

DuckDuckGo advertises that it protects user privacy and does not have targeted ads in search results.  Despite its small size, protecting user privacy makes DuckDuckGo a viable alternative to Google.  TechRepublic delves into DuckDuckGo’s profits and how privacy is a big money maker in the article, “How DuckDuckGo Makes Money Selling Selling Search, Not Privacy.”  DuckDuckGo has had profitable margins since 2014 and made over $100 million in 2020.

Google, Bing, and other companies interested in selling personal data say that it is a necessary evil in order for search and other services to work.  DuckDuckGo says that’s not true and the company’s CEO Gabriel Weinberg said:

“It’s actually a big myth that search engines need to track your personal search history to make money or deliver quality search results. Almost all of the money search engines make (including Google) is based on the keywords you type in, without knowing anything about you, including your search history or the seemingly endless amounts of additional data points they have collected about registered and non-registered users alike. In fact, search advertisers buy search ads by bidding on keywords, not people….This keyword-based advertising is our primary business model.”

Weinberg continued that search engines do not need to track as much personal information as they do to personalize customer experiences or make money.  Search engines and other online services could limit the amount of user data they track and still generate a profit.

Google made over $147 billion in 2020, but DuckDuckGo’s $100 million is not a small number either.  DuckDuckGo’s market share is greater than Bing’s and, if limited to the US market, its market share is second to Google.  DuckDuckGo is a like the Little Engine That Could.  It is a hard working marketing operation and it keeps chugging along while batting the privacy beach ball along the Madison Avenue sidewalk.

Whitney Grace, August 10, 2021

COVID Forces Google To Show Its Work And Cites Sources

August 10, 2021

Do you remember in math class when you were told to show you work or when writing an essay you had to cite your sources? Google has decided to do the same thing with its search results says Today Online in the article, “Google Is Starting To Tell You How It Found Search Results.” Google wants to share with users why they are shown particular results. Soon Google will display an option within search results that allows users to see how results were matched to their query. Google wants users to know where their search results come from to better determine relevancy.

Google might not respect users’ privacy, but they do want to offer better transparency in search results. Google wants to explain itself and help its users make better decisions:

“Google has been making changes to give users more context about the results its search engine provides. Earlier this year it introduced panels to tell users about the sources of the information they are seeing. It has also started warning users when a topic is rapidly evolving and search results might not be reliable.”

Google search makes money by selling ads and sponsoring content in search results. Google labels any sponsored results with an “ad” tag. However, one can assume that Google does push more sponsored content into search results than it tells users. Helping users understand content and make informative choices, is a great way to educate users. Google isn’t being altruistic, though. Misinformation about vaccines and COVID-19 has spread like wildfire since the past US presidential administration. Users have demanded that Google, Facebook, and other tech companies be held accountable as they are platforms used to spread misinformation. Google sharing the why behind search queries is a start, but how many people will actually read them?

Whitney Grace, August 10, 2021

Another Perturbation of the Intelware Market: Apple Cores Forbidden Fruit

August 6, 2021

It may be tempting for some to view Apple’s decision to implement a classic man-in-the-middle process. If the information in “Apple Plans to Scan US iPhones for Child Abuse Imagery” is correct, the maker of the iPhone has encroached on the intelware service firms’ bailiwick. The paywalled newspaper reports:

Apple intends to install software on American iPhones to scan for child abuse imagery

The approach — dubbed ‘neuralMatch’ — is on the iPhone device, thus providing functionality substantially similar to other intelware vendors’ methods for obtaining data about a user’s actions.

The article concludes:

According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a “safety voucher” saying whether it is suspect or not. Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.

Observations:

  1. The idea allows Apple to provide a function likely to be of interest to law enforcement and intelligence professionals; for example, requesting a report about a phone with filtered and flagged data are metadata
  2. Specialized software companies may have an opportunity to refine existing intelware or develop a new category of specialized services to make sense of data about on-phone actions
  3. The proposal, if implemented, would create a PR opportunity for either Apple or its critics to try to leverage
  4. Legal issues about the on-phone filtering and metadata (if any) would add friction to some legal matters.

One question: How similar is this proposed Apple service to the operation of intelware like that allegedly available from the Hacking Team, NSO Group, and other vendors? Another question: Is this monitoring a trial balloon or has the system and method been implemented in test locations; for example, China or an Eastern European country?

Stephen E Arnold, August 6, 2021

About Privacy? You Ask

July 30, 2021

Though the issue of privacy was not central to the recent US Supreme Court case Transunion v. Ramirez, the Court’s majority opinion may have far-reaching implications for privacy rights. The National Law Review considers, “Did the US Supreme Court Just Gut Privacy Law Enforcement?” At issue is the difference between causing provable harm and simply violating a law. Writer Theodore F. Claypoole explains:

“The relevant decision in Transunion involves standing to sue in federal court. The court found that to have Constitutional standing to sue in federal court, a plaintiff must show, among other things, that the plaintiff suffered concrete injury in fact, and central to assessing concreteness is whether the asserted harm has a close relationship to a harm traditionally recognized as providing a basis for a lawsuit in American courts. The court makes a separation between a plaintiff’s statutory cause of action to sue a defendant over the defendant’s violation of federal law, and a plaintiff’s suffering concrete harm because of the defendant’s violation of federal law. It claims that under the Constitution, an injury in law is not automatically an injury in fact. A risk of future harm may allow an injunction to prevent the future harm, but does not magically qualify the plaintiff to receive damages. … This would mean that some of the ‘injuries’ that privacy plaintiffs have claimed to establish standing, like increased anxiety over a data exposure or the possibility that their data may be abused by criminals in the future, are less likely to resonate in some future cases.”

The opinion directly affects only the ability to sue in federal court, not on the state level. However, California aside, states tend to follow SCOTUS’ lead. Since when do we require proof of concrete harm before punishing lawbreakers? “Never before,” according to dissenting Justice Clarence Thomas. It will be years before we see how this ruling affects privacy cases, but Claypoole predicts it will harm plaintiffs and privacy-rights lawyers alike. He notes it would take an act of Congress to counter the ruling, but (of course) Democrats and Republicans have different priorities regarding privacy laws.

Cynthia Murrell, July 30, 2021

Facial Recognition: More Than Faces

July 29, 2021

Facial recognition software is not just for law enforcement anymore. Israel-based firm AnyVision’s clients include retail stores, hospitals, casinos, sports stadiums, and banks. Even schools are using the software to track minors with, it appears, nary a concern for their privacy. We learn this and more from, “This Manual for a Popular Facial Recognition Tool Shows Just How Much the Software Tracks People” at The Markup. Writer Alfred Ng reports that AnyVision’s 2019 user guide reveals the software logs and analyzes all faces that appear on camera, not only those belonging to persons of interest. A representative boasted that, during a week-long pilot program at the Santa Fe Independent School District in Texas, the software logged over 164,000 detections and picked up one student 1100 times.

There are a couple privacy features built in, but they are not turned on by default. “Privacy Mode” only logs faces of those on a watch list and “GDPR Mode” blurs non-watch listed faces on playbacks and downloads. (Of course, what is blurred can be unblurred.) Whether a client uses those options depends on its use case and, importantly, local privacy regulations. Ng observes:

“The growth of facial recognition has raised privacy and civil liberties concerns over the technology’s ability to constantly monitor people and track their movements. In June, the European Data Protection Board and the European Data Protection Supervisor called for a facial recognition ban in public spaces, warning that ‘deploying remote biometric identification in publicly accessible spaces means the end of anonymity in those places.’ Lawmakers, privacy advocates, and civil rights organizations have also pushed against facial recognition because of error rates that disproportionately hurt people of color. A 2018 research paper from Joy Buolamwini and Timnit Gebru highlighted how facial recognition technology from companies like Microsoft and IBM is consistently less accurate in identifying people of color and women. In December 2019, the National Institute of Standards and Technology also found that the majority of facial recognition algorithms exhibit more false positives against people of color. There have been at least three cases of a wrongful arrest of a Black man based on facial recognition.”

Schools that have implemented facial recognition software say it is an effort to prevent school shootings, a laudable goal. However, once in place it is tempting to use it for less urgent matters. Ng reports the Texas City Independent School District has used it to identify one student who was licking a security camera and to have another removed from his sister’s graduation because he had been expelled. As Georgetown University’s Clare Garvie points out:

“The mission creep issue is a real concern when you initially build out a system to find that one person who’s been suspended and is incredibly dangerous, and all of a sudden you’ve enrolled all student photos and can track them wherever they go. You’ve built a system that’s essentially like putting an ankle monitor on all your kids.”

Is this what we really want as a society? Never mind, it is probably a bit late for that discussion.

Cynthia Murrell, July 29, 2021

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta