Allegations That Canadian Officials Are Listening

December 13, 2023

green-dino_thumb_thumb_thumbThis essay is the work of a dumb dinobaby. No smart software required.

Widespread Use of Phone Surveillance Tools Documented in Canadian Federal Agencies

It appears a baker’s dozen of Canadian agencies are ignoring a longstanding federal directive on privacy protections. Yes, Canada. According to CBC/ Radio Canada, “Tools Capable of Extracting Personal Data from Phones Being Used by 13 Federal Departments, Documents Show.” The trend surprised even York University associate professor Evan Light, who filed the original access-to-information request. Reporter Brigitte Bureau shares:

image

Many people, it seems, are listening to Grandma’s conversations in a suburb of Calgary. (Nice weather in the winter.) Thanks, MSFT Copilot. I enjoyed the flurry of messages that you were busy creating my other image requests. Just one problemo. I had only one image request.

“Tools capable of extracting personal data from phones or computers are being used by 13 federal departments and agencies, according to contracts obtained under access to information legislation and shared with Radio-Canada. Radio-Canada has also learned those departments’ use of the tools did not undergo a privacy impact assessment as required by federal government directive. The tools in question can be used to recover and analyze data found on computers, tablets and mobile phones, including information that has been encrypted and password-protected. This can include text messages, contacts, photos and travel history. Certain software can also be used to access a user’s cloud-based data, reveal their internet search history, deleted content and social media activity. Radio-Canada has learned other departments have obtained some of these tools in the past, but say they no longer use them. … ‘I thought I would just find the usual suspects using these devices, like police, whether it’s the RCMP or [Canada Border Services Agency]. But it’s being used by a bunch of bizarre departments,’ [Light] said.

To make matters worse, none of the agencies had conducted the required Privacy Impact Assessments. A federal directive issued in 2002 and updated in 2010 required such PIAs to be filed with the Treasury Board of Canada Secretariat and the Office of the Privacy Commissioner before any new activities involving collecting or handling personal data. Light is concerned that agencies flat out ignoring the directive means digital surveillance of citizens has become normalized. Join the club, Canada.

Cynthia Murrell, December 13, 2023

23andMe: Fancy Dancing at the Security Breach Ball

December 11, 2023

green-dino_thumb_thumb_thumbThis essay is the work of a dumb dinobaby. No smart software required.

Here’s a story I found amusing. Very Sillycon Valley. Very high school science clubby. Navigate to “23andMe Moves to Thwart Class-Action Lawsuits by Quietly Updating Terms.” The main point of the write up is that the firm’s security was breached. How? Probably those stupid customers or a cyber security vendor installing smart software that did not work.

image

How some influential wizards work to deflect actions hostile to their interests. In the cartoon, the Big Dog tells a young professional, “Just change the words.” Logical, right? Thanks, MSFT Copilot. Close enough for horseshoes.

The article reports:

Following a hack that potentially ensnared 6.9 million of its users, 23andMe has updated its terms of service to make it more difficult for you to take the DNA testing kit company to court, and you only have 30 days to opt out.

I have spit in a 23andMe tube. I’m good at least for this most recent example of hard-to-imagine security missteps. The article cites other publications but drives home what I think is a useful insight into the thought process of big-time Sillycon Valley firms:

customers were informed via email that “important updates were made to the Dispute Resolution and Arbitration section” on Nov. 30 “to include procedures that will encourage a prompt resolution of any disputes and to streamline arbitration proceedings where multiple similar claims are filed.” Customers have 30 days to let the site know if they disagree with the terms. If they don’t reach out via email to opt out, the company will consider their silence an agreement to the new terms.

No more neutral arbitrators, please. To make the firm’s intentions easier to understand, the cited article concludes:

The new TOS specifically calls out class-action lawsuits as prohibited. “To the fullest extent allowed by applicable law, you and we agree that each party may bring disputes against the only party only in an individual capacity, and not as a class action or collective action or class arbitration” …

I like this move for three reasons:

  1. It provides another example of the tactics certain Information Highway contractors view the Rules of the Road. In a word, “flexible.” In another word, “malleable.”
  2. The maneuver is one that seems to be — how shall I phrase it — elephantine, not dainty and subtle.
  3. The “fix” for the problem is to make the estimable company less likely to get hit with massive claims in a court. Courts, obviously, are not to be trusted in some situations.

I find the entire maneuver chuckle invoking. Am I surprised at the move? Nah. You can’t kid this dinobaby.

Stephen E Arnold, December 11, 2023

How about Fear and Paranoia to Advance an Agenda?

December 6, 2023

green-dino_thumb_thumb_thumbThis essay is the work of a dumb dinobaby. No smart software required.

I thought sex sells. I think I was wrong. Fear seems to be the barn burner at the end of 2023. And why not? We have the shadow of another global pandemic? We have wars galore. We have craziness on US air planes. We have a Cybertruck which spells the end for anyone hit by the behemoth.

I read (but did not shake like the delightful female in the illustration “AI and Mass Spying.” The author is a highly regarded “public interest technologist,” an internationally renowned security professional, and a security guru. For me, the key factoid is that he is a fellow at the Berkman Klein Center for Internet & Society at Harvard University and a lecturer in public policy at the Harvard Kennedy School. Mr. Schneier is a board member of the Electronic Frontier Foundation and the most, most interesting organization AccessNow.

image

Fear speaks clearly to those in retirement communities, elder care facilities, and those who are uninformed. Let’s say, “Grandma, you are going to be watched when you are in the bathroom.” Thanks, MSFT Copilot. I hope you are sending data back to Redmond today.

I don’t want to make too much of the Harvard University connection. I feel it is important to note that the esteemed educational institution got caught with its ethical pants around its ankles, not once, but twice in recent memory. The first misstep involved an ethics expert on the faculty who allegedly made up information. The second is the current hullabaloo about a whistleblower allegation. The AP slapped this headline on that report: “Harvard Muzzled Disinfo Team after $500 Million Zuckerberg Donation.” (I am tempted to mention the Harvard professor who is convinced he has discovered fungible proof of alien technology.)

So what?

The article “AI and Mass Spying” is a baffler to me. The main point of the write up strikes me as:

Summarization is something a modern generative AI system does well. Give it an hourlong meeting, and it will return a one-page summary of what was said. Ask it to search through millions of conversations and organize them by topic, and it’ll do that. Want to know who is talking about what? It’ll tell you.

I interpret the passage to mean that smart software in the hands of law enforcement, intelligence operatives, investigators in one of the badge-and-gun agencies in the US, or a cyber lawyer is really, really bad news. Smart surveillance has arrived. Smart software can process masses of data. Plus the outputs may be wrong. I think this means the sky is falling. The fear one is supposed to feel is going to be the way a chicken feels when it sees the Chik-fil-A butcher truck pull up to the barn.

Several observations:

  1. Let’s assume that smart software grinds through whatever information is available to something like a spying large language model. Are those engaged in law enforcement are unaware that smart software generates baloney along with the Kobe beef? Will investigators knock off the verification processes because a new system has been installed at a fusion center? The answer to these questions is, “Fear advances the agenda of using smart software for certain purposes; specifically, enforcement of rules, regulations, and laws.”
  2. I know that the idea that “all” information can be processed is a jazzy claim. Google made it, and those familiar with Google search results knows that Google does not even come close to all. It can barely deliver useful results from the Railway Retirement Board’s Web site. “All” covers a lot of ground, and it is unlikely that a policeware vendor will be able to do much more than process a specific collection of data believed to be related to an investigation. “All” is for fear, not illumination. Save the categorical affirmatives for the marketing collateral, please.
  3. The computational cost for applying smart software to large domains of data — for example, global intercepts of text messages — is fun to talk about over lunch. But the costs are quite real. Then the costs of the computational infrastructure have to be paid. Then the cost of the downstream systems and people who have to figure out if the smart software is hallucinating or delivering something useful. I would suggest that Israel’s surprise at the unhappy events in October 2023 to the present day unfolded despite the baloney for smart security software, a great intelligence apparatus, and the tons of marketing collateral handed out at law enforcement conferences. News flash: The stuff did not work.

In closing, I want to come back to fear. Exactly what is accomplished by using fear as the pointy end of the stick? Is it insecurity about smart software? Are there other messages framed in a different way to alert people to important issues?

Personally, I think fear is a low-level technique for getting one’s point across. But when those affiliated with an outfit with the ethics matter and now the payola approach to information, how about putting on the big boy pants and select a rhetorical trope that is unlikely to anything except remind people that the Covid thing could have killed us all. Err. No. And what is the agenda fear advances?

So, strike the sex sells trope. Go with fear sells.

Stephen E Arnold, December 6, 2023

23andMe: Those Users and Their Passwords!

December 5, 2023

green-dino_thumb_thumb_thumbThis essay is the work of a dumb dinobaby. No smart software required.

Silicon Valley and health are match fabricated in heaven. Not long ago, I learned about the estimable management of Theranos. Now I find out that “23andMe confirms hackers stole ancestry data on 6.9 million users.” If one follows the logic of some Silicon Valley outfits, the data loss is the fault of the users.

image

“We have the capability to provide the health data and bioinformation from our secure facility. We have designed our approach to emulate the protocols implemented by Jack Benny and his vault in his home in Beverly Hills,” says the enthusiastic marketing professional from a Silicon Valley success story. Thanks, MSFT Copilot. Not exactly Jack Benny, Ed, and the foghorn, but I have learned to live with “good enough.”

According to the peripatetic Lorenzo Franceschi-Bicchierai:

In disclosing the incident in October, 23andMe said the data breach was caused by customers reusing passwords, which allowed hackers to brute-force the victims’ accounts by using publicly known passwords released in other companies’ data breaches.

Users!

What’s more interesting is that 23andMe provided estimates of the number of customers (users) whose data somehow magically flowed from the firm into the hands of bad actors. In fact, the numbers, when added up, totaled almost seven million users, not the original estimate of 14,000 23andMe customers.

I find the leak estimate inflation interesting for three reasons:

  1. Smart people in Silicon Valley appear to struggle with simple concepts like adding and subtracting numbers. This gap in one’s education becomes notable when the discrepancy is off by millions. I think “close enough for horse shoes” is a concept which is wearing out my patience. The difference between 14,000 and almost 17 million is not horse shoe scoring.
  2. The concept of “security” continues to suffer some set backs. “Security,” one may ask?
  3. The intentional dribbling of information reflects another facet of what I call high school science club management methods. The logic in the case of 23andMe in my opinion is, “Maybe no one will notice?”

Net net: Time for some regulation, perhaps? Oh, right, it’s the users’ responsibility.

Stephen E Arnold, December 5, 2023 

Cyber Security Responsibility: Where It Belongs at Last!

December 5, 2023

green-dino_thumb_thumb_thumbThis essay is the work of a dumb dinobaby. No smart software required.

I want to keep this item brief. Navigate to “CISA’s Goldstein Wants to Ditch ‘Patch Faster, Fix Faster’ Model.”

CISA means the US government’s Cybersecurity and Infrastructure Security Agency. The “Goldstein” reference points to Eric Goldstein, the executive assistant director of CISA.

The main point of the write up is that big technology companies have to be responsible for cleaning up their cyber security messes. The write up reports:

Goldstein said that CISA is calling on technology providers to “take accountability” for the security of their customers by doing things like enabling default security controls such as multi-factor authentication, making security logs available, using secure development practices and embracing memory safe languages such as Rust.

I may be incorrect, but I picked up a signal that the priorities of some techno feudalists are not security. Perhaps these firms’ goals are maximizing profit, market share, and power over their paying customers. Security? Maybe it is easier to describe in a slide deck or a short YouTube video?

image

The use of a parental mode seems appropriate for a child? Will it work for techno feudalists who have created a digital mess in kitchens throughout the world? Thanks, MSFT Copilot. You must have ingested some “angry mommy” data when your were but a wee sprout.

Will this approach improve the security of mission-critical systems? Will the enjoinder make a consumer’s mobile phone more secure?

My answer? Without meaningful consequences, security is easier to talk about than deliver. Therefore, minimal change in the near future. I wish I were wrong.

Stephen E Arnold, December 5, 2023

Omegle: Hasta La Vista

November 30, 2023

green-dino_thumb_thumb_thumbThis essay is the work of a dumb dinobaby. No smart software required.

In the Internet’s early days, users could sign into chatrooms and talk with strangers. While chatrooms have fallen out of favor, the idea of talking with strangers hung on but now it’s accompanied by video. Chat Roulette and Omegle are popular chatting applications that allow users to video chat with random individuals. The apps are notorious for pranks and NSFW content, including child sexual abuse. The Independent shared a story about one of the two: “Omegle Anonymous Chat App Shuts Down After 14 Years.”

Omegle had a simple concept: sign in, be connected to another random person, and video chat for as long as you like. Leif K-Brooks launched the chat platform with good intentions in 2009, but it didn’t take long for bad actors to infiltrate it. K-Brooks tried to stop criminal activities on Omegle with features, such as the “monitored chats” with moderators. They didn’t work and Omegle continued to receive flack. K-Brooks doesn’t want to deal with the criticism anymore:

“The intensity of the fight over use of the site had forced him to decide to shut it down, he said, and it will stop working straight away. ‘As much as I wish circumstances were different, the stress and expense of this fight – coupled with the existing stress and expense of operating Omegle, and fighting its misuse – are simply too much. Operating Omegle is no longer sustainable, financially nor psychologically. Frankly, I don’t want to have a heart attack in my 30s,’ wrote Leif K-Brooks, who has run the website since founding it.”

Omegle’s popularity rose during the pandemic. The sudden popularity surge highlighted the criminal acts on the video chat platform. K-Brooks believes that his critics used fear to shut down the Web site. He also acknowledged that people are quicker to attack and slower to recognize shared humanity. He theorizes that social media platforms are being labeled negatively because of small groups of bad actors.

Whitney Grace, November 30, 2023

Who Benefits from Advertising Tracking Technology? Teens, Bad Actors, You?

November 23, 2023

green-dino_thumb_thumbThis essay is the work of a dumb humanoid. No smart software required.

Don’t get me wrong. I absolutely love advertising. When I click to Sling’s or Tubi’s free TV, a YouTube video about an innovation in physics, or visit the UK’s Daily Mail — I see just a little bit of content. The rest, it seems to this dinobaby, to be advertising. For some reason, YouTube this morning (November 17, 2023) is showing me ads for a video game or undergarments for a female-oriented person before I can watch an update on the solemnity of Judge Engoran’s courtroom.

However, there are some people who are not “into” advertising. I want to point out that these individuals are in the minority; otherwise, people flooded with advertising would not disconnect or navigate to a somewhat less mercantile souk. Yes, a few exist; for example, government Web sites. (I acknowledge that some governments’ Web sites are advertising, but there’s not much I can do about that fusion of pitches and objective information about the location of a nation’s embassy.)

But to the matter at hand. I read a PDF titled “Europe’s Hidden Security Crisis.” The document is a position paper, a white paper, or a special report. The terminology varies depending on the entities involved in the assembly of the information. The group apparently nudging the intrepid authors to reveal the “hidden security crisis” could be the Irish Council for Civil Liberties. I know zero about the group, and I know even less about the authors, Dr. Johnny Ryan and Wolfie Christl. Dr. Ryan has written for the newspaper which looks like a magazine, and Mr. Christl is a principal of Cracked Labs.

So what’s the “hidden security crisis”? There is a special operation underway in Ukraine. The open source information crowd is documenting assorted facts and developments on X.com. We have the public Telegram channels outputting a wealth of information about the special operation and the other unhappy circumstances in Europe. We have the Europol reports about cyber crime, takedowns, and multi-nation operations. I receive in my newsfeed pointers to “real” news about a wide range of illegal activities. In short, what’s hidden?

image

An evil Web bug is capturing information about a computer user. She is not afraid. She is unaware… apparently. Thanks Microsoft Bing. Ooops. Strike that. Thanks, Copilot. Good Grinch. Did you accidentally replicate a beloved character or just think it up?

The report focuses on what I have identified as my first love — commercial messaging aka advertising.

The “hidden”, I think, refers to data collected when people navigate to a Web site and click, drag a cursor, or hover on a particular region. Those data along with date, time, browser used, and similar information are knitted together into a tidy bundle. These data can be used to have other commercial messages follow a person to another Web site, trigger an email urging the surfer to buy more or just buy something, or populate one of the cross tabulation companies’ databases.

The write up uses the lingo RTB or real time bidding to describe the data collection. The report states:

Our investigation highlights a widespread trade in data about sensitive European personnel and leaders that exposes them to blackmail, hacking and compromise, and undermines the security of their organizations and institutions.  These data flow from Real-Time Bidding (RTB), an advertising technology that is active on almost all websites and apps. RTB involves the broadcasting of sensitive data about people using those websites and apps to large numbers of other entities, without security measures to protect the data. This occurs billions of times a day. Our examination of tens of thousands of pages of RTB data reveals that EU military personnel and political decision makers are targeted using RTB.

In the US, the sale of data gathered via advertising cookies, beacons, and related technologies is a business with nearly 1,000 vendors offering data. I am not sure about the “hidden” idea, however. If the term applies to an average Web user, most of those folks do not know about changing defaults. That is not a hidden function; that is an indication of the knowledge the user has about a specific software.

If you are interested in the report, navigate to this link. You may find the “security crisis” interesting. If not, keep in mind that one can eliminate such tracking with fairly straightforward preventative measures. For me, I love advertising. I know the beacons and bugs want to do the right thing: Capture and profile me to the nth degree. Advertising! It is wonderful and its data exhaust informative and useful.

Stephen E Arnold, November 23, 2023

Is Your Phone Secure? Think Before Answering, Please

November 21, 2023

green-dino_thumb_thumb_thumbThis essay is the work of a dumb dinobaby. No smart software required.

I am not going to offer my observations and comments. The article, its information, and the list of companies from The Times of India’s “11 Dangerous Spywares Used Globally: Pegasus, Hermit, FinFisher and More” speaks for itself. The main point of the write up is that mobile phone security should be considered in the harsh light of digital reality. The write up provides a list of outfits and components which can be used to listen to conversations, intercept text and online activity, as well as exfiltrate geolocation data, contact lists, logfiles, and imagery. Some will say, “This type of software should be outlawed.” I have no comment.

image

Are there bugs waiting to compromise your mobile device? Yep. Thanks, MSFT Copilot. You have a knack for capturing the type of bugs with which many are familiar.

Here’s the list. I have alphabetized by the name of the malware and provided a possible entity name for the owner:

  • Candid. Maybe a Verint product? (Believed to be another product developed by former Israeli cyber warfare professionals)
  • Chrysaor. (Some believe it was created by NSO Group or NSO Group former employees)
  • Dark Tequila. (Requires access to the targeted device or for the user to perform an action. More advanced methods require no access to the device nor for the user to click)
  • FinFisher. Gamma Group  (The code is “in the wild” and the the German unit may be on vacation or working under a different name in the UK)
  • Hawkeye, Predator, or Predator Pain (Organization owning the software is not known to this dinobaby)
  • Hermit. RCS Lab (Does RCS mean “remote control service”?)
  • Pegasus. NSO Group Pegasus (now with a new president who worked at NSA and Homeland Security)
  • RATs (Remote Access Trojans) This is a general class of malware. Many variants.
  • Sofacy. APT28 (allegedly)
  • XKeyscore (allegedly developed by a US government agency)

Is the list complete? No.

Stephen E Arnold, November 21, 2023

Why Suck Up Health Care Data? Maybe for Cyber Fraud?

November 20, 2023

green-dino_thumb_thumbThis essay is the work of a dumb humanoid. No smart software required.

In the US, medical care is an adventure. Last year, my “wellness” check up required a visit to another specialist. I showed up at the appointed place on the day and time my printed form stipulated. I stood in line for 10 minutes as two “intake” professionals struggled to match those seeking examinations with the information available to the check in desk staff. The intake professional called my name and said, “You are not a female.” I said, “That’s is correct.” The intake professional replied, “We have the medical records from your primary care physician for a female named Tina.” Nice Health Insurance Portability and Accountability Act compliance, right?

image

A moose in Maine learns that its veterinary data have been compromised by bad actors, probably from a country in which the principal language is not moose grunts. With those data, the shocked moose can be located using geographic data in his health record. Plus, the moose’s credit card data is now on the loose. If the moose in Maine is scared, what about the humanoids with the fascinating nasal phonemes?

That same health care outfit reported that it was compromised and was a victim of a hacker. The health care outfit floundered around and now, months later, struggles to update prescriptions and keep appointments straight. How’s that for security? In my book, that’s about par for health care managers who [a] know zero about confidentiality requirements and [b] even less about system security. Horrified? You can read more about this one-horse travesty in “Norton Healthcare Cyber Attack Highlights Record Year for Data Breaches Nationwide.” I wonder if the grandparents of the Norton operation were participants on Major Bowes’ Amateur Hour radio show?

Norton Healthcare was a poster child for the Commonwealth of Kentucky. But the great state of Maine (yep, the one with moose, lovable black flies, and citizens who push New York real estate agents’ vehicles into bays) managed to lose the personal data for 2,192,515 people. You can read about that “minor” security glitch in the Office of the Maine Attorney General’s Data Breach Notification.

What possible use is health care data? Let me identify a handful of bad actor scenarios enabled by inept security practices. Note, please, that these are worse than being labeled a girl or failing to protect the personal information of what could be most of the humans and probably some of the moose in Maine.

  1. Identity theft. Those newborns and entries identified as deceased can be converted into some personas for a range of applications, like applying for Social Security numbers, passports, or government benefits
  2. Access to bank accounts. With a complete array of information, a bad actor can engage in a number of maneuvers designed to withdraw or transfer funds
  3. Bundle up the biological data and sell it via one of the private Telegram channels focused on such useful information. Bioweapon researchers could find some of the data fascinating.

Why am I focusing on health care data? Here are the reasons:

  1. Enforcement of existing security guidelines seems to be lax. Perhaps it is time to conduct audits and penalize those outfits which find security easy to talk about but difficult to do?
  2. Should one or more Inspector Generals’ offices conduct some data collection into the practices of state and Federal health care security professionals, their competencies, and their on-the-job performance? Some humans and probably a moose or two in Maine might find this idea timely.
  3. Should the vendors of health care security systems demonstrate to one of the numerous Federal cyber watch dog groups the efficacy of their systems and then allow one or more of the Federal agencies to probe those systems to verify that the systems do, in fact, actually work?

Without meaningful penalties for security failures, it may be easier to post health care data on a Wikipedia page and quit the crazy charade that health information is secure.

Stephen E Arnold, November 20, 2023

Smart Software for Cyber Security Mavens (Good and Bad Mavens)

November 17, 2023

green-dino_thumb_thumbThis essay is the work of a dumb humanoid. No smart software required.

One of my research team (who wishes to maintain a low profile) called my attention to the “Awesome GPTs (Agents) for Cybersecurity.” The list on GitHub says:

The "Awesome GPTs (Agents) Repo" represents an initial effort to compile a comprehensive list of GPT agents focused on cybersecurity (offensive and defensive), created by the community. Please note, this repository is a community-driven project and may not list all existing GPT agents in cybersecurity. Contributions are welcome – feel free to add your own creations!

image

Open source cyber security tools and smart software can be used by good actors to make people safe. The tools can be used by less good actors to create some interesting situations for cyber security professionals, the elderly, and clueless organizations. Thanks, Microsoft Bing. Does MSFT use these tools to keep people safe or unsafe?

When I viewed the list, it contained more than 30 items. Let me highlight three, and invite you to check out the other 30 at the link to the repository:

  1. The Threat Intel Bot. This is a specialized GPT for advanced persistent threat intelligence
  2. The Message Header Analyzer. This dissects email headers for “insights.”
  3. Hacker Art. The software generates hacker art and nifty profile pictures.

Several observations:

  • More tools and services will be forthcoming; thus, the list will grow
  • Bad actors and good actors will find software to help them accomplish their objectives.
  • A for fee bundle of these will be assembled and offered for sale, probably on eBay or Etsy. (Too bad fr0gger.)

Useful list!

Stephen E Arnold, November 17, 2023

xx

test

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta