A Rare Moment of Constructive Cooperation from Tech Barons

November 23, 2023

green-dino_thumb_thumb_thumbThis essay is the work of a dumb dinobaby. No smart software required.

Platform-hopping is one way bad actors have been able to cover their tracks. Now several companies are teaming up to limit that avenue for one particularly odious group. TechNewsWorld reports, “Tech Coalition Launches Initiative to Crackdown on Nomadic Child Predators.” The initiative is named Lantern, and the Tech Coalition includes Discord, Google, Mega, Meta, Quora, Roblox, Snap, and Twitch. Such cooperation is essential to combat a common tactic for grooming and/ or sextortion: predators engage victims on one platform then move the discussion to a more private forum. Reporter John P. Mello Jr. describes how Lantern works:

Participating companies upload ‘signals’ to Lantern about activity that violates their policies against child sexual exploitation identified on their platform.

Signals can be information tied to policy-violating accounts like email addresses, usernames, CSAM hashes, or keywords used to groom as well as buy and sell CSAM. Signals are not definitive proof of abuse. They offer clues for further investigation and can be the crucial piece of the puzzle that enables a company to uncover a real-time threat to a child’s safety.

Once signals are uploaded to Lantern, participating companies can select them, run them against their platform, review any activity and content the signal surfaces against their respective platform policies and terms of service, and take action in line with their enforcement processes, such as removing an account and reporting criminal activity to the National Center for Missing and Exploited Children and appropriate law enforcement agency.”

The visually oriented can find an infographic of this process in the write-up. We learn Lantern has been in development for two years. Why did it take so long to launch? Part of it was designing the program to be effective. Another part was to ensure it was managed responsibly: The project was subjected to a Human Rights Impact Assessment by the Business for Social Responsibility. Experts on child safety, digital rights, advocacy of marginalized communities, government, and law enforcement were also consulted. Finally, we’re told, measures were taken to ensure transparency and victims’ privacy.

In the past, companies hesitated to share such information lest they be considered culpable. However, some hope this initiative represents a perspective shift that will extend to other bad actors, like those who spread terrorist content. Perhaps. We shall see how much tech companies are willing to cooperate. They wouldn’t want to reveal too much to the competition just to help society, after all.

Cynthia Murrell, November 23, 2023

Predictive Analytics and Law Enforcement: Some Questions Arise

October 17, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_t[2]Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

We wish we could prevent crime before it happens. With AI and predictive analytics it seems possible but Wired shares that “Predictive Policing Software Terrible At Predicting Crimes.” Plainfield, NJ’s police department purchased Geolitica predictive software and it was not a wise use go tax payer money. The Markup, a nonprofit investigative organization that wants technology serve the common good, reported Geolitica’s accuracy:

“We examined 23,631 predictions generated by Geolitica between February 25 and December 18, 2018, for the Plainfield Police Department (PD). Each prediction we analyzed from the company’s algorithm indicated that one type of crime was likely to occur in a location not patrolled by Plainfield PD. In the end, the success rate was less than half a percent. Fewer than 100 of the predictions lined up with a crime in the predicted category, that was also later reported to police.”

The Markup also analyzed predictions for robberies and aggravated results that would occur in Plainfield and it was 0.6%. Burglary predictions were worse at 0.1%.

The police weren’t really interested in using Geolitica either. They wanted to be accurate in predicting and reducing crime. The Plainfield, NJ hardly used the software and discontinued the program. Geolitica charged $20,500 for a year subscription then $15,5000 for year renewals. Geolitica had inconsistencies with information. Police found training and experience to be as effective as the predictions the software offered.

Geolitica will go out off business at the end of 2023. The law enforcement technology company SoundThinking hired Geolitica’s engineering team and will acquire some of their IP too. Police software companies are changing their products and services to manage police department data.

Crime data are important. Where crimes and victimization occur should be recorded and analyzed. Newark, New Jersey, used risk terrain modeling (RTM) to identify areas where aggravated assaults would occur. They used land data and found that vacant lots were large crime locations.

Predictive methods have value, but they also have application to specific use cases. Math is not the answer to some challenges.

Whitney Grace, October 17, 2023

Europol Focuses on Child Centric Crime

October 16, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_t[2]Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

Children are the most vulnerable and exploited population in the world. The Internet unfortunately aides bad actors by allowing them to distribute child sexual abuse material aka CSAM to avoid censors. Europol (the European-only sector of Interpol) wants to end CSAM by overriding Europeans’ privacy rights. Tech Dirt explores the idea in the article, “Europol Tells EU Commission: Hey, When It Comes To CSAM, Just Let Us Do Whatever We Want.”

Europol wants unfiltered access to a EU proposed AI algorithm and its data programmed to scan online content for CSAM. The police agency also wants to use the same AI to detect other crimes. This information came from a July 2022 high-level meeting that involved Europol Executive Director Catherine de Belle and the European Commission’s Director-General for Migration and Home Affairs Monique Pariat. Europol pitched this idea when the EU believed it would mandate client-side scanning on service providers.

Privacy activists and EU member nations vetoed the idea, because it would allow anyone to eavesdrop on private conversations. They also found it violated privacy rights. Europol used the common moniker “for the children” or “save the children” to justify the proposal. Law enforcement, politicians, religious groups, and parents have spouted that rhetoric for years and makes more nuanced people appear to side with pedophiles.

“It shouldn’t work as well as it does, since it’s been a cliché for decades. But it still works. And it still works often enough that Europol not only demanded access to combat CSAM but to use this same access to search for criminal activity wholly unrelated to the sexual exploitation of children… Europol wants a police state supported by always-on surveillance of any and all content uploaded by internet service users. Stasi-on-digital-steroids. Considering there’s any number of EU members that harbor ill will towards certain residents of their country, granting an international coalition of cops unfiltered access to content would swiftly move past the initial CSAM justification to governments seeking out any content they don’t like and punishing those who dared to offend their elected betters.”

There’s also evidence that law enforcement officials and politicians are working in the public sector to enforce anti-privacy laws then leaving for the private sector. Once there, they work at companies that sell surveillance technology to governments. Is that a type of insider trading or nefarious influence?

Whitney Grace, October 16, 2023

Cognitive Blind Spot 4: Ads. What Is the Big Deal Already?

October 11, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_t[1]_thumb_thumbNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

Last week, I presented a summary of Dark Web Trends 2023, a research update my team and I prepare each year. I showed a visual of the ads on a Dark Web search engine. Here’s an example of one of my illustrations:

image

The TorLanD service, when it is accessible via Tor, displays a search box and advertising. What is interesting about this service and a number of other Dark Web search engines is the ads. The search results are so-so, vastly inferior to those information retrieval solutions offered by intelware vendors.

Some of the ads appear on other Dark Web search systems as well; for example, Bobby and DarkSide, among others. The advertisements off a range of interesting content. TorLanD screenshot pitches carding, porn, drugs, gadgets (skimmers and software), illegal substances. I pointed out that the ads on TorLanD looked a lot like the ads on Bobby; for instance:

image

I want to point out that the Silk Road 4.0 and the Gadgets, Docs, Fakes ads are identical. Notice also that TorLanD advertises on Bobby. The Helsinki Drug Marketplace on the Bobby search system offers heroin.

Most of these ads are trade outs. The idea is that one Dark Web site will display an ad for another Dark Web site. There are often links to Dark Web advertising agencies as well. (For this short post, I won’t be listing these vendors, but if you are interested in this research, contact benkent2020 at yahoo dot com. One of my team will follow up and explain our for-fee research policy.)

The point of these two examples is make clear that advertising has become normalized, even among bad actors. Furthermore, few are surprised that bad actors (or alleged bad actors) communicate, pat one another on the back, and support an ecosystem to buy and sell space on the increasingly small Dark Web. Please, note that advertising appears in public and private Telegram groups focused on he topics referenced in these Dark Web ads.

Can you believe the ads? Some people do. Users of the Clear Web and the Dark Web are conditioned to accept ads and to believe that these are true, valid, useful, and intended to make it easy to break the law and buy a controlled substance or CSAM. Some ads emphasize “trust.”

People trust ads. People believe ads. People expect ads. In fact, one can poke around and identify advertising and PR agencies touting the idea that people “trust” ads, particularly those with brand identity. How does one build brand? Give up? Advertising and weaponized information are two ways.

The cognitive bias that operates is that people embrace advertising. Look at a page of Google results. Which are ads and which are ads but not identified. What happens when ads are indistinguishable from plausible messages? Some online companies offer stealth ads. On the Dark Web pages illustrating this essay are law enforcement agencies masquerading as bad actors. Can you identify one such ad? What about messages on Twitter which are designed to be difficult to spot as paid messages or weaponized content. For one take on Twitter technology, read “New Ads on X Can’t Be Blocked or Reported, and Aren’t Labeled as Advertisements.”

Let me highlight some of the functions on online ads like those on the Dark Web sites. I will ignore the Clear Web ads for the purposes of this essay:

  1. Click on the ad and receive malware
  2. Visit the ad and explore the illegal offer so that the site operator can obtain information about you
  3. Sell you a product and obtain the identifiers you provide, a deliver address (either physical or digital), or plant a beacon on your system to facilitate tracking
  4. Gather emails for phishing or other online initiatives
  5. Blackmail.

I want to highlight advertising as a vector of weaponization for three reasons: [a] People believe ads. I know it sound silly, but ads work. People suspend disbelief when an ad on a service offers something that sounds too good to be true; [b] many people do not question the legitimacy of an ad or its message. Ads are good. Ads are everywhere. and [c] Ads are essentially unregulated.

What happens when everything drifts toward advertising? The cognitive blind spot kicks in and one cannot separate the false from the real.

Public service note: Before you explore Dark Web ads or click links on social media services like Twitter, consider that these are vectors which can point to quite surprising outcomes. Intelligence agencies outside the US use Dark Web sites as a way to harvest useful information. Bad actors use ads to rip off unsuspecting people like the doctor who once lived two miles from my office when she ordered a Dark Web hitman to terminate an individual.

Ads are unregulated and full of surprises. But the cognitive blind spot for advertising guarantees that the technique will flourish and gain technical sophistication. Are those objective search results useful information or weaponized? Will the Dark Web vendor really sell you valid stolen credit cards? Will the US postal service deliver an unmarked envelope chock full of interesting chemicals?

Stephen E Arnold, October 11, 2023

Traveling to France? On a Watch List?

August 25, 2023

The capacity for surveillance has been lurking in our devices all along, of course. Now, reports Azerbaijan’s Azernews, “French Police Can Secretly Activate Phone Cameras, Microphones, and GPS to Spy on Citizens.” The authority to remotely activate devices was part of a larger justice reform bill recently passed. Officials insist, though, this authority will not be used willy-nilly:

“A judge must approve the use of the powers, and the recently amended bill forbids use against journalists, lawyers, and other ‘sensitive professions.’ The measure is also meant to limit use to serious cases, and only for a maximum of six months. Geolocation would be limited to crimes that are punishable by at least five years in prison.”

Surely, law enforcement would never push those limits. Apparently the Orwellian comparisons are evident even to officials, since Justice Minister Éric Dupond-Moretti preemptively batted them away. Nevertheless, we learn:

“French digital rights advocacy group, La Quadrature du Net, has raised serious concerns over infringements of fundamental liberties, and has argued that the bill violates the ‘right to security, right to a private life and to private correspondence’ and ‘the right to come and go freely.’ … The legislation comes as concerns about government device surveillance are growing. There’s been a backlash against NSO Group, whose Pegasus spyware has allegedly been misused to spy on dissidents, activists, and even politicians. The French bill is more focused, but civil liberties advocates are still alarmed at the potential for abuse. The digital rights group La Quadrature du Net has pointed out the potential for abuse, noting that remote access may depend on security vulnerabilities. Police would be exploiting security holes instead of telling manufacturers how to patch those holes, La Quadrature says.”

Smartphones, laptops, vehicles, and any other connected devices are all fair game under the new law. But only if one has filed the proper paperwork, we are sure. Nevertheless, progress.

Cynthia Murrell, August 25, 2023

Is It Lights Out on the Information Superhighway?

April 26, 2023

Vea4_thumb_thumb_thumbNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

We just completed a lecture about the shadow web. This is our way of describing a number of technologies specifically designed to prevent law enforcement, tax authorities, and other entities charged with enforcing applicable laws in the dark.

Among the tools available are roulette services. These can be applied to domain proxies so it is very difficult to figure out where a particular service is at a particular point in time. Tor has uttered noises about supporting the Mullvad browser and baking in a virtual private network. But there are other VPNs available, and one of the largest infrastructure service providers is under what appears to be “new” ownership. Change may create a problem for some enforcement entities. Other developers work overtime to provide services primarily to those who want to deploy what we call “traditional Dark Web sites.” Some of these obfuscation software components are available on Microsoft’s GitHub.

I want to point to “Global Law Enforcement Coalition Urges Tech Companies to Rethink Encryption Plans That Put Children in Danger from Online Abusers.” The main idea behind the joint statement (the one to which I point is from the UK’s National Crime Agency) is:

The announced implementation of E2EE on META platforms Instagram and Facebook is an example of a purposeful design choice that degrades safety systems and weakens the ability to keep child users safe. META is currently the leading reporter of detected child sexual abuse to NCMEC. The VGT has not yet seen any indication from META that any new safety systems implemented post-E2EE will effectively match or improve their current detection methods.

From my point of view, a questionable “player” has an opportunity to make it possible to enforce laws related to human trafficking, child safety, and related crimes like child pornography. The “player” seems interested in implementing encryption that would make government enforcement more difficult, if not impossible in some circumstances.

The actions of this “player” illustrate what’s part of a fundamental change in the Internet. What was underground is now moving above ground. The implementation of encryption in messaging applications is a big step toward making the “regular” Internet or what some called the Clear Web into a new version of the Dark Web. Not surprisingly, the Dark Web will not go away, but why develop Dark Web sites when Clear Web services provide communications, secrecy, the ability to transmit images and videos, and perform financial transactions related to these data. Thus the Clear Web is falling into the shadows.

My team and I are not pleased with ignoring appropriate and what we call “ethical” behavior with specific actions to increase risks to average Internet users. In fact, some of the “player’s actions” are specifically designed to make the player’s service more desirable to a market segment once largely focused on the Dark Web.

More than suggestions are needed in my opinion. Direct action is required.

Stephen E Arnold, April 26, 2023

NSO Group: How Easy Are Mobile Hacks?

April 25, 2023

I am at the 2023 US National Cyber Crime Conference, and I have been asked, “What companies offer NSO-type mobile phone capabilities?” My answer is, “Quite a few.” Will I name these companies in a free blog post? Sure, just call us at 1-800-YOU-WISH.

A more interesting question is, “Why is Israel-based NSO Group the pointy end of a three meter stick aimed at mobile devices?” (To get some public information about newly recognized NSO Group (Pegasus) tricks, navigate to “Triple Threat. NSO Group’s Pegasus Spyware Returns in 2022 with a Trio of iOS 15 and iOS 16 Zero-Click Exploit Chains.” I would point out that the reference to Access Now is interesting, and a crime analyst may find a few minutes examining what the organization does, its “meetings,” and its hosting services time well spent. Will I provide that information in a free blog post. Please, call the 800 number listed above.)

Now let’s consider the question regarding the productivity of the NSO technical team.

First, Israel’s defense establishment contains many bright people and a world-class training program. What happens when you take well educated people, the threat of war without warning, and an outstanding in-service instructional set up? The answer is, “Ideas get converted into exercises. Exercises become test code. Test code gets revised. And the functional software becomes weaponized.”

Second, the “in our foxhole” mentality extends once trained military specialists leave the formal service and enter the commercial world. As a result, individuals who studied, worked, and in some cases, fought together set up companies. These individuals are a bit like beavers. Beavers do what beavers do. Some of these firms replicate functionality similar to that developed under the government’s watch and sell those products. Please, note, that NSO Group is an exception of sorts. Some of the “insights” originated when the founders were repairing mobile phones. The idea, however, is the same. Learning, testing, deploying, and the hiring individuals with specialized training by the Israeli government. Keep in mind the “in my foxhole” notion, please.

Third, directly or indirectly important firms in Israel or, in some cases, government-assisted development programs provide: [a] Money, [b] meet up opportunities like “tech fests” in Tel Aviv, and [c] suggestions about whom to hire, partner with, consult with, or be aware of.

Do these conditions exist in other countries? In my experience, to some degree this approach to mobile technology exploits does. There are important differences. If you want to know what these are, you know the answer. Buzz that 800 number.

My point is that the expertise, insights, systems, and methods of what the media calls “the NSO Group” have diffused. As a result, there are more choices than ever before when it comes to exploiting mobile devices.

Where’s Apple? Where’s Google? Where’s Samsung? The firms, in my opinion, are in reactive mode, and, in some cases, they don’t know what they don’t know.

Stephen E Arnold, April 25, 2023

Killing Wickr

January 26, 2023

Encrypted messaging services are popular for privacy-concerned users as well as freedom fighters in authoritarian countries.  Tech companies consider these messaging services to be a wise investment, so Amazon purchased Wickr in 2020.  Wickr is an end-to-end encrypted messaging app and it was made available for AWS users.  Gizmodo explains that Wickr will soon be nonexistent in the article, “Amazon Plans To Close Up Wickr’s User-Centric Encrypted Messaging App.”

Amazon no longer wants to be part of the encrypted messaging services, because it got too saturated like the ugly Christmas sweater market.  Amazon is killing the Wickr Me app, limiting use to business and public sectors through AWS Wickr and Wickr Enterprise.  New registrations end on December 31 and the app will be obsolete by the end of 2023.  

Wickr was worth $60 million went Amazon purchased it.  Amazon, however, lost $1 trillion in stock vaguer in November 2022, becoming the first company in history to claim that “honor.”  Amazon is laying off employees and working through company buyouts.  Changing Wickr’s target market could recoup some of the losses:

“But AWS apparently wants Wickr to focus on its business and government customers much more than its regular users. Among those public entities using Wickr is U.S. Customs and Border Protection. That contract was reportedly worth around $900,000 when first reported in September last year. Sure, the CBP wants encrypted communications, but Wickr can delete all messages sent via the app, which is an increasingly dangerous proposition for open government advocates.”

Wickr, like other encryption services, does not have a clean record.  It has been used for illegal drug sales and other illicit items via the Dark Web.  At one time, Wickr might have been a source of useful metadata. Not now. Odd.

Whitney Grace, January 26, 2023

Encryption Will Be an Issue in 2023

January 18, 2023

The United Kingdom is making plans to weaken encryption within its boundaries. The UK’s government claims it will become the safest place in the world to be online, but others are fearful free speech and privacy will be the ultimate victims. The Next Web explores the situation in: “Privacy Advocates Are. Aghast At UK’s Anti-Encryption Plans.” The UK’s plans are part of the Online Safety Bill currently in parliament.

Privacy advocates are most worried about end-to-end encrypted (E2EE) messenger apps, because the new measures would force Internet providers to scam private messages for illegal content. Another clause would require Internet providers to use “accredited technology” to prevent people from terrorist or child pornography.

The bill would make everything viewable and exploitable by anyone with technical knowledge, including government surveillance and bad actors. Some UK lawmakers are worried about the bill’s implications:

“The proposals have also raised the eyebrows of legal experts. In November, barrister Matthew Ryder of Matrix Chambers, who was commissioned by the Index on Censorship campaign group to analyze the bill, asserted that the proposals could breach human rights laws.

‘No communications in the UK — whether between MPs, between whistleblowers and journalists, or between a victim and a victims support charity — would be secure or private,” said Ryder. “In an era where Russia and China continue to work to undermine UK cybersecurity, we believe this could pose a critical threat to UK national security.’”

The lawmakers pointed out that it could encourage authoritarian governments to implement their own policies against E2EE.

The UK government wants to force big tech companies to create backdoors to their products and services. The US has been after big tech companies to do the same for years, especially when there is a terrorist attack or mass shooting. Governments want more information access, but big tech wants to protect their technology. No one is concerned about privacy rights.

Whitney Grace, January 18, 2023

The Intelware Sector: In the News Again

January 13, 2023

It’s Friday the 13th. Bad luck day for Voyager Labs, an Israel-based intelware vendor. But maybe there is bad luck for Facebook or Meta or whatever the company calls itself. Will there be more bad luck for outfits chasing specialized software and services firms?

Maybe.

The number of people interested in the savvy software and systems which comprise Israel’s intelware industry is small. In fact, even among some of the law enforcement and intelligence professionals whom I have encountered over the years, awareness of the number of firms, their professional and social linkages, and the capabilities of these systems is modest. NSO Group became the poster company for how some of these systems can be used. Not long ago, the Brennan Center made available some documents obtained via legal means about a company called Voyager Labs.

Now the Guardian newspaper (now begging for dollars with blue and white pleas) has published “Meta Alleges Surveillance Firm Collected Data on 600,000 Users via Fake Accounts.” the main idea of the write up is that an intelware vendor created sock puppet accounts with phony names. Under these fake identities, the investigators gathered information. The write up refers to “fake accounts” and says:

The lawsuit in federal court in California details activities that Meta says it uncovered in July 2022, alleging that Voyager used surveillance software that relied on fake accounts to scrape data from Facebook and Instagram, as well as Twitter, YouTube, LinkedIn and Telegram. Voyager created and operated more than 38,000 fake Facebook accounts to collect information from more than 600,000 Facebook users, including posts, likes, friends lists, photos, comments and information from groups and pages, according to the complaint. The affected users included employees of non-profits, universities, media organizations, healthcare facilities, the US armed forces and local, state and federal government agencies, along with full-time parents, retirees and union members, Meta said in its filing.

Let’s think about this fake account thing. How difficult is it to create a fake account on a Facebook property. About eight years ago as a test, my team created a fake account for a dog — about eight years ago. Not once in those eight years was any attempt to verify the humanness or the dogness of the animal. The researcher (a special librarian in fact) set up the account and demonstrated to others on my research team how the Facebook sign up system worked or did not work in this particularly example. Once logged in, faithful and trusting Facebook seemed to keep our super user logged into the test computer. For all I know, Tess is still logged in with Facebook doggedly tracking her every move. Here’s Tess:

image

Tough to see that Tess is not a true Facebook type, isn’t it?

Is the accusation directed at Voyager Labs a big deal? From my point of view, no. The reason that intelware companies use Facebook is that Facebook makes it easy to create a fake account, exercises minimal administrative review of registered user, and prioritizes other activities.

I personally don’t know what Voyager Labs did or did not do. I don’t care. I do know that other firms providing intelware have the capability of setting up, managing, and automating some actions of accounts for either a real human, an investigative team, or another software component or system. (Sorry, I am not at liberty to name these outfits.)

Grab your Tum’s bottle and consider these points:

  1. What other companies in Israel offer similar alleged capabilities?
  2. Where and when were these alleged capabilities developed?
  3. What entities funded start ups to implement alleged capabilities?
  4. What other companies offer software and services which deliver similar alleged capabilities?
  5. When did Facebook discover that its own sign up systems had become a go to source of social action for these intelware systems?
  6. Why did Facebook ignore its sign up procedures failings?
  7. Are other countries developing and investing in similar systems with these alleged capabilities? If so, name a company in England, France, China, Germany, or the US?

These one-shot “intelware is bad” stories chop indiscriminately. The vendors get slashed. The social media companies look silly for having little interest in “real” identification of registrants. The licensees of intelware look bad because somehow investigations are somehow “wrong.” I think the media reporting on intelware look silly because the depth of the information on which they craft stories strikes me as shallow.

I am pointing out that a bit more diligence is required to understand the who, what, why, when, and where of specialized software and services. Let’s do some heavy lifting, folks.

Stephen E Arnold, January 13, 2023

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta