A Rare Moment of Constructive Cooperation from Tech Barons
November 23, 2023
This essay is the work of a dumb dinobaby. No smart software required.
Platform-hopping is one way bad actors have been able to cover their tracks. Now several companies are teaming up to limit that avenue for one particularly odious group. TechNewsWorld reports, âTech Coalition Launches Initiative to Crackdown on Nomadic Child Predators.â The initiative is named Lantern, and the Tech Coalition includes Discord, Google, Mega, Meta, Quora, Roblox, Snap, and Twitch. Such cooperation is essential to combat a common tactic for grooming and/ or sextortion: predators engage victims on one platform then move the discussion to a more private forum. Reporter John P. Mello Jr. describes how Lantern works:
Participating companies upload âsignalsâ to Lantern about activity that violates their policies against child sexual exploitation identified on their platform.
Signals can be information tied to policy-violating accounts like email addresses, usernames, CSAM hashes, or keywords used to groom as well as buy and sell CSAM. Signals are not definitive proof of abuse. They offer clues for further investigation and can be the crucial piece of the puzzle that enables a company to uncover a real-time threat to a childâs safety.
Once signals are uploaded to Lantern, participating companies can select them, run them against their platform, review any activity and content the signal surfaces against their respective platform policies and terms of service, and take action in line with their enforcement processes, such as removing an account and reporting criminal activity to the National Center for Missing and Exploited Children and appropriate law enforcement agency.â
The visually oriented can find an infographic of this process in the write-up. We learn Lantern has been in development for two years. Why did it take so long to launch? Part of it was designing the program to be effective. Another part was to ensure it was managed responsibly: The project was subjected to a Human Rights Impact Assessment by the Business for Social Responsibility. Experts on child safety, digital rights, advocacy of marginalized communities, government, and law enforcement were also consulted. Finally, weâre told, measures were taken to ensure transparency and victimsâ privacy.
In the past, companies hesitated to share such information lest they be considered culpable. However, some hope this initiative represents a perspective shift that will extend to other bad actors, like those who spread terrorist content. Perhaps. We shall see how much tech companies are willing to cooperate. They wouldnât want to reveal too much to the competition just to help society, after all.
Cynthia Murrell, November 23, 2023
Predictive Analytics and Law Enforcement: Some Questions Arise
October 17, 2023
Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.
We wish we could prevent crime before it happens. With AI and predictive analytics it seems possible but Wired shares that âPredictive Policing Software Terrible At Predicting Crimes.â Plainfield, NJâs police department purchased Geolitica predictive software and it was not a wise use go tax payer money. The Markup, a nonprofit investigative organization that wants technology serve the common good, reported Geoliticaâs accuracy:
âWe examined 23,631 predictions generated by Geolitica between February 25 and December 18, 2018, for the Plainfield Police Department (PD). Each prediction we analyzed from the companyâs algorithm indicated that one type of crime was likely to occur in a location not patrolled by Plainfield PD. In the end, the success rate was less than half a percent. Fewer than 100 of the predictions lined up with a crime in the predicted category, that was also later reported to police.â
The Markup also analyzed predictions for robberies and aggravated results that would occur in Plainfield and it was 0.6%. Burglary predictions were worse at 0.1%.
The police werenât really interested in using Geolitica either. They wanted to be accurate in predicting and reducing crime. The Plainfield, NJ hardly used the software and discontinued the program. Geolitica charged $20,500 for a year subscription then $15,5000 for year renewals. Geolitica had inconsistencies with information. Police found training and experience to be as effective as the predictions the software offered.
Geolitica will go out off business at the end of 2023. The law enforcement technology company SoundThinking hired Geoliticaâs engineering team and will acquire some of their IP too. Police software companies are changing their products and services to manage police department data.
Crime data are important. Where crimes and victimization occur should be recorded and analyzed. Newark, New Jersey, used risk terrain modeling (RTM) to identify areas where aggravated assaults would occur. They used land data and found that vacant lots were large crime locations.
Predictive methods have value, but they also have application to specific use cases. Math is not the answer to some challenges.
Whitney Grace, October 17, 2023
Europol Focuses on Child Centric Crime
October 16, 2023
Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.
Children are the most vulnerable and exploited population in the world. The Internet unfortunately aides bad actors by allowing them to distribute child sexual abuse material aka CSAM to avoid censors. Europol (the European-only sector of Interpol) wants to end CSAM by overriding Europeansâ privacy rights. Tech Dirt explores the idea in the article, âEuropol Tells EU Commission: Hey, When It Comes To CSAM, Just Let Us Do Whatever We Want.â
Europol wants unfiltered access to a EU proposed AI algorithm and its data programmed to scan online content for CSAM. The police agency also wants to use the same AI to detect other crimes. This information came from a July 2022 high-level meeting that involved Europol Executive Director Catherine de Belle and the European Commissionâs Director-General for Migration and Home Affairs Monique Pariat. Europol pitched this idea when the EU believed it would mandate client-side scanning on service providers.
Privacy activists and EU member nations vetoed the idea, because it would allow anyone to eavesdrop on private conversations. They also found it violated privacy rights. Europol used the common moniker âfor the childrenâ or âsave the childrenâ to justify the proposal. Law enforcement, politicians, religious groups, and parents have spouted that rhetoric for years and makes more nuanced people appear to side with pedophiles.
âIt shouldnât work as well as it does, since itâs been a clichĂŠ for decades. But it still works. And it still works often enough that Europol not only demanded access to combat CSAM but to use this same access to search for criminal activity wholly unrelated to the sexual exploitation of children⌠Europol wants a police state supported by always-on surveillance of any and all content uploaded by internet service users. Stasi-on-digital-steroids. Considering thereâs any number of EU members that harbor ill will towards certain residents of their country, granting an international coalition of cops unfiltered access to content would swiftly move past the initial CSAM justification to governments seeking out any content they donât like and punishing those who dared to offend their elected betters.â
Thereâs also evidence that law enforcement officials and politicians are working in the public sector to enforce anti-privacy laws then leaving for the private sector. Once there, they work at companies that sell surveillance technology to governments. Is that a type of insider trading or nefarious influence?
Whitney Grace, October 16, 2023
Cognitive Blind Spot 4: Ads. What Is the Big Deal Already?
October 11, 2023
Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.
Last week, I presented a summary of Dark Web Trends 2023, a research update my team and I prepare each year. I showed a visual of the ads on a Dark Web search engine. Hereâs an example of one of my illustrations:
The TorLanD service, when it is accessible via Tor, displays a search box and advertising. What is interesting about this service and a number of other Dark Web search engines is the ads. The search results are so-so, vastly inferior to those information retrieval solutions offered by intelware vendors.
Some of the ads appear on other Dark Web search systems as well; for example, Bobby and DarkSide, among others. The advertisements off a range of interesting content. TorLanD screenshot pitches carding, porn, drugs, gadgets (skimmers and software), illegal substances. I pointed out that the ads on TorLanD looked a lot like the ads on Bobby; for instance:
I want to point out that the Silk Road 4.0 and the Gadgets, Docs, Fakes ads are identical. Notice also that TorLanD advertises on Bobby. The Helsinki Drug Marketplace on the Bobby search system offers heroin.
Most of these ads are trade outs. The idea is that one Dark Web site will display an ad for another Dark Web site. There are often links to Dark Web advertising agencies as well. (For this short post, I wonât be listing these vendors, but if you are interested in this research, contact benkent2020 at yahoo dot com. One of my team will follow up and explain our for-fee research policy.)
The point of these two examples is make clear that advertising has become normalized, even among bad actors. Furthermore, few are surprised that bad actors (or alleged bad actors) communicate, pat one another on the back, and support an ecosystem to buy and sell space on the increasingly small Dark Web. Please, note that advertising appears in public and private Telegram groups focused on he topics referenced in these Dark Web ads.
Can you believe the ads? Some people do. Users of the Clear Web and the Dark Web are conditioned to accept ads and to believe that these are true, valid, useful, and intended to make it easy to break the law and buy a controlled substance or CSAM. Some ads emphasize âtrust.â
People trust ads. People believe ads. People expect ads. In fact, one can poke around and identify advertising and PR agencies touting the idea that people âtrustâ ads, particularly those with brand identity. How does one build brand? Give up? Advertising and weaponized information are two ways.
The cognitive bias that operates is that people embrace advertising. Look at a page of Google results. Which are ads and which are ads but not identified. What happens when ads are indistinguishable from plausible messages? Some online companies offer stealth ads. On the Dark Web pages illustrating this essay are law enforcement agencies masquerading as bad actors. Can you identify one such ad? What about messages on Twitter which are designed to be difficult to spot as paid messages or weaponized content. For one take on Twitter technology, read âNew Ads on X Canât Be Blocked or Reported, and Arenât Labeled as Advertisements.â
Let me highlight some of the functions on online ads like those on the Dark Web sites. I will ignore the Clear Web ads for the purposes of this essay:
- Click on the ad and receive malware
- Visit the ad and explore the illegal offer so that the site operator can obtain information about you
- Sell you a product and obtain the identifiers you provide, a deliver address (either physical or digital), or plant a beacon on your system to facilitate tracking
- Gather emails for phishing or other online initiatives
- Blackmail.
I want to highlight advertising as a vector of weaponization for three reasons: [a] People believe ads. I know it sound silly, but ads work. People suspend disbelief when an ad on a service offers something that sounds too good to be true; [b] many people do not question the legitimacy of an ad or its message. Ads are good. Ads are everywhere. and [c] Ads are essentially unregulated.
What happens when everything drifts toward advertising? The cognitive blind spot kicks in and one cannot separate the false from the real.
Public service note: Before you explore Dark Web ads or click links on social media services like Twitter, consider that these are vectors which can point to quite surprising outcomes. Intelligence agencies outside the US use Dark Web sites as a way to harvest useful information. Bad actors use ads to rip off unsuspecting people like the doctor who once lived two miles from my office when she ordered a Dark Web hitman to terminate an individual.
Ads are unregulated and full of surprises. But the cognitive blind spot for advertising guarantees that the technique will flourish and gain technical sophistication. Are those objective search results useful information or weaponized? Will the Dark Web vendor really sell you valid stolen credit cards? Will the US postal service deliver an unmarked envelope chock full of interesting chemicals?
Stephen E Arnold, October 11, 2023
Traveling to France? On a Watch List?
August 25, 2023
The capacity for surveillance has been lurking in our devices all along, of course. Now, reports Azerbaijan’s Azernews, “French Police Can Secretly Activate Phone Cameras, Microphones, and GPS to Spy on Citizens.” The authority to remotely activate devices was part of a larger justice reform bill recently passed. Officials insist, though, this authority will not be used willy-nilly:
“A judge must approve the use of the powers, and the recently amended bill forbids use against journalists, lawyers, and other ‘sensitive professions.’ The measure is also meant to limit use to serious cases, and only for a maximum of six months. Geolocation would be limited to crimes that are punishable by at least five years in prison.”
Surely, law enforcement would never push those limits. Apparently the Orwellian comparisons are evident even to officials, since Justice Minister Ăric Dupond-Moretti preemptively batted them away. Nevertheless, we learn:
“French digital rights advocacy group, La Quadrature du Net, has raised serious concerns over infringements of fundamental liberties, and has argued that the bill violates the ‘right to security, right to a private life and to private correspondence’ and ‘the right to come and go freely.’ ⌠The legislation comes as concerns about government device surveillance are growing. There’s been a backlash against NSO Group, whose Pegasus spyware has allegedly been misused to spy on dissidents, activists, and even politicians. The French bill is more focused, but civil liberties advocates are still alarmed at the potential for abuse. The digital rights group La Quadrature du Net has pointed out the potential for abuse, noting that remote access may depend on security vulnerabilities. Police would be exploiting security holes instead of telling manufacturers how to patch those holes, La Quadrature says.”
Smartphones, laptops, vehicles, and any other connected devices are all fair game under the new law. But only if one has filed the proper paperwork, we are sure. Nevertheless, progress.
Cynthia Murrell, August 25, 2023
Is It Lights Out on the Information Superhighway?
April 26, 2023
Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.
We just completed a lecture about the shadow web. This is our way of describing a number of technologies specifically designed to prevent law enforcement, tax authorities, and other entities charged with enforcing applicable laws in the dark.
Among the tools available are roulette services. These can be applied to domain proxies so it is very difficult to figure out where a particular service is at a particular point in time. Tor has uttered noises about supporting the Mullvad browser and baking in a virtual private network. But there are other VPNs available, and one of the largest infrastructure service providers is under what appears to be ânewâ ownership. Change may create a problem for some enforcement entities. Other developers work overtime to provide services primarily to those who want to deploy what we call âtraditional Dark Web sites.â Some of these obfuscation software components are available on Microsoft’s GitHub.
I want to point to âGlobal Law Enforcement Coalition Urges Tech Companies to Rethink Encryption Plans That Put Children in Danger from Online Abusers.â The main idea behind the joint statement (the one to which I point is from the UKâs National Crime Agency) is:
The announced implementation of E2EE on META platforms Instagram and Facebook is an example of a purposeful design choice that degrades safety systems and weakens the ability to keep child users safe. META is currently the leading reporter of detected child sexual abuse to NCMEC. The VGT has not yet seen any indication from META that any new safety systems implemented post-E2EE will effectively match or improve their current detection methods.
From my point of view, a questionable âplayerâ has an opportunity to make it possible to enforce laws related to human trafficking, child safety, and related crimes like child pornography. The âplayerâ seems interested in implementing encryption that would make government enforcement more difficult, if not impossible in some circumstances.
The actions of this âplayerâ illustrate whatâs part of a fundamental change in the Internet. What was underground is now moving above ground. The implementation of encryption in messaging applications is a big step toward making the âregularâ Internet or what some called the Clear Web into a new version of the Dark Web. Not surprisingly, the Dark Web will not go away, but why develop Dark Web sites when Clear Web services provide communications, secrecy, the ability to transmit images and videos, and perform financial transactions related to these data. Thus the Clear Web is falling into the shadows.
My team and I are not pleased with ignoring appropriate and what we call âethicalâ behavior with specific actions to increase risks to average Internet users. In fact, some of the âplayerâs actionsâ are specifically designed to make the playerâs service more desirable to a market segment once largely focused on the Dark Web.
More than suggestions are needed in my opinion. Direct action is required.
Stephen E Arnold, April 26, 2023
NSO Group: How Easy Are Mobile Hacks?
April 25, 2023
I am at the 2023 US National Cyber Crime Conference, and I have been asked, âWhat companies offer NSO-type mobile phone capabilities?â My answer is, âQuite a few.â Will I name these companies in a free blog post? Sure, just call us at 1-800-YOU-WISH.
A more interesting question is, âWhy is Israel-based NSO Group the pointy end of a three meter stick aimed at mobile devices?â (To get some public information about newly recognized NSO Group (Pegasus) tricks, navigate to âTriple Threat. NSO Groupâs Pegasus Spyware Returns in 2022 with a Trio of iOS 15 and iOS 16 Zero-Click Exploit Chains.â I would point out that the reference to Access Now is interesting, and a crime analyst may find a few minutes examining what the organization does, its âmeetings,â and its hosting services time well spent. Will I provide that information in a free blog post. Please, call the 800 number listed above.)
Now letâs consider the question regarding the productivity of the NSO technical team.
First, Israelâs defense establishment contains many bright people and a world-class training program. What happens when you take well educated people, the threat of war without warning, and an outstanding in-service instructional set up? The answer is, âIdeas get converted into exercises. Exercises become test code. Test code gets revised. And the functional software becomes weaponized.â
Second, the âin our foxholeâ mentality extends once trained military specialists leave the formal service and enter the commercial world. As a result, individuals who studied, worked, and in some cases, fought together set up companies. These individuals are a bit like beavers. Beavers do what beavers do. Some of these firms replicate functionality similar to that developed under the governmentâs watch and sell those products. Please, note, that NSO Group is an exception of sorts. Some of the âinsightsâ originated when the founders were repairing mobile phones. The idea, however, is the same. Learning, testing, deploying, and the hiring individuals with specialized training by the Israeli government. Keep in mind the âin my foxholeâ notion, please.
Third, directly or indirectly important firms in Israel or, in some cases, government-assisted development programs provide: [a] Money, [b] meet up opportunities like âtech festsâ in Tel Aviv, and [c] suggestions about whom to hire, partner with, consult with, or be aware of.
Do these conditions exist in other countries? In my experience, to some degree this approach to mobile technology exploits does. There are important differences. If you want to know what these are, you know the answer. Buzz that 800 number.
My point is that the expertise, insights, systems, and methods of what the media calls âthe NSO Groupâ have diffused. As a result, there are more choices than ever before when it comes to exploiting mobile devices.
Whereâs Apple? Whereâs Google? Whereâs Samsung? The firms, in my opinion, are in reactive mode, and, in some cases, they donât know what they donât know.
Stephen E Arnold, April 25, 2023
Killing Wickr
January 26, 2023
Encrypted messaging services are popular for privacy-concerned users as well as freedom fighters in authoritarian countries. Tech companies consider these messaging services to be a wise investment, so Amazon purchased Wickr in 2020. Wickr is an end-to-end encrypted messaging app and it was made available for AWS users. Gizmodo explains that Wickr will soon be nonexistent in the article, âAmazon Plans To Close Up Wickrâs User-Centric Encrypted Messaging App.â
Amazon no longer wants to be part of the encrypted messaging services, because it got too saturated like the ugly Christmas sweater market. Amazon is killing the Wickr Me app, limiting use to business and public sectors through AWS Wickr and Wickr Enterprise. New registrations end on December 31 and the app will be obsolete by the end of 2023.
Wickr was worth $60 million went Amazon purchased it. Amazon, however, lost $1 trillion in stock vaguer in November 2022, becoming the first company in history to claim that âhonor.â Amazon is laying off employees and working through company buyouts. Changing Wickrâs target market could recoup some of the losses:
âBut AWS apparently wants Wickr to focus on its business and government customers much more than its regular users. Among those public entities using Wickr is U.S. Customs and Border Protection. That contract was reportedly worth around $900,000 when first reported in September last year. Sure, the CBP wants encrypted communications, but Wickr can delete all messages sent via the app, which is an increasingly dangerous proposition for open government advocates.â
Wickr, like other encryption services, does not have a clean record. It has been used for illegal drug sales and other illicit items via the Dark Web.  At one time, Wickr might have been a source of useful metadata. Not now. Odd.
Whitney Grace, January 26, 2023
Encryption Will Be an Issue in 2023
January 18, 2023
The United Kingdom is making plans to weaken encryption within its boundaries. The UKâs government claims it will become the safest place in the world to be online, but others are fearful free speech and privacy will be the ultimate victims. The Next Web explores the situation in: âPrivacy Advocates Are. Aghast At UKâs Anti-Encryption Plans.â The UKâs plans are part of the Online Safety Bill currently in parliament.
Privacy advocates are most worried about end-to-end encrypted (E2EE) messenger apps, because the new measures would force Internet providers to scam private messages for illegal content. Another clause would require Internet providers to use âaccredited technologyâ to prevent people from terrorist or child pornography.
The bill would make everything viewable and exploitable by anyone with technical knowledge, including government surveillance and bad actors. Some UK lawmakers are worried about the billâs implications:
âThe proposals have also raised the eyebrows of legal experts. In November, barrister Matthew Ryder of Matrix Chambers, who was commissioned by the Index on Censorship campaign group to analyze the bill, asserted that the proposals could breach human rights laws.
âNo communications in the UK â whether between MPs, between whistleblowers and journalists, or between a victim and a victims support charity â would be secure or private,â said Ryder. âIn an era where Russia and China continue to work to undermine UK cybersecurity, we believe this could pose a critical threat to UK national security.ââ
The lawmakers pointed out that it could encourage authoritarian governments to implement their own policies against E2EE.
The UK government wants to force big tech companies to create backdoors to their products and services. The US has been after big tech companies to do the same for years, especially when there is a terrorist attack or mass shooting. Governments want more information access, but big tech wants to protect their technology. No one is concerned about privacy rights.
Whitney Grace, January 18, 2023
The Intelware Sector: In the News Again
January 13, 2023
Itâs Friday the 13th. Bad luck day for Voyager Labs, an Israel-based intelware vendor. But maybe there is bad luck for Facebook or Meta or whatever the company calls itself. Will there be more bad luck for outfits chasing specialized software and services firms?
Maybe.
The number of people interested in the savvy software and systems which comprise Israelâs intelware industry is small. In fact, even among some of the law enforcement and intelligence professionals whom I have encountered over the years, awareness of the number of firms, their professional and social linkages, and the capabilities of these systems is modest. NSO Group became the poster company for how some of these systems can be used. Not long ago, the Brennan Center made available some documents obtained via legal means about a company called Voyager Labs.
Now the Guardian newspaper (now begging for dollars with blue and white pleas) has published âMeta Alleges Surveillance Firm Collected Data on 600,000 Users via Fake Accounts.â the main idea of the write up is that an intelware vendor created sock puppet accounts with phony names. Under these fake identities, the investigators gathered information. The write up refers to âfake accountsâ and says:
The lawsuit in federal court in California details activities that Meta says it uncovered in July 2022, alleging that Voyager used surveillance software that relied on fake accounts to scrape data from Facebook and Instagram, as well as Twitter, YouTube, LinkedIn and Telegram. Voyager created and operated more than 38,000 fake Facebook accounts to collect information from more than 600,000 Facebook users, including posts, likes, friends lists, photos, comments and information from groups and pages, according to the complaint. The affected users included employees of non-profits, universities, media organizations, healthcare facilities, the US armed forces and local, state and federal government agencies, along with full-time parents, retirees and union members, Meta said in its filing.
Letâs think about this fake account thing. How difficult is it to create a fake account on a Facebook property. About eight years ago as a test, my team created a fake account for a dog — about eight years ago. Not once in those eight years was any attempt to verify the humanness or the dogness of the animal. The researcher (a special librarian in fact) set up the account and demonstrated to others on my research team how the Facebook sign up system worked or did not work in this particularly example. Once logged in, faithful and trusting Facebook seemed to keep our super user logged into the test computer. For all I know, Tess is still logged in with Facebook doggedly tracking her every move. Hereâs Tess:
Tough to see that Tess is not a true Facebook type, isnât it?
Is the accusation directed at Voyager Labs a big deal? From my point of view, no. The reason that intelware companies use Facebook is that Facebook makes it easy to create a fake account, exercises minimal administrative review of registered user, and prioritizes other activities.
I personally donât know what Voyager Labs did or did not do. I donât care. I do know that other firms providing intelware have the capability of setting up, managing, and automating some actions of accounts for either a real human, an investigative team, or another software component or system. (Sorry, I am not at liberty to name these outfits.)
Grab your Tumâs bottle and consider these points:
- What other companies in Israel offer similar alleged capabilities?
- Where and when were these alleged capabilities developed?
- What entities funded start ups to implement alleged capabilities?
- What other companies offer software and services which deliver similar alleged capabilities?
- When did Facebook discover that its own sign up systems had become a go to source of social action for these intelware systems?
- Why did Facebook ignore its sign up procedures failings?
- Are other countries developing and investing in similar systems with these alleged capabilities? If so, name a company in England, France, China, Germany, or the US?
These one-shot âintelware is badâ stories chop indiscriminately. The vendors get slashed. The social media companies look silly for having little interest in ârealâ identification of registrants. The licensees of intelware look bad because somehow investigations are somehow âwrong.â I think the media reporting on intelware look silly because the depth of the information on which they craft stories strikes me as shallow.
I am pointing out that a bit more diligence is required to understand the who, what, why, when, and where of specialized software and services. Letâs do some heavy lifting, folks.
Stephen E Arnold, January 13, 2023