DHS Turns to Commercial Cellphone Data Vendors for Tracking Intelligence

November 18, 2020

Color us completely unsurprised. BuzzFeed News reports, “DHS Authorities Are Buying Moment-By-Moment Geolocation Cellphone Data to Track People.” In what privacy advocates are calling a “surveillance partnership” between government and corporations, the Department of Homeland Security is buying cellphone data in order to track immigrants at the southern border. This is likely to go way beyond the enforcement of immigration laws—once precedent is set, agencies across the law enforcement spectrum are apt to follow suit.

Citing a memo that came into their possession, reporters Hamed Aleaziz and Caroline Haskins reveal DHS lead attorney Chad Mizelle believes ICE officials are free to access locations and cellphone data activity without the need to obtain a warrant and without violating the Fourth Amendment (protection against unreasonable search and seizure). His reasoning? The fact that such data is commercially available, originally meant for advertising purposes, means no warrant is required. Consider that loophole as you ponder how much personal information most citizens’ cell phones hold, from our daily movement patterns to appointments with doctors and other professionals, to our communications. Aleaziz and Haskins write:

“When DHS buys geolocation data, investigators only know that phones and devices visited certain places — meaning, they don’t automatically know the identities of people who visited those locations. Investigators have to match a person’s visited locations with, say, property records and other data sets in order to determine who a person is. But this also means that, technically, moment-by-moment location tracking could happen to anyone, not just people under investigation by DHS. In particular, lawyers, activists, nonprofit workers, and other essential workers could get swept up into investigations that start with geolocation data. DHS officials said they do not comment on alleged leaked documents. The agency is aware of potential legal vulnerabilities under the Fourth Amendment. Mizelle states in his memo that there are ways for CBP and ICE to ‘minimize the risk’ of possible constitutional violations, pointing out that they could limit their searches to defined periods, require supervisors to sign off on lengthy searches, only use the data when more ‘traditional’ techniques fail, and limit the tracking of one device to when there is ‘individualized suspicion’ or relevance to a ‘law enforcement investigation.’”

Earlier this year, The Wall Street Journal reported that DHS was purchasing this data for ICE and CBP. Federal records show both agencies have bought licenses and software from mobile-device-data-vendor Venntel. The House Committee on Oversight and Reform is now investigating the company for selling data to government agencies.

Interesting dynamics.

Cynthia Murrell, November 18, 2020

Size of the US Secret Service?

November 16, 2020

I read “Expansive White House Covid Outbreak Sidelines 10% of Secret Service.” If the headline is accurate, the US Secret service consists of 1,300 officers in the “uniformed division.” The key phrase is “uniformed division.” To the untrained eye, these officers appear in uniforms similar to those of other police. However, there are non-uniformed Secret Service officers. A list of USSS field offices is here. A year ago I learned at a law enforcement conference that there were more than 7,000 employees in the USSS. Net net: The USSS has a reasonably deep roster and can cooperate with the US Capitol Police to deal with events of interest. (The USCP is responsible for Congress; the USSS, the White House. When the vice president moves from the White House to Capitol Hill, the protective duties shift as well.) The article left me with the impression that Covid has impaired the USSS. In my opinion, the USSS is on duty and robust.

Stephen E Arnold, November 16, 2020

Germany Raids Spyware Firm FinFisher

November 3, 2020

Authorities in Germany have acted on suspicions that spyware firm FinFisher, based in Munich, illegally sold its software to the Turkish government. It is believed that regime used the tools to spy on anti-government protesters in 2017. The independent Turkish news site Ahval summarizes the raid and the accusations in, “Spyware Company that Allegedly Sold Spyware to Turkey Raided by German Police.” We’re told:

“Germany’s Customs Investigation Bureau (ZKA) searched 15 properties last week, both in Germany and other countries. Public prosecutors told German media that directors and employees of FinFisher and other companies were being investigated. The investigation follows complaints filed by NGOs Reporters Without Borders, Netzpolitik.org, the Society for Civil Rights (Gesellschafft für Freiheitsrechte, GFF) and the European Center for Constitutional and Human Rights. The NGOs believe that a spyware product used in 2017 to target anti-government protesters in Turkey was FinFisher’s FinSpy. Germany’s Economy Ministry has issued no new permits for spyware since 2015, while the software in question was written in 2016, meaning that if it was used, it must have been exported in violation of government license restrictions.”

Activist group CitizenLab asserts the Turkish government spread the spyware to protesters through Twitter accounts. These accounts, we’re told, masqueraded as sources of information about upcoming protests. As far back as 2011, FinFisher was suspected of supplying regimes in the Middle East with spyware to track Arab Spring protestors. The software has since been found in use by several authoritarian governments, including Bahrain, Ethiopia, and he UAE. Just this September, Amnesty International reported FinFisher’s spyware was being used by Egypt. For its part, of course, the company denies making any sales to countries not approved by German law. We shall see what the investigation turns up.

Cynthia Murrell, November 3, 2020

Amazon Rekognition: Helping Make Work Safer

October 22, 2020

DarkCyber noted Amazon’s blog post “Automatically Detecting Personal Protective Equipment on Persons in Images Using Amazon Rekognition.” Amazon discloses:

With Amazon Rekognition PPE detection, you can analyze images from your on-premises cameras at scale to automatically detect if people are wearing the required protective equipment, such as face covers (surgical masks, N95 masks, cloth masks), head covers (hard hats or helmets), and hand covers (surgical gloves, safety gloves, cloth gloves). Using these results, you can trigger timely alarms or notifications to remind people to wear PPE before or during their presence in a hazardous area to help improve or maintain everyone’s safety.

The examples in the Amazon write up make sense. However, applications in law enforcement and security are also possible. For instance, consider saying, “Hands up” to a person of interest:

10 21 hands up

The system can detect objects held by an individual. You can get more information in the blog post. Policeware and intelware vendors working with Amazon at this time may generate other use cases.

Stephen E Arnold, October 22, 2020

Exclusive: Interview with DataWalk’s Chief Analytics Officer Chris Westphal, Who Guides an Analytics Rocket Ship

October 21, 2020

I spoke with Chris Westphal, Chief Analytics Officer for DataWalk about the company’s string of recent contract “wins.” These range from commercial engagements to heavy lifting for the US Department of Justice.

Chris Westphal, founder of Visual Analytics (acquired by Raytheon) brings his one-click approach to advanced analytics.

The firm provides what I have described as an intelware solution. DataWalk ingests data and outputs actionable reports. The company has leap-frogged a number of investigative solutions, including IBM’s Analyst’s Notebook and the much-hyped Palantir Technologies’ Gotham products. This interview took place in a Covid compliant way. In my previous Chris Westphal interviews, we met at intelligence or law enforcement conferences. Now the experience is virtual, but as interesting and information in July 2019. In my most recent interview with Mr. Westphal, I sought to get more information on what’s causing DataWalk to make some competitors take notice of the company and its use of smart software to deliver what customers want: Results, not PowerPoint presentations and promises. We spoke on October 8, 2020.

DataWalk is an advanced analytics tool with several important innovations. On one hand, the company’s information processing system performs IBM i2 Analyst’s Notebook and Palantir Gotham type functions — just with a more sophisticated and intuitive interface. On the other hand, Westphal’s vision for advanced analytics has moved past what he accomplished with his previous venture Visual Analytics. Raytheon bought that company in 2013. Mr. Westphal has turned his attention to DataWalk. The full text of our conversation appears below.

Read more

Facebook: Interesting Data If Accurate

October 16, 2020

DarkCyber spotted a factoid of interest to law enforcement professionals in “Facebook Responsible for 94% of 69 Million Child Sex Abuse Images Reported by US Tech Firms.”

Facebook has previously announced plans to fully encrypt communications in its Messenger app, as well as its Instagram Direct service – on top of WhatsApp, which is already encrypted – meaning no one apart from the sender and recipient can read or modify messages.

Now about Facebook’s content curation procedures? End-to-end encryption of ad supported private messaging services appears to benefit bad actors.

Stephen E Arnold, October 16, 2020

Facebook WhatsApp: Disappearing Media. Really? Gone for Good?

September 28, 2020

Facebook is endlessly entertaining. On one tentacle, the octopus company seeks to lessen the competitive threats from next generation social media like TikTok-type videos. On another tentacle, Facebook suggests that those in the European Union can do without Facebook. DarkCyber thinks of this as “the take my ball and go home” tactic. Ten year olds with minimal physical coordination but a stock of high end athletic equipment have been known to trot out this argumentative chestnut. Another tentacle semi cooperates with government officials. Another tentacle balances on a knife edge in order to keep employees happy with the wonderful social environment within the helpful company’s digital walled garden. There are other tentacles too, but I want to focus your attention on “WhatsApp Expiring Media Feature Details Tipped via New Beta Version.” Keep in mind that “beta” does not mean something a thumbtyper will be able to use.

The write up explains:

WhatsApp 2.20.201.6 beta for Android has been released with further references for a new feature called “Expiring Media.” The feature, as its name suggests, would make media files such as images, videos, and GIFs sent to the recipient’s phone over WhatsApp disappear once they are viewed.

Interesting. Just one question:

If media are disappeared for users, are those data deleted from the Facebook servers?

One hopes not; otherwise, some investigations will be slowed or halted.

Stephen E Arnold, September 28, 2020

Predictive Policing: A Work in Progress or a Problem in Action?

September 2, 2020

Amid this year’s protests of police brutality, makers of crime-predicting software took the occasion to promote their products as a solution to racial bias in law enforcement. The Markup ponders, “Data-Informed Predictive Policing Was Heralded as Less Biased. Is It?” Writer Annie Gilbertson observes, as we did, that more than 1,400 mathematicians signed on to boycott predictive policing systems. She also describes problems discovered by researchers at New York University’s AI Now Institute:

“‘Police data is open to error by omission,’ [AI Now Director Rashida Richardson] said. Witnesses who distrust the police may be reluctant to report shots fired, and rape or domestic violence victims may never report their abusers. Because it is based on crime reports, the data fed into the software may be less an objective picture of crime than it is a mirror reflecting a given police department’s priorities. Law enforcement may crack down on minor property crime while hardly scratching the surface of white-collar criminal enterprises, for instance. Officers may intensify drug arrests around public housing while ignoring drug use on college campuses. Recently, Richardson and her colleagues Jason Schultz and Kate Crawford examined law enforcement agencies that use a variety of predictive programs. They looked at police departments, including in Chicago, New Orleans, and Maricopa County, Ariz., that have had problems with controversial policing practices, such as stop and frisk, or evidence of civil rights violations, including allegations of racial profiling. They found that since ‘these systems are built on data produced during documented periods of flawed, racially biased, and sometimes unlawful practices and policies,’ it raised ‘the risk of creating inaccurate, skewed, or systemically biased data.’”

The article also looks at a study from 2016 by the Royal Statistical Society. Researchers supplied PredPol’s algorithm with arrest data from Oakland California, a city where estimated drug use is spread fairly evenly throughout the city’s diverse areas. The software’s results would have had officers target Black neighborhoods at about twice the rate of white ones. The team emphasized the documented harm over-policing can cause. The write-up goes on to cover a few more studies on the subject, so navigate there for those details. Gilberston notes that concerns about these systems are so strong that police departments in at least two major cities, Chicago and Los Angeles, have decided against them. Will others follow suit?

Cynthia Murrell, September 2, 2020

Lexipol: Facing Scrutiny?

September 1, 2020

Should a private company be writing policies for police departments? Increasingly that is the case and, many say, is a major reason it is so difficult to hold police accountable for using excessive force. Mother Jones invites us to “Meet the Company that Writes the Policies that Protect Cops.” Founded in 2003, Lexipol’s focus is unabashedly on crafting policies that protect officers and departments against lawsuits. Much hinges on definitions of words one would think should be straightforward, like “necessary” and “imminent.” In fact, company co-founder (and former cop) Bruce Praet seems especially proud of the slippery language that gives police “flexibility.”

When pressed, Lexipol insists it bases its policies on federal and state standards, laws, court rulings, and “best practices.” However, reporter Madison Pauly writes:

“Some of the company’s policies … depart in significant ways from recommendations by mainstream policing organizations. The National Consensus Policy on Use of Force, a collaboration between 11 major law enforcement groups, requires cops to try de-escalation techniques before using force if possible. Lexipol discourages police departments from requiring them. Lexipol’s policy allows officers to shoot at moving vehicles in some circumstances, a practice that the Police Executive Research Forum recommends against because it may injure or kill civilians and officers. The ACLU has contested Lexipol’s rules for handling immigration violations, which in some states include a provision allowing cops to consider ‘a lack of English proficiency’ when deciding whether someone may have entered the country illegally. Despite these challenges, the company has marketed its policies as a way to decrease cities’ liability in police misconduct lawsuits. In its communications with potential clients, Lexipol has claimed that agencies that use its policies are sued less frequently and pay out smaller settlements, according to a Texas Law Review analysis of public records. The company’s critics argue that it accomplishes this with vague or permissive rules that meet bare-minimum legal requirements rather than holding officers to a higher standard.”

According to the company, Lexipol has vended its policies, training, customizable handbooks, or other services to more than 8 thousand public safety agencies, including several large cities. These include corrections, fire, and EMS agencies alongside police departments. In California, it is estimated that about 95 percent of law enforcement are using Lexipol policies. See the article for examples where, we’re told, these policies have stood in the way of justice. As with the crafting of state legislation, we suspect many citizens are unaware how much influence these public agencies have handed over to a third party.

Cynthia Murrell, September 1, 2020

The Child Protection System Catches Pedophiles

August 11, 2020

Child pornography plagues the Internet’s underbelly, the Dark Web, per-to-per sharing networks, and even simple Google search. Law enforcement officials want to protect children and stop the spread of child pornography, so a new software called Child Protection System was created. NBC News shares details in the article, “Inside the Surveillance Software Tracking Child Porn Offenders Across the Globe.”

The Child Protection System was designed by the Florida nonprofit Child Rescue Coalition. It is a forensic tool that scans file sharing networks and chatrooms to locate computers that download child pornography. It is programmed to search for over two hundred terms related to child sex abuse. These scans are then used as probable cause to gain search warrants. Child Protection System’s scans were used to arrest over 12,000 people. The software can search down to the county level and it also looks for images of children deemed twelve and under. It saves a lot of investigation time:

“ ‘The Child Protection System “has had a bigger effect for us than any tool anyone has ever created. It’s been huge,’ said Dennis Nicewander, assistant state attorney in Broward County, Florida, who has used the software to prosecute about 200 cases over the last decade. ‘They have made it so automated and simple that the guys are just sitting there waiting to be arrested.’ The Child Rescue Coalition gives its technology for free to law enforcement agencies, and it is used by about 8,500 investigators in all 50 states. It’s used in 95 other countries, including Canada, the U.K. and Brazil. Since 2010, the nonprofit has trained about 12,000 law enforcement investigators globally.”

The Child Rescue Coalition wants to partner with social media platforms, schools, and more in order to discover who is downloading child pornography. These platforms often contain information of people discussing suspicious behavior, but does not indicate criminal activities. If data from the Child Protection System and these platforms were cross-matched it might indicate possible bad actors.

Some assert that surveillance software is that it is breaking privacy laws. Handing over all this surveillance power to governments requires safeguards to protect individuals’ privacy.

Whitney Grace, August 11, 2020

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta