Facebook WhatsApp: Disappearing Media. Really? Gone for Good?

September 28, 2020

Facebook is endlessly entertaining. On one tentacle, the octopus company seeks to lessen the competitive threats from next generation social media like TikTok-type videos. On another tentacle, Facebook suggests that those in the European Union can do without Facebook. DarkCyber thinks of this as “the take my ball and go home” tactic. Ten year olds with minimal physical coordination but a stock of high end athletic equipment have been known to trot out this argumentative chestnut. Another tentacle semi cooperates with government officials. Another tentacle balances on a knife edge in order to keep employees happy with the wonderful social environment within the helpful company’s digital walled garden. There are other tentacles too, but I want to focus your attention on “WhatsApp Expiring Media Feature Details Tipped via New Beta Version.” Keep in mind that “beta” does not mean something a thumbtyper will be able to use.

The write up explains:

WhatsApp 2.20.201.6 beta for Android has been released with further references for a new feature called “Expiring Media.” The feature, as its name suggests, would make media files such as images, videos, and GIFs sent to the recipient’s phone over WhatsApp disappear once they are viewed.

Interesting. Just one question:

If media are disappeared for users, are those data deleted from the Facebook servers?

One hopes not; otherwise, some investigations will be slowed or halted.

Stephen E Arnold, September 28, 2020

Predictive Policing: A Work in Progress or a Problem in Action?

September 2, 2020

Amid this year’s protests of police brutality, makers of crime-predicting software took the occasion to promote their products as a solution to racial bias in law enforcement. The Markup ponders, “Data-Informed Predictive Policing Was Heralded as Less Biased. Is It?” Writer Annie Gilbertson observes, as we did, that more than 1,400 mathematicians signed on to boycott predictive policing systems. She also describes problems discovered by researchers at New York University’s AI Now Institute:

“‘Police data is open to error by omission,’ [AI Now Director Rashida Richardson] said. Witnesses who distrust the police may be reluctant to report shots fired, and rape or domestic violence victims may never report their abusers. Because it is based on crime reports, the data fed into the software may be less an objective picture of crime than it is a mirror reflecting a given police department’s priorities. Law enforcement may crack down on minor property crime while hardly scratching the surface of white-collar criminal enterprises, for instance. Officers may intensify drug arrests around public housing while ignoring drug use on college campuses. Recently, Richardson and her colleagues Jason Schultz and Kate Crawford examined law enforcement agencies that use a variety of predictive programs. They looked at police departments, including in Chicago, New Orleans, and Maricopa County, Ariz., that have had problems with controversial policing practices, such as stop and frisk, or evidence of civil rights violations, including allegations of racial profiling. They found that since ‘these systems are built on data produced during documented periods of flawed, racially biased, and sometimes unlawful practices and policies,’ it raised ‘the risk of creating inaccurate, skewed, or systemically biased data.’”

The article also looks at a study from 2016 by the Royal Statistical Society. Researchers supplied PredPol’s algorithm with arrest data from Oakland California, a city where estimated drug use is spread fairly evenly throughout the city’s diverse areas. The software’s results would have had officers target Black neighborhoods at about twice the rate of white ones. The team emphasized the documented harm over-policing can cause. The write-up goes on to cover a few more studies on the subject, so navigate there for those details. Gilberston notes that concerns about these systems are so strong that police departments in at least two major cities, Chicago and Los Angeles, have decided against them. Will others follow suit?

Cynthia Murrell, September 2, 2020

Lexipol: Facing Scrutiny?

September 1, 2020

Should a private company be writing policies for police departments? Increasingly that is the case and, many say, is a major reason it is so difficult to hold police accountable for using excessive force. Mother Jones invites us to “Meet the Company that Writes the Policies that Protect Cops.” Founded in 2003, Lexipol’s focus is unabashedly on crafting policies that protect officers and departments against lawsuits. Much hinges on definitions of words one would think should be straightforward, like “necessary” and “imminent.” In fact, company co-founder (and former cop) Bruce Praet seems especially proud of the slippery language that gives police “flexibility.”

When pressed, Lexipol insists it bases its policies on federal and state standards, laws, court rulings, and “best practices.” However, reporter Madison Pauly writes:

“Some of the company’s policies … depart in significant ways from recommendations by mainstream policing organizations. The National Consensus Policy on Use of Force, a collaboration between 11 major law enforcement groups, requires cops to try de-escalation techniques before using force if possible. Lexipol discourages police departments from requiring them. Lexipol’s policy allows officers to shoot at moving vehicles in some circumstances, a practice that the Police Executive Research Forum recommends against because it may injure or kill civilians and officers. The ACLU has contested Lexipol’s rules for handling immigration violations, which in some states include a provision allowing cops to consider ‘a lack of English proficiency’ when deciding whether someone may have entered the country illegally. Despite these challenges, the company has marketed its policies as a way to decrease cities’ liability in police misconduct lawsuits. In its communications with potential clients, Lexipol has claimed that agencies that use its policies are sued less frequently and pay out smaller settlements, according to a Texas Law Review analysis of public records. The company’s critics argue that it accomplishes this with vague or permissive rules that meet bare-minimum legal requirements rather than holding officers to a higher standard.”

According to the company, Lexipol has vended its policies, training, customizable handbooks, or other services to more than 8 thousand public safety agencies, including several large cities. These include corrections, fire, and EMS agencies alongside police departments. In California, it is estimated that about 95 percent of law enforcement are using Lexipol policies. See the article for examples where, we’re told, these policies have stood in the way of justice. As with the crafting of state legislation, we suspect many citizens are unaware how much influence these public agencies have handed over to a third party.

Cynthia Murrell, September 1, 2020

The Child Protection System Catches Pedophiles

August 11, 2020

Child pornography plagues the Internet’s underbelly, the Dark Web, per-to-per sharing networks, and even simple Google search. Law enforcement officials want to protect children and stop the spread of child pornography, so a new software called Child Protection System was created. NBC News shares details in the article, “Inside the Surveillance Software Tracking Child Porn Offenders Across the Globe.”

The Child Protection System was designed by the Florida nonprofit Child Rescue Coalition. It is a forensic tool that scans file sharing networks and chatrooms to locate computers that download child pornography. It is programmed to search for over two hundred terms related to child sex abuse. These scans are then used as probable cause to gain search warrants. Child Protection System’s scans were used to arrest over 12,000 people. The software can search down to the county level and it also looks for images of children deemed twelve and under. It saves a lot of investigation time:

“ ‘The Child Protection System “has had a bigger effect for us than any tool anyone has ever created. It’s been huge,’ said Dennis Nicewander, assistant state attorney in Broward County, Florida, who has used the software to prosecute about 200 cases over the last decade. ‘They have made it so automated and simple that the guys are just sitting there waiting to be arrested.’ The Child Rescue Coalition gives its technology for free to law enforcement agencies, and it is used by about 8,500 investigators in all 50 states. It’s used in 95 other countries, including Canada, the U.K. and Brazil. Since 2010, the nonprofit has trained about 12,000 law enforcement investigators globally.”

The Child Rescue Coalition wants to partner with social media platforms, schools, and more in order to discover who is downloading child pornography. These platforms often contain information of people discussing suspicious behavior, but does not indicate criminal activities. If data from the Child Protection System and these platforms were cross-matched it might indicate possible bad actors.

Some assert that surveillance software is that it is breaking privacy laws. Handing over all this surveillance power to governments requires safeguards to protect individuals’ privacy.

Whitney Grace, August 11, 2020

WhatsApp: Expiring Messages

August 3, 2020

DarkCyber noted “WhatsApp Is Working on Message Deletion Feature.” Encrypted messaging is a communications channel with magnetism. I pointed out in one of my recent lectures that:

messaging can provide many of the functions associated with old style Dark Web sites.

Messaging applications permit encrypted groups, in-app ecommerce, and links which effectively deliver digital content to insiders or customers.

Facebook, owner of WhatsApp, according to the article:

is working on an Expiring Messages feature.

The idea is that the Facebook system will:

“automatically delete a particular message for the sender and the receiver after a particular time.”

The innovation, if it is under development, begs the question, “Will Facebook retain copies of the deleted content?”

Stephen E Arnold, August 3, 2020

European Union Tries Panenka to Score Against Encrypted Data

July 31, 2020

Let’s assume this write up is accurate: “EU Plans to Use Supercomputers to Break Encryption But Also Wants Platforms to Create Opportunities to Snoop on End-to-End Communications.”

The “going dark” argument is not moving fast enough for European Union regulators. The fix is a “decryption platform.” The idea is to decrypt certain messages. The interesting part of the tactic is summarized in this passage:

Internet service providers such as Google, Facebook and Microsoft are to create opportunities to read end-to-end encrypted communications. If criminal content is found, it should be reported to the relevant law enforcement authorities. To this end, the Commission has initiated an “expert process” with the companies in the framework of the EU Internet Forum, which is to make proposals in a study. This process could later result in a regulation or directive that would force companies to cooperate.

The article points out:

There’s no way to “create opportunities” to read end-to-end encrypted communications without weakening the latter.

Worth monitoring the idea and its implementation and its opportunities.

Stephen E Arnold, July 31, 2020

Are High-Technology Companies Obstructionist?

June 11, 2020

When one talks about high-technology companies, the conversation skips over outfits like Siemens, China Mobile, and Telefónica. The parties to the conversation understand that the code “high tech” refers to Amazon, Apple, Facebook, Google, Twitter, and similar outfits. Baidu and TenCent type outfits cooperate in some countries, just less so in other countries.

ASIO Chief Hits Out at Obstructive Tech Companies” is an amplification of US Department of Justice calls for providing enforcement officials with backdoors when data are encrypted. Australia’s intelligence agency is called the Australian Security Intelligence Organisation, shortened to ASIO. The article points out hat in December 2018 laws came into effect that added “encryption busting powers.”

The bulk of the article consists of statements made by Australia’s spy agency chief Mike Burgess. The selected comments make clear that some high-technology companies have been slow to cooperate. The write up reports:

“As a society, whether we know it or not, we’ve accepted the fact that the police or ASIO can get a warrant to bug someone’s car or someone’s house. Why should cyberspace be any different? “Yet every time we have these conversations with the private sector companies they kind of push back and say, ‘Uh, no, we’re not so sure about that’.”—Mr. Burgess.

The main point is that high-technology companies often adopt the apocryphal  mañana approach when minutes count.

DarkCyber anticipates increased requests for encryption backdoors from other members of the Five Eyes. Some of those involved with this group are not amused with going slow.

Stephen E Arnold, June 11, 2020

DarkCyber Exclusive: Steele Aims for the Hearts of Wall Street Short Sellers

May 23, 2020

We posted a follow up interview with Robert David Steele, a former CIA professional. This video expands on the allegations of wide spread, systemic fraud. Steele explains why a government task force is needed. He describes the scope of the audit, involving six financial giants and a back office operation. If you are interested in learning about alleged skyscraper-sized financial misbehavior, you can view the video on Vimeo at this link.

Stephen E Arnold, May 23, 2020

German Intelligence Handcuffed

May 20, 2020

DarkCyber noted an interesting news story published on DW.com. The article? “German Intelligence Can’t Spy on Foreigners Outside Germany.” The DarkCyber research team talked about this and formed a collective response: “What the heck?”

The write up reports as actual factual news:

The German government must come up with a new law regulating its secret services, after the country’s highest court ruled that the current practice of monitoring telecommunications of foreign citizens at will violates constitutionally-enshrined press freedoms and the privacy of communications.

The article continued:

The ruling said that non-Germans were also protected by Germany’s constitutional rights, and that the current law lacked special protection for the work of lawyers and journalists. This applied both to the collection and processing of data as well as passing on that data to other intelligence agencies.

This is an interesting development if it is indeed accurate. Some countries’ intelligence agencies do conduct activities outside of their home countries’ borders. Furthermore, there are specialized service and device vendors headquartered in Germany which facilitate extra border data collection, monitoring, tracking, and sense making. These range from Siemens to software and hacking companies.

Restricting the activities of an intelligence unit to a particular geographic “space” sounds like a difficult task. There are “need to know” operations which may not be disclosed to an elected body except under quite specific circumstances. Electronic monitoring and intercepting ranges freely in the datasphere. Telecommunications hardware and service providers like T-Mobile have a number connections with certain German government entities.

Plus DarkCyber surmises that there are current operations underway in certain parts of the world which operate in a way that is hostile to the German state, its citizens, and its commercial enterprises.

Will these operations be stopped? Turning off a covert operation is not like flicking a button on a remote control to kill a Netflix program.

What if the German intelligence community, known to be one of the best in the European Community, goes dark?

The related question is, “What if secret agencies operate in secret?” Who will know? Who will talk? Who will prosecute? Who decides what’s important to protect citizens?

Stephen E Arnold, May 20, 2020

LAPD Shutters Predictive Policing During Shutdown

May 7, 2020

Police departments are not immune to the economic impact of this pandemic. We learn the Los Angeles Police Department is shutting down its predictive policing program, at least for now, in TechDirt’s write-up, “LAPD’s Failed Predictive Policing Program the Latest COVID-19 Victim.” Writer Tim Cushing makes it perfectly clear he has never been a fan of the analytics approach to law enforcement:

“For the most part, predictive policing relies on garbage data generated by garbage cops, turning years of biased policing into ‘actionable intel’ by laundering it through a bunch of proprietary algorithms. More than half a decade ago, early-ish adopters were expressing skepticism about the tech’s ability to suss out the next crime wave. For millions of dollars less, average cops could have pointed out hot crime spots on a map based on where they’d made arrests, while still coming nothing close to the reasonable suspicion needed to declare nearly everyone in a high crime area a criminal suspect. The Los Angeles Police Department’s history with the tech seems to indicate it should have dumped it years ago. The department has been using some form of the tech since 2007, but all it seems to be able to do is waste limited law enforcement resources to violate the rights of Los Angeles residents. The only explanations for the LAPD’s continued use of this failed experiment are the sunk cost fallacy and its occasional use as a scapegoat for the department’s biased policing.”

Now, though, an April 15 memo from the LAPD declares the department is ceasing to use the PredPol software immediately due to COVID-19 related financial constraints. As one might suppose, Cushing hopes the software will remain off the table once the shutdown is lifted. Hey, anything is possible.

Cynthia Murrell, May 7, 2020

Next Page »

  • Archives

  • Recent Posts

  • Meta