Facial Recognition: Not for LE and Intel Professionals? What? Hello, Reality Calling

July 30, 2018

I read “Facial Recognition Gives Police a Powerful New Tracking Tool. It’s Also Raising Alarms.” The write up is one of many pointing out that using technology to spot persons of interest is not a good idea. The Telegraph has a story which suggests that Amazon is having some doubts about its Rekognition facial recognition system. What? Hello, reality calling.

The “Raising Alarms” story makes this statement, obtained from an interview with an outfit called Kairos. I circled these statements:

“Time is winding down but it’s not too late for someone to take a stand and keep this from happening,” said Brian Brackeen, the CEO of the facial recognition firm Kairos, who wants tech firms to join him in keeping the technology out of law enforcement’s hands. Brackeen, who is black, said he has long been troubled by facial recognition algorithms’ struggle to distinguish faces of people with dark skin, and the implications of its use by the government and police. If they do get it, he recently wrote, “there’s simply no way that face recognition software will be not used to harm citizens.”

The write up points out:

Many law enforcement agencies — including the FBI, the Pinellas County Sheriff’s Office in Florida, the Ohio Bureau of Criminal Investigation and several departments in San Diego — have been using those databases for years, typically in static situations — comparing a photo or video still to a database of mug shots or licenses. Maryland’s system was used to identify the suspect who allegedly massacred journalists at the Capital Gazette newspaper last month in Annapolis and to monitor protesters following the 2015 death of Freddie Gray in Baltimore.

Yep, even the Hollywood gangster films have featured a victim flipping through a collection of mug shots. The idea is pretty simple. Bad actors who end up in a collection of mug shots are often involved in other crimes. Looking at images is one way for LE and intel professionals to figure out if there is a clue to be followed.

Now what’s the difference between having software look for matches? Software can locate similar fingerprints. Software can locate similar images, maybe even the image of the person who committed a crime. The idea of a 50 year old man robbed at an ATM flipping through images of bad actors in a Chicago police station is, from my point of view, a bridge too far. The 50 year old will either lose concentration or just point at some image and say, “Yeah, yeah, that looks like the guy.”

Let’s go with software because there are a lot of bad actors, there are some folks on Facebook who are bad actors, and there are bad actors wandering around in a crowd. Don’t believe me. Go to Rio, stay in a fancy hotel, and wander around on a Saturday night. How long before you are robbed? Maybe never, but maybe within 15 minutes. Give this test a try.

Software, like humans, makes errors. However, it seems to make sense to use available technology to take actions required by government rules and regulations. That means that big companies are going to chase government contracts. That means that stopping companies from providing facial recognition technology is pretty much impossible.

I would suggest that the barn is on fire, the horses have escaped, and Costco built a new superstore on the land. Well, maybe I will suggest that this has happened.

Facial recognition systems are tools which have been and will continue to be used. Today’s systems can be fooled. I showed a pair of glasses which can baffle most facial recognition systems in my DarkCyber video a couple of months ago.

The flaws in the algorithms will be improved slowly. The challenge of crowds, lousy lightning, disguises, hats, shadows, and the other impediments to higher accuracy will be reduced slowly and over time.

But let’s get down to basics: The facial recognition systems are here to stay. In the US, the UK, and most countries on the planet. Go to a small city in Ecuador. Guess what? There is a Chinese developed facial recognition system monitoring certain areas of most cities. Why? Flipping through a book with hundreds of thousands of images in an attempt to identify a suspect doesn’t work too well. Toss in Snapchat and YouTube. Software is the path forward. Period.

Facial recognition systems, despite their accuracy rates, provide a useful tool. Here’s the shocker. These systems have been around for decades. Remember the Rand Tablet. That was in the 1960s. Progress is being made.

Outrage is arriving a little late.

Stephen E Arnold, July 30, 2018

Amazon Rekognition: The View from Harrods Creek

July 29, 2018

I read the stories about Amazon’s facial recognition system. A representative example of this genre is “Amazon’s Facial Recognition Tool Misidentified 28 Members of Congress in ACLU Test.” The write up explains the sample. The confidence level was set at 80 percent. Amazon recommends 95 percent.

The result? Twenty eight individuals were misidentified.

At a breakfast meeting this morning (Sunday, July 29, 2018) one uninformed Kentucky resident asked:

What if these individuals are criminals?

Another person responded:

Just 28?

I jotted down the remarks on my mobile phone. Ah, the Bluegrass state.

Stephen E Arnold, July 29, 2018

DarkCyber for June 5, 2018: Amazon and Its LE and Intelligence Services

June 5, 2018

The DarkCyber for June 5, 2018, is now available at www.arnoldit.com/wordpress or on Vimeo at https://vimeo.com/273170550.

This week’s DarkCyber presents an extract from Stephen E Arnold’s lectures at the Prague Telestrategies ISS conference. The conference is designed for security, intelligence, and law enforcement professionals in Europe.

Stephen’s two lectures provided attendees with a snapshot of the services Amazon’s streaming data marketplace offer to customers, developers, and entrepreneurs.

Stephen said:

The Amazon platform is positioned to provide a robust, innovative way to anonymize digital currency transactions and perform the type of analyses needed to deal with bad actors and the activities.

The information was gleaned from Amazon conference lectures, Amazon’s Web logs and documentation, and open source documents.

For example, one public document stated:

“… A law enforcement agency may be a customer and may desire to receive global Bitcoin transactions, correlated by country, with USP data to determine source IP addresses and shipping addresses that correlate to Bitcoin addresses.”

Coupled with Amazon’s facial recognition service “Rekognition” and Amazon’s wide array of technical capabilities, Amazon is able to provide specialized content processing and data services.

Stephen stated:

Instead of learning how to use many different specialized systems, the Amazon approach offers a unified capability available with a Kindle-style interface. This is a potential game changer for LE, intel, and security service providers.

In this week’s DarkCyber video, Stephen provides an eight minute summary of his research, including the mechanisms by which new functions can be added to or integrated with the system.

A for fee lecture about what Stephen calls “Amazon’s intelligence services” is available on a for fee basis. For information, write darkcyber333 at yandex dot com.

Kenny Toth, June 5, 2018

DarkCyber for May 15, 2018, Now Available

May 15, 2018

DarkCyber for May 15, 2018, is now available at www.arnoldit.com/wordpress and on Vimeo at https://vimeo.com/268758291

Stephen E Arnold’s DarkCyber is a weekly video news and analysis program about the Dark Web and lesser known Internet services.

The stories in the May 15, 2018, program are another Dark Web murder-for-hire scam goes wrong, the “Terror in the Dark” report provides information about how bad actors use the hidden Internet, a run-down of manufacturers of cell site simulators, a new map of the Dark Web, and the New Zealand teen ran a drug dealing business from his parents’ home.

Please, note that Stephen will be lecturing the week of June 4, 2018, at the Telestrategies ISS conference. He will produce and release a special report about on of our team’s research findings on June 5, 2018. Due to time zones, the go live date for the program may be different. We will announce schedule shifts in Beyond Search.

Kenny Toth, May 15, 2018

Law Enforcement and Big Data

May 11, 2018

The job of being an officer of the law has never been harder, but many on the tech side are trying to make it easier. But, as with most innovations, this might make life harder. Confused? Join the club. A recent spate of big data law enforcement innovations are due to become a hot button issue for the foreseeable future. The latest one came from a recent Boing Boing piece, “Raleigh Cops are Investigating Crime by Getting Google to Reveal the Identity of Every Mobile User Within Acres of the Scene.”

According to the story:

“Public records requests have revealed that on at least four occasions, the Raleigh-Durham police obtained warrants forcing Google to reveal the identities of every mobile user within acres of a crime scene, sweeping up the personal information of thousands of people in a quest to locate a single perp.”

Such a double edged sword. On one hand we all want wrongdoers to be handled in a lawful way, but on the other this is all getting too close to science fiction. Couple that with the recent news that smart devices like Alexa are listening to every conversation and may some day be used as evidence in court.

In Stephen E Arnold’s “Making Sense of Chat” presentation for the Telestrategies ISS conference in Prague in June 2018, he will highlight three commercial systems which can process large flows of data. He said:

The efficiencies of the new systems means that needed information can be identified and displayed to an investigator. Smart software, not a team of analysts, scans digital information, identifies content with a probability of being germane to a case, and presenting that data in an easy-to-understand report. The result is that the hand waving about invasive analysis of information is often different from the actual functioning of a modern system. Today’s newest systems deliver benefits that were simply not possible with older, often manual methods.

He plans to offer webinars on the chat topic as well as his deanonymizing blockchain lecture. Watch for details in Beyond Search and in his weekly DarkCyber video.

Patrick Roland, May 11, 2018

Policeware Lights Up Venture World

May 8, 2018

Spy agencies have has recently begun taking on a different look, that of a Silicon Valley startup. That’s because some of the world’s most secretive organizations have started to publicly proclaim that they are investing in digital spying tools. The most recent example popped up in a Jerusalem Times story, “Start-Up Spies? Mossad Enters the World of Venture Capitalism.”

The story focuses on the Israeli spy agency, Mossad, publicly starting a VC fund.

“In June, the fund was made public for the first time and previous announcements have indicated that it would invest NIS 10 million per year in five companies following a similar model to the CIA in this arena.
“The CIA’s parallel outfit is called Q-Tel, which is defined as the ‘strategic investor for the US intelligence and defense communities that identifies and adapts cutting-edge technologies.’”

This combination of entities, spy agencies and tech companies, might seem like a dream combination on the surface, but it is highly flawed. As the New York Times pointed out, being investors is not exactly what an organization like the CIA or Mossad is known for. Perhaps they have bright people handing the money in these organizations, but we wouldn’t count on it.

Patrick Roland, May 8, 2018

Metadata Collection Spike: Is There a Reason?

May 6, 2018

I read “NSA Triples Metadata Collection Numbers Sucking Up over 500 Million Call Records in 2017.” Interesting report, but it raised several questions here in Harrod’s Creek. But first, let’s look at the “angle” of the story.

I noted this statement:

The National Security Agency revealed a huge increase in the amount of call metadata collected, from about 151 million call records in 2016 to more than 530 million last year — despite having fewer targets.

The write up pointed out that penetration testing and trace and tap orders declined. That’s interesting as well.

The write up focused on what’s called “call detail records.” These, the write up explained, are:

things like which numbers were called and when, the duration of the call, and so on…

The write up then reminds the reader that “one target can yield hundreds or thousands of sub-targets.”

The article ends without any information about why. My impression of the write up is that the government agency is doing something that’s not quite square.

My initial reaction to the data in the write up was, “That does not seem like such a big number.” A crawl of the Dark Web, which is a pretty tiny digital space, often generates quite a bit of metadata. Stuffing the tiny bit of Dark Web data into a robust system operated by companies from Australia to the United States can produce terabytes of data. In fact, one Israeli company uploads new data in zipped block to its customers multiple times a day. The firm of which I am thinking performs this work for outfits engaged in marketing consumer products. In comparison, the NSA effort strikes me as modest.

My first question, “Why so little data?” Message, call, image, and video data are going up. The corresponding volume of metadata is going up. Toss in link analysis pointers, and that’s a lot of data. In short, the increase reported seems modest.

The second question is, “What factors contributed to the increase?” Based on our research, we think that some of the analytic systems are bogged down due to the wider use of message encryption technology. I will be describing one of these systems in my June 2018 Telestrategies ISS lecture related to encrypted chat. I wonder if the change in the volume reported in the write up is related to encryption.

My third question is, “Is government analysis of message content new or different?” Based on the information I have stumbled upon here in rural Kentucky, my thought is that message traffic analysis has been chugging along for decades. I heard an anecdote when I worked at a blue chip consulting firm. It went something like this:

In the days of telegrams, the telegraph companies put paper records in a bag, took them to the train station in Manhattan, and sent them to Washington, DC.

Is the anecdote true or false? My hunch is that it is mostly true.

My final question triggered by this article is, “Why does the government collect date?” I suppose the reasons are nosiness, but my perception is that the data are analyzed in order to get a sense of who is doing what which might harm the US financial system or the country itself.

My point is that numbers without context are often not helpful. In this case, the 2010 Pew Data reported that the average adult with a mobile makes five calls per day. Text message volume is higher. With 300 million people in the US in 2010 and assuming 30 percent mobile phone penetration, the number of calls eight years ago works out to about 1.5 billion calls. Flash forward to the present. The “number” cited in the article seems low.

Perhaps the author of the article could provide more context, do a bit of digging to figure out why the number is what it is, and explain why these data are needed in the first place.

One can criticize the US government. But I want to know a bit more.

Net net: It seems that the NSA is showing quite a bit of focus or restraint in its collection activities. In the May 16, DarkCyber, I report the names of some of the companies manufacturing cell site simulators. These gizmos are an interesting approach to data collection. Some of the devices seem robust. To me, capturing 500 million calls seems well within the specifications of these devices.

But what do I know? I can see the vapor from a mine drainage ditch from my back window. Ah, Kentucky.

Stephen E Arnold, May 6, 2018

Removing Drug Information from Social Media May Be Difficult

April 27, 2018

There is an opioid dealer nearby. In fact, this drug kingpin is not standing on the corner or lurking on college campuses, this supplier is right at your fingertips. Thanks to a recent article, the plague of drug sales through popular and public social media platforms has caught the attention of some powerful people. We learned about these developments in a recent Wired article, “One Woman Got Facebook to Police Opioid Sales on Instagram.”

While it’s a little confusing, the basic story goes that one woman who discovered opioid sales on Instagram (which is owned by Facebook) reached out to Facebook, urging them to take action, through a rival social platform, Twitter. The tactic worked, even getting the FDA involved.

According to the story:

“It shouldn’t take this much effort to get people to realize that you have some responsibility for the stuff on your platform…A 13 year old could do this search and realize there’s bad stuff on your platform — and probably has — you don’t need the commissioner of the FDA to tell you that.”

However, the act of policing drug sales on social media platforms and the dark web is not as easy as one might think. Yes, they shut down offending accounts, but beyond that there is little that can be done. According to the story, it outlawed certain hashtags, like it had done before. “Instagram previously restricted the drug-related hashtags, #Xanax and #Xanaxbar and banned #weedforsale and #weed4sale.”

It’s a small step, but hopefully one that will lead to greater and greater progress. For more information, learn more about CyberOSINT (the Dark Web) here.

Patrick Roland, April 27, 2018

Commercial Solutions for Government: A Path Forward

April 13, 2018

I often hear grumbling when I tell law enforcement and intelligence professionals to use commercial tools. Some LE and intel professionals are confident that open source tools like Maltego, a little midnight oil, and their in house technical staff can build a system better than commercial offerings. In my 50 year work career, that can happen. But it does not happen often. The 18f alternative to Squarespace is a good example of spending money for software which falls short of low cost, widely available commercial tools.

Cybercrime has become a serious hurdle for police. It seems that under-funded departments and agencies find that procurement cycles and technological advances by bad actors combine to make certain tasks difficult. We noted the PC Magazine story, “Feds Bust Black Market Forum Behind $530M in Cybercrimes.”

According to the article:

“The Department of Justice on Wednesday announced the indictments of 36 suspects allegedly responsible for the black market Infraud forum, which sold stolen credit card details, malware, and information that could be used for identity theft, including Social Security numbers.”

This is a win for cybercrime cops. Several of the American suspects have been arrested and several more international criminals are being extradited. However, we believe that only the private sector can adequately combat clever cybercrime. We recently heard about what seems to be a positive plan from Entrepreneur magazine.

Google’s new Chronicle cyber security company may offer LE a useful tool. The specialty for Chronicle is Zero Day Attacks, which are those sneaky cyber attacks that happen instantly—unlike ransomware, for example. This is just one small piece of a massive private sector puzzle that can help put cybercrime under control for good.

Combine the capabilities of Google with Recorded Future (a company in which Google has a stake), and the open source alternatives may come up short.

Patrick Roland, April 13, 2018

Yikes! Google Kiddie YouTube a Target

April 12, 2018

I thought Google and its kiddie YouTube had figured out how to show age appropriate videos to children. If the information in the story “Child Advocates Ask FTC to Investigate YouTube” is accurate, the GOOG may face some PR challenges. Nothing is quite as volatile as an online advertising site displaying videos which can be perceived as inappropriate. Because the write up is branded “AP” which once meant Associated Press, I am unwilling to quote from the write up. If my understanding of the assertions in the “news” story are accurate, I recall learning:

  • “Child advocate groups” — no, I don’t know what outfits these are — want Google to be “investigated.”
  • Google apparently profits from showing ads to children. (Who knew?)
  • Google has an app but it is not too popular with parents. (I don’t know who does not use the app because the AP story did not tell me as I recall.)
  • Google has channels aimed at children. One of these may be named ChuChuTV. (Nifty spelling of “choo”.)
  • Advertisers can get access to children but if the child says, “Googzilla, I am not 13” some content is blocked. (If I were a child, I would probably figure out how to get access to the video about unicorn slime pretty quickly.)

Among the entities I recall seeing identified in the article are:

  • Georgetown University law clinic
  • Jeff Chester, The Center for Digital Democracy
  • Josh Golin, Campaign for a Commercial Free Childhood
  • Senator Edward Markey
  • Juliana Gruenwald Henderson, an FTC professional
  • Kandi Parsons, once an FTC lawyer

What’s missing? Links, examples of bad videos, data about what percent of kiddie YouTube programming is objectionable, and similar factual data.

I don’t want to be suspicious, but regardless of filtering method, some content may be viewed as offensive because subjective perception is not what smart software does well at this point in time.

In March 2018 I was appointed to a Judicial Commission focused on human trafficking and child sex abuse. My hope is that the documents and data which flow to me do not include assertions without specific entities being identified or with constraints that make me fearful of quoting from these documents in my writings.

After 50 years of professional work, I am not easily surprised. Therefore, I am not surprised that online ad vendors similar to Google  would focus on generating revenue. I am not surprised that videos vetted by smart software may make mistakes when “close enough for horseshoes” or “good enough” thresholds may be implemented for decision making. I am not surprised that individuals who spend time watching kiddie videos find content which is inappropriate.

Perhaps follow up stories from the “Associated Press” will beef up the details and facts about Google’s problems with kiddie YouTube. Quotes from folks are what “real” journalists do. Links, facts, and data are different from quotes. Make enough phone calls, and one can probably get a statement that fits the “real” news template.

Net net: I think more specifics would be helpful particularly if the goal is to find Google “guilty” of breaking a law, wrong doing, or some other egregious behavior. For now, however, the matter warrants monitoring. Accusations about topics like trafficking and child sex abuse and related issues are inflammatory. Quotes don’t cut it for me.

Stephen E Arnold, April 12, 2018

Next Page »

  • Archives

  • Recent Posts

  • Meta