German Intelligence Handcuffed

May 20, 2020

DarkCyber noted an interesting news story published on DW.com. The article? “German Intelligence Can’t Spy on Foreigners Outside Germany.” The DarkCyber research team talked about this and formed a collective response: “What the heck?”

The write up reports as actual factual news:

The German government must come up with a new law regulating its secret services, after the country’s highest court ruled that the current practice of monitoring telecommunications of foreign citizens at will violates constitutionally-enshrined press freedoms and the privacy of communications.

The article continued:

The ruling said that non-Germans were also protected by Germany’s constitutional rights, and that the current law lacked special protection for the work of lawyers and journalists. This applied both to the collection and processing of data as well as passing on that data to other intelligence agencies.

This is an interesting development if it is indeed accurate. Some countries’ intelligence agencies do conduct activities outside of their home countries’ borders. Furthermore, there are specialized service and device vendors headquartered in Germany which facilitate extra border data collection, monitoring, tracking, and sense making. These range from Siemens to software and hacking companies.

Restricting the activities of an intelligence unit to a particular geographic “space” sounds like a difficult task. There are “need to know” operations which may not be disclosed to an elected body except under quite specific circumstances. Electronic monitoring and intercepting ranges freely in the datasphere. Telecommunications hardware and service providers like T-Mobile have a number connections with certain German government entities.

Plus DarkCyber surmises that there are current operations underway in certain parts of the world which operate in a way that is hostile to the German state, its citizens, and its commercial enterprises.

Will these operations be stopped? Turning off a covert operation is not like flicking a button on a remote control to kill a Netflix program.

What if the German intelligence community, known to be one of the best in the European Community, goes dark?

The related question is, “What if secret agencies operate in secret?” Who will know? Who will talk? Who will prosecute? Who decides what’s important to protect citizens?

Stephen E Arnold, May 20, 2020

LAPD Shutters Predictive Policing During Shutdown

May 7, 2020

Police departments are not immune to the economic impact of this pandemic. We learn the Los Angeles Police Department is shutting down its predictive policing program, at least for now, in TechDirt’s write-up, “LAPD’s Failed Predictive Policing Program the Latest COVID-19 Victim.” Writer Tim Cushing makes it perfectly clear he has never been a fan of the analytics approach to law enforcement:

“For the most part, predictive policing relies on garbage data generated by garbage cops, turning years of biased policing into ‘actionable intel’ by laundering it through a bunch of proprietary algorithms. More than half a decade ago, early-ish adopters were expressing skepticism about the tech’s ability to suss out the next crime wave. For millions of dollars less, average cops could have pointed out hot crime spots on a map based on where they’d made arrests, while still coming nothing close to the reasonable suspicion needed to declare nearly everyone in a high crime area a criminal suspect. The Los Angeles Police Department’s history with the tech seems to indicate it should have dumped it years ago. The department has been using some form of the tech since 2007, but all it seems to be able to do is waste limited law enforcement resources to violate the rights of Los Angeles residents. The only explanations for the LAPD’s continued use of this failed experiment are the sunk cost fallacy and its occasional use as a scapegoat for the department’s biased policing.”

Now, though, an April 15 memo from the LAPD declares the department is ceasing to use the PredPol software immediately due to COVID-19 related financial constraints. As one might suppose, Cushing hopes the software will remain off the table once the shutdown is lifted. Hey, anything is possible.

Cynthia Murrell, May 7, 2020

IBM Suffers a Setback in South Africa: Datawalk Stomps on Big Blue

April 21, 2020

IBM Analyst’s Notebook at one time enjoyed near total market dominance for investigative software, what I call policeware. IBM owns Analyst Notebook, and it has a sustainable revenue stream from some governments. Once installed — even though there may be no or very few qualified operators who can use the system — the money continues to roll in. Furthermore, IBM has home-grown technology, and Big Blue has acquired smaller firms with particularly valuable technology; for example, CyberTap.

Maybe not in South Africa? Datawalk has strolled into the country’s key integrator and plopped itself down in the cat-bird seat.

Under the original i2 founders’ leadership, losing South Africa was not in the game plan. IBM may have misplaced the three ring binder containing the basic strategy of i2 Ltd. To make matters worse, IBM could have asked its Watson (right, the super smart technology tackling cancer and breaking its digital ankle in a wild play) about the South African account.

Also, affected are downstream, third party products and services. Analyst’s Notebook has been available for more than two decades. There are training and support professionals like Tovek in Prague; there are add ins; there are enhancements which like Sintelix could be considered an out-and-out replacement. What happened?

If the information reported by ISB News is accurate, a company headquartered in Poland captured the account and the money. The article asserts that a key third party reseller doing business as SSG Group and its partner TechFINIUM (a Datawalk partner in South Africa) have stepped away from IBM and SAS. These are, in the view of DarkCyber, old school solutions.

The Datawalk replacement, according to John Smit, president of SSG Group, allegedly said:

“DataWalk is a powerful solution that will allow us to combine all data in one repository and then conduct detailed investigations. We often use unstructured data that we receive from our partners. DataWalk will provide us with the previously unattainable ability to view this data in the full context of our own databases “

Datawalk is characterized as a solution that is “more suited to current challenges.”

According to the article:

DataWalk (formerly PiLab) is a technological entity that … connects billions of objects from many sources, finding application in forensic analytics in the public and financial sectors, including in the fight against crime (US agencies), scams (insurers) and fraud identification (central administration).

These are aggressive assertions. IBM may well ask Watson or maybe a human involved with Analyst’s Notebook sales, “What happened?”

Stephen E Arnold, April 21, 2020

Apple and Google: Teaming Up for a Super Great Reason?

April 21, 2020

In a remarkable virtue signaling action, Apple and Google joined forces to deal with coronavirus. The approach is not the invention of a remedy, although both companies have dabbled in health. The mechanism is surveillance-centric in the view of DarkCyber.

Google Apple Contact Tracing (GACT): A Wolf in Sheep’s Clothes” provides an interesting opinion about the Google Apple Contact Tracing method. The idea seems to be that there are two wolves amongst the sheep. The sheep cooperate because that’s the nature of sheep. The wolves have the system, data, and methodology to make the sheep better. Are there other uses of the system? It is too soon to tell. But we can consider what the author asserts.

But the bigger picture is this: it creates a platform for contact tracing that works all across the globe for most modern smart phones (Android Marshmallow and up, and iOS 13 capable devices) across both OS platforms.

image

The write up states:

Whenever a user tests positive, the daily keys his or her devices used the last 14 days can be retrieved by the app through the GACT API, presumably only after an authorised request from the health authorities. How this exactly works, and in particular how a health authority gets authorised to sign such request or generate a valid confirmation code is not clear (yet). The assumption is that these keys are submitted to a central server set up by the contact tracing app. Other instances of the same app on other people’s phones are supposed to regularly poll this central server to see if new daily keys of phones of recently infected people have been uploaded. Another function in the GACT API allows the app to submit these daily keys to the operating system for analysis. The OS then uses these keys to derive all possible proximity identifiers from them, and compares each of these with the proximity identifiers it has stored in the database of identifiers recently received over Bluetooth. Whenever a match is found, the app is informed, and given the duration and time of contact (where the time may be rounded to daily intervals).

The author includes this observation about the procedure:

Google and Apple announced they intend to release the API’s in May and build this functionality into the underlying platforms in the months to follow. This means that at some point in time operating system updates (through Google Play Services updates in the case of Android) will contain the new contact tracing code, ensuring that all users of a modern iPhone or Android smartphone will be tracked as soon as they accept the OS update. (Again, to be clear: this happens already even if you decide not to install a contact tracing app!) It is unclear yet how consent is handled, whether there will be OS settings allowing one to switch on or off contact tracing, what the default will be.

The write up concludes with this statement:

We have to trust Apple and Google to diligently perform this strict vetting of apps, to resist any coercion by governments, and to withstand the temptation of commercial exploitation of the data under their control. Remember: the data is collected by the operating system, whether we have an app installed or not. This is an awful amount of trust….

DarkCyber formulated several observations:

  1. The system appears to be more accessible than existing specialized services now available to some authorities
  2. Apple’s and Google’s cooperation seems mature in terms of operational set up. When did work on this method begin?
  3. Systems operated by private companies on behalf of government agencies rely on existing legal and contractual methods to persist through time; that is, once funded or supported in a fungible manner, the programs operate in an increasingly seamless manner.

Worth monitoring this somewhat rapid and slightly interesting tag team duo defeat their opponent.,

Stephen E Arnold, April 21, 2020

Intelware/Policeware Vendors Face Tough Choices and More Sales Pressure

April 20, 2020

The wild and crazy reports about the size of the lawful intercept market, the policeware market, and the intelware market may have to do some recalculations. Research and Markets’ is offering a for fee report which explains the $8.8 billion lawful interception market. The problem is that the report was issued in March 2020, and it does not address changes in the financing of intelware and policeware companies nor the impact of the coronavirus matter. You can get more information about the report from this link.

As you know, it is 2020. Global investments have trended down. Estimates range from a few percent to double digits. Now there is news from Israel that the funding structures for high technology companies are not just sagging. The investors are seeking different paths and payoffs.

Post Covid-19, Exits May Seem Like a Distant Dream But Exercising Options May Become Easier” states:

With Israeli tech companies having to cut employees’ salaries by up to 40%, many have turned to repricing stock options as a means of maintaining their talent.

Repricing means that valuations go down.

Gidi Shalom Bendor, founder and CEO of IBI Capital subsidiary S-Cube Financial Consulting, allegedly said:

You can see the valuations of public companies decreasing and can assume private companies are headed the same way,” Shalom Bendor said. Companies that are considering repricing have been around for several years and have a few dozen employees, so even though an exit is not around the corner for them it is still in sight, he explained. “In some cases, these companies have even had acquisition offers made, so options are a substantial issue.

Ayal Shenhav, head of the tech department at Israel-based firm GKH Law Offices, allegedly said:

Pay cuts and the repricing of options go hand in hand.

Let’s step back. What are the implications of repricing, if indeed it becomes a trend that reaches from Israel to Silicon Valley?

First, the long sales cycles for certain specialized software puts more financial pressure on the vendors. Providing access to software is not burdensome. What is expensive is providing the professional support required for proof of concept, training, and system tuning. Larger companies like BAE Systems and Verint will have an advantage over smaller, possibly more flexible alternatives.

Second, the change in compensation is likely to hamper hiring and retaining employees. The work harder, work longer approach in some startups means that the payoffs have to be juicy. Without the tasty cash at the end of a 70 hour work week, the best and brightest may leave the startup and join a more established firm. Thus, innovation can be slowed.

Third, specialized service providers can flourish in regions/countries which operate with a different approach to funding. Stated simply, Chinese intelware and policeware vendors may be able to capture more customers in markets coveted by some Israeli and US companies.

These are major possibilities. Evidence of change can be discerned. In my DarkCyber video for April 14, 2020, I pointed out that Geospark Analytics was doing a podcast. That’s a marketing move of note as was the firm’s publicity about hiring a new female CEO, who was a US Army major, a former SAIC senior manager, and a familiar figure in some government agencies. LookingGlass issues a steady stream of publicity about its webinars. Recorded Future, since its purchase by Insight, has become more vocal in its marketing to the enterprise. The claims of cyber threat vendors about malware, hacks, and stolen data are flowing from companies once content with a zero profile approach to publicity.

Why?

Sales are being made, but according to the DarkCyber research team the deals are taking longer, have less generous terms, and require proofs of concept. Some police departments are particularly artful with proofs of concepts and are able to tap some high value systems for their analysts with repeated proofs of concept.

To sum up, projections about the size of the lawful intercept, intelware, and policeware market will continue to be generated. But insiders know that the market is finite. Governments have to allocate funds, work with planning windows open for months if not a year or more, and then deal with unexpected demands. Example? The spike in coronavirus related fraud, misdirection of relief checks, and growing citizen unrest in some sectors.

Net net: The change in Israel’s financing, the uptick in marketing from what were once invisible firms, and the environment of the pandemic are disruptive factors. No quick resolution is in sight.

Stephen E Arnold, April 21, 2020

Wolfcom, Body Cameras, and Facial Recognition

April 5, 2020

Facial recognition is controversial topic and is becoming more so as the technology advances. Top weapons and security companies will not go near facial recognition software due to the cans of worms it would open. Law enforcement agencies want these companies to add it. Wolfcom is actually adding facial recognition to its cameras. Techdirt has the scoop on the story, “Wolfcom Decides It Wants To Be The First US Body Cam Company To Add Facial Tech To Its Products.”

Wolfcom makes body camera for law enforcement and they want to add facial recognition technology to their products. Currently Wolfcom is developing facial recognition for its newest body cam, Halo. Around one thousand five hundred police departments have purchased Wolfcam’s body cameras.

If Wolfcom is successful with its facial recognition development, it would be the first company to have body cameras that use the technology. The technology is still in development according to Wolfcom’s marketing. Right now, their facial recognition technology rests on taking individuals’ photos, then matching them against a database. The specific database is not mentioned.

Wolfcom obviously wants to be an industry leader, but it is also being careful about no making false promises or drumming up bad advertising:

“About the only thing Wolfcom is doing right is not promising sky high accuracy rate for its unproven product when pitching it to government agencies. That’s the end of the “good” list. Agencies who have been asked to beta test the “live” facial recognition AI are being given free passes to use the software in the future, when (or if) it actually goes live. Right now, Wolfcom’s offering bears some resemblance to Clearview’s: an app-based search function that taps into whatever databases the company has access to. Except in this case, even less is known about the databases Wolfcom uses or if it’s using its own algorithm or simply licensing one from another purveyor.”

Wolfcom could eventually offer realtime facial recognition technology and that could affect some competitors.

Whitney Grace, April 5, 2020

Clearview: More Tradecraft Exposed

March 26, 2020

After years of dancing around the difference between brain dead products like enterprise search, content management, and predictive analytics, anyone can gain insight into the specialized software provided by generally low profile companies. Verint is publicly traded. Do you know what Verint does? Sure, look it up on Bing or Google.

I read with some discomfort “I Got My File From Clearview AI, and It Freaked Me Out.”

Here are some factoids from the write up. Are these true? DarkCyber assumes that everything the team sees on the Internet meets the highest standards of integrity, objectivity, and truthiness. DarkCyber’s comments are in italic:

  1. “Someone really has been monitoring nearly everything you post to the public internet. And they genuinely are doing “something” with it. The someone is Clearview AI. And the something is this: building a detailed profile about you from the photos you post online, making it searchable using only your face, and then selling it to government agencies and police departments who use it to help track you, identify your face in a crowd, and investigate you — even if you’ve been accused of no crime.”
  2. “Clearview AI was founded in 2017. It’s the brainchild of Australian entrepreneur Hoan Ton-That and former political aide Richard Schwartz. For several years, Clearview essentially operated in the shadows.”
  3. “The Times, not usually an institution prone to hyperbole, wrote that Clearview could “end privacy as we know it.” [This statement is a reference to a New York Times intelware article. The New York Times continues to hunt for real news that advances an agenda of “this stuff is terrible, horrible, unconstitutional, pro anything the NYT believes in, etc.”]
  4. “the company [Clearview] scrapes public images from the internet. These can come from news articles, public Facebook posts, social media profiles, or multiple other sources. Clearview has apparently slurped up more than 3 billion of these images.” [The images are those which are available on the Internet and possibly from other sources; for example, commercial content vendors.]
  5. “The images are then clustered together which allows the company to form a detailed, face-linked profile of nearly anyone who has published a picture of themselves online (or has had their face featured in a news story, a company website, a mug shot, or the like).” [This is called enrichment, context, or machine learning indexing and — heaven help DarkCyber — social graphs or semantic relationships. Jargon varies according to fashion trends.]
  6. “Clearview packages this database into an easy-to-query service (originally called Smartcheckr) and sells it to government agencies, police departments, and a handful of private companies….As of early 2020, the company had more than 2,200 customers using its service.” [DarkCyber wants to point out that law enforcement entities are strapped for cash, and many deals are little more than proofs-of-concept. Some departments cycle through policeware and intelware in order to know what the systems do versus what the marketing people say the systems do. Big difference? Yep, yep.]
  7. “Clearview’s clients can upload a photo of an unknown person to the system. This can be from a surveillance camera, an anonymous video posted online, or any other source.”
  8. “In a matter of seconds, Clearview locates the person in its database using only their face. It then provides their complete profile back to the client.”

Now let’s look at what the write up reported that seemed to DarkCyber to be edging closer to “real news.”

This is the report the author obtained:

image

The article reports that the individual who obtained this information from Clearview was surprised. DarkCyber noted this series of statements:

The depth and variety of data that Clearview has gathered on me is staggering. My profile contains, for example, a story published about me in my alma mater’s alumni magazine from 2012, and a follow-up article published a year later. It also includes a profile page from a Python coders’ meet up group that I had forgotten I belonged to, as well as a wide variety of posts from a personal blog my wife and I started just after getting married. The profile contains the URL of my Facebook page, as well as the names of several people with connections to me, including my faculty advisor and a family member (I have redacted their information and images in red prior to publishing my profile here).

The write up includes commentary on the service, its threats to individual privacy, and similar sentiments.

DarkCyber’s observations include:

  • Perhaps universities could include information about applications of math, statistics, and machine learning in their business and other courses? At a lecture DarkCyber gave at the University of Louisville in January 2019, cluelessness among students and faculty was the principal takeaway for the DarkCyber team.
  • Clearview’s technology is not unique, nor is it competitive with the integrated systems available from other specialized software vendors, based on information available to DarkCyber.
  • The summary of what Clearview does captures information that would have been considered classified and may still be considerate classified in some countries.
  • Clearview does not appear to have video capability like other vendors with richer, more sophisticated technology.

Why did DarkCyber experience discomfort? Some information is not — at this time or in the present environment — suitable for wide dissemination. A good actor with technical expertise can become a bad actor because the systems and methods are presented in sufficient detail to enable certain activities. Knowledge is power, but knowledge in the hands of certain individuals can yield unexpected consequences. DarkCyber is old fashioned and plans to stay that way.

Stephen E Arnold, March 26, 2020

The Google: Geofence Misdirection a Consequence of Good Enough Analytics?

March 18, 2020

What a surprise—the use of Google tracking data by police nearly led to a false arrest, we’re told in the NBC News article, “Google Tracked his Bike Ride Past a Burglarized Home. That Made him a Suspect.” Last January, programmer and recreational cyclist Zachary McCoy received an email from Google informing him, as it does, that the cops had demanded information from his account. He had one week to try to block the release in court, yet McCoy had no idea what prompted the warrant. Writer Jon Schuppe reports:

“There was one clue. In the notice from Google was a case number. McCoy searched for it on the Gainesville Police Department’s website, and found a one-page investigation report on the burglary of an elderly woman’s home 10 months earlier. The crime had occurred less than a mile from the home that McCoy … shared with two others. Now McCoy was even more panicked and confused.”

After hearing of his plight, McCoy’s parents sprang for an attorney:

“The lawyer, Caleb Kenyon, dug around and learned that the notice had been prompted by a ‘geofence warrant,’ a police surveillance tool that casts a virtual dragnet over crime scenes, sweeping up Google location data — drawn from users’ GPS, Bluetooth, Wi-Fi and cellular connections — from everyone nearby. The warrants, which have increased dramatically in the past two years, can help police find potential suspects when they have no leads. They also scoop up data from people who have nothing to do with the crime, often without their knowing ? which Google itself has described as ‘a significant incursion on privacy.’ Still confused ? and very worried ? McCoy examined his phone. An avid biker, he used an exercise-tracking app, RunKeeper, to record his rides.”

Aha! There was the source of the “suspicious” data—RunKeeper tapped into his Android phone’s location service and fed that information to Google. The records show that, on the day of the break-in, his exercise route had taken him past the victim’s house three times in an hour. Eventually, the lawyer was able to convince the police his client (still not unmasked by Google) was not the burglar. Perhaps ironically, it was RunKeeper data showing he had been biking past the victim’s house for months, not just proximate to the burglary, that removed suspicion.

Luck, and a good lawyer, were on McCoy’s side, but the larger civil rights issue looms large. Though such tracking data is anonymized until law enforcement finds something “suspicious,” this case illustrates how easy it can be to attract that attention. Do geofence warrants violate our protections against unreasonable searches? See the article for more discussion.

Cynthia Murrell, March 18, 2020

Banjo: A How To for Procedures Once Kept Secret

March 13, 2020

DarkCyber wrote about BlueDot and its making reasonably clear what steps it takes to derive actionable intelligence from open source and some other types of data. Ten years ago, the processes implemented by BlueDot would have been shrouded in secrecy.

From Secrets to Commercial Systems

Secret and classified information seems to find its way into social media and the mainstream media. DarkCyber noted another example of a company utilizing some interesting methods written up in a free online publication.

DarkCyber can visualize old-school companies depending on sales to law enforcement and the intelligence community asking themselves, “What’s going on? How are commercial firms getting this know how? Why are how to and do it yourself travel guides to intelligence methods becoming so darned public?”

It puzzles DarkCyber as well.

Let’s take a look at the revelations in “Surveillance Firm Banjo Used a Secret Company and Fake Apps to Scrape Social Media.” The write up explains:

  • A company called Pink Unicorn Labs created apps which obtained information from users. Users did not know their data were gathered, filtered, and cross correlated.
  • Banjo, an artificial intelligence firm that works with police used a shadow company to create an array of Android and iOS apps that looked innocuous but were specifically designed to secretly scrape social media. The developer of the apps was Pink Unicorn. Banjo CEO Damien Patton created Pink Unicorn.
  • Why create apps that seemed to do one while performing data inhalation: “Dataminr received an investment from Twitter. Dataminr has access to the Twitter fire hose. Banjo, the write up says, “did not have that sort of data access.” The fix? Create apps that sucked data.
  • The apps obtained information from Facebook, Twitter, Instagram, Russian social media app VK, FourSquare, Google Plus, and Chinese social network Sina Weibo.
  • The article points out: “Once users logged into the innocent looking apps via a social network OAuth provider, Banjo saved the login credentials, according to two former employees and an expert analysis of the apps performed by Kasra Rahjerdi, who has been an Android developer since the original Android project was launched. Banjo then scraped social media content.”
  • The write up explains, Banjo, via a deal with Utah, has access to the “state’s traffic, CCTV, and public safety cameras. Banjo promises to combine that input with a range of other data such as satellites and social media posts to create a system that it claims alerts law enforcement of crimes or events in real-time.”
Discussion

Why social media? On the surface and to most parents and casual users of Facebook, Twitter, and YouTube, there are quite a few cat posts. But via the magic of math, an analyst or a script can look for data which fills in missing information. The idea is to create a record of a person, leave blanks where desirable information is not yet plugged in, and then rely on software to spot the missing item. How is this accomplished? The idea is simple. One known fact appears in the profile and that fact appears in another unrelated item of content. Then the correlated item of content is scanned by a script and any information missing from the profile is plugged in. Using this method and content from different sources, a clever system can compile a dossier on an entity. Open source information yields numerous gems; for example, a cute name applied to a boy friend might become part of a person of interest’s Dark Web handle. Phone numbers, geographic information, friends, and links to other interesting content surface. Scripts work through available data. Data can be obtained in many ways. The methods are those which were shrouded in secrecy before the Internet started publishing essays revealing what some have called “tradecraft.”

Net Net

Banjo troubles DarkCyber on a number of levels:

  1. Secrecy has significant benefits. Secrets, once let loose, have interesting consequences.
  2. Users are unaware of the risks apps pose. Cluelessness is in some cases problematic.
  3. The “now” world looks more like an intelligence agency than a social construct.

Stephen E Arnold, March 13, 2020

Sintelix Adds Unstructured Text to IBM i2 Solutions

March 12, 2020

DarkCyber noted that IBM is promoting the Sintelix text and data analytics software. The tie up makes it easier for i2 users to make sense of unstructured text. Sintelix does not compete with IBM. Sintelix has filled a gap in IBM’s presentation of the i2 solutions. For more information, navigate to this IBM page. No pricing details. Sintelix’s headquarters are in Australia.

Stephen E Arnold, March 12, 2020

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta