Surveillance Footage Has Value

September 10, 2020

It is not a secret that Google, Facebook, Apple, Instagram, and other large technology companies gather user data and sell it to the highest bidder. It is a easy way to pad their bottom line, especially when users freely give away this information. The Russian city of Moscow wants to ad more revenue to the city’s coffers, so they came up with an ingenious way to get more cash says Yahoo Finance, “Moscow May Sell Footage From Public Secret Camera: Report.”

According to the report, Moscow’s tech branch plans to broadcast videos captured on cameras in public areas. Technically, at least within the United States, if you are in a public place you are free to be filmed and whoever does the filming can do whatever they want with the footage. Russia must be acting on the same principle, so Moscow’s Department of Information Technologies purchased cameras to install outside of 539 hospitals. It might also be a way to increase security.

All of the footage will be stored on a central database and people will be able to purchase footage. The footage will also be shown on the Internet.

What is alarming is that MBK Media wrote in December 2019 that footage from Moscow’s street cameras was available for purchase on black markets with options to access individual or an entire system of cameras. This fact is scarier, however:

“The same department organized the blockchain-based electronic voting in Moscow and one more Russian region this summer when Russians voted to amend the country’s constitution. The voting process was criticized for the weak data protection.”

Moscow wants more ways to keep track of citizens in public areas and it wants to make some quick rubles off the process. Companies in the US do the same thing and the government as well.

Whitney Grace, September 10, 2020

—-

Oh, Oh, Millennials Want Their Words and Services Enhanced. Okay, Done!

September 9, 2020

A couple of amusing items caught my attention this morning. The first is Amazon’s alleged demand that a Silicon Valley real news outlet modify its word choice.

image

The Bezos bulldozer affects the social environment. The trillion horsepower Prime machine wants to make sure that its low cost gizmos are not identified with surveillance. Why is that? Perhaps because their inclusion of microphones, arrays, and assorted software designed to deal with voices in far corners performs surveillance? DarkCyber does not know. The solution? Amazon = surveillance. Now any word will do, right?

The second item is mentioned in “Microsoft Confirms Why Windows Defender Can’t Be Disabled via Registry.” The idea is that Microsoft’s system is now becoming Bob’s mom. You remember Bob, don’t you. User controls? Ho ho ho.

The third item is a rib tickler. You worry about censorship for text and videos, don’t you. Now you can worry about Google’s new user centric ability to filter your phone calls. That’s a howler. What if the call is from a person taking Google to court? Filtered. This benefits everyone. You can get the allegedly full story in “Google New Verified Calls Feature Will Tell You Why a Business Is Calling You.” Helpful.

Each of these examples amuse me. Shall we complain about Chinese surveillance apps?

These outfits are extending their perimeters as far as possible before the ever vigilant, lobbyist influenced political animals begin the great monopoly game.

Stephen E Arnold, September 9, 2020

Consumer Control of Personal Data: Too Late, Chums

September 3, 2020

The Economics of Social Data” is an interesting write up by a Yale graduate student (how much time did you put into this work, Tan Gan?), a Yale professor (George Bush’s stomping grounds), and an MIT professor (yes, the outfit that accepted money from an alleged human trafficker and then stumbled through truth thickets).

What did these esteemed individuals discover? I like this sentence:

Platforms focuses on ensuring consumers’ control over their individual data. Regulators hope that ownership and control over one’s own data will result in appropriate compensation for the data one chooses to reveal. However, economists need to consider the social aspect of data collection. Because an individual user’s data is predictive of the behavior of others, individual data is in practice social data. The social nature of data leads to an externality: an individual’s purchase on Amazon, for example, will convey information about the likelihood of purchasing a certain product among other consumers with similar purchase histories.

Does this imply that a light bulb has flickered to life in the research cubbies of these influential scholars? Let’s grind forward:

While consumers can experience positive externalities, such as real-time traffic information, very little curbs the platform from trading data for profit in ways that harm consumers. Therefore, data ownership is insufficient to bring about the efficient use of information, since arbitrarily small levels of compensation can induce a consumer to relinquish her personal data.

Remember. I reside in rural Kentucky and most of my acquaintances go bare foot or wear work boots. It seems that after decades of non-regulation, governmental hand waving, and sitting on the porch watching monopolies thrive — a problem?

The fix? Here you go:

In terms of policy implications, our results on the aggregation of consumer information suggest that privacy regulation must move away from concerns over personalized prices at the individual level. Most often, firms do not set prices in response to individual-level characteristics. Instead, segmentation of consumers occurs at the group level (e.g. as in the case of Uber) or at the temporal and spatial levels (e.g. Staples, Amazon). Thus, our analysis points to the significant welfare effects of group-based price discrimination and of uniform prices that react in real time to changes in market-level demand.

Translation: Too late, chums.

Stephen E Arnold, September 3, 2020

Bad Actors Rejoice: Purrito Is Kitten with Claws

September 1, 2020

The Internet has taught us many things about people, particularly tech geeks. Technology geeks love challenging themselves with hacking tricks, possess off base senses of humor, and love their fur babies. They particularly love cats.

A “purrito” is a term coined by the animal rescue community for tiny kittens swaddled in tiny blankets, ergo like burritos. It goes without saying that burritos are adorable.

It is also not surprising that potential bad actors, who love cats, would purloin the Purrito for an “ultra fast, minimalistic, encrypted command line paste-bin.” Purrito Bin even uses characters to make a tabby kitten face: (=???=).

Reading through the instructions for Purrito, the developer made it even cuter by calling the standard client “meow” and a companion client “purr.” Purrito Bin is a simple way to encrypt files :

“In a encrypted storage setting, the paste is encrypted before sending it to the server.

Now the server will only be used as a storage bin and even in case of a non-https connection, you are guaranteed that no one else will be able to read the data that you have sent.

How does it work?

Steps automatically done by the provided clients, on the client side:

• Randomly generate an encryption key.

• Encrypt your data using said key, the encrypted data is called the cipher.

• Send the cipher to PurritoBin and get a standard paste url as above, which will be converted to the form”

The concept of Purrito Bin is itself genius, but is it a good idea for it to be posted publicly where bad actors can use it?

Whitney Grace, September 1, 2020

Another Data Marketplace: Amazon, Microsoft, Oracle, or Other Provider for This Construct?

August 31, 2020

The European Union is making a sharp U-turn on data privacy, we learn from MIT Technology Review’s article, “The EU Is Launching a Market for Personal Data. Here’s What That Means for Privacy.” The EU has historically protected its citizens’ online privacy with vigor, fighting tooth and nail against the commercial exploitation of private information. As of February, though, the European Commission has decided on a completely different data strategy (PDF). Reporter Anna Artyushina writes:

The Trusts Project, the first initiative put forth by the new EU policies, will be implemented by 2022. With a €7 million [8.3 million USD] budget, it will set up a pan-European pool of personal and nonpersonal information that should become a one-stop shop for businesses and governments looking to access citizens’ information. Global technology companies will not be allowed to store or move Europeans’ data. Instead, they will be required to access it via the trusts. Citizens will collect ‘data dividends,’ which haven’t been clearly defined but could include monetary or nonmonetary payments from companies that use their personal data. With the EU’s roughly 500 million citizens poised to become data sources, the trusts will create the world’s largest data market. For citizens, this means the data created by them and about them will be held in public servers and managed by data trusts. The European Commission envisions the trusts as a way to help European businesses and governments reuse and extract value from the massive amounts of data produced across the region, and to help European citizens benefit from their information.”

It seems shifty they have yet to determine just how citizens will benefit from this data exploitation, I mean, value-extraction. There is no guarantee people will have any control over their information, and there is currently no way to opt out. This change is likely to ripple around the world, as the way EU approaches data regulation has long served as an example to other countries.

The concept of data trusts has been around since 2018, when Sir Tim Berners Lee proposed it. Such a trust could be for-profit, for a charitable cause, or simply for data storage and protection. As Artyushina notes, whether this particular trust actually protects citizens depends on the wording of its charter and the composition of its board of directors. See the article for examples of other trusts gone wrong, as well as possible solutions. Let us hope this project is set up and managed in a way that puts citizens first.

Cynthia Murrell, August 31, 2020

OnionFruit Revamps With New Browser Version

August 26, 2020

Remaining anonymous is impossible online, especially with all the cookies we “eat.” Instead of an all cookie diet, try using a browser made from onions and fruit! Major Geeks revealed their latest harvest with an update to their popular TOR browser: OnionFruit Connect 2020.730.0.

TOR browsers work, because they encrypt a user’s browsing data in many security layers like an onion. In order to identify the user, one has to peel back layers of encrypted data. It makes hacking someone with a Tor browser tedious and extremely difficult. TOR browsers also allow people to connect to the Dark Web that uses encrypted and random web addresses.

OnionFruit guarantees its users are protected:

“Having the ability to use a browser that you are already comfortable with makes using TOR more of a seamless process. OnionFruit Connect will initiate the TOR service and then configures your proxy settings allowing your apps to be routed through TOR’s tunnel. You will be notified that you’re protected, confirming that all your internet traffic is being passed through the TOR tunnel safely encrypted. This process ensures that every single site you visit gets routed through multiple servers to help mask your actions, making them difficult to track.”

OnionFruit is simple to set up on a computer and then access the TOR network. The best thing is that it works with favored browsers: Chrome, Firefox, Edge, Opera, and others without an extra configuration. OnionFruit updates itself, has custom landing pages, and a download speed monitor.

It is an easy way to encrypt Web browsing and also learn more about the TOR network.

Whitney Grace, August 26, 2020

Surprising Google Data

August 20, 2020

DarkCyber is not sure if these data are accurate. We have had some interesting interactions with NordVPN, and we are skeptical about this outfit. Nevertheless, let’s look beyond a dicey transaction with the NordVPN outfit and focus on the data in “When Looking for a VPN, Chinese Citizens Search for Google.”

The article asserts:

New research by NordVPN reveals that when looking for VPN services on Baidu, the local equivalent of Google, the Chinese are mostly trying to get access to Google – in fact, 40,35% of all VPN service-related searches have to do with Google. YouTube comes second on the list, accounting for 31,58% of all searches. Other research by NordVPN has shown that YouTube holds the most desired restricted content, with 82,7% of Internet users worldwide searching for how to unblock this video sharing platform.

If valid, these data suggest that Google’s market magnetism is powerful. Perhaps a type of quantum search entanglement?

Stephen E Arnold, August 20, 2020

Telegram: Friendly Outfit for Russia, Other Places, Not So Much

August 18, 2020

There’s nothing like encrypted communications for bad actors. Some law enforcement and regulatory professionals are less enthusiastic. Russia has worked a deal with Telegram. Details about what Telegram’s side of the bargain include are sparse. Russia’s side of the deal is equally fuzzy. One might surmise a mechanism for accessing encrypted content. Is this possible? The answer depends on whom one asks.

Telegram in its quest to remain in business has, according to “Telegram Launches One-on-One Video Calls with End-to-End Encryption,” is dimming the lights for some law enforcement and regulatory units. The write up reports:

The video calls on Telegram support picture-in-picture mode, so that people can check and reply to their messages while talking to a friend. The calls are also protected with end-to-end encryption, with the security confirmed by matching emojis on the screen on either end of the line. Telegram continues to work on more features and improvements for its video call offering, saying that it is working to launch group video calls in the coming months. The upcoming feature will allow the app to jump into the videoconferencing market, which has become more crucial as people stay at home amid the COVID-19 pandemic.

What’s the big deal? Encrypted messaging poses a cost and time hurdle for government authorities. Bad actors find these encrypted services more useful than some other forms of information exchange. For example, the razzle dazzle of the Dark Web (despite its modest size in terms of sites and users) is losing ground to encrypted messaging services. And why not?

From a mobile device, encrypted messaging can replicate many of the more interesting facets of the Dark Web; for example:

  • Encryption and anonymity. Check
  • In app payment. Check
  • Private groups. Check
  • Social media functions. Semi check.

“Going dark” is no longer a bit of in-crowd jargon. It is a reality. And for Russian authorities, maybe not so dark.

Stephen E Arnold, August 18, 2020

Amazon Alexa Is Sensitive

August 17, 2020

The gimmick behind digital assistants is with a simple vocal command, using a smart speaker like Amazon Echo, users have access to knowledge and can complete small tasks. Amazon is either very clever or very clumsy when it comes to digital assistant Alexa. Venture Beat shares that, “Researchers Identify Dozens Of Words That Accidentally Trigger Amazon Echo Speakers” in a recent study.

The problem with digital assistants, other than them recording conversations and storing them to the cloud, is who will have access tot these conversations. LeakyPick is a platform that investigates microphone equipped devices and monitors network traffic for audio transmissions. LeakyPick was founded by University of Darmstadt, North Carolina State University, and the University of Paris Saclay.

LeakyPick was designed to test when smart speakers are activated:

“LeakyPick — which the researchers claim is 94% accurate at detecting speech traffic — works for both devices that use a wakeword and those that don’t, like security cameras and smoke alarms. In the case of the former, it’s preconfigured to prefix probes with known wakewords and noises (e.g., “Alexa,” “Hey Google”), and on the network level, it looks for “bursting,” where microphone-enabled devices that don’t typically send much data cause increased network traffic. A statistical probing step serves to filter out cases where bursts result from non-audio transmissions.”

To identify words that could activate smart speakers, LeakyPick uses all words in a phoneme dictionary. After testing smart speakers: Echo Dot, HomePod, and Google Home over fifty-two days. LeakyPick discovered that the Echo Dot reacted to eighty-nine words, some phonetically different from Alexa, to activate.

Amazon responded it built privacy deep into Alexa and sometimes smart speakers respond to words other than command signals.

LeakyPick, however, does show potential for testing smart home privacy and how to prevent them.

Whitney Grace, August 17, 2020

TikTok: Exploiting, Exploited, or Exploiter?

August 12, 2020

I read “TikTok Tracked Users’ Data with a Tactic Google Banned.” [Note: You will have to pay to view this article. Hey, The Murdoch outfit has to have a flow of money to offset its losses from some interesting properties, right?]

The write up reveals that TikTok, the baffler for those over 50, tricked users. Those lucky consumers of 30 second videos allegedly had one of their mobile devices ID numbers sucked into the happy outfit’s data maw. Those ID numbers — unlike the other codes in mobile devices — cannot be changed. (At least, that’s the theory.)

What can one do with a permanent ID number? Let us count some of the things:

  1. Track a user
  2. Track a user
  3. Track a user
  4. Obtain information to pressure a susceptible person into taking an action otherwise not considered by that person?

I think that covers the use cases.

The write up states with non-phone tap seriousness, a business practice of one of the Murdoch progeny:

The identifiers collected by TikTok, called MAC address, are most commonly used for advertising purposes.

Whoa, Nellie. This here is real journalism. A MAC address is shorthand for “media access control.” I think of the MAC address as a number tattooed on a person’s forehead. Sure, it can be removed… mostly. But once a user watches 30-second videos and chases around for “real” information on a network, that unique number can be used to hook together otherwise disparate items of information. The MAC is similar to one of those hash codes which allow fast access to data in a relational structure or maybe an interest graph. One can answer the question, “What are the sites with this MAC address in log files?” The answer can be helpful to some individuals.

There are some issues bubbling beneath the nice surface of the Murdoch article; for example:

  1. Why did Google prohibit access to a MAC address, yet leave a method to access the MAC address available to those in the know? (Those in the know include certain specialized services support US government agencies, ByteDance, and just maybe Google. You know Google. That is the outfit which wants to create a global seismic system using every Android device who owner gives permission to monitor earthquakes. Yep, is that permission really needed? Ho, ho, ho.)
  2. What vendors are providing MAC address correlations across mobile app content and advertising data? The WSJ is chasing some small fish who have visited these secret data chambers, but are there larger, more richly robust outfits in the game? (Yikes, that’s actually going to take more effort than calling a university professor who runs a company about advertising as a side gig. Effort? Yes, not too popular among some “real” Murdoch reporters.)
  3. What are the use cases for interest graphs based on MAC address data? In this week’s DarkCyber video available on Facebook at this link, you can learn about one interesting application: Targeting an individual who is susceptible to outside influence to take an action that individual otherwise would not take. Sounds impossible, no? Sorry, possible, yes.

To summarize, interesting superficial coverage but deeper research was needed to steer the writing into useful territory and away from the WSJ’s tendency to drift closer to News of the World-type information. Bad TikTok, okay. Bad Google? Hmmmm.

Stephen E Arnold, August 12, 2020

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta