Telegram: Friendly Outfit for Russia, Other Places, Not So Much

August 18, 2020

There’s nothing like encrypted communications for bad actors. Some law enforcement and regulatory professionals are less enthusiastic. Russia has worked a deal with Telegram. Details about what Telegram’s side of the bargain include are sparse. Russia’s side of the deal is equally fuzzy. One might surmise a mechanism for accessing encrypted content. Is this possible? The answer depends on whom one asks.

Telegram in its quest to remain in business has, according to “Telegram Launches One-on-One Video Calls with End-to-End Encryption,” is dimming the lights for some law enforcement and regulatory units. The write up reports:

The video calls on Telegram support picture-in-picture mode, so that people can check and reply to their messages while talking to a friend. The calls are also protected with end-to-end encryption, with the security confirmed by matching emojis on the screen on either end of the line. Telegram continues to work on more features and improvements for its video call offering, saying that it is working to launch group video calls in the coming months. The upcoming feature will allow the app to jump into the videoconferencing market, which has become more crucial as people stay at home amid the COVID-19 pandemic.

What’s the big deal? Encrypted messaging poses a cost and time hurdle for government authorities. Bad actors find these encrypted services more useful than some other forms of information exchange. For example, the razzle dazzle of the Dark Web (despite its modest size in terms of sites and users) is losing ground to encrypted messaging services. And why not?

From a mobile device, encrypted messaging can replicate many of the more interesting facets of the Dark Web; for example:

  • Encryption and anonymity. Check
  • In app payment. Check
  • Private groups. Check
  • Social media functions. Semi check.

“Going dark” is no longer a bit of in-crowd jargon. It is a reality. And for Russian authorities, maybe not so dark.

Stephen E Arnold, August 18, 2020

Amazon Alexa Is Sensitive

August 17, 2020

The gimmick behind digital assistants is with a simple vocal command, using a smart speaker like Amazon Echo, users have access to knowledge and can complete small tasks. Amazon is either very clever or very clumsy when it comes to digital assistant Alexa. Venture Beat shares that, “Researchers Identify Dozens Of Words That Accidentally Trigger Amazon Echo Speakers” in a recent study.

The problem with digital assistants, other than them recording conversations and storing them to the cloud, is who will have access tot these conversations. LeakyPick is a platform that investigates microphone equipped devices and monitors network traffic for audio transmissions. LeakyPick was founded by University of Darmstadt, North Carolina State University, and the University of Paris Saclay.

LeakyPick was designed to test when smart speakers are activated:

“LeakyPick — which the researchers claim is 94% accurate at detecting speech traffic — works for both devices that use a wakeword and those that don’t, like security cameras and smoke alarms. In the case of the former, it’s preconfigured to prefix probes with known wakewords and noises (e.g., “Alexa,” “Hey Google”), and on the network level, it looks for “bursting,” where microphone-enabled devices that don’t typically send much data cause increased network traffic. A statistical probing step serves to filter out cases where bursts result from non-audio transmissions.”

To identify words that could activate smart speakers, LeakyPick uses all words in a phoneme dictionary. After testing smart speakers: Echo Dot, HomePod, and Google Home over fifty-two days. LeakyPick discovered that the Echo Dot reacted to eighty-nine words, some phonetically different from Alexa, to activate.

Amazon responded it built privacy deep into Alexa and sometimes smart speakers respond to words other than command signals.

LeakyPick, however, does show potential for testing smart home privacy and how to prevent them.

Whitney Grace, August 17, 2020

TikTok: Exploiting, Exploited, or Exploiter?

August 12, 2020

I read “TikTok Tracked Users’ Data with a Tactic Google Banned.” [Note: You will have to pay to view this article. Hey, The Murdoch outfit has to have a flow of money to offset its losses from some interesting properties, right?]

The write up reveals that TikTok, the baffler for those over 50, tricked users. Those lucky consumers of 30 second videos allegedly had one of their mobile devices ID numbers sucked into the happy outfit’s data maw. Those ID numbers — unlike the other codes in mobile devices — cannot be changed. (At least, that’s the theory.)

What can one do with a permanent ID number? Let us count some of the things:

  1. Track a user
  2. Track a user
  3. Track a user
  4. Obtain information to pressure a susceptible person into taking an action otherwise not considered by that person?

I think that covers the use cases.

The write up states with non-phone tap seriousness, a business practice of one of the Murdoch progeny:

The identifiers collected by TikTok, called MAC address, are most commonly used for advertising purposes.

Whoa, Nellie. This here is real journalism. A MAC address is shorthand for “media access control.” I think of the MAC address as a number tattooed on a person’s forehead. Sure, it can be removed… mostly. But once a user watches 30-second videos and chases around for “real” information on a network, that unique number can be used to hook together otherwise disparate items of information. The MAC is similar to one of those hash codes which allow fast access to data in a relational structure or maybe an interest graph. One can answer the question, “What are the sites with this MAC address in log files?” The answer can be helpful to some individuals.

There are some issues bubbling beneath the nice surface of the Murdoch article; for example:

  1. Why did Google prohibit access to a MAC address, yet leave a method to access the MAC address available to those in the know? (Those in the know include certain specialized services support US government agencies, ByteDance, and just maybe Google. You know Google. That is the outfit which wants to create a global seismic system using every Android device who owner gives permission to monitor earthquakes. Yep, is that permission really needed? Ho, ho, ho.)
  2. What vendors are providing MAC address correlations across mobile app content and advertising data? The WSJ is chasing some small fish who have visited these secret data chambers, but are there larger, more richly robust outfits in the game? (Yikes, that’s actually going to take more effort than calling a university professor who runs a company about advertising as a side gig. Effort? Yes, not too popular among some “real” Murdoch reporters.)
  3. What are the use cases for interest graphs based on MAC address data? In this week’s DarkCyber video available on Facebook at this link, you can learn about one interesting application: Targeting an individual who is susceptible to outside influence to take an action that individual otherwise would not take. Sounds impossible, no? Sorry, possible, yes.

To summarize, interesting superficial coverage but deeper research was needed to steer the writing into useful territory and away from the WSJ’s tendency to drift closer to News of the World-type information. Bad TikTok, okay. Bad Google? Hmmmm.

Stephen E Arnold, August 12, 2020

Huawei and Its Sci-Fi Convenience Vision

July 9, 2020

One of the DarkCyber research team spotted what looked like a content marketing, rah rah article called “Huawei’s 1+8+N Strategy Will Be a Big Success in China As It Has No Competitors.”

We talked about the article this morning and dismissed its words as less helpful than most recycled PR. The gem in the write up is this diagram which was tough to read in the original. We poked around and came across a Huawei video which you can view on the Sparrow News Web site.

Here’s a version of the 1+8+N diagram. If you are trying to read the word “sphygmomanometer” means blood pressure gizmo. The term is shorthand for “smart medical devices”.

image

The idea is that the smartphone is the de facto surveillance device. It provides tags for the device itself and a “phone number” for the device owner. Burner phones registered to smart puppets require extra hoops, and government authorities are going to come calling when the identify of the burner phone’s owner is determined via cross correlation of metadata.

The diagram has three parts, right? Sort of. First, the “plus” sign in the 1+8+N is Huawei itself. Think of Huawei as the Ma Bell, just definitely very cozy with the Chinese government. The “plus” means glue. The glue unites or fuses the data from the little icons.

The focal point of the strategy is the individual.

From the individual, the diagram shows no phone computing devices. There are nine devices identified, but more can be added. These nine devices connected to an individual are all smart; that is, Internet of things, mobile aware, surveillance centric, and related network connected products.

The 1

The “1” refers to the smartphone.

The 8

The eight refers to the smart devices an individual uses. (The smartphone is interacting with these eight devices either directly or indirectly as long as there is battery and electrical power.)

Augmented / virtual reality “glasses”

Earphones

Personal computers

Speakers

Tablets

Televisions

Watches

Vehicles

The connection between and among the devices is enabled by Huawei HiLink or mobile WiFi, although Bluetooth and other wireless technologies are an option.

The N

The N like the math symbol refers to any number of ecologies. An ecology could be a person riding in a vehicle, watching a presentation displayed by a connected projector, a smart printer, a separate but modern smart camera, a Chinese Roomba type robot, a smart scale for weighing a mobile phone owner, a medical device connected or embedded in an individual, a device streaming a video, a video game played on a device or online, a digital map.

These use cases cluster; for example, mobile, smart home, physical health, entertainment, and travel. Other categories can, of course, be added.

Is 1+8+N the 21st Century E=MC^2?

Possibly. What is clear is that Huawei has done a very good job of mapping out the details of the Chinese intelligence and surveillance strategy. By extension, one can view the diagram as one that could be similar to those developed by the governments of Iran, North Korea, Russia, and a number of other nation states.

The smartphone delivers on its potential in the 1+8+N diagram, if the Huawei vision gets traction.

Observations

The 1+8+N equation has been around since 2019. Its resurfacing may have more to do with Huawei’s desire to be quite clear about what its phones and other products and services can deliver.

The company uses the phrase “full scene” instead of the American jargon of a 360 degree view.

Neither phrase captures the import of data in multiple dimensions. Tracking and analyzing data through time enables a number of interesting dependent features, services, and functions.

The 1+8+N may be less about math and more about intelligence than some of the write ups about the diagram discuss.

Stephen E Arnold, July 9, 2020

Do It Huiwei, Please

July 9, 2020

Believe it or not.

Huawei is a mobile device brand not well known in the United States, but it provides an Android based device to millions of consumers in the eastern hemisphere. Huawai devices are manufactured in China and in May the company held its seventeenth annual analyst summit. Ameyaw Debrah shares the story in the article, “Huawei Analyst Summit: Security And Privacy In A Seamless AI Life-Only You Control Your Personal Data.”

The Vice President of Consumer Cloud Services Eric Tan delivered the keynote speech called “Rethink the Seamless AI Experience with the Global HMS Ecosystem” related to Huawei’s privacy and security related to the cloud, hardware, application development, and global certifications. Tan stated that Huawei abides by GDPR, GAPP, and local laws to guarantee privacy compliance.

Another speaker, Dr. Wang Chenglu spoke about “Software-Powered, Seamless AI Experiences and Ecosystems.” He stated how distributed security builds trust between people, data, and devices to protect user privacy and data:

“He explained that firstly, ensure that users are using the correct devices to process data and Huawei has developed a comprehensive security and privacy management system that covers smart phone chips, kernels, EMUI, and applications. This allows devices to establish trusted connections and transfer data based on end-to-end encryption.

Secondly, ensure the right people are accessing data and operating services via the distributed security architecture which makes coordinated, multi-device authentication possible. An authentication capability resource pool is established by combining the hardware capabilities of different devices. The system provides the best security authentication measures based on authentication requests and security level requirements in different business scenarios.”

Huawei stressed that privacy and security are its MO, but can one believe that “only you control your private life” when. a country-supported company is coding up a storm?”

Whitney Grace, July 9, 2020

Neeva: To the Rescue?

July 2, 2020

After the 2017 scandal involving YouTube ads, Google’s head of advertising left the company. However, Sridhar Ramaswamy was not finished with search; he promised then to find another way that did not depend on ads. Now we learn subscription service Neeva is that promised approach from Ars Technica’s article, “Search Engine Startup Asks Users to Be the Customer, not the Product.” Not only does paying to search through Neeva allow one to avoid ads, the platform vows to respect user privacy, as well.

There are just a couple, fundamental problems. First, will enough users actually pay to search when they are used to Googling for free? Critics suspect most users will opt to accept ads over paying a fee. As for the privacy promise, we already have (ad-supported) privacy-centric search platforms DuckDuckGo and Startpage. Besides, though Neeva’s “Digital Bill of Rights” that dominates the company’s About page sounds nice, the official Privacy Policy linked in the site’s footers prompts doubt. Reporter Jim Salter writes:

“Neeva opens that section by saying it does not share, disclose, or sell your personal information with third parties ‘outside of the necessary cases below’—but those necessary cases include ‘Affiliates,’ with the very brusque statement that Neeva ‘may share personal information with our affiliated companies.’ Although the subsections on both Service Providers and Advertising Partners are hedged with usage limitations, there are no such limits given for data shared with ‘Affiliates.’ The document also provides no concrete definition of who the term ‘Affiliates’ might refer to, or in what context.

We noted:

“More security-conscious users should also be aware of Neeva’s Data Retention policy, which simply states ‘we store the personal information we receive as described in this Privacy Policy for as long as you use our Services or as necessary to fulfill the purposes for which it was collected… [including pursuit of] legitimate business purposes.’ Given that the data collection may include direct connection to a user’s primary Google or Microsoft email account, this might amount to a truly unsettling volume of personal data—data that is now vulnerable to compromise of Neeva’s services, as well as use or sale (particularly in the case of acquisition or merger) by Neeva itself.”

Neeva is currently in beta testing, but anyone still interested can sign up to be an early tester on waitlist at the bottom of this blog post. Though Neeva has yet to set a price for its subscription, we’re told it should be under $10 per month.

Cynthia Murrell, July 2, 2020

Smartphones: Surveillance Facilitated?

May 22, 2020

A recent study published in the Journal of Marketing suggests we tend to reveal more about ourselves when we communicate through our smartphones than when we are on our desktops. The research was performed at the University of Pennsylvania by Shiri Melumad and Robert Meyer. Scienmag explores the tendency in, “Why Smartphones Are Digital Truth Serum.” We learn:

“For example, Tweets and reviews composed on smartphones are more likely to be written from the perspective of the first person, to disclose negative emotions, and to discuss the writer’s private family and personal friends. Likewise, when consumers receive an online ad that requests personal information (such as phone number and income), they are more likely to provide it when the request is received on their smartphone compared to their desktop or laptop computer.”

But why would we do this? For one thing, users seem to be subconsciously affected by the challenges inherent in using a smaller device:

“[The smaller size] makes viewing and creating content generally more difficult compared with desktop computers. Because of this difficulty, when writing or responding on a smartphone, a person tends to narrowly focus on completing the task and become less cognizant of external factors that would normally inhibit self-disclosure, such as concerns about what others would do with the information.”

Then there is the fact that most of us keep our phones on our person or near us constantly—they have become a modern comfort item (or “adult pacifiers,” as Melumad puts it). The article explains:

“The downstream effect of those feelings shows itself when people are more willing to disclose feelings to a close friend compared to a stranger or open up to a therapist in a comfortable rather than uncomfortable setting. As Meyer says, ‘Similarly, when writing on our phones, we tend to feel that we are in a comfortable “safe zone.” As a consequence, we are more willing to open up about ourselves.’”

The researchers analyzed thousands of social media posts and online reviews, responses to web ads, and controlled laboratory studies using both natural-language processing and human analysts. They also examined responses to nearly 20,000 “call to action” web ads that asked users for private info—such ads deployed on smartphones were consistently more successful at raking in personal data than those aimed at PCs. So consumers beware—do not give in to the tendency get too chummy with those on the other end of your phone just because you are comfortable with the phone itself.

Cynthia Murrell, May 22, 2020

Survey Says, Make the Content Go Away, Please

May 19, 2020

TechRadar states the obvious—“Want to Remove Information About Yourself Online? You’re Not Alone.” The write-up cites a recent Kaspersky survey of over 15,000 respondents. It confirms people are finally taking notice that their personal data has been making its way across the Web. The findings show a high percentage of Internet users have tried to erase personal information online, and for good reason, but many have met with little success. Writer Mike Moore reports:

“Four in five people (82 percent) surveyed in a major study by Kaspersky said they had tried to remove private information which had been publicly available, either from websites or social media channels, recently. However a third (37 percent) of those surveyed had no idea of how to remove details about themselves online. … [The survey] found that over a third (34 percent) of consumers have faced incidents where their private information was accessed by someone who did not have their consent. Of these incidents, over a quarter (29 percent) resulted in financial losses and emotional distress, and more than a third (35 percent) saw someone able to gain access to personal devices without permission. This rises to 39 percent among those aged between 25 and 34, despite younger internet users often being expected to have higher levels of technological literacy. Overall, one in five people say they are concerned about the personal data that organizations are collecting about them and their loved ones.”

The standard recommendation to protect privacy in the first place has been to use a VPN, but even that may be inadequate. A study performed by TechRadar Pro found that nearly half of all VPN services are based in countries that are part of the Fourteen Eyes international surveillance alliance. Looking for alternatives? Moore shares this link to a TechRadar article on what they say are the most secure VPN providers.

Cynthia Murrell, May 19, 2020

Work from Home: Trust but Use Monitoring Software

May 19, 2020

As the COVID-19 pandemic keeps offices closed and employees continue to work from home, bosses want to be sure their subordinates are working. According to the Washington Post, bosses are “replicating the office” using webcams, microphones, and surveillance software says the article, “Managers Turn To Surveillance Software, Always-On Webcams To Ensure Employees Are (Really) Working From Home.”

Harking back to the chatrooms of yesteryear, employees log into digital work spaces with customizable avatars and chatroom cubicles with instructions to keep webcams and microphones on all day. The idea of the digital workspace designed by Pragli will encourage spontaneous conversation. Some quickly adapt to the technology change, others have difficulty.

While some companies do not replicate the office with programs, they are using other tools such as always on webcams, check-ins, and mandatory digital meetings. There is the concern that companies are being invasive:

“Company leaders say the systems are built to boost productivity and make the quiet isolation of remote work more chipper, connected and fun. But some workers said all of this new corporate surveillance has further blurred the lines between their work and personal lives, amping up their stress and exhaustion at a time when few feel they have the standing to push back.”

Since the COVID-19 forced the American workforce into quarantine, companies want to confirm their workers’ productivity and report on how they are spending their business hours. There has also been an increase in the amount of time Americans spend working each day.

InterGuard is a software that can be hidden on computers and creates a log of everything a worker did during the day. The software records everything a worker does as frequently as every five seconds. It ranks the apps and Web sites as “productive” and “unproductive,” then tallies a “productivity score.”

Many employees do not like the surveillance software and cite that the need to confirm they are actually working disrupts their work flow. Pragli, on the other hand, says the replication of human interaction brings employees closer and allows them to connect more frequently.

A new meaning for the phrase “trust but verify.”

Whitney Grace, May 19, 2020

Sensors and Surveillance: A Marriage Made in Sci Fi

May 4, 2020

We can expect the volume of data available for analyses, tracking, and monitoring to skyrocket. EurekaAlert!, a site operated by the American Association for the Advancement of Science, reports, “Tiny Sensors Fit 30,000 to a Penny, Transmit Data from Living Tissue.” The project out of the Cornell Center for Materials Research was described in the team’s paper, published in PNAS on April 16. The optical wireless integrated circuits (OWICs) are a mere 100 microns in size. The news release explains:

“[The sensors] are equipped with an integrated circuit, solar cells and light-emitting diodes (LEDs) that enable them to harness light for power and communication. And because they are mass fabricated, with up to 1 million sitting on an 8-inch wafer, each device costs a fraction of that same penny. The sensors can be used to measure inputs like voltage and temperature in hard-to-reach environments, such as inside living tissue and micro fluidic systems. For example, when rigged with a neural sensor, they would be able to noninvasively record nerve signals in the body and transmit findings by blinking a coded signal via the LED. … The OWICS are essentially paramecium-size smartphones that can be specialized with apps. But rather than rely on cumbersome radio frequency technology, as cell phones do, the researchers looked to light as a potential power source and communication medium.”

The researchers have already formed a company, OWiC Technologies, to market the sensors and have applied for a patent. The first planned application is a line of e-tags for product identification. The write-up predicts many different uses will follow for these micro sensors that can track more complicated data with less power for fewer dollars. Stay tuned.

Cynthia Murrell, May 4, 2020

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta