Listening to Mobile Calls: Maybe? Maybe Not

August 18, 2020

An online publication called Hitb.org has published “Hackers Can Eavesdrop on Mobile Calls with $7,000 Worth of Equipment.” Law enforcement and other government entities often pay more for equipment which performs similar functions. Maybe $7,000 is a bargain, assuming the technology works and does not lead to an immediate visit from government authorities.

According to the write up, you can listen to mobile calls using a method called “ReVoLTE”, a play on the LTE or long term evolution cellular technology. The article reports:

Now, researchers have demonstrated a weakness that allows attackers with modest resources to eavesdrop on calls. Their technique, dubbed ReVoLTE, uses a software-defined radio to pull the signal a carrier’s base station transmits to a phone of an attacker’s choosing, as long as the attacker is connected to the same cell tower (typically within a few hundred meters to few kilometers) and knows the phone number. Because of an error in the way many carriers implement VoLTE, the attack converts cryptographically scrambled data into unencrypted sound. The result is a threat to the privacy of a growing segment of cell phone users. The cost: about $7,000.

Ah, ha, a catch. One has to be a researcher, which implies access to low cost, highly motivated students eager to get an A. Also, the “researcher” words makes it clear that one cannot order the needed equipment with one click on Amazon’s ecommerce site.

How realistic is this $7,000 claim? DarkCyber thinks that a person interested in gaining access to mobile calls may want to stay in school. CalTech or Georgia Tech may be institutions to consider. Then after getting an appropriate degree, work for one of the specialized services firms developing software and hardware for law enforcement.

On the other hand, if you can build these devices in your bedroom, why not skip school and contact one of the enforcement agencies in the US or elsewhere. DarkCyber has a suggestion. Unlawful intercept can lead to some interesting learning experiences with government authorities. Too bad similar enforcement does not kick in for misleading headlines for articles which contain fluff. That sounds like I am pointing out flaws in Silicon Valley-style reporting. Okay, okay, I am.

Stephen E Arnold, August 18, 2020

TikTok: Exploiting, Exploited, or Exploiter?

August 12, 2020

I read “TikTok Tracked Users’ Data with a Tactic Google Banned.” [Note: You will have to pay to view this article. Hey, The Murdoch outfit has to have a flow of money to offset its losses from some interesting properties, right?]

The write up reveals that TikTok, the baffler for those over 50, tricked users. Those lucky consumers of 30 second videos allegedly had one of their mobile devices ID numbers sucked into the happy outfit’s data maw. Those ID numbers — unlike the other codes in mobile devices — cannot be changed. (At least, that’s the theory.)

What can one do with a permanent ID number? Let us count some of the things:

  1. Track a user
  2. Track a user
  3. Track a user
  4. Obtain information to pressure a susceptible person into taking an action otherwise not considered by that person?

I think that covers the use cases.

The write up states with non-phone tap seriousness, a business practice of one of the Murdoch progeny:

The identifiers collected by TikTok, called MAC address, are most commonly used for advertising purposes.

Whoa, Nellie. This here is real journalism. A MAC address is shorthand for “media access control.” I think of the MAC address as a number tattooed on a person’s forehead. Sure, it can be removed… mostly. But once a user watches 30-second videos and chases around for “real” information on a network, that unique number can be used to hook together otherwise disparate items of information. The MAC is similar to one of those hash codes which allow fast access to data in a relational structure or maybe an interest graph. One can answer the question, “What are the sites with this MAC address in log files?” The answer can be helpful to some individuals.

There are some issues bubbling beneath the nice surface of the Murdoch article; for example:

  1. Why did Google prohibit access to a MAC address, yet leave a method to access the MAC address available to those in the know? (Those in the know include certain specialized services support US government agencies, ByteDance, and just maybe Google. You know Google. That is the outfit which wants to create a global seismic system using every Android device who owner gives permission to monitor earthquakes. Yep, is that permission really needed? Ho, ho, ho.)
  2. What vendors are providing MAC address correlations across mobile app content and advertising data? The WSJ is chasing some small fish who have visited these secret data chambers, but are there larger, more richly robust outfits in the game? (Yikes, that’s actually going to take more effort than calling a university professor who runs a company about advertising as a side gig. Effort? Yes, not too popular among some “real” Murdoch reporters.)
  3. What are the use cases for interest graphs based on MAC address data? In this week’s DarkCyber video available on Facebook at this link, you can learn about one interesting application: Targeting an individual who is susceptible to outside influence to take an action that individual otherwise would not take. Sounds impossible, no? Sorry, possible, yes.

To summarize, interesting superficial coverage but deeper research was needed to steer the writing into useful territory and away from the WSJ’s tendency to drift closer to News of the World-type information. Bad TikTok, okay. Bad Google? Hmmmm.

Stephen E Arnold, August 12, 2020

More about India App Banning

July 23, 2020

India and China are not likely to hold a fiesta to celebrate the digital revolution in the next month or two. “Government Said to Ask Makers of 59 Banned Chinese Apps to Ensure Strict Compliance” explains that India has some firm ideas about the potential risks of Chinese-centric and Chinese-developed mobile applications. The risks include actions “prejudicial to sovereignty, integrity and security of the country.”

The write up states:

If any app in the banned list is found to be made available by the company through any means for use within India, directly or indirectly, it would be construed as a violation of the government orders…

It is not clear what action the Indian government can take, but obviously the issue is perceived as important; specifically, the accusation relates to the:

stealing and surreptitiously transmitting users’ data in an unauthorized manner to servers which have locations outside India.

Among the nearly 60 banned apps are:

  • Club Factory
  • TikTok
  • UC Browser
  • WeChat
  • Xiaomi

Plus, some less high profile services:

  • Bigo Live
  • CamScanner
  • Helo
  • Likee
  • Shein

There will be workarounds, of course. It is not clear if a citizen persists in using a Xiaomi phone and its baked in apps (some of which route interesting information through data centers in Singapore) what the consequences will be.

Censorship of the Internet is thriving and becoming an active measure in India and other countries. Why? Because Internet, of course.

Stephen E Arnold, July 23, 2020

Do It Huiwei, Please

July 9, 2020

Believe it or not.

Huawei is a mobile device brand not well known in the United States, but it provides an Android based device to millions of consumers in the eastern hemisphere. Huawai devices are manufactured in China and in May the company held its seventeenth annual analyst summit. Ameyaw Debrah shares the story in the article, “Huawei Analyst Summit: Security And Privacy In A Seamless AI Life-Only You Control Your Personal Data.”

The Vice President of Consumer Cloud Services Eric Tan delivered the keynote speech called “Rethink the Seamless AI Experience with the Global HMS Ecosystem” related to Huawei’s privacy and security related to the cloud, hardware, application development, and global certifications. Tan stated that Huawei abides by GDPR, GAPP, and local laws to guarantee privacy compliance.

Another speaker, Dr. Wang Chenglu spoke about “Software-Powered, Seamless AI Experiences and Ecosystems.” He stated how distributed security builds trust between people, data, and devices to protect user privacy and data:

“He explained that firstly, ensure that users are using the correct devices to process data and Huawei has developed a comprehensive security and privacy management system that covers smart phone chips, kernels, EMUI, and applications. This allows devices to establish trusted connections and transfer data based on end-to-end encryption.

Secondly, ensure the right people are accessing data and operating services via the distributed security architecture which makes coordinated, multi-device authentication possible. An authentication capability resource pool is established by combining the hardware capabilities of different devices. The system provides the best security authentication measures based on authentication requests and security level requirements in different business scenarios.”

Huawei stressed that privacy and security are its MO, but can one believe that “only you control your private life” when. a country-supported company is coding up a storm?”

Whitney Grace, July 9, 2020

Geospatial: Context and Opinions

June 24, 2020

DarkCyber spotted a sequence of tweets published by that well managed, completely coherent, and remarkable outfit Twitter. Twitter disseminated brief emissions from Joe Morrison who uses the handle “mouth of Morrison.” Love that Twitter thing!

The write up in Quibi style chunks is about geospatial technology. As it turns out, mobile devices and smart gizmos output geographic coordinates. These are useful to many.

The observations in the stream of tweets explain that geospatial is mostly a bad idea. DarkCyber says, “Ho, ho, ho.”

Two warrant highlighting, but you may find other faves in the list.

Let’s begin:

The most successful and ambitious mapping project of all time, Google Maps, is an advertising platform. There is no “geospatial industry,” only industries with spatial problems.

Yep, the Google. Nevertheless, one must give the GOOG credit for buying Keyhole, morphing an intelligence operation into a cog in ad sales, and then building a large scale geospatial data vacuum cleaner. Remember the comment about capturing Wi-Fi data: “Wow, no idea how that happened.” Does that help you jog down memory lane.

The second emission we noted is:

In geo, you either die a hero or live long enough to make the majority of your revenue from defense and intelligence.

This is sort of accurate. Including law enforcement might be a more accurate characterization of where the money is, however.

These earthworm emissions are amusing; for example, “ESRI is a petty, anti competitive bully”. Are any lawyers paying attention? Also, big companies use open source software and don’t give back. No kidding? Ever hear of code cost reduction?

Worth a look. More context, explanation, and details would add some muscle to the tweeter bones.

Stephen E Arnold, June 24, 2020

Mobile Security Is Possible, But It Is Work

June 10, 2020

Ads are a pain on desktop devices, but they are even more annoying on mobile devices. The worst type of ads are the ones where the X is hidden, making it impossible to close the ad. Mobile ads are only getting worse as mobile devices become SOP and IT-Online shares more insight into the “Mobile Adware: The Silent Plague With No Origin.”

The article focuses on a research from the Check Point’s Cyber Security Report 2020 and the insights are alarming. According to the security report, 27% of companies experienced a security breach through a mobile device. What is even worse is that most companies do not prioritize mobile security, making mobile devices the most vulnerable area. Check Point’s regional director stated:

“ ‘It only takes one compromised mobile device for cybercriminals to steal confidential information and access an organisation’s corporate network,’ explains Pankaj Bhula, Regional Director: Africa at Check Point. ‘More and more mobile threats are created each day, with higher levels of sophistication and larger success rates. Mobile adware, a form of malware designed to display unwanted advertisements on a user’s screen, is utilised by cybercriminals to execute sixth-generation cyberattacks.’”

Adware is like a plague, because it can secretly be downloaded onto a phone and collect a user’s personal information from location to banking information. Adware is designed to sneak onto a phone and deleting it is harder than finding an X on an annoying ad. Adware sneaks onto mobiles devices through applications, usually through a device’s specific store.

It is smart advice to not download third party apps from unverified companies, especially ones discussed in ads or low download rates. Do not trust anything without researching it first.

Whitney Grace, June 10, 2020

Smartphones: Surveillance Facilitated?

May 22, 2020

A recent study published in the Journal of Marketing suggests we tend to reveal more about ourselves when we communicate through our smartphones than when we are on our desktops. The research was performed at the University of Pennsylvania by Shiri Melumad and Robert Meyer. Scienmag explores the tendency in, “Why Smartphones Are Digital Truth Serum.” We learn:

“For example, Tweets and reviews composed on smartphones are more likely to be written from the perspective of the first person, to disclose negative emotions, and to discuss the writer’s private family and personal friends. Likewise, when consumers receive an online ad that requests personal information (such as phone number and income), they are more likely to provide it when the request is received on their smartphone compared to their desktop or laptop computer.”

But why would we do this? For one thing, users seem to be subconsciously affected by the challenges inherent in using a smaller device:

“[The smaller size] makes viewing and creating content generally more difficult compared with desktop computers. Because of this difficulty, when writing or responding on a smartphone, a person tends to narrowly focus on completing the task and become less cognizant of external factors that would normally inhibit self-disclosure, such as concerns about what others would do with the information.”

Then there is the fact that most of us keep our phones on our person or near us constantly—they have become a modern comfort item (or “adult pacifiers,” as Melumad puts it). The article explains:

“The downstream effect of those feelings shows itself when people are more willing to disclose feelings to a close friend compared to a stranger or open up to a therapist in a comfortable rather than uncomfortable setting. As Meyer says, ‘Similarly, when writing on our phones, we tend to feel that we are in a comfortable “safe zone.” As a consequence, we are more willing to open up about ourselves.’”

The researchers analyzed thousands of social media posts and online reviews, responses to web ads, and controlled laboratory studies using both natural-language processing and human analysts. They also examined responses to nearly 20,000 “call to action” web ads that asked users for private info—such ads deployed on smartphones were consistently more successful at raking in personal data than those aimed at PCs. So consumers beware—do not give in to the tendency get too chummy with those on the other end of your phone just because you are comfortable with the phone itself.

Cynthia Murrell, May 22, 2020

Google Apple Contact Tracing Interface

May 9, 2020

Now Toronto published “Here’s What Apple and Google’s COVID-19 Contact Tracing App Looks Like.” The article includes sample screenshots and some explanation about the data displayed. Worth a look. Much more is possible in terms of tracking, contact mapping, and analytics, of course. Who or what will have access to these more useful views of the collected data?

Stephen E Arnold, May 9, 2020

Cambridge Analytica Alum: Social Media Is Like Bad, You Know

April 4, 2020

A voice of (in)experience describes how tech companies can be dangerous when left unchecked. Channel News Asia reports, “Tech Must Be Regulated Like Tobacco, says Cambridge Analytica Whistleblower.” Christopher Wylie is the data scientist who exposed Cambridge Analytica’s use of Facebook data to manipulate the 2016 presidential election, among others. He declares society has yet to learn the lesson of that scandal. Yes, Facebook was fined a substantial sum, but it and other tech giants continue to operate with little to no oversight. The article states:

“Wylie details in his book how personality profiles mined from Facebook were weaponised to ‘radicalise’ individuals through psychographic profiling and targeting techniques. So great is their potential power over society and people’s lives that tech professionals need to be subject to the same codes of ethics as doctors and lawyers, he told AFP as his book was published in France. ‘Profiling work that we were doing to look at who was most vulnerable to being radicalised … was used to identify people in the US who were susceptible to radicalisation so that they could be encouraged and catalysed on that path,’ he said. ‘You are being intentionally monitored so that your unique biases, your anxieties, your weaknesses, your needs, your desires can be quantified in such a way that a company can seek to exploit that for profit,’ said the 30-year-old. Wylie, who blew the whistle to British newspaper, The Guardian, in Mar 2018, said at least people now realise how powerful data can be.”

As in any industry, tech companies are made up of humans, some of whom are willing to put money over morality. And as in other consequential industries like construction, engineering, medicine, and law, Wylie argues, regulations are required to protect consumers from that which they do not understand.

Cynthia Murrell, April 4, 2020

NSO: Back in the News Again

April 3, 2020

Let’s assume that the Beeb is on the money. “Coronavirus: Israeli Spyware Firm Pitches to Be Covid 19 Saviors” is a bit of British snark. First, the word “coronavirus” is newsy, and it is clickbait. Second, “Israeli spyware pitches” converts the use of specialized software into a carnival barker’s shout. (One might ask, “Why?” I think I know the answer. The British Cervantes is on the gallop perhaps?)

The point of the story which contains some loaded words like “controversial” is that NSO has technology which can assist governments in gathering useful information about the virus. The write up states after the Beeb explains that Facebook and NSO are in a legal wrestling match:

NSO says its employees will not have access to any data, but its software will work best if a government asks local mobile phone operators to provide the records of every subscriber in the country. Each person known to be infected with Covid-19 could then be tracked, with the people they had met and the places they had visited, even before showing symptoms, plotted on a map.

Scary, ominous, Orwellian, something that British government agencies would never, ever in a million years consider.

The reality is that monitoring a population is happening in quite a few countries. Perhaps even merrie olde Land of the Angles?

A news story is okay. Shading the coverage to advance the agenda “NSO is just not such a fine piece of British wool” is unsettling — possibly more so than specialized service firms’ software.

Stephen E Arnold, April 3, 2020

Next Page »

  • Archives

  • Recent Posts

  • Meta