Cybersecurity Lapse: Lawyers and Technology

October 29, 2020

Hackers Steal Personal Data of Google Employees after Breaching US Law Firm” is an interesting article. First, it reminds the reader that security at law firms may not be a core competency. Second, it makes clear that personal information can be captured in order to learn about every elected officials favorite company Google.

The write up states:

Fragomen, Del Rey, Bernsen & Loewy LLP, a law firm that offers employment verification compliance services to Google in the United States, suffered unauthorized access into its computer systems in September that resulted in hackers accessing the personal information of present and former Google employees.

The article quotes to a cyber security professional from ImmuniWeb. I learned:

According to Ilia Kolochenko, Founder & CEO of ImmuniWeb, the fact that hackers targeted a law firm that stores a large amount of data associated with present and former Google employees is not surprising as law firms possess a great wealth of the most confidential and sensitive data of their wealthy or politically-exposed clients, and habitually cannot afford the same state-of-the-art level of cybersecurity as the original data owners.

This law firm, according to the write up, handles not just the luminary Google. It also does work for Lady Gaga and Run DMC.

Stephen E Arnold, October 29, 2020

Music and Moods: Research Verifies the Obvious

October 21, 2020

It has been proven that music can have positive or negative psychological impacts on people. Following this train of research, Business Line reports that playlists are a better reflection of mood than once thought, “Your Playlist Mirrors Your Mood, Confirms IIIT-Hyderabad Study.”

The newest study on music and its effect on mood titled “Tag2risk: Harnessing Social Music Tags For Characterizing Depression Risk, Cover Over 500 Individuals” comes from the International Institute of Information Technology in Hyderabad (IIIT-H). The study discovered that people who listen to sad music can be thrown into depression. Vinoo Alluri and her students from IIIT-H’s cognitive science department investigated if they could identify music listeners with depressive tendencies from their music listening habits.

Over five hundred people’s music listening histories were studied. The researchers discovered that repeatedly listening to sad music was used as an avoidance tool and a coping mechanism. These practices, however, also kept people in depressive moods. Music listeners in the study were also drawn to music sub genres tagged with “sadness” and tenderness.

We noted:

“ ‘While it can be cathartic sometimes, repeatedly being in such states may be an indicator of potential underlying mental illness and this is reflected in their choice and usage of music,’ Vinoo Alluri points out. She feels that music listening habits can be changed. But, in order to do that, they need to be identified first by uncovering their listening habits. It is possible to break the pattern of “ruminative and repetitive music usage”, which will lead to a more positive outcome.”

Alluri’s study is an amazing investigation into the power and importance of music. Her research, however, only ratifies what music listeners and teenagers have known for decades.

Whitney Grace, October 21, 2020

Infohazards: Another 2020 Requirement

October 20, 2020

New technologies that become society staples have risks and require policies to rein in potential dangers. Artificial intelligence is a developing technology. Governing policies have yet to catch up with the emerging tool. Experts in computer science, government, and other controlling organizations need to discuss how to control AI says Vanessa Kosoy in the Less Wrong blog post: “Needed: AI Infohazard Policy.”

Kosoy approaches her discussion about the need for a controlling AI information policy with the standard science fiction warning argument: “AI risk is that AI is a danger, and therefore research into AI might be dangerous.” It is good to draw caution from science fiction to prevent real world disaster. Experts must develop a governing body of AI guidelines to determine what learned information should be shared and how to handle results that are not published.

Individuals and single organizations cannot make these decisions alone, even if they do have their own governing policies. Governing organizations and people must coordinate their knowledge regarding AI and develop a consensual policies to control AI information. Kozoy determines that any AI policy shoulder consider the following:

• “Some results might have implications that shorten the AI timelines, but are still good to publish since the distribution of outcomes is improved.

• Usually we shouldn’t even start working on something which is in the should-not-be-published category, but sometimes the implications only become clear later, and sometimes dangerous knowledge might still be net positive as long as it’s contained.

• In the midgame, it is unlikely for any given group to make it all the way to safe AGI by itself. Therefore, safe AGI is a broad collective effort and we should expect most results to be published. In the endgame, it might become likely for a given group to make it all the way to safe AGI. In this case, incentives for secrecy become stronger.

• The policy should not fail to address extreme situations that we only expect to arise rarely, because those situations might have especially major consequences.”

She continues that any AI information policy should determine the criteria for what information is published, what channels should be consulted to determine publication, and how to handle potentially dangerous information.

These questions are universal for any type of technology and information that has potential hazards. However, specificity of technological policies weeds out any pedantic bickering and sets standards for everyone, individuals and organizations. The problem is getting everyone to agree on the policies.

Whitney Grace, October 20, 2020

Covid Trackers Are Wheezing in Europe

October 19, 2020

COVID-19 continues to roar across the world. Health professionals and technologists have combined their intellects attempting to provide tools to the public. The Star Tribune explains how Europe wanted to use apps to track the virus: “As Europe Faces 2nd Wave Of Virus, Tracing Apps Lack Impact.”

Europe planned that mobile apps tracking where infected COVID-19 individuals are located would be integral to battling the virus. As 2020 nears the end, the apps have failed because of privacy concerns, lack of public interest, and technical problems. The latter is not a surprise given the demand for a rush job. The apps were supposed to notify people when they were near infected people.

Health professionals predicted that 60% of European country populations would download and use the apps, but adoption rates are low. The Finnish, however, reacted positively and one-third of the country downloaded their country’s specific COVID-19 tracking app. Finland’s population ironically resists wearing masks in public.

The apps keep infected people’s identities secret. Their data remains anonymous and the apps only alert others if they come in contact with a virus carrier. If the information provides any help to medical professionals remains to be seen:

“We might never know for sure, said Stephen Farrell, a computer scientist at Trinity College Dublin who has studied tracing apps. That’s because most apps don’t require contact information from users, without which health authorities can’t follow up. That means it’s hard to assess how many contacts are being picked up only through apps, how their positive test rates compare with the average, and how many people who are being identified anyway are getting tested sooner and how quickly. ‘I’m not aware of any health authority measuring and publishing information about those things, and indeed they are likely hard to measure,’ Farrell said.”

Are these apps actually helpful? Maybe. But they require maintenance and constant updating. They could prevent some of the virus from spreading, but sticking to tried and true methods of social distancing, wearing masks, and washing hands work better.

Whitney Grace, October 19, 2020

Apple and AWS: Security?

October 13, 2020

DarkCyber noted an essay-style report called “We Hacked Apple for 3 Months: Here’s What We Found.” The write up contains some interesting information. One particular item caught our attention:

AWS Secret Keys via PhantomJS iTune Banners and Book Title XSS

The information the data explorers located potential vulnerabilities to allow such alleged actions as:

  • Obtain what are essentially keys to various internal and external employee applications
  • Disclose various secrets (database credentials, OAuth secrets, private keys) from the various design.apple.com applications
  • Likely compromise the various internal applications via the publicly exposed GSF portal
  • Execute arbitrary Vertica SQL queries and extract database information

Other issues are touched upon in the write up.

Net net: The emperor has some clothes; they are just filled with holes and poorly done stitching if the write up is correct.

Stephen E Arnold, October 13, 2020

Amazon: The Bulldozer Grinds Forward

October 7, 2020

It is hard to tell whether the company is shameless or clueless. Either way, SlashGear observes, “Amazon Has A Creepiness Problem.” The growingly ubiquitous tech giant recently unveiled two products that will make privacy enthusiasts shiver. Writer Chris Davies reports:

“The Echo Show 10, for example, brings movement to Amazon’s smart displays, with a rotating base that promises to track you as you wander around the room. The result? A perfectly-centered video call, or a more attentive Alexa, whether you’re stood at the sink or raiding the refrigerator. Echo Show 10 seems positively pedestrian, though, in comparison to the Ring Always Home Cam. Part drone, part security camera, it launches out of a base station that resembles a fancy fragrance diffuser and then buzzes around your home to spot intruders or misbehaving pets. Never mind wondering whether the microphone on your Echo is disabled: now, the cameras themselves will be airborne.”

Naturally, Amazon offers reassurances that users are in complete control of what the devices observe and transmit. The Ring drone maintains a certain hum so one can hear it coming, and users can limit its flight area. Also, when it is docked, the camera is physically blocked. The Echo Show 10 relies on visual and audio cues to keep the user center stage, but we’re assured that data is processed locally and immediately deleted. But there is no easy way to verify the devices respect these restrictions. Users will just have to take Amazon’s word. Davies considers:

“Rationally, nothing Amazon has announced today is any more intrusive or dangerous to privacy than, well, any other smart speaker or connected camera the company has offered before. All the same, there’s a gulf between perception and reality. I could understand you being skeptical about Amazon’s intentions – and its technology – simply because it’s, well, Amazon. The company that knows so much about your shopping habits it can make pitch-perfect recommendations; the company that wants to put microphones and cameras all over your house, in your car, and in your hotel room. […]

The man has a point. Apparently, many consumers do trust Amazon enough to place these potential spies in their homes and offices. Others, though, do not. Will a day come when it will be difficult to function in society without them? We think Amazon hopes so.

Cynthia Murrell, October 7, 2020

TikTok: Maybe Some Useful Information?

September 19, 2020

US President Donald Trump banned Americans from using TikTok, because of potential information leaks to China. In an ironic twist, The Intercept explains “Leaked Documents Reveal What TikTok Shares With Authorities—In The U.S.” It is not a secret in the United States that social media platforms from TikTok to Facebook collect user data as ways to spy and sell products.

While the US monitors its citizens, it does not take the same censorship measures as China does with its people. It is alarming the amount of data TikTok gathers for the Chinese, but leaked documents show that the US also accesses that data. Data privacy has been a controversial topic for years within the United States and experts argue that TikTok collects the same type of information as Google, Amazon, and Facebook. The documents reveal that ByteDance, TikTok’s parent company, the FBI, and Department of Homeland Security monitored the platform.

Law enforcement officials use TikTok as a means to monitor social unrest related to the death of George Floyd. Floyd suffocated when a police officer cut off his oxygen attempting to restrain him during arrest. TikTok users post videos about Black Lives Matter, police protests, tips for disarming law enforcement, and even jokes about the US’s current upheaval. TikTok’s user agreement says it collects information and will share it with third parties. The third parties include law enforcement if TikTok feels there is an imminent danger.

TikTok, however, also censors videos, particularly those the Chinese government dislikes. These videos include political views, the Hong Kong protests, Uyghur internment camps, and people considered poor, disabled, or ugly.

Trump might try to make the US appear as the better country, but:

““The common concern, whether we’re talking about TikTok or Huawei, isn’t the intentions of that company necessarily but the framework within which it operates,” said Elsa Kania, an expert on Chinese technology at the Center for a New American Security. “You could criticize American companies for having an opaque relationship to the U.S. government, but there definitely is a different character to the ecosystem.” At the same time, she added, the Trump administration’s actions, including a handling of Portland protests that brought to mind the police crackdown in Hong Kong, have undercut official critiques of Chinese practices: “At a moment when we’re seeing attempts by the administration to draw a contrast in terms of values and ideology with China, these eerie parallels that keep recurring do really undermine that.”

The issue is contentious. Information does not have to be used at the time of collection. The actions of youth can be used to exert pressure at a future time. That may be the larger risk.

Whitney Grace, September 19, 2020

Apple, Google Make it Easier for States to Adopt Virus Tracing App

September 12, 2020

Google and Apple created an app that would, with the cooperation of state governments, aid in tracing the spread of the coronavirus and notify citizens if they spent time around someone known to have tested positive. It is nice to see these rivals working together for the common good. So far, though, only a few states have adopted the technology. In order to encourage more states to join in, AP News reveals, “Apple, Google Build Virus-Tracing Tech Directly into Phones.” Reporter Matt O’Brien writes:

“Apple and Google are trying to get more U.S. states to adopt their phone-based approach for tracing and curbing the spread of the coronavirus by building more of the necessary technology directly into phone software. That could make it much easier for people to get the tool on their phone even if their local public health agency hasn’t built its own compatible app. The tech giants on Tuesday launched the second phase of their ‘exposure notification’ system, designed to automatically alert people if they might have been exposed to the coronavirus. Until now, only a handful of U.S. states have built pandemic apps using the tech companies’ framework, which has seen somewhat wider adoption in Europe and other parts of the world.”

In states that do adopt the system, iPhone users will be prompted for consent to run it on their phones. Android users will have to download the app, which Google will auto-generate for each public health agency that participates. Early adopters are expected to be Maryland, Nevada, Virginia, and Washington D.C. Virginia was the first to use the framework to launch a customized app in early August, followed by North Dakota, Wyoming, Alabama, and Nevada. O’Brien describes how it works:

“The technology relies on Bluetooth wireless signals to determine whether an individual has spent time near anyone else who has tested positive for the virus. Both people in this scenario must have signed up to use the Google-Apple technology. Instead of geographic location, the app relies on proximity. The companies say the app won’t reveal personal information either to them or public health officials.”

This all sounds helpful. However, the world being what it is today, we must ask: does this have surveillance applications? Perhaps. Note we’re promised the app won’t “reveal” personal data, but will it retain it? If it does, will agencies be able to resist this big, juicy pile of data? Promises about surveillance have a way of being broken, after all.

Cynthia Murrell, September 12, 2020

Surveillance Footage Has Value

September 10, 2020

It is not a secret that Google, Facebook, Apple, Instagram, and other large technology companies gather user data and sell it to the highest bidder. It is a easy way to pad their bottom line, especially when users freely give away this information. The Russian city of Moscow wants to ad more revenue to the city’s coffers, so they came up with an ingenious way to get more cash says Yahoo Finance, “Moscow May Sell Footage From Public Secret Camera: Report.”

According to the report, Moscow’s tech branch plans to broadcast videos captured on cameras in public areas. Technically, at least within the United States, if you are in a public place you are free to be filmed and whoever does the filming can do whatever they want with the footage. Russia must be acting on the same principle, so Moscow’s Department of Information Technologies purchased cameras to install outside of 539 hospitals. It might also be a way to increase security.

All of the footage will be stored on a central database and people will be able to purchase footage. The footage will also be shown on the Internet.

What is alarming is that MBK Media wrote in December 2019 that footage from Moscow’s street cameras was available for purchase on black markets with options to access individual or an entire system of cameras. This fact is scarier, however:

“The same department organized the blockchain-based electronic voting in Moscow and one more Russian region this summer when Russians voted to amend the country’s constitution. The voting process was criticized for the weak data protection.”

Moscow wants more ways to keep track of citizens in public areas and it wants to make some quick rubles off the process. Companies in the US do the same thing and the government as well.

Whitney Grace, September 10, 2020

—-

Oh, Oh, Millennials Want Their Words and Services Enhanced. Okay, Done!

September 9, 2020

A couple of amusing items caught my attention this morning. The first is Amazon’s alleged demand that a Silicon Valley real news outlet modify its word choice.

image

The Bezos bulldozer affects the social environment. The trillion horsepower Prime machine wants to make sure that its low cost gizmos are not identified with surveillance. Why is that? Perhaps because their inclusion of microphones, arrays, and assorted software designed to deal with voices in far corners performs surveillance? DarkCyber does not know. The solution? Amazon = surveillance. Now any word will do, right?

The second item is mentioned in “Microsoft Confirms Why Windows Defender Can’t Be Disabled via Registry.” The idea is that Microsoft’s system is now becoming Bob’s mom. You remember Bob, don’t you. User controls? Ho ho ho.

The third item is a rib tickler. You worry about censorship for text and videos, don’t you. Now you can worry about Google’s new user centric ability to filter your phone calls. That’s a howler. What if the call is from a person taking Google to court? Filtered. This benefits everyone. You can get the allegedly full story in “Google New Verified Calls Feature Will Tell You Why a Business Is Calling You.” Helpful.

Each of these examples amuse me. Shall we complain about Chinese surveillance apps?

These outfits are extending their perimeters as far as possible before the ever vigilant, lobbyist influenced political animals begin the great monopoly game.

Stephen E Arnold, September 9, 2020

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta