Information Manipulation: A Rich Tradition

September 21, 2020

Scientists Use Big Data to Sway Elections and Predict Riots — Welcome to the 1960s” is an interesting write up. The essay begins with a quote from a high profile Xoogler, Anthony Levandowski. He’s the engineer who allegedly found information in his possession which was not supposed be in his possession. Things just happen, of course. The quote in the write up reminded me that Sillycon Valley in an interesting place.

The point of the write up is to romp through information manipulations related to elections in the US. One company — Simulmatics — applied systems and methods refined by other experts. I am not comfortable naming these people because it is 2020. Proper nouns can be tricky business.

The write up asserts:

The press called Simulmatics scientists the “What-If Men”, because their work — programming an IBM 704 — was based on endless what-if simulations. The IBM 704 was billed as the first mass-produced computer capable of doing complex mathematics. Today, this kind of work is much vaunted and lavishly funded. The 2018 Encyclopedia of Database Systems describes ‘what-if analysis’ as “a data-intensive simulation”. It refers to it as “a relatively recent discipline”. Not so.

The “not so” nails down the obvious. Information manipulation has been around for more years than Silicon Valley’s luminaries have been reshaping the world with digital services.

This quote warranted a check mark:

Although none of the researchers he had met “had malignant political designs on the American public”, Burdick warned, their very lack of interest in contemplating the possible consequences of their work stood as a terrible danger. Indeed, they might “radically reconstruct the American political system, build a new politics, and even modify revered and venerable American institutions — facts of which they are blissfully innocent”.

Yep, Sumulmatics. The other thought the write up evoked is, “When and to what does one pay attention?” Thumbtypers, what do you think?

Stephen E Arnold, September 21, 2020

Social Media: Inherently Corrosive?

September 16, 2020

DarkCyber noted “The Inevitable Corruption of Social Systems on the Web.” [You may be asked to pay to view this write up. Sigh.] The article invests some effort into explaining a Captain Obvious point: Amazon reviews cannot be trusted. Okay. Insight.

In the essay is one interesting point. DarkCyber dubs this SMIC’s Law; to wit: The Law of SMIC (Social Media Inherently Corrosive) is, according to the write up:

Any system with a social base will experience, from a certain level of popularity, a corruption of its operations that will tend to destroy the value of the metrics used in it.“

DarkCyber agrees. Too bad users and regulators choose to ignore it.

Stephen E Arnold, September 16, 2020

Facebook a PR Firm? What about a Silicon Valley Cash Rich Intelligence Agency?

September 1, 2020

DarkCyber noted “Facebook, The PR Firm.” The main idea seems to be:

I [Can Duruk maybe?] saw on Twitter a leak from Facebook where the comms people were pleading to their coworkers to stop leaking to the press. The comms folks, in a weird form of irony, were so inundated with their moderation work that they had to ask people whose main task is to create more moderation work for the poorly paid and mentally traumatized people to please creating less work for them.

That statement combines the best of Escher and Kafka.

The write up notes:

I read Facebook less as a tech company, but instead a communications one. Not a telecom communications, but more like a PR / marketing consultancy. There’s nothing original about Facebook. It’s a company that hires people to build others’ ideas, and, more often than not, it does that better and faster than them too. And when it can’t do that, it just buys them outright. There is a lot of building, but the ideas are outsourced. But what Facebook is really good at is actually doing all this while fighting what seems to be a never-ending, at least since 2016 or so, PR battle while not giving an inch.

Is the entrepreneur Mark Zuckerberg a reincarnation of Ivy Lee, a metempsychosis in the realm of online social media?

And the final line of the write up reminds the reader:

This, after all, is a company that once thought comparing itself to a chair was a good idea.

DarkCyber thinks the write up makes some helpful points. However, several observations emerged from our morning Zoom “meeting” among the team members who had the energy to click a mouse button:

  1. Facebook has internalized the mechanisms used by some intelligence agencies and specialized services firms; for example, the dalliance in and out of court with NSO Group
  2. Facebook can perform what can be called “beam forming.” The idea is to take digital bits, focus them on a topic or issue, and then aim the beam of content at individuals and groups. The beam works like a wood carver’s oblique knife. The “targets” are shaped as needed.
  3. The company can exert threats in order to apply pressure to entities with a perceived intention of doing Facebook hard; for example, the threats made to Australia if the social media giant has to pay for news.

To sum up, DarkCyber believes that Facebook has more in common with an intelligence operation than a PR firm. I mean public relations. Really? Does Facebook care about relating to the public? Money, clicks, users, tracking, and data for sure. But public relations?

Stephen E Arnold, September 1, 2020

Telegram: Friendly Outfit for Russia, Other Places, Not So Much

August 18, 2020

There’s nothing like encrypted communications for bad actors. Some law enforcement and regulatory professionals are less enthusiastic. Russia has worked a deal with Telegram. Details about what Telegram’s side of the bargain include are sparse. Russia’s side of the deal is equally fuzzy. One might surmise a mechanism for accessing encrypted content. Is this possible? The answer depends on whom one asks.

Telegram in its quest to remain in business has, according to “Telegram Launches One-on-One Video Calls with End-to-End Encryption,” is dimming the lights for some law enforcement and regulatory units. The write up reports:

The video calls on Telegram support picture-in-picture mode, so that people can check and reply to their messages while talking to a friend. The calls are also protected with end-to-end encryption, with the security confirmed by matching emojis on the screen on either end of the line. Telegram continues to work on more features and improvements for its video call offering, saying that it is working to launch group video calls in the coming months. The upcoming feature will allow the app to jump into the videoconferencing market, which has become more crucial as people stay at home amid the COVID-19 pandemic.

What’s the big deal? Encrypted messaging poses a cost and time hurdle for government authorities. Bad actors find these encrypted services more useful than some other forms of information exchange. For example, the razzle dazzle of the Dark Web (despite its modest size in terms of sites and users) is losing ground to encrypted messaging services. And why not?

From a mobile device, encrypted messaging can replicate many of the more interesting facets of the Dark Web; for example:

  • Encryption and anonymity. Check
  • In app payment. Check
  • Private groups. Check
  • Social media functions. Semi check.

“Going dark” is no longer a bit of in-crowd jargon. It is a reality. And for Russian authorities, maybe not so dark.

Stephen E Arnold, August 18, 2020

Twitter Adulting: Copyright and the President of the United States

July 21, 2020

Imagine. Twitter has procedures which automate a portion of its copyright vigilance. (DarkCyber is not so sure about Twitter’s hiring practices and the internal security of its system, but the copyright function may be working.)

Twitter Disables Trump Tweet over Copyright Complaint” presents as accurate and “real” news this statement:

Twitter removed the video, which Trump had retweeted from White House social media director Dan Scavino, after it received a Digital Millennium Copyright Act notice from Machine Shop Entertainment, according to a notice posted on the Lumen Database which collects requests for removal of online materials. Machine Shop is a management company owned by the rock band Linkin Park, according to its LinkedIn page.

DarkCyber hopes that Twitter will bring similar diligence to its security, management, and governance of a firm which occupies an interesting, if not secure, place in the pantheon of social media luminaries.

As Linkin Park sang:

Go, stop the show
Choppy words…

Indeed, but the DarkCyber team would substitute the word “tweety” for choppy. But we are not song writers or exceptional tweeters.

Stephen E Arnold, July 21, 2020

Orkut Version 6, Maybe 7?

July 15, 2020

Alphabet Inc. had a dismal failure with Google+, but the company is ready to try again with a new social media platform says the Bandwidth Blog in: “With Its Experimental ‘Keen’ Service, Google Hopes To Tackle Pinterest.” Google’s new social media platform is called Keen and it is impossible not to make the assumption they are playing me too with this innovation. Can it even be called innovation?

Pinterest has its devoted users, who post everything from dream wedding albums to their favorite fanart. Google wants Keen to one up Pinterest, but its description sounds innocuous:

“Launched under the auspices of Google’s in-house incubator, Area 120, Keen is a Pinterest-like social network that is designed to marry Google’s machine learning strengths, user data hoarded, and insights from its existing Google Alerts system to craft together a smorgasbord of content that evolves as its users interests take shape.”

Google has developed powerful AI that collects user data and successfully makes personalized recommendations. Google also sells user data to advertisers, so it is not a stretch for them to use it for another social media project.

Keen is already available on Android. Users’ Keen pinboards feature content based on recent Google searches. The content includes YouTube videos, purchase suggestions, and articles. The biggest thing is that when a user becomes interested in a topic, Keen will recommend more content for deeper dives. Users can pin their items to create their own ‘Keen’ interest, then share them with others.

The experience is similar to Pinterest, except the topics on Keen are generated by Google searches and other activity. It would be easy for Google to add Keen to its other free services, especially making it an Apple app or a browser extensions on Chrome. Unlike Google+, Alphabet is not concentrating as much on Keen. It is hard to say if Keen will emerge beyond Android, but anything is possible.

Whitney Grace, July 15, 2020

Social Media Helps Trolls Roll

July 9, 2020

Even social-media researcher Jeanna Matthews has to be vigilant to keep from being fooled, we learn from her article in Fast Company, “Bots and Trolls Control a Shocking Amount of Online Conversation.” Armies of hackers maliciously swaying public opinion through social media have only grown larger, and their methods more sophisticated, since they started making news in 2016. These bad actors game the algorithms that decide which posts to circulate heavily, choices based largely on which ones get the most reactions (“likes,” “votes,” sad/ laughing/ angry faces, etc.) It has been shown, however, that lies spread faster than truths. Any middle-school girl could have told us that. Mattews writes:

“But who is doing this ‘voting’? Often it’s an army of accounts, called bots, that do not correspond to real people. In fact, they’re controlled by hackers, often on the other side of the world. For example, researchers have reported that more than half of the Twitter accounts discussing COVID-19 are bots. As a social media researcher, I’ve seen thousands of accounts with the same profile picture ‘like’ posts in unison. I’ve seen accounts post hundreds of times per day, far more than a human being could. I’ve seen an account claiming to be an ‘All-American patriotic army wife’ from Florida post obsessively about immigrants in English, but whose account history showed it used to post in Ukranian. Fake accounts like this are called ‘sock puppets’—suggesting a hidden hand speaking through another identity. In many cases, this deception can easily be revealed with a look at the account history. But in some cases, there is a big investment in making sock puppet accounts seem real.”

One example is the much-followed Jenna Abrams Twitter account that turned out to be run by Russian trolls. These imposters’ have their favorite subjects—Covid-19 and Black Lives Matter are two examples—but their goals go beyond the issues. They practice to divide and conquer: sowing mistrust, pitting us against each other, and building a society in which objective truth no longer matters. Social media platforms, which (sadly) profit from the spread of misinformation, have been slow to act against these manipulators. They often brandish the freedom of speech argument to defend their inaction.

Matthews suggests some ways to protect ourselves from being swayed by these deceivers. We can use social media sparingly, and when we do visit, be more deliberate—navigate to particular pages instead of just consuming the default feed. We can also pressure platforms to delete accounts with sure signs of automation, provide more controls over what crosses our feed, and provide more transparency about how choices are made and who is placing ads. Some may want to contact legislators to demand regulation. Finally, we must take it all with a grain of salt. We know the trolls are out there, and we know how active they are. Do not fall for their tricks.

Cynthia Murrell, July 9, 2020

Techno-Grousing: A New Analytic Method?

July 3, 2020

Two items snagged my attention as my team and I were finishing the pre-recorded lecture about Amazon policeware for the upcoming National Cyber Crime Conference.

The first is a mostly context free item from a Silicon Valley type “real” news outfit. The article’s title is:

Hany Farid Says a Reckoning Is Coming for Toxic Social Media

The item comes from one of the technology emission centers in the San Francisco / Silicon Valley region: A professor at the University of California, Berkeley.

What’s interesting is that Hany Farid is activating a klaxon that hoots:

In five years, I expect us to have long since reached the boiling point that leads to reining in an almost entirely unregulated technology sector to contend with how technology has been weaponized against individuals, society, and democracy.

Insight? Prediction? Anticipatory avoidance?

After decades of supporting, advocating, and cheerleading technology — now, this moment, is the time to be aware that change is coming. Who is responsible? The media is a candidate, people who disseminate misinformation, and bad actors.

Sounds good. What about educators? Well, not mentioned.

The other item comes from the Jakarta Post. You can find the story at this link. I have learned that mentioning the entity the story discusses results in my blog post being skipped by certain indexing systems. Hey, that’s a surprise, right?

The point of the write up is that a certain social media site is now struggling with increased feistiness among otherwise PR influenced users.

What’s interesting is that suddenly, like the insight du jour from the Berkeley professor, nastiness is determined to be undesirable.

The fix for the social media outfit is simple: Get out of line and you will be blocked from the service. There’s nothing so comforting as hitting the big red cancel button.

Turning battleships quickly can have interesting consequences. The question is, “What if the battleship’s turn has unforeseen consequences?”

Stephen E Arnold, July 3, 2020

TikTok Actually Manages but Who Decides?

June 22, 2020

TikTok’s older sibling TopBuzz has not been nearly as successful as the wildly popular (though problematic) short-video app, but it held its own for a while. The product, launched in 2015, recommended AI-personalized news articles to subscribers around the world. However, reports Reuters, “TikTok Owner ByteDance Shuts Down Overseas News Aggregator TopBuzz.” Writers Yingzhi Yang and Brenda Goh tells us:

“The closure of TopBuzz underlines how ByteDance’s moves into international markets have not been entirely smooth in spite of TikTok’s success. … TopBuzz’s downloads declined to 1.2 million in the first half of 2019 from 7 million in all of 2018 on the App Store and Google Play combined, according to researcher Sensor Tower. TikTok had 345.2 million downloads in the first half of 2019. TopBuzz began shrinking its operations last year, according to two sources familiar with the matter. The app used to have operations in multiple languages including Spanish and Portuguese, one of the sources said. But now its website only shows English and Japanese versions.”

But why the decline? Perhaps this has something to do with it—the write-up continues:

“ByteDance is currently under a U.S national security inquiry into TikTok’s handling of user data, and also facing tightened scrutiny from regulators around the world.”

There is a global news app, however, that is still going strong in China. Last month News Break, founded by former Yahooer Zheng Zhaohui and funded by Chinese investors, outperformed both Twitter and Reddit on Google Play according to SimilarWeb.

ByteDance was founded in 2012 and is based in Shanghai, China.

Cynthia Murrell, June 22, 2020

Is It Facebookization or Goddellization? Either Way Zation Is a Thing

June 6, 2020

DarkCyber noted the article “Facebook’s Zuckerberg Vows to Review Content Policies.” Interesting. Mr. Zuckerberg, the supreme and respected Great Leader of Facebook, is doing backtracking with a red herring. A vow. Wow. Not an actual action but a vow, a promise, an assurance of rethink-ization. The write up reports  in “real news” fashion:

Facebook Inc. Chief Executive Officer Mark Zuckerberg said the company will review content policies after employees blasted their leader for his decision to leave up controversial posts … The company will review policies on posts that promote or threaten state use of force or voter suppression techniques, and will also look into options for flagging or labeling posts that are a violation but shouldn’t necessarily be removed entirely, the CEO wrote on Facebook. He also pledged to study Facebook’s review structure “to make sure the right groups and voices are at the table.”

Facebook has been a stellar example of appropriate behavior for years. There have been some slips twixt the cup and the lip. Cambridge Analytica, the role of the firm’s Board of Directors, and testimony before the US Congress. No biggies.

A “zation” for sure. Facebookization appears to mean the act of emitting statements that semi-approach issues of governance and related matters. Change could be afoot. Baloney-ization remains a possibility.

Then there is the non technical Goodellization of mental frameworks. Walt Disney’s “real news” company published “NFL Players Spoke, and Roger Goodell Responded.” Now What? Here’s What We Know.” No mouse ears were included as illustrative touch points.

In a video message released Friday night (June 5, 2020) , NFL commissioner Roger Goodell responded to a video released Thursday night (June 4, 2020) by a collection of NFL stars, including Michael Thomas, Patrick Mahomes and Deshaun Watson. Goodell’s video included three specific statements the players in Thursday’s video asked the NFL to make about racism, social injustice and peaceful protests. “We, the National Football League, condemn racism and the systematic oppression of black people,” Goodell said. “We, the National Football League, admit we were wrong for not listening to NFL players earlier and encourage all players to speak out and peacefully protest. We, the National Football League, believe that black lives matter.”

The Goodellization of a contentious issue arrives with the timeliness and possibly the sincerity of the Facebookization event.

Several observations:

  1. Employee push back now is more effective than an internal ethical compass for guiding a corporate construct. DarkCyber thought that fuzzy stuff like subjective data was irrelevant in today’s go go business world.
  2. No actual change has taken place in the isolated, self congratulating worlds of Facebook social media or the voracious maw of video.
  3. A threat to money and power is more effective than employees posting grumpies on an email system or fencing with attorney Mark Geragos, handwaving, and emulating Roman nobles.

Facebookization and Goddellization. New words. Maybe new behaviors in the online and video constructs?

We’ll see because this social media and TV sports watching produces money. A threat to the cash flow puts the cards on the table, the fingers on the buttons, and the thought processes of Big Wheels in a different gear. Money-ization?

Stephen E Arnold, June 6, 2020

Next Page »

  • Archives

  • Recent Posts

  • Meta