Social Media Helps Trolls Roll

July 9, 2020

Even social-media researcher Jeanna Matthews has to be vigilant to keep from being fooled, we learn from her article in Fast Company, “Bots and Trolls Control a Shocking Amount of Online Conversation.” Armies of hackers maliciously swaying public opinion through social media have only grown larger, and their methods more sophisticated, since they started making news in 2016. These bad actors game the algorithms that decide which posts to circulate heavily, choices based largely on which ones get the most reactions (“likes,” “votes,” sad/ laughing/ angry faces, etc.) It has been shown, however, that lies spread faster than truths. Any middle-school girl could have told us that. Mattews writes:

“But who is doing this ‘voting’? Often it’s an army of accounts, called bots, that do not correspond to real people. In fact, they’re controlled by hackers, often on the other side of the world. For example, researchers have reported that more than half of the Twitter accounts discussing COVID-19 are bots. As a social media researcher, I’ve seen thousands of accounts with the same profile picture ‘like’ posts in unison. I’ve seen accounts post hundreds of times per day, far more than a human being could. I’ve seen an account claiming to be an ‘All-American patriotic army wife’ from Florida post obsessively about immigrants in English, but whose account history showed it used to post in Ukranian. Fake accounts like this are called ‘sock puppets’—suggesting a hidden hand speaking through another identity. In many cases, this deception can easily be revealed with a look at the account history. But in some cases, there is a big investment in making sock puppet accounts seem real.”

One example is the much-followed Jenna Abrams Twitter account that turned out to be run by Russian trolls. These imposters’ have their favorite subjects—Covid-19 and Black Lives Matter are two examples—but their goals go beyond the issues. They practice to divide and conquer: sowing mistrust, pitting us against each other, and building a society in which objective truth no longer matters. Social media platforms, which (sadly) profit from the spread of misinformation, have been slow to act against these manipulators. They often brandish the freedom of speech argument to defend their inaction.

Matthews suggests some ways to protect ourselves from being swayed by these deceivers. We can use social media sparingly, and when we do visit, be more deliberate—navigate to particular pages instead of just consuming the default feed. We can also pressure platforms to delete accounts with sure signs of automation, provide more controls over what crosses our feed, and provide more transparency about how choices are made and who is placing ads. Some may want to contact legislators to demand regulation. Finally, we must take it all with a grain of salt. We know the trolls are out there, and we know how active they are. Do not fall for their tricks.

Cynthia Murrell, July 9, 2020

Techno-Grousing: A New Analytic Method?

July 3, 2020

Two items snagged my attention as my team and I were finishing the pre-recorded lecture about Amazon policeware for the upcoming National Cyber Crime Conference.

The first is a mostly context free item from a Silicon Valley type “real” news outfit. The article’s title is:

Hany Farid Says a Reckoning Is Coming for Toxic Social Media

The item comes from one of the technology emission centers in the San Francisco / Silicon Valley region: A professor at the University of California, Berkeley.

What’s interesting is that Hany Farid is activating a klaxon that hoots:

In five years, I expect us to have long since reached the boiling point that leads to reining in an almost entirely unregulated technology sector to contend with how technology has been weaponized against individuals, society, and democracy.

Insight? Prediction? Anticipatory avoidance?

After decades of supporting, advocating, and cheerleading technology — now, this moment, is the time to be aware that change is coming. Who is responsible? The media is a candidate, people who disseminate misinformation, and bad actors.

Sounds good. What about educators? Well, not mentioned.

The other item comes from the Jakarta Post. You can find the story at this link. I have learned that mentioning the entity the story discusses results in my blog post being skipped by certain indexing systems. Hey, that’s a surprise, right?

The point of the write up is that a certain social media site is now struggling with increased feistiness among otherwise PR influenced users.

What’s interesting is that suddenly, like the insight du jour from the Berkeley professor, nastiness is determined to be undesirable.

The fix for the social media outfit is simple: Get out of line and you will be blocked from the service. There’s nothing so comforting as hitting the big red cancel button.

Turning battleships quickly can have interesting consequences. The question is, “What if the battleship’s turn has unforeseen consequences?”

Stephen E Arnold, July 3, 2020

TikTok Actually Manages but Who Decides?

June 22, 2020

TikTok’s older sibling TopBuzz has not been nearly as successful as the wildly popular (though problematic) short-video app, but it held its own for a while. The product, launched in 2015, recommended AI-personalized news articles to subscribers around the world. However, reports Reuters, “TikTok Owner ByteDance Shuts Down Overseas News Aggregator TopBuzz.” Writers Yingzhi Yang and Brenda Goh tells us:

“The closure of TopBuzz underlines how ByteDance’s moves into international markets have not been entirely smooth in spite of TikTok’s success. … TopBuzz’s downloads declined to 1.2 million in the first half of 2019 from 7 million in all of 2018 on the App Store and Google Play combined, according to researcher Sensor Tower. TikTok had 345.2 million downloads in the first half of 2019. TopBuzz began shrinking its operations last year, according to two sources familiar with the matter. The app used to have operations in multiple languages including Spanish and Portuguese, one of the sources said. But now its website only shows English and Japanese versions.”

But why the decline? Perhaps this has something to do with it—the write-up continues:

“ByteDance is currently under a U.S national security inquiry into TikTok’s handling of user data, and also facing tightened scrutiny from regulators around the world.”

There is a global news app, however, that is still going strong in China. Last month News Break, founded by former Yahooer Zheng Zhaohui and funded by Chinese investors, outperformed both Twitter and Reddit on Google Play according to SimilarWeb.

ByteDance was founded in 2012 and is based in Shanghai, China.

Cynthia Murrell, June 22, 2020

Is It Facebookization or Goddellization? Either Way Zation Is a Thing

June 6, 2020

DarkCyber noted the article “Facebook’s Zuckerberg Vows to Review Content Policies.” Interesting. Mr. Zuckerberg, the supreme and respected Great Leader of Facebook, is doing backtracking with a red herring. A vow. Wow. Not an actual action but a vow, a promise, an assurance of rethink-ization. The write up reports  in “real news” fashion:

Facebook Inc. Chief Executive Officer Mark Zuckerberg said the company will review content policies after employees blasted their leader for his decision to leave up controversial posts … The company will review policies on posts that promote or threaten state use of force or voter suppression techniques, and will also look into options for flagging or labeling posts that are a violation but shouldn’t necessarily be removed entirely, the CEO wrote on Facebook. He also pledged to study Facebook’s review structure “to make sure the right groups and voices are at the table.”

Facebook has been a stellar example of appropriate behavior for years. There have been some slips twixt the cup and the lip. Cambridge Analytica, the role of the firm’s Board of Directors, and testimony before the US Congress. No biggies.

A “zation” for sure. Facebookization appears to mean the act of emitting statements that semi-approach issues of governance and related matters. Change could be afoot. Baloney-ization remains a possibility.

Then there is the non technical Goodellization of mental frameworks. Walt Disney’s “real news” company published “NFL Players Spoke, and Roger Goodell Responded.” Now What? Here’s What We Know.” No mouse ears were included as illustrative touch points.

In a video message released Friday night (June 5, 2020) , NFL commissioner Roger Goodell responded to a video released Thursday night (June 4, 2020) by a collection of NFL stars, including Michael Thomas, Patrick Mahomes and Deshaun Watson. Goodell’s video included three specific statements the players in Thursday’s video asked the NFL to make about racism, social injustice and peaceful protests. “We, the National Football League, condemn racism and the systematic oppression of black people,” Goodell said. “We, the National Football League, admit we were wrong for not listening to NFL players earlier and encourage all players to speak out and peacefully protest. We, the National Football League, believe that black lives matter.”

The Goodellization of a contentious issue arrives with the timeliness and possibly the sincerity of the Facebookization event.

Several observations:

  1. Employee push back now is more effective than an internal ethical compass for guiding a corporate construct. DarkCyber thought that fuzzy stuff like subjective data was irrelevant in today’s go go business world.
  2. No actual change has taken place in the isolated, self congratulating worlds of Facebook social media or the voracious maw of video.
  3. A threat to money and power is more effective than employees posting grumpies on an email system or fencing with attorney Mark Geragos, handwaving, and emulating Roman nobles.

Facebookization and Goddellization. New words. Maybe new behaviors in the online and video constructs?

We’ll see because this social media and TV sports watching produces money. A threat to the cash flow puts the cards on the table, the fingers on the buttons, and the thought processes of Big Wheels in a different gear. Money-ization?

Stephen E Arnold, June 6, 2020

Jack Benny Tropes Return: Tweets Are Making the Oooooold New Again

June 5, 2020

The Jack Benny Radio Show. A character tagged Frank Nelson, who says, “Yeeeeeesssss.” Funny, yep. When? A half century ago. So what?

Even with the breath of emojis, GIFs, videos, and other accoutrement it is hard to express emotional intent through text. Wired investigated the how and why emotions are expressed in the article, “Whoooaaa Duuuuude: Why We Stretch Words In Tweets And Texts.”

The University of Vermont researched Twitter tweets about why elongated words are used so much on the social media platform. They discovered that stretching a word is a linguistic device conveying a varied emotional range from excitement to sarcasm. Exclamation points are the old dead tree way to express anything from excitement to fear, but apparently they are old fashioned and it shows restraint not to use one. People turn to stretched words to add more meaning to their tweets.

The University of Vermont examined 10% of tweets sent between 2008-2016 for elongated words. Their research yielded interesting patterns, but the most obvious is how complex human emotion is for AI:

“Because stretched words can be embedded with so much extra meaning beyond the words themselves, understanding them is critical for artificial intelligences that analyze text, like chatbots. At the moment, a stretched word may be so perplexing for an AI that the program just skips over it entirely. We don’t want to have to bold or italicize words to emphasize them for the chatbot to parse—and even then, such formatting can’t replicate the range of emotions that stretched words convey.”

Studies like this help AI and machine learning understand the subtle nuances involved in human language. It will be decades before machines are entirely capable of understanding human language patterns, but they more data they have the closer they come.

Oh, Rochester, yessssss bossssss.

Whitney Grace, June 6, 2020

Grammar? You Must Be Joking!

June 5, 2020

Perhaps the set of rules many of us worked so hard to master have become but a quaint convention. Write to Edit discusses the question, “Does Grammar Even Matter Anymore?” Writing practices are changing so fast, it is a natural question to ponder. However, states writer Amelia Zimmerman, that very question misses the point. It is the old prescriptivism vs. descriptivism issue—is grammar a set of fixed rules to be adhered to or an evolving account of how language is used? Zimmerman writes:

“Neither side is entirely wrong. Although correct grammar is important for clarity and often determines your reputation on the page, language is an evolving thing, not a static rulebook. Things people said in Shakespeare’s day would hardly be said now; even the spelling and meaning of words changes over time (literally doesn’t mean literally anymore). Now, the internet, text messages and emojis are changing the English language faster than ever. But this divide focuses on the small-picture topic of grammar without addressing the big-picture idea which is meaning. Grammar is a tool that, when used correctly, creates clarity and delivers meaning. But that’s all it is — a tool. Whether grammar matters is the wrong question. The right question is whether meaning matters — whether clarity matters — and that answer will never change.”

Of course, the answer there is yes; clarity is the cardinal quality of any good editor. The article goes on to examine what grammar rules really are (most are more like guidelines, really) and when one might choose to break them. Sometimes breaking a convention makes the meaning clearer, other times doing so makes a sentence more appealing, persuasive, or succinct. Zimmerman concludes:

“Most grammar guidelines have been constructed and are adhered to in such a way that they do help transmit your meaning clearly. … But sometimes adhering too strictly to old notions of grammar can get in the way of comprehension, make your writing too long-winded or ridiculous, or restrict creative expression and poetic effect. That’s when a mix of common sense and your own gut should prevail.”

This descriptivist heartily concurs. Remember. The number is plural. A number is singular. None is a singular, so none is agreeing. Bummer.

Cynthia Murrell, June 6, 2020

GeoSpark Analytics: Real Time Analytics

April 6, 2020

In late 2017, OGSystems chopped out some of the firm’s analytics capabilities. The new company was Geospark Analytics. The service provided enabled customers like the US Department of Defense and FEMA to obtain information about important new events. “Events” is jargon for an alert plus data about something that is important.

“FEMA Contractor Tracing Coronavirus Deaths Uses Web Scraping, Social Media Monitoring” explains one use of the system. The write up says:

Geospark Analytics combines machine learning and big data to analyze events in real-time and warn of potential disruptions to the businesses of high-dollar private and public clientele…

Like Bluedot in Canada, Geospark was one of the monitoring companies analyzing open source and some specialized data to find interesting events. The write up continues:

Geospark Analytics’ product, called Hyperion, the namesake of the Titan son of Uranus (meaning, “watcher from above”), fingered Wuhan as a “hotspot,” in the company’s parlance, within hours after news of the virus first broke. “Hotspots tracks normal patterns of activity across the globe and provides a visual cue to flag disruptive events that could impact your employees, operations, and investments and result in billions of dollars in economic losses,” the company’s website says.

Engadget points out that there are a couple of companies with the name “Geospark.” DarkCyber finds this interesting. This statement provides more color about the Geospark approach:

Geospark Analytics claims to have processed “6.8 million” sources of information; everything from tweets to economic reports. “We geo-position it, we use natural language processing, and we have deep learning models that categorize the data into event and health models,” Goolgasian [Geospark’s CEO] said. It’s through these many millions of data points that the company creates what it calls a “baseline level of activity” for specific regions, such as Wuhan. A spike of activity around any number of security-, military-, or health-related topics and the system flags it as a potential disruption.

How does Geospark avoid the social media noise, bias, and disinformation that finds its way into open source content? The article states:

“We rely more on traditional data sources and we don’t do anything that isn’t publicly available,” Goolgasian said, echoing a common refrain among data firms that fuel surveillance products by mining the internet itself.

Providing specialized services to government agencies is not much of a surprise in DarkCyber’s opinion. Financial firms can also be avid consumers of real-time data. The idea is to get the jump on the competition which probably has its own source of digital insights.

Other observations:

  • The apparent “surprise” threading through the Engadget article is a bit off putting. DarkCyber is aware of a number of social media and specialized content monitoring services. In fact, there is a surplus of these operations and not all will survive in the present business climate.
  • Detecting and alerting are helpful but the messengers failed to achieve impact. How does DarkCyber know? Well, there is the lockdown.
  • Publicizing what companies like Geospark and others do to generate income can have interesting consequences.

Net net: Some types of specialized services are difficult to explain in a way that reduces blowback. Some of the blowback have significant impact on social media analytics companies. The Geofeedia case is a reminder. I know. I know. “What’s a Geofeedia some may ask?”

Good question and DarkCyber thinks few know the answer. Plucking insights from information many people believe to be privileged can be fraught with business shock waves.

Stephen E Arnold, April 6, 2020

Cambridge Analytica Alum: Social Media Is Like Bad, You Know

April 4, 2020

A voice of (in)experience describes how tech companies can be dangerous when left unchecked. Channel News Asia reports, “Tech Must Be Regulated Like Tobacco, says Cambridge Analytica Whistleblower.” Christopher Wylie is the data scientist who exposed Cambridge Analytica’s use of Facebook data to manipulate the 2016 presidential election, among others. He declares society has yet to learn the lesson of that scandal. Yes, Facebook was fined a substantial sum, but it and other tech giants continue to operate with little to no oversight. The article states:

“Wylie details in his book how personality profiles mined from Facebook were weaponised to ‘radicalise’ individuals through psychographic profiling and targeting techniques. So great is their potential power over society and people’s lives that tech professionals need to be subject to the same codes of ethics as doctors and lawyers, he told AFP as his book was published in France. ‘Profiling work that we were doing to look at who was most vulnerable to being radicalised … was used to identify people in the US who were susceptible to radicalisation so that they could be encouraged and catalysed on that path,’ he said. ‘You are being intentionally monitored so that your unique biases, your anxieties, your weaknesses, your needs, your desires can be quantified in such a way that a company can seek to exploit that for profit,’ said the 30-year-old. Wylie, who blew the whistle to British newspaper, The Guardian, in Mar 2018, said at least people now realise how powerful data can be.”

As in any industry, tech companies are made up of humans, some of whom are willing to put money over morality. And as in other consequential industries like construction, engineering, medicine, and law, Wylie argues, regulations are required to protect consumers from that which they do not understand.

Cynthia Murrell, April 4, 2020

Biased? You Betcha

March 11, 2020

Fact checkers probably have one of the hardest jobs, especially with today’s 24/7 access news stream. Determining what the facts are is difficult and requires proper research. Fact checkers, however, have a tougher nut to crack with confirmation bias a.k.a. this article from Nieman Lab: “The Fact-Checker’s Dilemma: Humans Are Hardwired To Dismiss Facts That Don’t Fit Their Worldview.”

The article opens with a poignant statement about polarized, insulated ideological communities ratified by their own beliefs. Some examples of these communities are autism is caused by vaccines, global warming is a hoax, and different political mish mash.

Refuting false information should be simple, especially with cold, hard facts, but that is not the case. Political, religion, ethnicity, nationality, and other factors influence how and what people believe. What is the cause behind this behavior?

“The interdisciplinary study of this phenomenon has exploded over just the past six or seven years. One thing has become clear: The failure of various groups to acknowledge the truth about, say, climate change, isn’t explained by a lack of information about the scientific consensus on the subject. Instead, what strongly predicts denial of expertise on many controversial topics is simply one’s political persuasion.”

What is astonishing is this:

“A 2015 metastudy showed that ideological polarization over the reality of climate change actually increases with respondents’ knowledge of politics, science, and/or energy policy. The chances that a conservative is a climate change denier is significantly higher if he or she is college-educated. Conservatives scoring highest on tests for cognitive sophistication or quantitative reasoning skills are most susceptible to motivated reasoning about climate science.”

While the above example is about conservatives, liberals also have their own confirmation bias dilemmas. This behavior is also linked to primal human behaviors, where, in order to join a social group, humans had to assimilate the group’s beliefs and habits. Personally held prejudices do affect factual beliefs and these can be anything from politics, religion, etc.

Unwelcome information also increases people to cling to wrong information. Anything that threatens an established system encourages close minded thinking. This also gives rise to deniers and conspiracy theories that can also be regarded as fact, when there is not any information to support it.

It is basic human behavior to reject anything that threatens strongly held interests, dogmas, or creeds giving way to denial. Politicians manipulate that behavior to their benefit and the average individual does not realize it. “Waking up “ or becoming aware how the human brain works in relation to confirmation bias is key to overcoming false facts.

Whitney Grace, March 11, 202

Facebook Is Definitely Evil: Plus or Minus Three Percent at a 95 Percent Confidence Level

March 2, 2020

The Verge Tech Survey 2020 allegedly and theoretically reveals the deepest thoughts, preferences, and perceptions of people in the US. The details of these people are sketchy, but that’s not the point of the survey. The findings suggest that Facebook is a problem. Amazon is a problem. Other big tech companies are problems. Trouble right here is digital city.

The survey findings come from a survey of 1123 people “nationally representative of the US.” There was no information about income, group with which the subject identifies, or methodology. But the result is a plus or minus three percent at a 95 percent confidence level. That sure seems okay despite DarkCyber’s questions about:

  • Sample selection. Who pulled the sample, from where, were people volunteers, etc.
  • “Nationally representative” means what? Was it the proportional representation method? How many people from Montana and the other “states”? What about Puerto Rico? Who worked for which company?
  • Plus or minus three percent. That’s a swing at a 95 percent confidence level. In terms of optical character recognition that works out to three to six errors per page about 95 percent of the time. Is this close enough for a drone strike or an enforcement action. Oh, right, this is a survey about big tech. Big tech doesn’t think the DarkCyber way, right?
  • What were the socio economic strata of the individuals in the sample?

What’s revealed or discovered?

First, people love most of the high profile “names” or “brands.” Amazon is numero uno, the Google is number two, and YouTube (which is the Google in case you have forgotten is number three. So far, the data look like a name recognition test. “Do you prefer this unknown lye soap or Dove?” Yep, people prefer Dove. But lye soap may be making a come back.

The stunning finding is that Facebook and Twitter impact society in a negative way. Contrast this to lovable Google and Amazon, 72 percent are favorable to the Google and 70 percent are favorable to Amazon.

Here’s the data about which companies people trust. Darned Amazing. People trust Microsoft and Amazon the most.

image

Which companies do the homeless and people in rural West Virginia trust?

Plus 72 percent of the sample believe Facebook has too much “power.” What does power mean? No clue for the context of this survey.

Gentle reader, please, examine the article containing these data. I want to go back in time and reflect on the people who struggled in my statistics classes. Painful memories but I picked up some cash tutoring. I got out of that business because some folks don’t grasp numerical recipes.

Stephen E Arnold, March 2, 20020

Next Page »

  • Archives

  • Recent Posts

  • Meta