Facebook and Twitter: Battle Platforms

February 16, 2018

Social media is, according to an analysis by Lt. Col Jarred Prier (USAF), is a component of information warfare. “Commanding the Trend: Social Media As Information Warfare” explains how various actions can function as a lever for action and ideas. Highly recommended. The analysis suggests that social media is more than a way to find a companion and keep up with the kids.

Stephen E Arnold, February 16, 2018

EU Considers Making Platforms Pay for News Content

February 13, 2018

European journalists are sick of giant internet companies profiting from their labor without recompense, we learn from Yahoo News’ article, “Net Giants ‘Must Pay for News’ From Which They Make Billions.” The declaration from nine press agencies comes in support of a proposed EU directive that would require companies like Facebook, Google, and Twitter to pay for the articles that bring so much ad revenue to their platforms. The write-up shares part of the agencies’ plea:

Facebook has become the biggest media in the world,” the agencies said in a plea published in the French daily Le Monde. “Yet neither Facebook nor Google have a newsroom… They do not have journalists in Syria risking their lives, nor a bureau in Zimbabwe investigating Mugabe’s departure, nor editors to check and verify information sent in by reporters on the ground. Access to free information is supposedly one of the great victories of the internet. But it is a myth,” the agencies argued. “At the end of the chain, informing the public costs a lot of money.

News, the declaration added, is the second reason after catching up on family and friends for people to log onto Facebook, which tripled its profits to $10 billion (€8.5 billion) last year. Yet it is the giants of the net who are reaping vast profits “from other people’s work” by soaking up between 60 and 70 percent of advertising revenue, with Google’s jumping by a fifth in a year. Meanwhile, ad revenue for news media fell nine percent in France alone last year, “a disaster for the industry”.

Indeed it is. And, we are reminded, a robust press is crucial for democracy itself. Some attempts have been made in France, Germany, and Spain to obtain compensation from these companies, but the limited results were disappointing. The press agencies suggest granting journalists “related rights” copyrights and assure a concerned Parliament that citizens will still be able to access information for free online. The only difference, they insist, would be that an appropriate chunk of that ad revenue will go to the people who actually researched and created the content. That sounds reasonable to this writer.

Cynthia Murrell, February 13, 2018

 

 The Future of Social Media is Old School

February 8, 2018

Before social media, the only way to express yourself online was via a mostly anonymous series of blogs and sites that were impossible to go viral because virality didn’t exist. Oddly, some bright minds are going back to this method with txt.fyi, a platform where you can post anything you want without it going to search engines. This old-fashioned message board was examined in a recent Wired article, “This Stripped-Down Blogging Tool Exemplifies Antisocial Media.”

I wanted something where people could publish their thoughts without any false game of social manipulation, one-upmanship, and favor-trading,” he says. This is what I found so interesting about his creation. Its antivirality doesn’t necessarily prevent a post from becoming wildly popular. (A txt.fyi URL shared on, say, Facebook could perhaps go viral.) But its design favors messages to someone, not everyone.

 

[The inventor] discovered someone using txt.fyi to write letters to a deceased relative. It was touching and weirdly human, precisely the sort of unconventional expression we used to see a lot more of online. But today we sand down those rough edges, those barbaric yawps, in the quest for social spread. Even if you don’t want to share something, Medium or Tumblr or Snapchat tries to make you. They have the will to virality baked in.

This is a neat idea and might have a longer shelf life than you’d think. That’s because we are firm believers that every good idea on the internet gets retooled for awfulness. (Reddit, anyone?) This quasi-dark web blogging approach is almost certain to be used for nefarious purposes and will become a tool for hate speech and crime.

Patrick Roland, February 8, 2018

Facebook: Regulate It

January 26, 2018

I read “Facebook Is Addictive and Should Be Regulated Like a Cigarette Company: Salesforce CEO.” Yes, another call to regulate online services. I noted this statement in the authoritative USA Today “real” journalism type article:

“I think that you do it exactly the same way that you regulated the cigarette industry. Here’s a product: Cigarettes. They’re addictive. You know, they’re not good for you…There’s a lot of parallels,” Salesforce CEO Marc Benioff told CBNC’s Squawk Alley.

I wonder if Salesforce has forgotten the advice it received from pundit Steve Gilmor about attention and the role it would play in boosting Salesforce’s customer activity?

Must be a different thing. Email, phone calls, and reports about “sales” calls. Different stuff with contacts, analyses, mechanisms for capturing data from multi-tenant systems, etc. Different, right?

Stephen E Arnold, January 26, 2018

How SEO Has Shaped the Web

January 19, 2018

With the benefit of hindsight, big-name thinker Anil Dash has concluded that SEO has contributed to the ineffectiveness of Web search. He examines how we got here in his article, “Underscores, Optimization & Arms Races” at Medium.  Starting with the year 2000, Dash traces the development of Internet content management systems (CMS’s), of which he was a part. (It is a good brief summary for anyone who wasn’t following along at the time.) WordPress is an example of a CMS.

As Google’s influence grew, online publishers became aware of an opportunity—they could game the search algorithm to move their site to the top of “relevant” results by playing around with keywords and other content details. The question of whether websites should bow to Google’s whims seemed to go unasked, as site after site fell into this pattern, later to be known as Search Engine Optimization. For Dash, the matter was symbolized by a question over hyphens or underbars to represent spaces in web addresses. Now, of course, one can use either without upsetting Google’s algorithm, but that was not the case at first. When Google’s Matt Cutts stated a preference for the hyphen in 2005, most publishers fell in line. Including Dash, eventually and very reluctantly; for him, the choice represented nothing less than the very nature of the Internet.

He writes:

You see, the theory of how we felt Google should work, and what the company had often claimed, was that it looked at the web and used signals like the links or the formatting of webpages to indicate the quality and relevance of content. Put simply, your search ranking with Google was supposed to be based on Google indexing the web as it is. But what if, due to the market pressure of the increasing value of ranking in Google’s search results, websites were incentivized to change their content to appeal to Google’s algorithm? Or, more accurately, to appeal to the values of the people who coded Google’s algorithm?

Eventually, even Dash and his CMS caved and switched to hyphens. What he did not notice at the time, he muses, was the unsettling development of the  entire SEO community centered around appeasing these algorithms. He concludes:

By the time we realized that we’d gotten suckered into a never-ending two-front battle against both the algorithms of the major tech companies and the destructive movements that wanted to exploit them, it was too late. We’d already set the precedent that independent publishers and tech creators would just keep chasing whatever algorithm Google (and later Facebook and Twitter) fed to us. Now, the challenge is to reform these systems so that we can hold the big platforms accountable for the impacts of their algorithms. We’ve got to encourage today’s newer creative communities in media and tech and culture to not constrain what they’re doing to conform to the dictates of an opaque, unknowable algorithm.

Is that doable, or have we gone too far toward appeasing the Internet behemoths to turn back?

Cynthia Murrell, January 19, 2018

Facebook Experiment Harming Democracy

January 16, 2018

Facebook seems to be the last place on the Web to negatively affect democratic governments, but according to The Guardian it will in, “‘Downright Orwellian’: Journalists Decry Facebook Experiment’s Impact On Democracy.”  Facebook is being compared to Big Brother in a news feed experiment that removed professional media stories from six countries.  Let the article break it down for you:

The experiment, which began 19 October and is still ongoing, involves limiting the core element of Facebook’s social network to only personal posts and paid adverts.

So-called public posts, such as those from media organisation Facebook pages, are being moved to a separate “explore” feed timeline. As a result, media organisations in the six countries containing 1% of the world’s population – Sri Lanka, Guatemala, Bolivia, Cambodia, Serbia and Slovakia – have had one of their most important publishing platforms removed overnight.

In other weeks, “Eek!”  These countries have very volatile governments and any threat to their news outlets is very bad if free speech is going to live.  Also the news outlets in these countries do not have the budgets to pay for Facebook’s post boosting fees.  Facebook was used as a free service to spread the news, but it fell more than 50% in many of the countries where this experiment was tested.

Even if Facebook were to stop the experiment some of the media outlets would not recover.  It is curious why Facebook did not test the news feed experiment in another country.  Oh wait, we know why.  It did not want to deal with the backlash from western countries and the countless people who whine on the Internet.  In the smaller countries, there is less culpability, but more home front damage. Nice job Facebook!

Whitney Grace, January 16, 2018

Indian Regulators Pursue Market Manipulators Around Web

January 11, 2018

Apparently, efforts by India’s market watchdog have driven manipulators in that country to explore alternative methods of communication. So we learn from the article, “Market Manipulators Take To Dark Web, Whatsapp As Sebi Steps Up Surveillance” at India’s NDTV. Note that a “multi-bagger” is a deal promising multi-fold returns. We’re told:

Market manipulators have hooked onto dark web and private chat groups on messaging apps like WhatsApp and Telegram for sharing ‘multibagger’ stock tips and unpublished price sensitive information about listed firms. This has prompted the exchanges and the regulator to beef up the ‘whistleblower’ framework to encourage people, including investors and those working with various market intermediaries, to anonymously give a tip-off on such groups. The shift to these platforms follow an enhanced vigil by the capital markets watchdog Sebi (Securities and Exchange Board of India) and the stock exchanges on social media platforms like Facebook and Twitter, while the regulator can also seek call data records from telecom firms for its probe.

The article notes that both the National Stock Exchange of India and the Bombay Stock Exchange have tip-off systems in place and that officials are considering ways to reward whistleblowers. Both exchanges are using social media analytics to monitor for rumors and news reports about companies they have listed. They are also analyzing the last year’s worth of trade data for such companies, hoping to spot any breaches of norms. So far, Sebi has taken action against some parties for providing investment advice without a registration. The article observes that last year, Sebi suggested banning the exchange of “unauthorized trading tips” through chat apps, social media, and securities-related games and competitions; however, no such regulation has been put in place as of yet.

Cynthia Murrell, January 11, 2018

LinkedIn: Not Just for Job Seekers and Attention Junkies

January 8, 2018

Last year I spotted this write up: “Spies Are Watching … on LinkedIn.” My first reaction was, “This is news?” I set the item aside, and I watched my newsfeeds to see if the story had “legs.” It did not. I thought I would document the existence of the write up and invite you, gentle reader, to figure out if this is old news, new news, or just flim flam news.

The main point is that an outfit known as BfV, shorthand for Bundesamt für Verfassungsschut) monitors LinkedIn for espionage actors. The main point of the write up strike me as:

Chinese intelligence has used LinkedIn to target at least 10,000 Germans, possibly to recruit them as informants.

I wonder if other intelligence agencies monitor LinkedIn. I suppose that is a possibility.

The write up include these faked profiles:

“Rachel Li”, identified as a “headhunter” at “RiseHR”

“Alex Li”, a “Project Manager at Center for Sino-Europe Development Studies”

“Laeticia Chen”, a manager at the “China Center of International Politics and Economy” whose attractive photo was reportedly swiped from an online fashion catalog, according to a BfV official

I have not spotted any recent information about the number of faked profiles on LinkedIn. My hunch is that most of the résumés on the service might qualify as faked, but that’s just my supposition.

With Microsoft’s ownership of LinkedIn making small, yet meaningful, changes in the service, I wonder how these “fake” spy-related profiles and discussions, if any, will be filtered.

Next time you accept a “friend” on LinkedIn, will you ask yourself, “Is this fine person a spy?”

Stephen E Arnold, January 8, 2018

Sisyphus Gets a Digital Task: Defining Hate Speech, Fake News, and Illegal Material

January 2, 2018

I read “Germany Starts Enforcing Hate Speech Law.” From my point of view in Harrod’s Creek, Kentucky, defining terms and words is tough. When I was a debate team member, our coach Kenneth Camp insisted that each of the “terms” in our arguments and counter arguments be defined. When I went to college and joined the debate team, our coach — a person named George Allen — added a new angle to the rounded corners of definitions. The idea was “framing.” As I recall, one not only defined terms, but one selected factoids, sources, and signs which would  put our opponents in a hen house from which one could escape with scratches and maybe a nasty cut or two.

The BBC and, of course, the author of the article, Germany, and the lawmakers were not thinking about definitions (high school), framing (setting up the argument so winning was easier), or the nicks and bumps incurred when working free of the ramshackle structure.

The write up states:

Germany is set to start enforcing a law that demands social media sites move quickly to remove hate speech, fake news and illegal material.

So what’s hate speech, fake news, and illegal material? The BBC does not raise this question.

I noted:

Germany’s justice ministry said it would make forms available on its site, which concerned citizens could use to report content that violates NetzDG or has not been taken down in time.

And what do the social media outfits have to do?

As well as forcing social media firms to act quickly, NetzDG requires them to put in place a comprehensive complaints structure so that posts can quickly be reported to staff.

Is a mini trend building in the small pond of clear thinking? The BBC states:

The German law is the most extreme example of efforts by governments and regulators to rein in social media firms. Many of them have come under much greater scrutiny this year as information about how they are used to spread propaganda and other sensitive material has come to light. In the UK, politicians have been sharply critical of social sites, calling them a “disgrace” and saying they were “shamefully far” from doing a good job of policing hate speech and other offensive content. The European Commission also published guidelines calling on social media sites to act faster to spot and remove hateful content.

Several observations:

  1. I am not sure if there are workable definitions for the concepts. I may be wrong, but point of view, political orientation, and motivation may be spray painting gray over already muddy concepts.
  2. Social media giants do not have the ability to move quickly. I would suggest that the largest of these targeted companies are not sure what is happening amidst their programmers, algorithms, and marketing professionals. How can one react quickly when one does not know who, what, or where an action occurs.
  3. Attempts to shut down free flowing information will force those digital streams into the murky underground of hidden networks with increasingly labyrinthine arabesques of obfuscation used to make life slow, expensive, and frustrating for enforcement authorities.

Net net: We know that the BBC does  not think much about these issues; otherwise, a hint of the challenges would have filtered into the write up. We know that the legislators are interested in getting control of social media communications, and filtering looks like a good approach. We know that the social media “giants” are little more than giant, semi-organized ad machines designed to generate data and money. We know that those who allegedly create and disseminate “hate speech, fake news and illegal material” will find communication channels, including old fashioned methods like pinning notes on a launderette’s bulletin board or marking signs on walls.

Worth watching how these “factors” interact, morph, and innovate.

Stephen E Arnold, January 2, 2018

If You Want Search Engines to Eliminate Fake News, Cautiously Watch Russia

December 21, 2017

There is a growing rallying cry for social media and search to better police fake news. This is an admirable plan, because nobody should be misled by false information and propaganda. However, as history has told us, those in charge of misinformation and propaganda can often use changes like this to their advantage. Take, for example, the recent Motherboard story, “How Russia Polices Yandex, Its Most Popular Search Engine,” which detailed how Russia aimed to get rid of its “fake news” but really only encourages more of it.

The story says,

This year, the “news aggregator law” came into effect in Russia. It requires websites that publish links to news stories with over one million daily users (Yandex.News has over six million daily users) to be responsible for all the content on their platform, which is an enormous responsibility.

 

‘Our Yandex.News team has been actively working to retain a high quality service for our users following new regulations that impacted our service this past year,’ Yandex told Motherboard in a statement, adding that to comply with new regulations, it reduced the number of sources that it aggregated from 7,000 to 1,000, which have official media licenses.’

In short, since the government oversees part of Yandex, the government can make it harder to publish stories that are not favorable to itself. It’s food for thought, especially to the Mark Zuckerbergs of the world calling for more government oversight in social media. You might not get exactly what you hoped for when a third party starts calling the shots.

Patrick Roland, December 21, 2017

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta