Australia: Facebook and Google Will Not Be Allowed to Kill News

April 20, 2020

Australia to Force Technology Giants Facebook and Google to Pay for News Content” expresses something News Corp’s Rupert Murdoch has long desired: Money for real news.

The write up reports:

Social media giants Facebook and Google will be forced to pay Australian media companies for sharing their content or face sanctions under a landmark decision by the Morrison government. The move comes as the media industry reels from tumbling advertising revenue, already in decline before the Covid 19 coronavirus outbreak collapsed the market.

Several questions may soon be answered:

  • Will Facebook and Google tie up the “pay for news” effort in the courts?
  • If the invoices are sent, will Facebook and Google pay them or seek to stall, negotiate, or just ignore blandishments?
  • Will the law cause Facebook and Google to set up their own news gathering operations and subsidize them via ad revenue; that is, reinvent traditional news. (Remember: Apple and Google have teamed up to deal with coronavirus. The “pay for news” effort may force a similar shotgun marriage.)
  • Will other countries like members of the Five Eyes, get with this “pay for news” program?

Net net: Facebook and Google face a management moment that could become “real news.”

Stephen E Arnold, April 20, 2020

Facebook: Disappearing Snapchap Content?

March 24, 2020

Ever vigilant Techcrunch published “Instagram Prototypes Snapchat Style Disappearing Text Messages.” The article reports:

Instagram has prototyped an unreleased ephemeral text messaging feature that clears the chat thread whenever you leave it.

The function seems to complement Whatsapp disappearing content.

Will there be unintended consequences of these measures? DarkCyber believes that Facebook has a knack for sparking discussion about its policies, goals, and intentions among some customer segments.

Stephen E Arnold, March 24, 2020

WhatsApp: Indexed by Google

March 11, 2020

The Orissa Post reports, “Google Indexes Private WhatsApp Group Chat Links.” As a result of the search indexing, assorted private chat groups were summarily forced open for anyone to join. Writer Ians reports,

“According to a report in Motherboard, invitations to WhatsApp group chats were being indexed by Google. The team found private groups using specific Google searches and even joined a group intended for NGOs accredited by the UN and had access to all the participants and their phone numbers. Journalist Jordan Wildon said on Twitter that he discovered that WhatsApp’s ‘Invite to Group Link’ feature lets Google index groups, making them available across the internet since the links are being shared outside of WhatsApp’s secure private messaging service. ‘Your WhatsApp groups may not be as secure as you think they are,’ Wildon tweeted Friday, adding that using particular Google searches, people can discover links to the chats. According to app reverse-engineer Jane Wong, Google has around 470,000 results for a simple search of ‘chat.whatsapp.com’, part of the URL that makes up invites to WhatsApp groups.”

A spokesperson for WhatsApp confirmed that publicly posted invite links would be available to other WhatsApp users, and insists folks should not have to worry their private invites may be made public in this way. On the other hand, Google’s public search liaison seemed to place the blame squarely on WhatsApp. He tweets:

“Search engines like Google & others list pages from the open web. That’s what’s happening here. It’s no different than any case where a site allows URLs to be publicly listed. We do offer tools allowing sites to block content being listed in our results.”

Perhaps both companies could have handled this issue with more consideration. We wonder whether WhatsApp has since taken advantage of those content-blocking tools.

Cynthia Murrell, March 11, 2020

Facebook: A Blunder Down Under?

March 10, 2020

DarkCyber noted “Australia sues Facebook over Cambridge Analytica, fine could scale to $529BN.” The modest fine imposed by Britain has not dissuaged Australia from boosting the cost of data impropriety. Facebook — yes, the Cambridge Analytica matter — may incur a hefty fine. The write up states:

The suit alleges the personal data of Australian Facebook users was disclosed to the This is Your Digital Life app for a purpose other than that for which it was collected — thereby breaching Australia’s Privacy Act 1988. It further claims the data was exposed to the risk of being disclosed to Cambridge Analytica and used for political profiling purposes, and passed to other third parties.

The potential fine is sufficiently large to catch the attention of the “connect everyone” company. In NBC News’ math that is about $20.00, right?

On the other hand, nothing has applied the brakes to Facebook’s activities for years. Money alone may not press the pedal to the metal.

Stephen E Arnold, March 10, 2020

Facebook Is Definitely Evil: Plus or Minus Three Percent at a 95 Percent Confidence Level

March 2, 2020

The Verge Tech Survey 2020 allegedly and theoretically reveals the deepest thoughts, preferences, and perceptions of people in the US. The details of these people are sketchy, but that’s not the point of the survey. The findings suggest that Facebook is a problem. Amazon is a problem. Other big tech companies are problems. Trouble right here is digital city.

The survey findings come from a survey of 1123 people “nationally representative of the US.” There was no information about income, group with which the subject identifies, or methodology. But the result is a plus or minus three percent at a 95 percent confidence level. That sure seems okay despite DarkCyber’s questions about:

  • Sample selection. Who pulled the sample, from where, were people volunteers, etc.
  • “Nationally representative” means what? Was it the proportional representation method? How many people from Montana and the other “states”? What about Puerto Rico? Who worked for which company?
  • Plus or minus three percent. That’s a swing at a 95 percent confidence level. In terms of optical character recognition that works out to three to six errors per page about 95 percent of the time. Is this close enough for a drone strike or an enforcement action. Oh, right, this is a survey about big tech. Big tech doesn’t think the DarkCyber way, right?
  • What were the socio economic strata of the individuals in the sample?

What’s revealed or discovered?

First, people love most of the high profile “names” or “brands.” Amazon is numero uno, the Google is number two, and YouTube (which is the Google in case you have forgotten is number three. So far, the data look like a name recognition test. “Do you prefer this unknown lye soap or Dove?” Yep, people prefer Dove. But lye soap may be making a come back.

The stunning finding is that Facebook and Twitter impact society in a negative way. Contrast this to lovable Google and Amazon, 72 percent are favorable to the Google and 70 percent are favorable to Amazon.

Here’s the data about which companies people trust. Darned Amazing. People trust Microsoft and Amazon the most.

image

Which companies do the homeless and people in rural West Virginia trust?

Plus 72 percent of the sample believe Facebook has too much “power.” What does power mean? No clue for the context of this survey.

Gentle reader, please, examine the article containing these data. I want to go back in time and reflect on the people who struggled in my statistics classes. Painful memories but I picked up some cash tutoring. I got out of that business because some folks don’t grasp numerical recipes.

Stephen E Arnold, March 2, 20020

Clever Teens and a Less Than Clever Instagram

March 1, 2020

Teenagers are young, inexperienced, and do anything for a laugh. Most of their time their antics result in trouble with horrible consequences, but this time the victim is Instagram. Instagram is one of the most popular social media platforms for teenagers and, being a generation who never knew a world without the Internet, they figured out how to hack aka mess with the algorithm. CNET has the story about, “Teens Have Figured Out How To Mess With Instagram’s Tracking Algorithm.”

Teenagers may post their entire lives on social media, but some of them are concerned about social media platforms such as Instagram tracking their data. They especially do not like Instagram tracking them, so they formed a plan. Using groups of trusted friends with access to multiple accounts, teenagers are fooling Instagram. Here is how:

“First, make multiple accounts. You might have an Instagram account dedicated to you and friends, or another just for your hobby. Give access to one of these low-risk accounts to someone you trust.

Then request a password reset, and send the link to that trusted friend who’ll log on from a different device. Password resets don’t end Instagram sessions, so both you and the second person will be able to access the same account at the same time.

Finally, by having someone else post the photo, Instagram grabs metadata from a new, fresh device. Repeat this process with a network of, say, 20 users in 20 different locations with 20 different devices? Now you’re giving Instagram quite the confusing cocktail of data.”

The hilarious part is that while it is not against Instagram’s policies, the parent company Facebook advises against it because of security risks. While it is laughable that Facebook is worried about privacy, when that company and other collect user data to tailor Internet experiences with personalized ads. However, if one person on the Instagram account posted something malicious, the entire group is accountable.

In order to have access to one of these “hacking” accounts, users must follow strict rules. They must only post content that the original users approve, do not accept follow requests or follow others, and any violations results in dismissal from access.

Clever teens. Less clever Instagram and, by extension, the fun folks at Facebook.

Whitney Grace, March 1, 2020

Facebook PR: Lean In, Reframe, and Output Word Salad

February 28, 2020

Facebook’s Sheryl Sandberg Defends Her Company and Her Reputation in Wide-Ranging Interview” is an interesting example of corporate PR, “running for office” preparation, and really heartfelt, super sincere explanations.

A techno journalist of significant stature wrote a book about Facebook. Allegedly the effort involved hundreds of interviews with Facebookers past and present. DarkCyber has not read Facebook: The Inside Story. DarkCyber does not use Facebook to locate “friends”—although one of our deceased dogs has a Facebook page.

This interview with an NBC journalist appears to be a little bit about the book, a little bit about Facebook; for example:

“I wish so much that the world could see the Mark I know,” Sandberg said. “Mark is an enormously, enormously talented guy. He has a great product sense. … People think he doesn’t understand people. That’s just clearly wrong.

Right, clearly wrong. Facebook is suing NSO Group. Facebook is suing a partner OneAudience. Facebook is using legal weapons to demonstrate how Facebook “understands people.” Right. DarkCyber thinks it recalls a bit of an issue with Cambridge Analytica, a zippy researcher, and a whistle blower with a keen fashion sense. But maybe that was a hallucination. People do have them. People also reshape facts into confections of delight.

We noted this statement from the NBC story:

Sandberg also offered her most robust defense to date of Facebook’s business model and its vast collection of personal data, which she said was necessary to offer users a better content and advertising experience. “There is growing concern, which is based on a lack of understanding, that we are using people’s information in a bad way. We are selling it. We are giving it away. We are violating it. None of that’s true. We do not sell data,” she said. “Here’s what we do: We take your information and we show you personalized ads … [to give you] a much better experience.”

Yep, experience.

Lean in. Be sincere. Deliver factoids. Let the lawyers do their work.

Mr. Zuckerberg understands people. Right.

Stephen E Arnold, February 28, 2020

Want Facebook Statistics?

February 19, 2020

If you want a round up of Facebook statistics, take a look at “Facebook Statistics You Need to Know.” The data come from secondary sources. You may want to verify the factoids before you head to a job interview at Facebook. If you are applying for work at a social media company or a mid tier consulting firm, go with the numbers. Here are three which DarkCyber noted:

An okay, boomer number: People aged 65 and over are the fastest-growing demographic on Facebook

An Amazon wake up call: In the U.S., 15% of social media users use Facebook to shop

TV executive, are you in touch with viewer preferences?  Square Facebook videos get 35% more views than landscape videos

No data are presented about the percentage of Mr. Zuckerberg’s neighbors in Palo Alto who dislike him, however.

Stephen E Arnold, February 19, 2020

Facebook: Chock Full of Good Ideas

December 31, 2019

Investigators are not a priority for Facebook. How does DarkCyber know this? “WhatsApp to Add ‘Disappearing Messages’ Feature Soon” explained a function that may make those managing interesting groups to have more control over content.

Here’s the statement which caught the attention of our alert service:

With the ‘Delete Messages’ feature, group admins will able to select a specific duration for messages on the group and once a message crosses the duration, it will be automatically deleted, news portal GSMArena reported recently. Initially, the new feature was expected to be available for both individual chats and group chats, but now the report claims that the feature will be limited to group chats only. The ‘Delete Messages’ feature for group chats will make it easy for the admins to manage old messages and chats.

How many coordinators will find this new feature helpful? Too many.

Stephen E Arnold, December 31, 2019

Business 101: Incentives Work at Facebook. Talk, Not So Much

December 18, 2019

Many years ago, I worked on a project for a very large, quite paranoid company. I am not sure how I landed a project to interview about two dozen unit CEOs and interview each about technology. As I recall, my task was to group the CEOs into three categories:

Bluebirds—These were the CEOs who understood technology germane to their business unit, evidenced no particular fear testing and integrating such technology, and who were following the company’s marching orders.

Canaries—These were executives who evidenced fear of technology. These individuals were not likely to move forward in order to reduce costs and staff using technology whilst increasing revenue and profits for the company.

Sparrows—These were hapless commodity CEOs who did not know much about technology, were happy snacking near careless MBAs lunching in the park, and who generally reacted to what most other CEOs were doing with regards to technology.

I had a bunch of fancy criteria, scoring sheets, prepared and consistent questions, plus other odds and ends required for such a subjective job.

My findings, I believe, revealed that the technology question was stupid. The CEOs were accountants and lawyers. Knowledge of technology was abysmal. The CEOs as a group responded to one thing—bonuses and raises. Chatter about technology was essentially irrelevant.

Whatever DNA this group of big time “leaders” had was warped in the intense radiation of benchmarks needed to take home a fat pay packet and get a bonus big enough to choke an investment banker.

I thought of this project when I read “Facebook Is Still Prioritizing Scale over Safety.” There’s quite a bit of yada yada in the write up, but this segment explains what drives Facebook:

Facebook calls its product managers’ ability to hit their metric “impact,” and impact can count for high percentages of product managers’ evaluations, though it varies by position and level. At the end of the evaluation process, each individual is assigned a rating by a manager — ranging from “doesn’t meet expectations” to “redefines expectations” — which is algorithmically tied to their compensation. Managers at Facebook aren’t given discretionary raise pools (raises are handed out evenly based on ratings) and there is no appeals process for evaluations, making a good rating paramount if you work at Facebook.

In order to be a bluebird, Facebook managers follow the incentive breadcrumbs. Why? Money. Public statements and other interesting Facebook behavior are irrelevant.

Why? The explanation may be found in the precepts of high school science club management methods. These are not taught in MBA school; these are learned in high school science club meetings and late night dorm sessions among programmers and assorted engineering wizards.

To fix Facebook, change the incentives.

Stephen E Arnold, December 18, 2019

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta