Instagram: Another Facebook Property in the News

March 22, 2019

Instagram (IG or Insta) has become an important social media channel. Here’s a quick example:

My son and his wife have opened another exercise studio in Washington, DC. How was the service promoted? Instagram.

Did the Instagram promotions for the new facility work? Yes, quite well.

The article “Instagram Is the Internet’s New Home for Hate” makes an attempt to explain that Facebook’s Instagram is more than a marketing tool. Instagram is a source of misinformation.

The write up states:

Instagram is teeming with these conspiracy theories, viral misinformation, and extremist memes, all daisy-chained together via a network of accounts with incredible algorithmic reach and millions of collective followers—many of whom, like Alex, are very young. These accounts intersperse TikTok videos and nostalgia memes with anti-vaccination rhetoric, conspiracy theories about George Soros and the Clinton family, and jokes about killing women, Jews, Muslims, and liberals.

We also noted this statement:

The platform is likely where the next great battle against misinformation will be fought, and yet it has largely escaped scrutiny. Part of this is due to its reputation among older users, who generally use it to post personal photos, follow inspirational accounts, and keep in touch with friends. Many teenagers, however, use the platform differently—not only to connect with friends, but to explore their identity, and often to consume information about current events.

Is it time to spend more time on Instagram? How do intelligence-centric software systems index Instagram content? What non obvious information can be embedded in a picture or a short video? Who or what examines content posted on the service? Can images with hashtags be used to pass information about possibly improper or illegal activities?

Stephen E Arnold, March 22, 2019

Facebook: Ripples of Confusion, Denial, and Revisionism

March 18, 2019

Facebook contributed to an interesting headline about the video upload issue related to the bad actor in New Zealand. Here’s the headline I noted as it appeared on Techmeme’s Web page:


The Reuters’ story ran a different headline:


What caught my attention is the statement “blocked at upload.” If a video were blocked at upload, were those videos removed? If blocked, then the number of videos drops to 300 million.

This type of information is typical of the coverage of Facebook, a company which is become the embodiment of social media.

There were two other interesting Facebook stories in my news feed this morning.

The first concerns a high profile Silicon Valley investor, Marc Andreessen. The write up reports and updates a story whose main point is:

Facebook Board Member May Have Met Cambridge Analytica Whistleblower in 2016.

Read more

When the Best and the Brightest Tech Stars Fail

March 15, 2019

Two outages. Two explanations.

Google’s March 12, 2019, outage was explained this way at Google Cloud Status Dashboard.

On Monday 11 March 2019, Google SREs were alerted to a significant increase in storage resources for metadata used by the internal blob service. On Tuesday 12 March, to reduce resource usage, SREs made a configuration change which had a side effect of overloading a key part of the system for looking up the location of blob data. The increased load eventually lead to a cascading failure.

I like the phrase “cascading failure.” Sounds inevitable.

Facebook’s explanation of its one day plus outage appeared in “Biggest Facebook Outage in its History Due to Database Issues.” The explanation was:

The company’s databases were “overloaded.”

Concentration, just like in the mainframe days, can create some challenges for those downstream. If the big outfits cannot deal with failure, I don’t feel bad when my Android phone complains it cannot connect to the Google Play store where malware may still live.

Stephen E Arnold,  March 15, 2019

Building Data Sets

March 14, 2019

I read “Why Is It Legal to Collect Data on Kids, Let Alone Sell It?” The write up comes from a person who contributed in some way to the fine operation that Facebook embodies. Now that person is asking questions about building databases.

I noticed this quote, allegedly made by one of the Facebookers past:

“Why is it okay for credit card companies to sell financial records?” McNamee said at the South by Southwest conference in Austin over the weekend. “Why is it legal for cell companies to sell location data? Why is it legal for companies that make apps for health and wellness to sell or trade our data? Why is it legal for anybody on the web to transact in our web history? Why is it legal to collect data on kids under 18, much less sell it?”

You can read the rest of the article, but I want to offer some answers to these questions; to wit:

  1. Because we can collect and build databases. Users are too stupid to know what we are doing.
  2. Because there are no consequences. Regulators and lawyers are as clueless as the users.
  3. Because it is easy if you are smart like us. Anyone not working at a Google- or Facebook-type company or an adviser to one of these outfits is not going to be able to keep up with us.
  4. Because we want do things which make people like us feel cool. Snort, snort, snort.
  5. Because we never understood the silliness related to any philosophical bedrock other than the mantra of “me, me, me” or what I call digital existentialism. One is what one does to attract attention from those just like “one.”
  6. Because it is cool to do the mea culpa thing in public.

Stephen E Arnold, March 14, 2019

Facebook Tracking Amidst Privacy Assertions

March 7, 2019

Privacy International published “Guess What? Facebook Still Tracks You on Android Apps (Even If You Don’t Have a Facebook Account).”

I am not particularly surprised. The chatter about Facebook and its privacy initiative is one of those “pivot” plays. Talk is cheap, unlike online advertising.

The write up states:

seven apps, including Yelp, the language-learning app Duolingo and the job search app Indeed, as well as the King James Bible app and two Muslim prayer apps, Qibla Connect and Muslim Pro, still send your personal data to Facebook before you can decide whether you want to consent or not. Keep in mind: these are apps with millions of installs.

There are some recommendations in the write up. DarkCyber suggests you read these before spending much time on statements like this one from Facebook: “A Privacy Focused Vision for Social Networking.”

Stephen E Arnold, March 7, 2019

A Hip Bro Excuse: We Cannot Modify Our System and Software

March 5, 2019

I was zipping through news this morning, and I spotted “Google to Ban Political Ads Ahead of Federal Election, Citing New Transparency Rules.” The “rules” apply to Canada, not the United States. Google will not sell ads. That’s interesting.

The main point of the article for me was the reason Google will turn down money and leave a giant pile of cash on the table was this sentence in the write up (which I assume is true, of course):

Google is banning political advertising on its platforms ahead of the Canadian federal election because of new ad transparency rules it says would be too challenging to comply with.

Challenge, when I hear the word, means “too darned difficult.” A connotation for me is “what a waste of time and effort.” Another is a variation on the Bezos basic, “Just walk away”; for instance, Hasta la vista, Nueva York.”

Is adapting Google’s ad sense too challenging for a company which has a boat load of talented programmers?

What I find interesting is that Facebook has the same limitation. Do you recall that Facebook users were going to get a control that would allow them to delete some of their data. The delay, I heard, is a consequence of figuring out how to make delete work.

Net net: Two outfits with smart people are unable to modify their respective systems.

Do I believe that technical modifications are too difficult?

Yeah, I believe the moon is made of green cheese as well. The questions these technical challenges beg include:

  • What is the specific problem?
  • Is the system intractable so that other changes are too great a challenge? If so, what functions cannot be altered?
  • What is the engineering approach at Google which renders its software unfixable?
  • Are Google’s (and Facebook’s) engineers less effective than technical personnel at other companies; for example, Apple or Microsoft?
  • What’s the personnel problem? Is underpaying certain ethnic groups an issue?

Maybe regulations are the optimal way to deal with companies unable to comply with government regulations?

Stephen E Arnold, March 5 2019

AI: The Facebook View for the Moment

February 21, 2019

We get some insight into the current trajectory of AI from Fortune’s article, “Facebook’s Chief A.I. Scientist Yann LeCun On the Future of Computer Chips, Lawnmowers, and Deep Learning.” The write-up points to a talk on AI hardware LeCun gave at the recent International Solid-State Circuits Conference in San Francisco.

Writer Jonathan Vanian highlights three points. First, he notes the advent of specialized chips designed to save energy, which should facilitate the use of more neural networks within data centers. This could mean faster speech translation, for example, or more effective image analysis. The tech could even improve content moderation, a subject much on Facebook’s mind right now. Then there are our “smart” devices, which can be expected to grow more clever as their chips get smaller. For instance, Vanian envisions a lawn mower that could identify and pull weeds. He notes, though, that battery capacity is another conundrum altogether.

Finally, we come to the curious issue of “common sense”—so far, AIs tend to fall far short of humans in that area. We’re told:

“Despite advances in deep learning, computers still lack common sense. They would need to review thousands of images of an elephant to independently identify them in other photos. In contrast, children quickly recognize elephants because they have a basic understanding about the animals. If challenged, they can extrapolate that an elephant is merely a different kind of animal—albeit a really big one. LeCun believes that new kinds of neural networks will eventually be developed that gain common sense by sifting through a smorgasbord of data. It would be akin to teaching the technology basic facts that it can later reference, like an encyclopedia. AI practitioners could then refine these neural networks by further training them to recognize and carry out more advanced tasks than modern versions.”

The chips to facilitate that leap are not yet on the market, of course. However, LeCun seems to believe they will soon be upon us. I do hope so; perhaps these super chips will bring some much needed sense to our online discourse.

Cynthia Murrell, February 21, 2019

UK Report about Facebook, the Digital Gangster

February 18, 2019

The hot read this morning is the UK’s report about a highly successful US company, Facebook. You can obtain a copy of the report at this link.

Coverage of the report is extensive, and DarkCyber anticipates more analyses, explanations, and Twitterverse excitement as the report diffuses.

Here are five items to note in the report:

First, the use of the phrase, “digital gangster” is brilliant That’s a superior two word summary of the document and its implicit indictment of the Silicon Valley way and America, the home of the bad actors. The subtext is that the US has fostered, aided, and abetted a 21st century Al Capone who operates a criminal cartel on a global scale. DarkCyber expects more anti-US business push back with “digital gangsterism” becoming a fertile field for study in business school. Who will write the book “Principles of Digital Gangsterism”?

ec and zuck

Second, the idea of a linking “data ethics and algorithms” is an interesting one. Like the goal of having software identify Deepfakes (videos and images which depict a fictional or false reality), dealing with a fuzzy concept like data ethics and the equally fuzzy world of algorithm thresholds may lead to a rebirth of old-school philosophical debates. Who will be the 21st-century Plato? The experts who will chop through the notional wilderness of ethics and making money could expose what, for want of a better phrase, I will call “deep stupid.” Like deepfake” the precise definition of deep stupid has to be crafted.

Third, regulation is an interesting idea. But the UK report provides compelling evidence that the digital “cat is out of the bag” with regard to data collection, analysis, and use of information. Regulations can put people in jail. Regulations can shut down a company operating in a country. But regulation of zeros and ones on a global scale in a distributed computing environment boils down to taxes and coordinated direct actions. Will war against Facebook and Facebook-type companies be put on the table? Fines or nano drones with a nano warhead?

Fourth, the document does not focus on what I call a Brexit-scale issue: Destabilizing a country. The report offers no path forward when a country has been blasted with digital flows operating outside of conventional social norms. The message, as I understand it, is, “We have a problem so let’s ignore it.”

Finally, the report itself is proof that flows of digital information decompose, disintermediate extablished institutions, and allow new social norms to grow in the datasphere. Facebook is Mark Zuckerberg, and Facebook is a product of the US business environment. What do these two facts sum to? No pleasant answer.

Let’s check Facebook, shall we?

Stephen E Arnold, February 18, 2019

Apple Sends Facebook To The Principal’s Office

February 8, 2019

Facebook was wearing a dunce cap. According to Recode, Apple is not happy with the social media giant: “Apple Says It’s Banning Facebook’s Research App That Collects Users’ Personal Information.” Apple is accusing Facebook of breaching an agreement with a new “research” app. Basically, Facebook paid users for sharing their personal information with the app, such as private messages, location data, etc. The big stickler is that users as young as thirteen were targeted.

It is against Apple’s privacy policy to collect any kind of data and apps of this nature are no available in the Apple App Store. Facebook found a loop through Apple’s “Developer Enterprise Program,” where Apple partners can release apps for testing, mostly for their own employees. The apps are not available to the general public and Facebook used this method to pay users to download the app and get paid.

Facebook’s options are similar to country-to-country negotiations: Do what’s necessary to reduce tensions. The Facebook can figure out how to work around the “problem.” I learned:

“The story also shows how important it is for Facebook to collect data on other apps people use on their phones. It’s a big competitive advantage, and collecting this kind of data isn’t foreign to Facebook. The company actually collected similar user data through a separate app Facebook owns called Onavo Protect, which was just removed from the App Store in August for violating Apple’s guidelines. (It’s still available for Android users.)”

User data tell social media sites like Facebook about their habits and then that information can be sold to advertisers. The question is how long will Apple abide by its privacy guidelines, or is Apple flexing its muscles for another reason?

Whitney Grace, February 8, 2019

Whoa, Facebook

February 3, 2019

We know that Facebook has been facing criticism for playing fast and loose with user privacy. Now Fortune examines the issue in its piece, “Forcing Facebook to Behave: Why Consent Decrees Are Not Enough.” Writer Jeff John Roberts observes that the FTC may levy a significant fine on the company for violating a consent decree. (Facebook, of course, asserts it did no such thing.) This is a step in the right direction, perhaps, but will it do any good? We’re told:

“Facebook executives appear to have calculated long ago that a fine, even one for $1 billion, was the price of rapid growth and one that it could well afford. The calculation has paid off: Not only has Facebook turned user data into an advertising gold mine, it has also used it to squelch competitors and maintain a monopoly. Why should it have acted any differently? or companies to take privacy seriously, the U.S. requires a different legal regime. Right now, regulators must rely on the consent decree system, which gives companies a pass on their first major privacy violation, and then lets them quibble about subsequent violations. Vladeck points out consent decrees are a relatively new policy tool to oversee privacy, and the FTC is still navigating how to use them. This may be the case but the law that underlies them—known as Section 5, which forbids ‘unfair or deceptive acts’— still feels like a clumsy tool to police data regulation.”

On the other hand, Roberts notes, other countries deal more directly with the issue—with very specific privacy laws and significant consequences for those that break them. There is hope for common sense at home, too: a national privacy law has been proposed by an alliance of retail, finance, and tech companies. We shall see what becomes of it.

Cynthia Murrell, February 3, 2019

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta