A Hip Bro Excuse: We Cannot Modify Our System and Software
March 5, 2019
I was zipping through news this morning, and I spotted “Google to Ban Political Ads Ahead of Federal Election, Citing New Transparency Rules.” The “rules” apply to Canada, not the United States. Google will not sell ads. That’s interesting.
The main point of the article for me was the reason Google will turn down money and leave a giant pile of cash on the table was this sentence in the write up (which I assume is true, of course):
Google is banning political advertising on its platforms ahead of the Canadian federal election because of new ad transparency rules it says would be too challenging to comply with.
Challenge, when I hear the word, means “too darned difficult.” A connotation for me is “what a waste of time and effort.” Another is a variation on the Bezos basic, “Just walk away”; for instance, Hasta la vista, Nueva York.”
Is adapting Google’s ad sense too challenging for a company which has a boat load of talented programmers?
What I find interesting is that Facebook has the same limitation. Do you recall that Facebook users were going to get a control that would allow them to delete some of their data. The delay, I heard, is a consequence of figuring out how to make delete work.
Net net: Two outfits with smart people are unable to modify their respective systems.
Do I believe that technical modifications are too difficult?
Yeah, I believe the moon is made of green cheese as well. The questions these technical challenges beg include:
- What is the specific problem?
- Is the system intractable so that other changes are too great a challenge? If so, what functions cannot be altered?
- What is the engineering approach at Google which renders its software unfixable?
- Are Google’s (and Facebook’s) engineers less effective than technical personnel at other companies; for example, Apple or Microsoft?
- What’s the personnel problem? Is underpaying certain ethnic groups an issue?
Maybe regulations are the optimal way to deal with companies unable to comply with government regulations?
Stephen E Arnold, March 5 2019
AI: The Facebook View for the Moment
February 21, 2019
We get some insight into the current trajectory of AI from Fortune’s article, “Facebook’s Chief A.I. Scientist Yann LeCun On the Future of Computer Chips, Lawnmowers, and Deep Learning.” The write-up points to a talk on AI hardware LeCun gave at the recent International Solid-State Circuits Conference in San Francisco.
Writer Jonathan Vanian highlights three points. First, he notes the advent of specialized chips designed to save energy, which should facilitate the use of more neural networks within data centers. This could mean faster speech translation, for example, or more effective image analysis. The tech could even improve content moderation, a subject much on Facebook’s mind right now. Then there are our “smart” devices, which can be expected to grow more clever as their chips get smaller. For instance, Vanian envisions a lawn mower that could identify and pull weeds. He notes, though, that battery capacity is another conundrum altogether.
Finally, we come to the curious issue of “common sense”—so far, AIs tend to fall far short of humans in that area. We’re told:
“Despite advances in deep learning, computers still lack common sense. They would need to review thousands of images of an elephant to independently identify them in other photos. In contrast, children quickly recognize elephants because they have a basic understanding about the animals. If challenged, they can extrapolate that an elephant is merely a different kind of animal—albeit a really big one. LeCun believes that new kinds of neural networks will eventually be developed that gain common sense by sifting through a smorgasbord of data. It would be akin to teaching the technology basic facts that it can later reference, like an encyclopedia. AI practitioners could then refine these neural networks by further training them to recognize and carry out more advanced tasks than modern versions.”
The chips to facilitate that leap are not yet on the market, of course. However, LeCun seems to believe they will soon be upon us. I do hope so; perhaps these super chips will bring some much needed sense to our online discourse.
Cynthia Murrell, February 21, 2019
UK Report about Facebook, the Digital Gangster
February 18, 2019
The hot read this morning is the UK’s report about a highly successful US company, Facebook. You can obtain a copy of the report at this link.
Coverage of the report is extensive, and DarkCyber anticipates more analyses, explanations, and Twitterverse excitement as the report diffuses.
Here are five items to note in the report:
First, the use of the phrase, “digital gangster” is brilliant That’s a superior two word summary of the document and its implicit indictment of the Silicon Valley way and America, the home of the bad actors. The subtext is that the US has fostered, aided, and abetted a 21st century Al Capone who operates a criminal cartel on a global scale. DarkCyber expects more anti-US business push back with “digital gangsterism” becoming a fertile field for study in business school. Who will write the book “Principles of Digital Gangsterism”?
Second, the idea of a linking “data ethics and algorithms” is an interesting one. Like the goal of having software identify Deepfakes (videos and images which depict a fictional or false reality), dealing with a fuzzy concept like data ethics and the equally fuzzy world of algorithm thresholds may lead to a rebirth of old-school philosophical debates. Who will be the 21st-century Plato? The experts who will chop through the notional wilderness of ethics and making money could expose what, for want of a better phrase, I will call “deep stupid.” Like deepfake” the precise definition of deep stupid has to be crafted.
Third, regulation is an interesting idea. But the UK report provides compelling evidence that the digital “cat is out of the bag” with regard to data collection, analysis, and use of information. Regulations can put people in jail. Regulations can shut down a company operating in a country. But regulation of zeros and ones on a global scale in a distributed computing environment boils down to taxes and coordinated direct actions. Will war against Facebook and Facebook-type companies be put on the table? Fines or nano drones with a nano warhead?
Fourth, the document does not focus on what I call a Brexit-scale issue: Destabilizing a country. The report offers no path forward when a country has been blasted with digital flows operating outside of conventional social norms. The message, as I understand it, is, “We have a problem so let’s ignore it.”
Finally, the report itself is proof that flows of digital information decompose, disintermediate extablished institutions, and allow new social norms to grow in the datasphere. Facebook is Mark Zuckerberg, and Facebook is a product of the US business environment. What do these two facts sum to? No pleasant answer.
Let’s check Facebook, shall we?
Stephen E Arnold, February 18, 2019
Apple Sends Facebook To The Principal’s Office
February 8, 2019
Facebook was wearing a dunce cap. According to Recode, Apple is not happy with the social media giant: “Apple Says It’s Banning Facebook’s Research App That Collects Users’ Personal Information.” Apple is accusing Facebook of breaching an agreement with a new “research” app. Basically, Facebook paid users for sharing their personal information with the app, such as private messages, location data, etc. The big stickler is that users as young as thirteen were targeted.
It is against Apple’s privacy policy to collect any kind of data and apps of this nature are no available in the Apple App Store. Facebook found a loop through Apple’s “Developer Enterprise Program,” where Apple partners can release apps for testing, mostly for their own employees. The apps are not available to the general public and Facebook used this method to pay users to download the app and get paid.
Facebook’s options are similar to country-to-country negotiations: Do what’s necessary to reduce tensions. The Facebook can figure out how to work around the “problem.” I learned:
“The story also shows how important it is for Facebook to collect data on other apps people use on their phones. It’s a big competitive advantage, and collecting this kind of data isn’t foreign to Facebook. The company actually collected similar user data through a separate app Facebook owns called Onavo Protect, which was just removed from the App Store in August for violating Apple’s guidelines. (It’s still available for Android users.)”
User data tell social media sites like Facebook about their habits and then that information can be sold to advertisers. The question is how long will Apple abide by its privacy guidelines, or is Apple flexing its muscles for another reason?
Whitney Grace, February 8, 2019
Whoa, Facebook
February 3, 2019
We know that Facebook has been facing criticism for playing fast and loose with user privacy. Now Fortune examines the issue in its piece, “Forcing Facebook to Behave: Why Consent Decrees Are Not Enough.” Writer Jeff John Roberts observes that the FTC may levy a significant fine on the company for violating a consent decree. (Facebook, of course, asserts it did no such thing.) This is a step in the right direction, perhaps, but will it do any good? We’re told:
“Facebook executives appear to have calculated long ago that a fine, even one for $1 billion, was the price of rapid growth and one that it could well afford. The calculation has paid off: Not only has Facebook turned user data into an advertising gold mine, it has also used it to squelch competitors and maintain a monopoly. Why should it have acted any differently? or companies to take privacy seriously, the U.S. requires a different legal regime. Right now, regulators must rely on the consent decree system, which gives companies a pass on their first major privacy violation, and then lets them quibble about subsequent violations. Vladeck points out consent decrees are a relatively new policy tool to oversee privacy, and the FTC is still navigating how to use them. This may be the case but the law that underlies them—known as Section 5, which forbids ‘unfair or deceptive acts’— still feels like a clumsy tool to police data regulation.”
On the other hand, Roberts notes, other countries deal more directly with the issue—with very specific privacy laws and significant consequences for those that break them. There is hope for common sense at home, too: a national privacy law has been proposed by an alliance of retail, finance, and tech companies. We shall see what becomes of it.
Cynthia Murrell, February 3, 2019
Facebook: Quite a Slippery Fish
February 1, 2019
Apple nuked Facebook. Finally, a company took action, which seems to have been overdue because of the endless, “Gee, we’re sorry” statements.
I noted an article in TNW called “A Handy List of Ways Facebook Has Tried to Sneakily Gather Data about You.” Most people know about young kids who spend money via their parents’ credit card and carrying the moniker “whales.” But there is a useful list to remind me why our Tess the Dog Facebook page receives so many friend requests. Tess died in 2016, but people still want her to be their pal.
Perhaps someone in a US regulatory agency will print out a copy of the TNW article and, maybe, think about its information.
Stephen E Arnold, February 1, 2019
Facebook: Will Its Artificial Intelligence Understand France, Russia, and the EU?
January 25, 2019
According to one prominent expert, Facebook has gone all in on AI. A write-up at Analytics India Magazine reports, “Yann LeCun Says Facebook is ‘Dust’ Without Deep Learning, and No One is Disagreeing.” Writer Abhijeet Katte reports on comments LeCun made in a recent CNN interview. Katte writes:
“The French AI expert who played a pivotal role in setting up Facebook’s lab in Paris shared in an interview to CNN, ‘If you take the deep learning out of Facebook today, Facebook’s dust. It’s entirely built around it now.’ The statement typifies the current stand on technology by Mark Zuckerberg led Menlo Park giant which has been embattled over its role in US elections. Even though statement has been largely dubbed controversial, LeCun summed up Facebook’s transformation around AI and ML succinctly. Over the last three to four years, Facebook has been slowly but surely transforming its business around intelligent systems technology. The change can be seen in features such as posts, translations and newsfeed algorithms which is the core of the social network platform. Facebook applied deep learning to combat hate speech and misinformation in countries like Myanmar. The social media giant was criticized when the platform was said to have fueled ethnic violence against the Rohingya population. The company also said AI application created by Facebook is now capable of flagging 52 percent of all content it gets rid of in Myanmar before it is reported by users.”
The article goes on to list several ways Facebook has incorporated deep learning—FB Learner Flow, the “backbone” of the company’s AI; the convolutional neural network system Building Perception; and text-comprehension-engine DeepText, which is said to go beyond traditional NLP and can work on multiple languages. Though LeCun’s observation seems to have stirred some controversy, would it be so surprising for Facebook to become almost entirely dependent on AI tech?
Facebook may have to be dependent upon human attorneys.
Cynthia Murrell, January 25, 2019
Facebook User Awareness: Two Views
January 17, 2019
What happens when Silicon Valley centric “real” journalists contemplate the question, “How much do Americans know about data slurping, reusing, and monetizing.
For one view, navigate to “Most Facebook Users Still in the Dark about Its Creepy Ad Practices, Pew Finds.” The headline tells the story. I learned:
Pew found three-quarters (74%) of Facebook users did not know the social networking behemoth maintains a list of their interests and traits to target them with ads, only discovering this when researchers directed them to view their Facebook ad preferences page.
Now for another view. Navigate to “Don’t Underestimate Americans’ Knowledge of Facebook’s Business Model.” I learned from this write up:
But let’s take another look at the numbers. According to Pew, 26 percent of Americans are aware that Facebook records a list of their interests and uses it to target ads at them. There are roughly 214 million Americans with Facebook profiles. If that’s the case, then over the past decade, 55.6 million people have educated themselves about how ad targeting works. Facebook itself has played no small role in this effort, regularly describing their ad targeting system in software and marketing materials, and recently even started building pop-up events around it.
And to add beef to the argument:
Pew surveyed more than 3,400 U.S. Facebook users in May and June, and found that a whopping 44 percent of those ages 18 to 29 say they’ve deleted the app from their phone in the last year. Some of them may have reinstalled it later. Overall, 26 percent of survey respondents say they deleted the app, while 42 percent have “taken a break” for several weeks or more, and 54 percent have adjusted their privacy settings.
Nothing like interpreting data from a survey from the left coast.
Stephen E Arnold, January 17, 2019
Facebook Starts 2019 with Some Advice to Consider
January 2, 2019
Ah, the Guardian. I read “‘Resign from Facebook’: experts offer Mark Zuckerberg advice for 2019.” My hunch is that the Zuck will ignore the input. But it is fun for an outfit struggling with revenue and technology to provide suggestions to an organization with plenty of dough and a new advertising business to flog. Jealous much?
Facebook makes headlines because of allegations or discoveries like those reported in “Facebook Collects information from Your Android Even If You Don’t Have Facebook.” One with capitalist DNA automatically responds to this type of business method.
The Guardian begs for dollars with increasingly large yellow banner pleas. Facebook gathers data and goes about its business.
The Guardian reported:
Tom Watson, deputy leader of the Labor party
Will be: Continue to evade parliamentary scrutiny and personal responsibility for Facebook’s problems.
Should be: Have a productive life having resigned from the company he founded to leave a new leadership team to clean up his mess.
Unlikely. Money can be messy when there is a lot of it.
Stephen E Arnold, January 2, 2019
Facebook: The Fallacy of Rules in an Open Ended Datasphere
December 29, 2018
I read “Inside Facebook’s Secret Rulebook for Global Political Speech.” Yogi Berra time: It’s déjà vu all over again.”
Some history, gentle reader.
Years ago I met with a text analytics company. The purpose of the meeting was to discuss how to identify problematic content; for example, falsified reports related to a warfighting location.
I listened as the 20 somethings and a couple of MBA types bandied about ideas for creating a set of rules that would identify the ways in which information is falsified. There was the inevitable knowledgebase, a taxonomy of terms and jargon, and rules. “If then” stuff.
The big idea was to filter the content with the front end of old school look ups and then feed the outputs into the company’s smart system. I listened and suggested that the cost and time of fiddling with rules would consume the available manpower, time, and money.
Ho, ho, ho was the response. Listen to the old goose from rural Kentucky.
Yeah, that was in 2005, and where is that system now? It’s being used as a utility for IBM’s staggering mountain of smart software and for finding items of interest for a handful of financial clients.
Ho, ho, ho. The joke is one the whiz kids and the investors, who care going to run out of patience when the light bulb does on and says:
“Yo, folks, figuring out what’s fake, shaped, disinformationized, or reformationized content is what makes search difficult.”
I read a scoop from the New York Times. Yep, that’s the print newspaper which delivers to my door each day information that is two or three days old. I see most of the stories online in one form or another. Tip: 85 percent of news is triggered by AP or Reuters feeds.
The article reveals that Facebook’s really smart people cannot figure out how to deal with various types of speech: Political and other types. The child porn content on WhatsApp is a challenge as well I would add.
The write up says:
An examination of the files revealed numerous gaps, biases and outright errors. As Facebook employees grope for the right answers, they have allowed extremist language to flourish in some countries while censoring mainstream speech in others.
Yep, a scoop.
Facebook’s hubris, like the text processing company which dragged me into a series of bull sessions, allows the company to demonstrate that it cannot cope with filtering within a datasphere in which controls are going to be tough to enforce.
The fix is to create a for fee country club. If a person does not meet the criteria, no membership for you. Then each member gets the equivalent of a US social security number which is linked to the verified identity, the payment mechanism, and other data the system can link.
Amazon has this type of system available, but I am not sure the Facebookers are going to pay Amazon to use its policeware to create a clean, well lit place. (Sorry, Ernest, not “lighted”.)
As a final point, may I suggest that rules based systems where big data floweth are going to be tough to create, update, and pay for.
On the other hand, why not hire the New York Times to set up an old school editorial board to do the work. News is not ringing the financial bell at the NYT, so maybe becoming the Facebook solution is a path forward. The cost may put Facebook in the dog house with investors, but the NYT regains it position as the arbiter of what’s in and what’s out.
Power again!
Stephen E Arnold, December 29, 2018