A Useful but Brief Taxonomy of Dark Patterns
December 29, 2020
The idea that one can lead a hapless Web surfer by the nose has been fascinating to me for years. The idea is that one tries to purchase something online; for example, a downloadable Photoshop macro or buy a package of batteries on a deal site. Then the system pops up requests for information or displays “Other customers liked” messages. The person who follows these digital breadcrumbs discovers that a surprise has been delivered. The single macro is a bundle and the batteries include an flashlight.
“Types of Dark Patterns” provides a handy list of the specific methods for getting that nose going in the direction it may not have wanted to go. The write up includes 12 types of dark patterns. I am not confident that the list is exhaustive, but it is a good start. Here are three dark patterns briefly explained in the article, and be sure to consult the original write up for the complete list:
- Privacy Zuckering. This pattern is named in honor of that estimable innovator Mark Zuckerberg.
- Hidden costs. A popular method for some eBay sellers and a vendor who sells big bars of soap.
- Forced continuity. The name could be better but the idea is that a single purchase leads to monthly charges because you subscribed to something. Perhaps it is a good idea to check PayPal to see if a merchant has been billing a modest amount each month even though you purchased a single service on Fiverr, the Israel-based gig site.
There’s nothing like cleverness combined with duplicity in the wild, wonderful world of online.
Stephen E Arnold, December 29, 2020
Thumb Typers Know Exactly What to Do: Okay, Not So Much
December 29, 2020
I read “More Info is Available about Which College Majors Pay Off, But Students Aren’t Using It.” This is a surprise? Nope, but it is “real” news. I noted this statement:
“What we find is that they’re not changing their majors,” Troutman [an expert in this subject] said. “They’re following their passions.”
Passions like van life, a digital emulation of riding a camel in the desert waiting for their Lawrence of Arabia to deliver a payoff?
The Bezos publication points out:
But even as this information becomes more readily available, there’s consensus that students generally aren’t consulting it when deciding where to go and what to study.
But what about students who don’t pick a major which “pays off”?
The write up states:
That students don’t know their likely future incomes well before they graduate is particularly surprising given that getting a good job is now the No. 1 reason they say they go to college, according to a nationwide survey of freshmen by an institute at the University of California at Los Angeles — edging out “learn[ing] more about things that interest me” — and that 84 percent said it was very important or essential to them to be financially very well off.
Maybe journalism? Alternatively another Bezos linked entity is hiring for warehouse work or artificial intelligence development.
Stephen E Arnold, December 29, 2020
Google Pins HR Hopes on New Executive
December 29, 2020
Perhaps this move will help Google recover some much-needed goodwill. The Times Union reports, “Google Hires New Personnel Head Amid Rising Worker Tensions.” The company has hired Fiona Cicconi, formerly the executive VP of HR at pharmaceutical company AstraZeneca. One major challenge for Cicconi will be overseeing Google’s roughly 130,000 employees as most continue to work from home until anywhere from July until September of next year. She will also have to make their transition back to Googley offices around the world as smooth as possible. But working around the pandemic may be the least of her worries. Writer Michael Liedtke reminds us:
“She is also walking into a company that has seen its relationship with its workforce change dramatically in the past few years as more employees have become convinced that it has strayed far away from the ‘Don’t Be Evil’ motto that co-founders Larry Page and Sergey Brin embraced in its early years. In 2018, thousands of Google employees walked off the job and staged public protests in a backlash spurred by concerns about how the company had been handling sexual harassment claims against top executives and managers. Google has also faced employee outrage about potential bids on military contracts and, more recently, the murky circumstances surrounding the abrupt departure of a respected artificial intelligence scholar, Timnit Gebru. After a dispute over a research paper examining the societal dangers of an emerging branch of artificial intelligence, Gebru said Google fired her earlier this month. Google maintains the company accepted her offer to resign. The rift incensed hundreds of Google employees who have signed a public letter of protest.”
Google has apologized for the way it treated Gebru, but hard feelings linger. We hope Cicconi will be able to help the company maintain a better relationship with its many employees, but the head of personnel can only do so much. The rest depends on other executives behaving well. Will the culture change?
Cynthia Murrell, December 29, 2020
DarkCyber for December 29, 2020, Is Now Available
December 29, 2020
DarkCyber for December 29, 2020, is now available on YouTube at this link or on the Beyond Search blog at this link. This week’s program includes seven stories. These are:
A Chinese consulting firm publishes a report about the low profile companies indexing the Dark Web. The report is about 114 pages long and does not include Chinese companies engaged in this business.
A Dark Web site easily accessible with a standard Internet browser promises something that DarkCyber finds difficult to believe. The Web site contains what are called “always” links to Dark Web sites; that is, those with Dot Onion addresses.
Some pundits have criticized the FBI and Interpol for their alleged failure to take down Jokerstash. This Dark Web site sells access to “live” credit cards and other financial data. Among those suggesting that the two law enforcement organizations are falling short of the mark are four cyber security firms. DarkCyber explains one reason for this alleged failure.
NSO Group, a specialized services company, has been identified as the company providing technology to “operators” surveilling dozens of Al Jazeera journalists. DarkCyber points out that a commercial firm is not in a position to approve or disapprove the use of its technology by the countries which license the Pegasus platform.
Facebook has escalated its dispute with Apple regarding tracking. Now the social media company has alleged that contractors to the French military are using Facebook in Africa via false accounts. What’s interesting is that Russia is allegedly engaged in a disinformation campaign in Africa as well.
The drone news this week contaisn two DJI items. DJI is one of the world’s largest vendors of consumer and commercial drones. The US government has told DJI that it may no longer sell its drones in the US. DJI products remain available in the US. DJI drones have been equipped with flame throwers to destroy wasp nests. The flame throwing drones appear formidable.
DarkCyber is a twice a month video news program reporting on the Dark Web, lesser known Internet services, and cyber crime. The program is produced by Stephen E Arnold and does not accept advertising or sponsorships.
Kenny Toth, December 29, 2020
Failure: The Reasons Are Piling Up
December 28, 2020
Years ago I read a monograph by some big wig in Europe. As I recall, that short book boiled down failure to one statement: “Little things add up.” The book contained a number of interesting industrial examples. “How Complex Systems Fail” is a modern take on the failure of systems. The author has cataloged 18 reasons. Here are three of the reasons, and it may be worth your time to check out the other 15.
- Complex systems contain changing mixtures of failures latent within them.
- Change introduces new forms of failure.
- Failure free operations require experience with failure.
I am not an expert on failure although I have failed. I have had a couple of wins, but the majority of my efforts are total, complete flops. I am not sure I have learned anything. The witness to my ineptitude is this Web log.
Nevertheless, I would like to add a couple of additional reasons for failure:
- Those involved deny the likelihood of failure. I suppose this is just the old “know thyself” thing. Thumb typers seem to be even more unaware of risks than I, the old admitted failure.
- Impending failure emits signals which those involved cannot hear or actively choose to ignore.
The list of reasons will be expanded by an MBA pursuing a career in consulting. That, in itself, is one of those failure signals.
Little things still add up. Knowing about these little things is often difficult. I am not away of a hearing aid to assist whiz kids in detecting the exciting moment when the digital construct goes boom.
Stephen E Arnold, December 28, 2020
Google and Its Smart Software
December 28, 2020
I spotted “What AlphaGo Can Teach Us About How People Learn.” The subtitle is Google friendly:
David Silver of DeepMind, who helped create the program that defeated a Go champion, thinks rewards are central to how machines—and humans—acquire knowledge.
The write up contains a number of interesting statements. You will want to work through the essay and excavate those which cause your truth meter to vibrate with excitement. I noted this segment:
I don’t want to put a timescale on it [general artificial intelligence], but I would say that everything that a human can achieve, I ultimately think that a machine can. The brain is a computational process, I don’t think there’s any magic going on there.
I noted the “everything.” That’s an encompassing term. In fact, the term “everything” effectively means the old saw from Paradise Lost”
O sun, to tell thee how I hate thy beams, That bring to my remembrance from what state I fell; how glorious once above thy sphere; Till pride and worse ambition threw me down, Warring in heaven against heaven’s matchless King. (IV, 37–41)
I also noted this Venture Beat write up called “DeepMind’s Big Losses and the Questions around Running an AI Lab.” The MBA speak cannot occlude this factoid (which I assume is close enough for horse shoes):
According to its annual report filed with the UK’s Companies House register, DeepMind has more than doubled its revenue, raking in £266 million in 2019, up from £103 million in 2018. But the company’s expenses continue to grow as well, increasing from £568 million in 2018 to £717 in 2019. The overall losses of the company grew from £470 million in 2018 to £477 million in 2019.
Doing “everything” does seem to be expensive. It was expensive for IBM to get Watson on the Jeopardy show. Google has pumped money into DeepMind to nuke a hapless human Go player.
I also noted this write up: “Google Told Scientists to Use a Positive Tone in AI Research, Documents Show.” I noted this passage:
Four staff researchers, including the senior scientist Margaret Mitchell, said they believe Google is starting to interfere with crucial studies of potential technology harms.
Beyond Search believes that these write ups make clear:
- Google is in the midst of a public relations offensive. Perhaps it is more of a singularity than Google’s announcements about quantum computing. My hunch is that Timnit Gebru’s experience may be an example of Google-entanglement.
- Google is trotting out the big dogs to provide an explainer about “everything.” Wait. Isn’t that a logical impossibility like the Godel thing?
- Google is in the midst of another high school science club management moment. The effort is amusing in a high school science club way.
Net net: My take is that Google announced that it would “solve death.” This did not happen. “Everything”, therefore, is another example of the Arnold Law of Online: “Online fosters a perception that one is infallible, infinite, and everlasting.” Would anyone wager some silver on the veracity of my Law?
Stephen E Arnold, December 28, 2020
SolarWinds: One Interesting Message
December 28, 2020
I read “Wave of Cyberattacks Exposes the Powerlessness of IT Security Chiefs.” With all the hoohah about cyber superiority from government officials and commercial enterprises, one troubling fact is clear: If the advanced systems could not detect the attack nor could top secret security systems monitoring possible bad actors, the defensive and alerting methods are broken. The write up points out security focuses on a wide spread weak link:
Splunk, a U.S. company, publishes an annual list of “10 things that keeps CISOs up at night,” and this year’s includes the expanded “attack surface” created by the growing use of the internet of things (web-connected devices) and the growing use of cloud computing, “malicious insiders” and the “alert fatigue” resulting from so many layers of data security inside a big organization. But apart from that, Splunk notes the lack of money to ensure data security. “CISOs continue to face challenges in securing substantial budgets, largely because they have difficulty forecasting threats and achieving measurable results from security investments…
He said 66% of CISOs surveyed said they didn’t have adequate staff. Others cited increasingly onerous regulations and their lack of access to top management.
Something in the cyber security establishment enables breaches.
Stephen E Arnold, December 28, 2020
Satellites Are Upgraded Peeping Toms
December 28, 2020
Satellites have had powerful cameras for decades. Camera technology for satellites has advanced from being able to read credit card numbers in the early 2000s to peering inside buildings. Futurism shares details about the new (possible) invasion of privacy in: “A New Satellite Can Peer Inside Some Buildings, Day Or Night.”
Capella Space launched a new type of satellite with a new state of the art camera capable of taking clear radar images with precise resolution. It even has the ability to take photos inside buildings, such as airplane hangers. Capella Space assures people that while their satellite does take powerful, high resolution photos it can only “see” through lightweight structures. The new satellite camera cannot penetrate dense buildings, like residential houses and high rises.
The new satellite can also take pictures from space of Earth on either its daytime or nighttime side. Capella also released a new photo imaging platform that allows governments or private customers to request images of anything in the world. Most satellites orbiting the Earth use optical image sensors, which make it hard to take photos when its cloudy. Capella’s new system uses synthetic aperture radar that can peer through cloud cover and night skies.
The resolution for the SAR images is extraordinary:
“Another innovation, he says, is the resolution at which Capella’s satellites can collect imagery. Each pixel in one of the satellite’s images represents a 50-centimeter-by-50-centimeter square, while other SAR satellites on the market can only get down to around five meters. When it comes to actually discerning what you’re looking at from space, that makes a huge difference.
Cityscapes are particularly intriguing. Skyscrapers poke out of the Earth like ghostly, angular mushrooms — and, if you look carefully, you notice that you can see straight through some of them, though the company clarified that this is a visual distortion rather than truly seeing through the structures.”
Capella’s new satellite has a variety of uses. Governments can use it to track enemy armies, while scientists can use it to monitor fragile ecosystems like the Amazon rainforest. Capella has assured the public that its new satellite cannot spy into dense buildings, but if the technology improves maybe it is a possibility? Hopefully bad actors will not use Capella’s new satellite.
Whitney Grace, December 28, 2020
Arthur.ai Designed To Ensure Accuracy In Machine Learning Models
December 28, 2020
Most tech companies are investing their capital in designing machine learning models, but Arthur.ai decided to do something different. TechCrunch reveals Arthur.ai’s innovation in “Arthur.ai Snags $15M Series A To Grow Machine Learning Monitoring Tool.” The Arthur.ai is designed to ensure that machine learning models retain their precise accuracy over time.
Despite being fine tuned algorithms, machine learning AI needs maintenance like any other technology. Index Ventures saw the necessity of such a tool and lead a Series A round of funding with investments from Homebrew, AME Ventures, Workbench, Acrew, and Plexo Capital:
“Investor Mike Volpi from Index certainly sees the value proposition of this company. “One of the most critical aspects of the AI stack is in the area of performance monitoring and risk mitigation. Simply put, is the AI system behaving like it’s supposed to?” he wrote in a blog post announcing the funding.
Arthur.ai has doubled its employees since it was a startup and founder and CEO Adam Wenchel wants to continue expansion. AWS released a similar tool called SageMaker Clarity, but Wenchel views the potential competition as affirmation. If there are products that provide the same service that means there is a market for it. He is also not worried about larger cloud companies, because Arthur.ai will focus entirely on its maintenance tools while the larger ones are strained.
Whitney Grace, December 28, 2020
Another Somewhat Obvious Report about Hippy Dippy Learning
December 25, 2020
What’s “hippy dippy”? That’s my code word for expecting students to sit in front of a computing device to learn. When individuals are freed from class and a motivated instructor, the students kick into screw around mode: Games, porn, TikTok, and digital mischief. Am I the only person in rural Kentucky aware of this fact? I don’t think so.
I read with some amusement (short-lived, very short-lived) “Kids Are Failing Online Learning.” The write up reports:
… Students are still struggling with the switch to online learning months after in-person classrooms shuttered.
I noted these factoids:
Around the United States, as grades trickle in, it’s become clear how devastating the switch to remote learning has been for many students. In Austin, early data released to local reporters noted that failing grades had increased by 70%. (A spokesperson for the Austin Independent School District, Cristina Nguyen, said more recently updated data showed the district overall didn’t see a statistically significant increase in failing grades, although secondary schools did see an increase.) One notably detailed report from Fairfax, Va., on first-quarter grades found that F’s had increased from 6% the prior year to 11% this year. The report concluded that there was a “widening gap” among students…
Online has been around for decades. The shift to online learning has made clear that putting students in a classroom with a teacher works better than thumb typing.
Is this dismal report important? Yes, it is. The write up confirms that making a technology shift teaches. Students learn how to excel at displacement activities. Those are okay but may not be helpful in making informed decisions.
Sure, there will be exceptions. Is that why there is an elite in today’s social construct? In person and classroom instruction may reduce the gap between thumb typers who wander and those few who can suck in data and generate high value outputs.
Computing devices are not magical online teaching systems because what students learn may be how to islands of ignorance. The islands, however, each perceive their knowledge empire as comprehensive, robust, and informed.
Stephen E Arnold, December 25, 2020