The Famous Google Paper about Attention, a Code Word for Transformer Methods

June 20, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_t[1]_thumbNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

Wow, many people are excited a Bloomberg article calledThe AI Boom Has Silicon Valley on Another Manic Quest to Change the World: A Guide to the New AI Technologies, Evangelists, Skeptics and Everyone Else Caught Up in the Flood of Cash and Enthusiasm Reshaping the Industry.”

In the tweets and LinkedIn posts one small factoid is omitted from the second hand content. If you want to read the famous DeepMind-centric paper which doomed the Google Brain folks to watch their future from the cheap seats, you can find “Attention Is All You Need”, branded with the imprimatur of the Neural Information Processing Systems Conference held in 2017. Here’s the link to the paper.

For those who read the paper, I would like to suggest several questions to consider:

  1. What economic gain does Google derive from proliferation of its transformer system and method; for example, the open sourcing of the code?
  2. What does “attention” mean for [a] the cost of training and [b] the ability to steer the system and method? (Please, consider the question from the point of view of the user’s attention, the system and method’s attention, and a third-party meta-monitoring system such as advertising.)
  3. What other tasks of humans, software, and systems can benefit from the user of the Transformer system and methods?

I am okay with excitement for a 2017 paper, but including a link to the foundation document might be helpful to some, not many, but some.

Net net: Think about Google’s use of the word “trust” and “responsibility” when you answer the three suggested questions.

Stephen E Arnold, June 20, 2023

Comments

Comments are closed.

  • Archives

  • Recent Posts

  • Meta