A Flashing Yellow Light for GitHub: Will Indifferent Drivers Notice?
November 9, 2022
I read “We’ve Filed a Lawsuit Challenging GitHub Copilot, an AI Product That Relies on Unprecedented Open-Source Software Piracy. Because AI Needs to Be Bair & Ethical for Everyone.” The write up reports:
… we’ve filed a class-action lawsuit in US federal court in San Francisco, CA on behalf of a proposed class of possibly millions of GitHub users. We are challenging the legality of GitHub Copilot (and a related product, OpenAI Codex, which powers Copilot). The suit has been filed against a set of defendants that includes GitHub, Microsoft (owner of GitHub), and OpenAI.
My view of GitHub is that it presents a number of challenges. On one hand, Microsoft is a pedal-to-the-metal commercial outfit and GitHub is an outfit with some roots in the open source “community” world. Many intelware solutions depend on open source software. In my experience, it is difficult to determine whether cyber security vendors or intelware vendors offer software free of open source code. I am not sure the top dogs in these firms know. Big commercial companies love open source software because these firms see a way to avoid the handcuffs proprietary code vendors use for lock in and lock down without a permission slip. These permissions can be purchased. This fee irritates many of the largest companies which are avid users of open source software.
A second challenge of GitHub is that it serves bad actors in two interesting ways. Those eager to compromise networks, automate phishing attacks, and probe the soft underbelly of companies “protected” by somewhat Swiss Cheese like digital moats rely on open source tools. Second, the libraries for some code on GitHub is fiddled so that those who use libraries but never check too closely about their plumbing are super duper attack and compromise levering vectors. When I was in Romania, “Hooray for GitHub” was, in my opinion, one of the more popular youth hang out disco hits.
The write up adds a new twist: Allegedly inappropriate use of the intellectual property of open source software on GitHub. The write up states:
As far as we know, this is the first class-action case in the US challenging the training and output of AI systems. It will not be the last. AI systems are not exempt from the law. Those who create and operate these systems must remain accountable. If companies like Microsoft, GitHub, and OpenAI choose to disregard the law, they should not expect that we the public will sit still. AI needs to be fair & ethical for everyone.
This issue is an important one. The friction for this matter is that the US government is dependent on open source to some degree. Microsoft is a major US government contractor. A number of Federal agencies are providing money to companies engaged in strategically significant research and development of artificial intelligence.
The different parties to this issue may exert or apply influence.
Worth watching because Amazon- and Google-type companies want to be the Big Dog in smart software. Once the basic technology has been appropriated, will these types of companies pull the plug on open source support and god cloud commercial? Will attorneys benefit while the open source community suffers? Will this legal matter mark the start of a sharp decline in open source software?
Stephen E Arnold, November 9, 2022