Another GitHub Copilot Detractor Emerges, a California Lawyer Eyeing Lawsuit
GitHub Copilot, described as an "AI pair programmer" coding assistant, made big waves in the software development space when it launched last year, being lauded for its advanced code-completion capabilities. However, detractors soon emerged -- and still are emerging, the latest being a California lawyer who's investigating a potential lawsuit against GitHub's owner, Microsoft.
GitHub calls it an "AI pair programmer" for its ability to provide advanced code-completion functionality and suggestions similar to IntelliSense/IntelliCode when using IDEs or code editors like Visual Studio and Visual Studio Code. However, it goes beyond those Microsoft offerings thanks to OpenAI Codex, a cutting-edge AI system developed by Microsoft partner OpenAI. That lets it turn typed commands into actual code.
The tool has definitely increased programmer productivity, according to a report last month (see "A Year In, GitHub Measures AI-Based Copilot's Productivity Boost"), but along the way it has come under a fair share of criticism.
For example, the Free Software Foundation (FSF) last year deemed GitHub Copilot to be "unacceptable and unjust."
And in June, another open source-focused organization, Software Freedom Conservancy (SFC), piled on, listing many grievances about GitHub's behavior, especially pertaining to the release of a paid service based on the tool, whose AI model is trained on top-quality GitHub source code repos.
Along with complaints, there were warnings, such as one last year from researchers who, in multiple scenario testing, found some 40 percent of tested Copilot-assisted projects were found to include security vulnerabilities.
And that's not to mention that GitHub Copilot renewed developer angst about losing their coding jobs to AI constructs.
Now, on top of all that, a "GitHub Copilot investigation" site has been published by Matthew Butterick, a lawyer who has teamed up with a law firm to conduct the investigation. His site solicits commentary on GitHub Copilot as part of the investigation into a potential lawsuit.
The site features this heading: "Maybe you don't mind if GitHub Copilot used your open source code without asking. But how will you feel if Copilot erases your open source community?"
Butterick said that after writing about GitHub Copilot problems last year -- mostly about the handling of open source licenses -- he reactivated is California bar membership and teamed up with the Joseph Saveri Law Firm for the investigation. The site lists concerns about the use of GitHub Copilot and on the training of its backing AI system.
"By offering Copilot as an alternative interface to a large body of open source code, Microsoft is doing more than severing the legal relationship between open source authors and users," the site says in a lengthy treatise. "Arguably, Microsoft is creating a new walled garden that will inhibit programmers from discovering traditional open source communities. Or at the very least, remove any incentive to do so. Over time, this process will starve these communities. User attention and engagement will be shifted into the walled garden of Copilot and away from the open source projects themselves -- away from their source repos, their issue trackers, their mailing lists, their discussion boards. This shift in energy will be a painful, permanent loss to open source."
The site says Butterick and his team want to hear from people if:
- You have stored open source code on GitHub (in a public or private repo), or if you otherwise have reason to believe your code was used to train OpenAI's Codex or Copilot.
- You own -- or represent an entity that owns -- one or more copyrights, patents, or other rights in open source code.
- You represent a group that advocates for open source code creators.
- You are a current or past GitHub Copilot user.
- You have other information about Copilot you'd like to bring to our attention.
David Ramel is an editor and writer for Converge360.