News

Another GitHub Copilot Detractor Emerges, a California Lawyer Eyeing Lawsuit

GitHub Copilot, described as an "AI pair programmer" coding assistant, made big waves in the software development space when it launched last year, being lauded for its advanced code-completion capabilities. However, detractors soon emerged -- and still are emerging, the latest being a California lawyer who's investigating a potential lawsuit against GitHub's owner, Microsoft.

GitHub calls it an "AI pair programmer" for its ability to provide advanced code-completion functionality and suggestions similar to IntelliSense/IntelliCode when using IDEs or code editors like Visual Studio and Visual Studio Code. However, it goes beyond those Microsoft offerings thanks to OpenAI Codex, a cutting-edge AI system developed by Microsoft partner OpenAI. That lets it turn typed commands into actual code.

The tool has definitely increased programmer productivity, according to a report last month (see "A Year In, GitHub Measures AI-Based Copilot's Productivity Boost"), but along the way it has come under a fair share of criticism.

GitHub Copilot
[Click on image for larger view.] GitHub Copilot (source: GitHub).

For example, the Free Software Foundation (FSF) last year deemed GitHub Copilot to be "unacceptable and unjust."

And in June, another open source-focused organization, Software Freedom Conservancy (SFC), piled on, listing many grievances about GitHub's behavior, especially pertaining to the release of a paid service based on the tool, whose AI model is trained on top-quality GitHub source code repos.

Along with complaints, there were warnings, such as one last year from researchers who, in multiple scenario testing, found some 40 percent of tested Copilot-assisted projects were found to include security vulnerabilities.

And that's not to mention that GitHub Copilot renewed developer angst about losing their coding jobs to AI constructs.

Now, on top of all that, a "GitHub Copilot investigation" site has been published by Matthew Butterick, a lawyer who has teamed up with a law firm to conduct the investigation. His site solicits commentary on GitHub Copilot as part of the investigation into a potential lawsuit.

The site features this heading: "Maybe you don't mind if GitHub Copilot used your open source code without asking. But how will you feel if Copilot erases your open source community?"

Butterick said that after writing about GitHub Copilot problems last year -- mostly about the handling of open source licenses -- he reactivated is California bar membership and teamed up with the Joseph Saveri Law Firm for the investigation. The site lists concerns about the use of GitHub Copilot and on the training of its backing AI system.

"By offering Copilot as an alternative interface to a large body of open source code, Microsoft is doing more than severing the legal relationship between open source authors and users," the site says in a lengthy treatise. "Arguably, Microsoft is creating a new walled garden that will inhibit programmers from discovering traditional open source communities. Or at the very least, remove any incentive to do so. Over time, this process will starve these communities. User attention and engagement will be shifted into the walled garden of Copilot and away from the open source projects themselves -- away from their source repos, their issue trackers, their mailing lists, their discussion boards. This shift in energy will be a painful, permanent loss to open source."

The site says Butterick and his team want to hear from people if:

  • You have stored open source code on GitHub (in a public or private repo), or if you otherwise have reason to believe your code was used to train OpenAI's Codex or Copilot.
  • You own -- or represent an entity that owns -- one or more copyrights, patents, or other rights in open source code.
  • You represent a group that advocates for open source code creators.
  • You are a current or past GitHub Copilot user.
  • You have other information about Copilot you'd like to bring to our attention.

About the Author

David Ramel is an editor and writer at Converge 360.

comments powered by Disqus

Featured

  • Compare New GitHub Copilot Free Plan for Visual Studio/VS Code to Paid Plans

    The free plan restricts the number of completions, chat requests and access to AI models, being suitable for occasional users and small projects.

  • Diving Deep into .NET MAUI

    Ever since someone figured out that fiddling bits results in source code, developers have sought one codebase for all types of apps on all platforms, with Microsoft's latest attempt to further that effort being .NET MAUI.

  • Copilot AI Boosts Abound in New VS Code v1.96

    Microsoft improved on its new "Copilot Edit" functionality in the latest release of Visual Studio Code, v1.96, its open-source based code editor that has become the most popular in the world according to many surveys.

  • AdaBoost Regression Using C#

    Dr. James McCaffrey from Microsoft Research presents a complete end-to-end demonstration of the AdaBoost.R2 algorithm for regression problems (where the goal is to predict a single numeric value). The implementation follows the original source research paper closely, so you can use it as a guide for customization for specific scenarios.

  • Versioning and Documenting ASP.NET Core Services

    Building an API with ASP.NET Core is only half the job. If your API is going to live more than one release cycle, you're going to need to version it. If you have other people building clients for it, you're going to need to document it.

Subscribe on YouTube