This Copyright Lawsuit Could Shape the Future of Generative AI

This Copyright Lawsuit Could Shape the Future of Generative AI

Of course, programmers have always studied, learned and copied each other’s code. But not everyone is sure it’s right for AI to do the same, especially if it can then produce tons of valuable code itself, without adhering to the licensing requirements of the source material. “As a technologist, I’m a big fan of AI,” Butterick says. “I look forward to all the possibilities of these tools. But they must be fair to everyone.

Thomas Dohmke, CEO of GitHub, says Copilot now comes with a feature designed to prevent existing code from being copied. “When you enable this and suggest that Copilot publish the code for matches on GitHub (without even looking at the license), it won’t make that suggestion,” he says.

Whether this will provide sufficient legal protection remains to be seen, and the upcoming trial could have broader implications. “Assuming this isn’t settled, it will certainly be a landmark case,” says Luis Villa, a coder-turned-lawyer who specializes in open source-related cases.

Villa, who personally knows GitHub co-founder Nat Friedman, doesn’t think it’s clear that tools like Copilot go against the philosophy of open source and free software. “The free software movement in the ’80s and ’90s talked a lot about reducing the power of copyright in order to increase people’s ability to code,” he says. “I find it a little frustrating that we are now in a position where some people are rushing to say that we need as many copyrights as possible in order to protect these communities.”

Whatever the outcome of the Copilot case, Villa says it could shape the destiny of other areas of generative AI. If the outcome of the Copilot case depends on how similar the AI-generated code is to its training materials, this could have implications for systems that reproduce images or music matching the style of the materials contained in their training data.

Anil Dash, CEO of Glitch and board member of the Electronic Frontier Foundation, says the legal debate is only part of a larger adjustment driven by generative AI. “When people see AI creating art, creating writing, and creating code, they ask themselves, ‘What is all this, what does this mean for my business and what does this mean for society? “, he said. “I don’t think every organization has thought about it in depth, and I think it’s kind of the next frontier.” As more people start thinking about and experimenting with generative AI, there will likely be more lawsuits.

Your Google searches are slowly evolving. Here's the rest

Your Google searches are slowly evolving. Here’s the rest

Nearly 40% of Australian children aged 16-18 use ChatGPT

Victoria child protection worker uses ChatGPT for safeguarding report

Leave a Reply

Your email address will not be published. Required fields are marked *