AI and copyright: Breaking the ‘Napster’ mindset

Artificial Intelligence and Digital Innovation Minister Evan Solomon, pictured at a Coding for Veterans event. ‘As we advance AI for All, we are making sure no one is left behind. Veterans bring leadership, resilience, and experience that strengthen every sector they enter. This is how we build a stronger Canada for the future,’ he said on social media. / TWITTER PHOTO

The government must protect intellectual property rights and ensure there is transparency when it comes to the development of AI in the creative industries, a new report by the House of Commons Heritage Committee says. 

The report, Impacts of Artificial Intelligence on the Creative Industries, tabled in the House on April 14, called on the government to confirm that the Copyright Act covers AI-generated content, require AI developers to disclose what copyrighted material they use to train their models, and establish an opt-in system so that creators must give explicit consent before their work can be used in AI training.

“Witnesses stated that the pillars of Authorization, Remuneration and Transparency should ground the use of copyrighted content by generative AI models. They called for upholding the Copyright Act as it stands, rejecting the introduction of text and data mining exceptions and the extension of copyrightability to machine-generated outputs. They also expressed their support for a voluntary, market-based licensing regime that they said would ensure respect for creators’ rights without stifling innovation. Finally, they insisted on the need for transparency, both for training datasets and to identify AI-generated outputs,” the report says.

After hearing from 43 witnesses between October and November 2025, the committee made 13 recommendations. The proposals touch on digital infrastructure investment, regulatory design and Canada's broader positioning in the global AI economy.

The European Union's approach permits text and data mining unless rights holders actively opt out. Several witnesses urged the committee not to follow the EU model.

"These companies refuse to come to the table to negotiate, and they are counting on you to change copyright in their favour," Margaret McGuffin, CEO of Music Publishers Canada, told the committee.

The report also quoted Patrick Rogers of Music Canada who warned that tech companies are "jurisdiction shopping" in the hope that different markets will adopt an exception, and urged Canadian MPs not to "cave to demands" for one.

Technology company representatives argued, however, that model training does not engage copyright interests. Rachel Curran of Meta Platforms told the committee the company's AI models don't "store or reproduce any content," but instead "extract what we believe are unprotectable facts, statistics, patterns and relationships." Her colleague Kevin Chan said that "learning about information and developing the patterns and relationships to build these models" does not touch on copyright interests.

Rather than new regulation, many creative industry witnesses called for the government to uphold existing copyright principles and allow a voluntary licensing market to emerge — effectively an opt-in mechanism consistent with how intellectual property has traditionally been managed.

‘We’re in the Napster era of AI’

The music industry noted its experience navigating the transition from piracy to licensed streaming as a relevant precedent.

“We are in the Napster era of AI in the marketplace,” Rogers told the committee. “We need to get to the iTunes stage.”

McGuffin added: “If you were to look for headlines about licensing coming together, you would find them.”

Meanwhile, Alexandra Kearney, co-founder of Edmonton-based AI company Artificial Agency, stressed the importance of "introducing technical people into the room," noting that AI is a "nuanced" technology and that policy initiatives should be "securely grounded in technological realities." She cautioned against treating AI as a "monolith," saying many of its existing and potential applications could be impeded by frameworks that are too broad.

Stephanie Enders of the Alberta Machine Intelligence Institute advised against "horizontal" regulation of AI as such, suggesting governance be structured by applying an "AI lens" to existing sectoral regulations rather than creating a "universal path" that would capture all AI technologies.

Rudyard Griffiths of The Hub cautioned that any attempt to "freeze the Canadian economy in place" to protect certain industries from disruption would be "prohibitively expensive." 

Brown pushed back on the innovation-versus-protection framing: "Respect for copyright does not stifle innovation. If you stream music on your smart phone, you have proof in your pocket that compensation for creators and technical innovation can successfully coexist."

Government has 120 days to respond

Several witnesses called for the cultural sector to be included in AI policy processes from which it has, to date, largely been absent, including the government's Strategic Task Force on AI. The committee recommended adding cultural community representatives to the Advisory Council on Artificial Intelligence and establishing a dedicated working group on cultural impacts.

The committee requested a response from the government, which has 120 days to do so.

On the same day the report was released, Artificial Intelligence and Digital Innovation Minister Evan Solomon appeared at a Senate Committee of the Whole to speak about his portfolio. He noted the long-awaited AI Strategy is “upcoming” but did not say when it is expected after several delays. He told the Senate the government’s core principle is “AI for all.” 

This means the technology will work for everyone across the country, he said. 

“Canadian AI must not be something invented here and developed elsewhere. It has to help build a stronger Canada right here at home,” he said. “Too often, we plant the seed and water and grow the plant, and then someone else harvests it. That will stop. Our plan will take that under consideration.”

Meanwhile, the Conservative MPs on the committee included a supplementary report noting that AI is an “enormous opportunity for Canada to lead globally, drive innovation, strengthen our economy, and improve the lives of everyday Canadians, especially those in the creative sector.”

The report warns that overregulation could discourage investment and talent, citing concerns that restrictive policies may make Canada appear “not AI-friendly” in a fast-moving global market. Conservatives instead urged governments to target harmful uses such as fraud and impersonation, rather than regulating AI technology itself, and to rely on technical experts when crafting new laws.

You might also like

Bea Vongdouangchanh

Bea Vongdouangchanh is Editor-in-Chief of Means & Ways. Bea covered politics and public policy as a parliamentary journalist for The Hill Times for more than a decade and served as its deputy editor, online editor and the editor of Power & Influence magazine, where she was responsible for digital growth. She holds a Master of Journalism from Carleton University.

Previous
Previous

Carney’s global Rolodex to spark investment summit

Next
Next

After $1 trillion exodus, Canada finds itself ‘back on the capital radar’