How AI and copyright turned into a political nightmare for UK Labour

1 month ago 2
ARTICLE AD BOX

LONDON — It was never meant to be this hard.

In the wake of Labour’s decisive election victory in July, ministers in the party’s tech team were determined to grip an issue they felt the previous Conservative government had failed to address: how to protect copyright holders from artificial intelligence companies’ voracious appetite for content to train their AI models.

Instead, Labour’s handling of the issue has snowballed into a PR nightmare which has transformed a largely uncontroversial data bill into a political football.

The Data (Use and Access Bill) has ricocheted between the Commons and the Lords in an extraordinarily long incidence of ping-pong, with both Houses digging their heels in and a frenzied lobbying battle on all sides.

As one tech industry insider put it: “Everyone has fought dirty and everyone is going to walk away covered in shit.”

Opting out

Many in the creative sector, which has long viewed Labour as its natural ally in Westminster, hoped the party’s July election victory would work in its favor. 

In a manifesto for the creative sectors published while in opposition, the party had vowed to “support, maintain, and promote the U.K.’s strong copyright regime,” stating: “The success of British creative industries to date is thanks in part to our copyright framework.”

Instead, just five months later, creatives’ worst fears were realized when ministers proposed allowing AI developers to scrape copyrighted content freely unless artists, publishers and creators “opt out” — putting the onus on creatives to protect their work.

The ensuing backlash sparked broadsides from the likes of Paul McCartney and Elton John and a rearguard effort by peers to enshrine protections for creatives.

On Monday, peers in the House of Lords voted overwhelmingly to defy the Commons for the fourth time by amending a data bill to enshrine protections for creators, despite Department for Science, Innovation and Technology minister Maggie Jones saying businesses want “certainty, not constitutional crises.”

Perhaps more troublingly for Labour, the rancor has fed a perception that the party has allied itself too closely to foreign tech giants in its bid for economic growth — a narrative that could make it more difficult for the government to deliver its plans.

How did we get here?

POLITICO spoke to several people familiar with discussions inside the government to understand how a complex — if fiercely contested — debate over intellectual property law became front page news. Many were granted anonymity to speak freely.

The ensuing backlash sparked broadsides from the likes of Paul McCartney and Elton John and a rearguard effort by peers to enshrine protections for creatives. | Dan Kitwood/Getty Images

They all agreed that significant missteps had turned a genuine attempt to resolve the matter into a political nightmare.

Creative sector representatives pointed POLITICO to the outsized influence of Matt Clifford, a prolific tech investor who was tapped by Technology Secretary Peter Kyle to draft an “AI Opportunities Action Plan” within days of Labour taking office.

Clifford had previously argued that the U.K. should reform its copyright laws to attract AI investment in a report for the previous government on “Pro-innovation regulation” authored with Patrick Vallance, now the U.K.’s science minister.

The Tony Blair Institute, which had helped shape the incoming Labour government’s AI policy, also backed the idea.

But it was Chris Bryant, a joint minister across the technology and culture departments, who was the driving force behind efforts to resolve a deadlock left behind by the previous government, according to two people close to the process.

One creative sector lobbyist accused Bryant of “naiveté” for assuming the main problem was that the “other lot were rubbish,” rather than acknowledging the extent of the technical and political hurdles to a solution.

They also argued that Bryant’s role meant there was no distinct voice from the culture department — which acted as a bulwark against reform in the previous government — on what was ostensibly a shared policy.

Ministers and officials in the culture department were “slow to organize themselves,” the person said, allowing the technology department to own the issue.

It’s a stitch-up

When POLITICO revealed in October that the government was planning to propose an “opt out” model, it seemed that creatives had been outmanoeuvred.

A subsequent government consultation in December described an “opt out” system, alongside increased transparency obligations on AI firms, as its “preferred option.” 

The consultation was “a pivotal opportunity to ensure that sustained growth and innovation for the U.K.’s AI sector continues to benefit creators, businesses and consumers alike,” Bryant said, adding: “We want to provide legal certainty for all.”

For copyright holders, the suggestion that the law on AI training was unclear undermined their efforts to extract potentially lucrative licensing deals from AI firms, which partially relied on the threat of pursuing legal action.

Briefings to the press from figures involved in the campaign highlighted Clifford’s personal AI investments, as well as Kyle’s refusal to meet creatives despite taking a bevy of meetings with Big Tech lobbyists. | Andy Rain/EFE via EPA

But by stating a “preferred option” — and accepting all the recommendations in Clifford’s AI plan, which called for copyright reform, one month later — the government created a “target” for itself, a tech industry figure said.

The creative sector was able to portray itself as the victim of a stitch up. In a coordinated campaign that stretched from newspaper publishers to record labels, trade bodies called on their A-list networks and sympathetic lawmakers to concoct a steady stream of damning headlines for the government.

In May, Elton John labeled Kyle a “moron” on the BBC’s Sunday news show. (“My family thought it was the best thing ever,” Kyle joked this week during an appearance at SXSW in London.)

Briefings to the press from figures involved in the campaign highlighted Clifford’s personal AI investments, as well as Kyle’s refusal to meet creatives despite taking a bevy of meetings with Big Tech lobbyists. Kyle’s decision to paint the sector as trying to “resist change” in an interview with the Financial Times didn’t help.

Campaigners also accused Kyle of only seeking advice from a small circle of advisers with strong views on AI. Kyle “refuses to go one inch” beyond what’s required in order not to imperil investment from AI firms, one said.

Most importantly, the sector found a champion in Beeban Kidron, a former film director and crossbench peer in the House of Lords. A formidable campaigner, Kidron had previous form holding minsters’ feet to the fire over online safety.

In coordination with the wider campaign, Kidron tabled a series of amendments to a data bill before parliament to push for immediate transparency duties on AI firms that would force them to disclose how they train their AI models.

“What the government is doing is bad politics, bad economics and bad for the culture and reputation of the U.K.,” Kidron said. “They will live to regret their short-sighted awe and the failure to be adults in the room.”

Face the music

In private, ministers railed against what they felt was misleading and inflammatory coverage of their genuine attempt to resolve a knotty issue.

But “there was no way we could have campaigned the way we did without the ‘preferred option,’” the creative sector lobbyist quoted at the top of the article said.

Liberal Democrat peer Tim Clement-Jones, who has backed Kidron’s efforts, told POLITICO the government “put the cart before the horse.” “It completely destroyed trust,” he said.

In response to Kidron’s campaigning, ministers have promised to publish technical reports on transparency, technical solutions to an “opt out,” licensing, and other subjects within nine months. Cross-industry working groups will be formed to weigh in on the questions and seek to cultivate consensus.

Leading AI companies OpenAI and Google have already made it clear that they view “disproportionate” transparency requirements as a threat to their business models. | Rungroj Yongrit/EFE via EPA

Most significantly, ministers now say they no longer have a “preferred option” on the way forward, and insist the U.K.’s existing law is clear — though Kyle maintains the U.K.’s laws are “not fit for purpose” in the AI era.

“We’re open-minded,” a DSIT official said.

In May, Kyle told MPs he “regrets” the timing and framing of the government’s proposals, accepting they inflamed creatives and almost derailed the government’s legislative agenda.

“We all should have done things differently,” a senior tech executive agreed.

Where next?

The senior tech executive argued that an “opt out” with proportionate transparency duties on AI firms ultimately remains the best way forward.

Getting there won’t be easy, however.

It is unclear if, and when, technology will emerge that could allow rights holders to easily and effectively “opt out” of AI firms training new models on the full range of media spread across the web.

And finding consensus on what constitutes adequate transparency — which ministers say will be “the foundation” of any legislation that could emerge within the next two years — will also be challenging.

Leading AI companies OpenAI and Google have already made it clear that they view “disproportionate” transparency requirements as a threat to their business models.

Tim Flagg, CEO of UKAI, which represents U.K. businesses adopting AI, said the trade body “has been on a journey” over the issue.

After initially supporting liberalization of the law, he told POLITICO he now believes the U.K. stands to benefit most by carving out a niche developing smaller, more specialized AI systems using high quality, licensed content.

But others in the tech world have only hardened their position.

Some of the largest tech firms have responded to the creative sector’s campaign by adopting even more extreme positions in a bid to “balance” the debate, according to multiple people familiar with their thinking. It is understood that the vast majority of the 11,500 responses submitted to the government’s consultation are from creators opposed to the government’s plans.

Industry bodies including TechUK, which previously advocated for reforms mirroring the government’s original “preferred option,” now describe it as a reluctant “compromise.”

But rights holders may also struggle to sell any eventual compromise to artists and creators after taking such an uncompromising public position, the tech industry figure cited above said.

“What people [in the creative sector] are saying in private is very different to what they are saying in public,” they said, noting that many of the groups involved in the campaign had been in discussions with tech firms until relations soured.

Ahead of a near unprecedented fourth round of ping pong on the data bill, a tearful Chris Bryant on Tuesday begged peers to allow the data bill to “run its course,” saying he had “heard the concerns” and would address them in the round in future legislation.

Lobbyists are rolling up their sleeves.

In a statement to POLITICO, a DSIT spokesperson said:

“We recognize how pressing these issues are and we truly want to solve them, which is why we have committed to bring forward the publication of our report exploring the breadth of issues raised in the AI and copyright debate, alongside an economic impact assessment covering a range of possible options. 

“We have also been clear the way to arrive at a solution is to ensure we are meeting the needs of both sectors, rather than trying to force through piecemeal changes to unrelated legislation which could quickly become outdated.

“As you would rightly expect, we are taking the time to consider the 11,500 responses to our consultation, but no changes to copyright law will be considered unless we’re completely satisfied they work for creators.”

Read Entire Article