Blog
Court of Appeal clarifies data protection claims for non-material damage: A win for claimants - But what are the implications for controllers and processors?
Caroline Sheldon
Below are the ten key takeaways from our engaging discussion.
The EU AI Act, which became law in August last year, is being implemented in stages, allowing businesses time to adapt. Drawing parallels with the GDPR, it can enforce strict fines for non-compliance - up to €35M or 7% of global revenue (whichever is higher), for the most serious breaches, signalling the EU’s commitment to responsible AI development.
AI systems under the EU AI Act fall into four categories:
Unlike the EU’s structured approach, the UK’s regulatory landscape remains fluid. While the previous government favoured a light-touch approach, the new Labour government seems to be moving towards a more structured framework.
Global approaches vary widely, including:
Many businesses are using AI for customer insights, surveillance and automation. However, transparency is critical. Capturing behavioural data is one thing but tracking identifiable personal data ventures into GDPR territory, requiring careful legal navigation.
With AI models often trained on vast datasets, many companies struggle to pinpoint exact data sources. A key takeaway was the necessity for AI-driven businesses to develop clear, accessible explanations for their data usage to maintain trust with regulators and consumers.
The conversation highlighted growing concerns about AI’s impact on IP. Many AI models scrape vast amounts of data, often without explicit permission, raising questions about originality, ownership, attribution and fair use. Traditional IP frameworks may be inadequate for the AI era. This is expected to remain a major area of legal contention for the foreseeable future.
Companies that proactively engage with regulatory changes - by setting up risk registers, compliance frameworks and internal accountability measures - are better positioned for growth. External accountability via advisors, board members, certification bodies and professional networks,) can help improve compliance. Overall, you should develop a “culture of compliance” across the organisation, not just in the legal/compliance department.
Companies are increasingly leveraging AI-driven digital twins for customer profiling and predictive analytics. While this enhances efficiency, it also raises ethical and privacy considerations, requiring thoughtful regulation to balance innovation with consumer rights.
The roundtable event illuminated the evolving AI regulatory landscape and its direct impact on businesses. While compliance can seem daunting, those who embrace AI governance as a strategic advantage rather than a burden will be best positioned for long-term success.
As AI continues to shape industries, staying informed and engaged in regulatory conversations will be critical.
Need help navigating the opportunities and challenges of AI?
Our AI Advisory Group brings together specialists from across Kingsley Napley to support clients who are building, scaling, selling or using AI technologies. Get in touch to find out how we can help you manage the risks and make the most of the benefits.
Chris is a highly experienced solicitor who leads the Corporate, Commercial and Finance team’s general Commercial & Technology Contracts, Outsourcing & Data legal advisory services.
We welcome views and opinions about the issues raised in this blog. Should you require specific advice in relation to personal circumstances, please use the form on the contact page.
Caroline Sheldon
James Fulforth
Christopher Perrin
Skip to content Home About Us Insights Services Contact Accessibility
Share insightLinkedIn X Facebook Email to a friend Print