Generative AI & Employee IP: Who Owns Work Created by Employees Using AI Tools?

Chouta Christina Eurofast
Christina Chouta
Marketing Executive

Generative AI tools like ChatGPT and Microsoft Copilot are increasingly used by employees to create reports, marketing copy, code, and presentations. However, this widespread adoption raises a pressing legal and human resources question: who owns the work produced when employees use AI? This article explores the legal landscape, current statistics on workplace AI use, and practical contract and policy considerations to clarify ownership rights, data protection, and workflow control.

What the Data Shows

  1. A TELUS Digital Experience survey of 1,000 U.S. enterprise employees found 57% had posted sensitive business data into public AI tools, with 68% using personal accounts rather than company-managed platforms
  2. The KPMG‑University of Melbourne 2025 global study of 48,340 workers across 47 countries revealed 57% conceal their AI use from employers, 58% use AI intentionally at work, but only 47% have formal guidance.
  3. According to U.S. data, 57% of workers have tried ChatGPT, but only 16% use it regularly—and just 17% say their employer has a clear AI policy.

These figures highlight two core risks: hidden AI usage and lack of contractual clarity about content ownership and confidentiality.

The Legal Landscape on AI-Generated IP

AI Tools Lack Legal “Authorship”
Across major jurisdictions—including the U.S., UK, China, and EU—only humans can be legal authors of creative works. Purely AI-generated outputs are not eligible for copyright protection

Human Input Matters
The U.S. Copyright Office states that copyright can apply to AI-assisted works when there’s sufficient human input or “creative arrangement”. ERA, the European Commission similarly recognizes that AI-generated content may qualify for copyright only when there’s notable human involvement .

Training Data Liability
There is growing legal scrutiny on AI models trained on copyrighted data without permission. Lawsuits from The New York Times, Authors Guild, and several artists allege infringement and challenge “fair use” defenses.

No IP by AI
Generative AI systems themselves cannot own IP rights. Any rights to AI output typically reside with the human user or employer under contract law.

The Need for Contracts
Organisations, including WIPO, advise clear contractual allocation of rights when employees use AI tools. HR policies and employment contracts should explicitly set out who owns the output—and clarify data usage and confidentiality.

Eurofast’s Take — Action Points & Client Implications

Eurofast helps employers establish clear frameworks when adopting generative AI, ensuring employee-created content does not blur ownership or confidentiality lines. Here’s how we add value:

Contractual Clarity
We draft or update employment and contractor agreements to explicitly state:

  • Any work created with the help of generative AI is owned by the company.
  • Employees must declare AI use and log tool usage.
  • Bans or restrictions on inputting confidential data into public models.

AI & IP Policy Framework
We support organisations in drafting AI use policies that:

  • Restrict or approve tools (e.g., company‑managed vs. public apps).
  • Require disclosure of AI assistance in deliverables.
  • Mandate data privacy and compliance protocols.

Protecting IP & Minimising Legal Risk
We advise on IP strategies for AI-assisted inventions and branding. Since courts may not protect content absent human creativity, we help clients document workflows to demonstrate authorship.

➡️ By ensuring robust contract terms, clear policies, and staff training, Eurofast enables companies to confidently adopt generative AI tools while securing legal rights and reducing IP or data risks.

Next Steps – What You Can Do Today

If your organisation is using or considering generative AI tools:

  • Update employment contracts with AI-specific IP provisions.
  • Roll out AI use policies to cover confidentiality, data security, and disclosure.
  • Train staff on AI benefits and pitfalls—usage without guidance leads to legal exposure.
  • Audit tool usage and compelled disclosure mechanisms to enforce policies.

For expert guidance or tailored workshops on these topics, please contact us at [email protected]

Related posts: