Position Paper on Audiovisual, Animation, Post-Production and AI-Assisted Creation
By Del-York Creative Academy (DCA)
In collaboration with Youth in Animation and Post-Production Initiative (YAPPI)
Lagos, Nigeria | December 2025
Executive summary
Nigeria’s creative industries are standing at a decisive inflection point. Film, animation, post‑production, digital content and design have moved from the margins of the economy into the heart of how the country generates jobs, attracts investment and projects soft power to the world. At the same time, generative artificial intelligence (AI) and other digital tools are quietly rewiring creative workflows from the way ideas are visualised to how voices, characters and worlds are produced, raising new questions about who owns what and on what terms.
This paper argues that Nigeria already has enough legal architecture to protect its creators in an AI‑driven world, but that the way those rules are interpreted and operationalised will determine whether the country remains a low‑margin service provider or emerges as a global source of original, bankable IP. It advances a simple but powerful foundation: copyright protection rests on human authorship; purely machine‑generated material, left untouched, does not qualify as a protected work; and in the absence of AI‑specific laws, contracts, platform terms and documentation practices are doing much of the heavy lifting.
Building on Nigerian legislation, key international treaties and practical experience in the audiovisual and animation sectors, the paper develops a “rights‑first, evidence‑driven” framework for IP in creative pipelines. It draws on global disputes around training data, image synthesis and deepfakes to illustrate the risks of silence, while proposing concrete steps for creators, studios, regulators, funders and development partners to strengthen chain‑of‑title, clarify AI usage and build a culture of trust that lowers transaction costs across the ecosystem. The goal is not merely alignment with existing global norms, but to position Nigeria as a thought leader in how human creativity and AI can coexist on fair, transparent and commercially viable terms.
Nigeria at a creative crossroads
Over the past two decades, Nigeria’s creative economy has shifted from a “promise” narrative to a measurable reality. Nollywood productions sit on major global streaming platforms; Nigerian music shapes global charts and festival line‑ups; and local animation and post‑production studios now routinely service international clients. In cities such as Lagos, Abuja and Port Harcourt, young teams are building studios, collectives and start‑ups that approach storytelling and design with a distinctly Nigerian voice and a global mindset.
Yet beneath the visible success of high‑profile projects lies a quieter, more fragile reality. Many creators still work on the basis of verbal understandings, informal messages and template contracts that do not reflect the complexity of modern production chains. Rights are often assumed rather than documented, and when a project suddenly attracts attention, whether through a viral clip, an international festival selection or a licensing offer, questions of ownership and entitlement surface at precisely the moment when clarity is most needed.
This structural fragility is magnified by the arrival of AI tools in creative workflows. A young animation team in Lagos might now use an image generator to quickly visualise character concepts, a language model to suggest dialogue variations, and audio tools to create temp voices or refine recordings. These tools accelerate production, lower costs and democratise experimentation, but they also introduce new lines of risk: what happens if an AI‑generated background closely resembles a foreign artist’s work? Who owns an AI‑assisted storyboard that was refined by multiple hands? What does it mean to license a show if its key assets were produced using a model trained on unlicensed data?
Nigeria therefore stands at a crossroads defined by three intersecting forces: a rapidly modernising creative economy, an evolving but still under‑utilised legislative framework, and a technological disruption that ignores borders and moves faster than traditional policymaking. The question is not whether AI will be used in Nigerian creative work, it already is. But whether the country will shape usage norms in a way that protects creators, reassures investors and inspires regulators, or allow a vacuum in which rights are eroded by default.
Legal and policy foundations for a human‑centred AI era
The starting point for any serious conversation about IP and AI in Nigeria is the existing law. At the constitutional level, property rights are explicitly protected, and the courts have long recognised that this extends beyond land and physical objects to include intangible assets such as copyright, trademarks and related rights. Building on this foundation, the Copyright Act 2022, the Trademarks Act, the Patents and Designs Act and the Evidence Act 2011 together define the core rules that govern who owns creative outputs, how those rights can be recorded and transferred, and how they can be proven in a dispute.
Within this architecture, one doctrinal point carries particular weight in the age of AI: copyright subsists only in original works created by a human author. The Copyright Act 2022 requires that a work originate from a “qualified person,” and that requirement is echoed and reinforced by the international instruments to which Nigeria is party, including the Berne Convention, the TRIPS Agreement, the WIPO Copyright Treaty, the WIPO Performances and Phonograms Treaty, the Beijing Treaty on Audiovisual Performances and the emerging AfCFTA IP Protocol. Across these instruments, authors and performers are understood to be natural persons whose intellectual contributions merit protection and remuneration.
In practice, this means that purely machine‑generated content, produced without meaningful human creative input, falls outside the current definition of a “protected work.” An AI system may produce an image, a melody or a line of dialogue, but unless a human creator selects, arranges, modifies or integrates that output in a way that reflects their own skill and judgment, there is no copyright in the result under Nigerian law as it stands. This position does not prohibit AI use; it simply clarifies that legal protection follows human creativity, not computational capacity.
At the same time, Nigerian law does not operate in isolation. Global disputes around AI training and output, ranging from claims by stock‑image libraries against AI developers for training on their catalogues, to artists alleging that AI tools replicate their distinctive styles, are shaping expectations about what is acceptable and where liability should fall. While Nigerian courts are not bound by those foreign cases, they will be aware of the reasoning and may draw on their logic when interpreting local statutes in AI‑related disputes. The opportunity for Nigeria is to respond not as a passive rule‑taker, but as a jurisdiction that uses its own laws to craft a coherent approach: one that honours human authorship, respects international obligations and acknowledges technological realities.
In this sense, the real policy choice is not whether to “allow” AI in creative work, but how to embed it within a rights framework that ensures Nigerian creators remain visible and empowered. That requires moving beyond generic enthusiasm or alarm and focusing instead on the concrete points where law, technology and practice intersect: ownership, consent, similarity, and evidence.
Intellectual property in creative pipelines: from idea to evidence
Understanding how IP functions in an AI‑enabled environment requires looking closely at how creative projects actually unfold. Take a typical animated series developed in Lagos. It may begin with a small writers’ room shaping a story bible, character arcs and episodic outlines. Visual development artists then translate those ideas into character sheets, environment concepts and mood boards. Producers assemble financing and talent, animators and VFX artists build assets and scenes, sound designers and composers craft the sonic world, and editors bring everything together into coherent episodes.
At each stage, layers of IP emerge. The script and treatment embody literary rights; character designs and concept art give rise to artistic works; the final footage, with its integrated visuals and audio, constitutes a cinematograph film. Names, logos and distinctive character appearances may function as trademarks, while certain physical embodiments—figurines, merchandise or interface designs—could fall under industrial designs law. These are not theoretical categories; they determine who can license, adapt or merchandise the project over time.
Where AI enters this pipeline, it typically does so as a tool: an image generator used to prototype backgrounds; a writing assistant used to explore dialogue variants; an AI‑driven plug‑in used to clean audio or automate rotoscoping. If a concept artist feeds a prompt into an image generator, receives several options, and then painstakingly redraws and refines one of them to align with the show’s visual language, the final design reflects original human authorship, even if AI helped generate the starting point. Conversely, if a studio simply lifts raw AI outputs into a production without human transformation, those elements may contribute to the look and feel of the show but may not themselves be protected works under current doctrine.
For studios, the challenge is to ensure that the legal record keeps pace with this creative complexity. The Copyright Act 2022 generally vests initial ownership in the author unless a contract states otherwise, while the Evidence Act 2011 requires that digital records be shown to be authentic and reliably produced. In an AI‑enabled pipeline, that means studios must go beyond having “some contracts” and move towards an “evidence by design” approach: keeping signed agreements with all key contributors; storing dated versions of scripts, designs and project files; preserving AI prompts and outputs linked to particular assets; and backing up everything on at least two independent systems.
What may seem like administrative overhead today will determine tomorrow’s bargaining power. When a distributor or funder asks “Do you own this?” they are really asking “Can you show us the chain‑of‑title?” In a world where AI is part of the process, that chain must include evidence of who did what, how AI was used and where human creativity made the difference.
Generative AI, training data and emerging risk patterns
The most visible debates around AI in creative industries have not been about helpful tools that clean audio or speed up rotoscoping, but about generative systems that create images, text, music and video based on vast training datasets. Artists, photographers, voice actors and studios have raised a series of concerns: that their work has been used to train models without consent or compensation; that AI outputs sometimes reproduce distinctive styles or even specific works; and that synthetic voices and likenesses can be deployed in ways that undermine performers’ careers or reputations.
Even though many of the headline disputes are currently being litigated in North America and Europe, the underlying issues are directly relevant to Nigeria. If a model has been trained on global image or audio datasets that include African works, Nigerian creators may find their distinctive aesthetics reflected in outputs with no recognition or remuneration. If a studio uses a commercial AI tool that promises “royalty‑free” outputs but was trained on unlicensed content, that studio may be exposed to infringement claims when exporting the work. If a voice model can convincingly imitate a well‑known Nigerian actor, the risk is not abstract; it is immediate.
Under current Nigerian law, these questions will likely be analysed through a combination of copyright, performers’ rights, privacy, data protection and passing‑off doctrines. Did the use of underlying works respect their economic and moral rights? Is an AI‑generated output “substantially similar” to a protected work or performance? Was there informed consent for the capture and reuse of voice or likeness data? Do platform terms fairly allocate risk between AI providers and creative users?
A leadership posture for Nigeria does not require waiting for foreign courts to settle every dispute. It involves articulating a principled stance based on existing law: that large‑scale use of Nigerian cultural content for AI training should not occur in a legal vacuum; that creators have legitimate expectations of control and compensation when their works are commercially exploited; and that performers’ voices and likenesses are not mere raw materials but attributes that deserve protection and respect. It also means insisting, in contracts with AI vendors and cloud providers, that training on uploaded project materials be clearly addressed rather than hidden in fine print.
In practical terms, this suggests three immediate best‑practice moves for Nigerian studios and creators. First, treat AI platform terms as serious legal instruments and seek clarification where rights and responsibilities are unclear. Second, document how AI is used in each project, including prompts, settings and human modifications, so that if a question arises, there is an evidential trail. Third, avoid relying exclusively on tools whose training and licensing status is opaque for high‑value, export‑oriented projects, particularly where distinctive character designs, music or voices are involved.
Sector responsibilities and shared opportunities
Nigeria’s ability to turn creative talent into durable intellectual property will depend on how each segment of the ecosystem chooses to act over the next few years. The law sets the frame, but day‑to‑day practices in studios, freelance relationships, platforms, funding agreements and development programmes will determine whether that frame is filled with enforceable rights or with uncertainty.
By Del-York Creative Academy (DCA)
In collaboration with Youth in Animation and Post-Production Initiative (YAPPI)
Lagos, Nigeria | December 2025
Executive summary
Nigeria’s creative industries are standing at a decisive inflection point. Film, animation, post‑production, digital content and design have moved from the margins of the economy into the heart of how the country generates jobs, attracts investment and projects soft power to the world. At the same time, generative artificial intelligence (AI) and other digital tools are quietly rewiring creative workflows from the way ideas are visualised to how voices, characters and worlds are produced, raising new questions about who owns what and on what terms.
This paper argues that Nigeria already has enough legal architecture to protect its creators in an AI‑driven world, but that the way those rules are interpreted and operationalised will determine whether the country remains a low‑margin service provider or emerges as a global source of original, bankable IP. It advances a simple but powerful foundation: copyright protection rests on human authorship; purely machine‑generated material, left untouched, does not qualify as a protected work; and in the absence of AI‑specific laws, contracts, platform terms and documentation practices are doing much of the heavy lifting.
Building on Nigerian legislation, key international treaties and practical experience in the audiovisual and animation sectors, the paper develops a “rights‑first, evidence‑driven” framework for IP in creative pipelines. It draws on global disputes around training data, image synthesis and deepfakes to illustrate the risks of silence, while proposing concrete steps for creators, studios, regulators, funders and development partners to strengthen chain‑of‑title, clarify AI usage and build a culture of trust that lowers transaction costs across the ecosystem. The goal is not merely alignment with existing global norms, but to position Nigeria as a thought leader in how human creativity and AI can coexist on fair, transparent and commercially viable terms.
Nigeria at a creative crossroads
Over the past two decades, Nigeria’s creative economy has shifted from a “promise” narrative to a measurable reality. Nollywood productions sit on major global streaming platforms; Nigerian music shapes global charts and festival line‑ups; and local animation and post‑production studios now routinely service international clients. In cities such as Lagos, Abuja and Port Harcourt, young teams are building studios, collectives and start‑ups that approach storytelling and design with a distinctly Nigerian voice and a global mindset.
Yet beneath the visible success of high‑profile projects lies a quieter, more fragile reality. Many creators still work on the basis of verbal understandings, informal messages and template contracts that do not reflect the complexity of modern production chains. Rights are often assumed rather than documented, and when a project suddenly attracts attention, whether through a viral clip, an international festival selection or a licensing offer, questions of ownership and entitlement surface at precisely the moment when clarity is most needed.
This structural fragility is magnified by the arrival of AI tools in creative workflows. A young animation team in Lagos might now use an image generator to quickly visualise character concepts, a language model to suggest dialogue variations, and audio tools to create temp voices or refine recordings. These tools accelerate production, lower costs and democratise experimentation, but they also introduce new lines of risk: what happens if an AI‑generated background closely resembles a foreign artist’s work? Who owns an AI‑assisted storyboard that was refined by multiple hands? What does it mean to license a show if its key assets were produced using a model trained on unlicensed data?
Nigeria therefore stands at a crossroads defined by three intersecting forces: a rapidly modernising creative economy, an evolving but still under‑utilised legislative framework, and a technological disruption that ignores borders and moves faster than traditional policymaking. The question is not whether AI will be used in Nigerian creative work, it already is. But whether the country will shape usage norms in a way that protects creators, reassures investors and inspires regulators, or allow a vacuum in which rights are eroded by default.
Legal and policy foundations for a human‑centred AI era
The starting point for any serious conversation about IP and AI in Nigeria is the existing law. At the constitutional level, property rights are explicitly protected, and the courts have long recognised that this extends beyond land and physical objects to include intangible assets such as copyright, trademarks and related rights. Building on this foundation, the Copyright Act 2022, the Trademarks Act, the Patents and Designs Act and the Evidence Act 2011 together define the core rules that govern who owns creative outputs, how those rights can be recorded and transferred, and how they can be proven in a dispute.
Within this architecture, one doctrinal point carries particular weight in the age of AI: copyright subsists only in original works created by a human author. The Copyright Act 2022 requires that a work originate from a “qualified person,” and that requirement is echoed and reinforced by the international instruments to which Nigeria is party, including the Berne Convention, the TRIPS Agreement, the WIPO Copyright Treaty, the WIPO Performances and Phonograms Treaty, the Beijing Treaty on Audiovisual Performances and the emerging AfCFTA IP Protocol. Across these instruments, authors and performers are understood to be natural persons whose intellectual contributions merit protection and remuneration.
In practice, this means that purely machine‑generated content, produced without meaningful human creative input, falls outside the current definition of a “protected work.” An AI system may produce an image, a melody or a line of dialogue, but unless a human creator selects, arranges, modifies or integrates that output in a way that reflects their own skill and judgment, there is no copyright in the result under Nigerian law as it stands. This position does not prohibit AI use; it simply clarifies that legal protection follows human creativity, not computational capacity.
At the same time, Nigerian law does not operate in isolation. Global disputes around AI training and output, ranging from claims by stock‑image libraries against AI developers for training on their catalogues, to artists alleging that AI tools replicate their distinctive styles, are shaping expectations about what is acceptable and where liability should fall. While Nigerian courts are not bound by those foreign cases, they will be aware of the reasoning and may draw on their logic when interpreting local statutes in AI‑related disputes. The opportunity for Nigeria is to respond not as a passive rule‑taker, but as a jurisdiction that uses its own laws to craft a coherent approach: one that honours human authorship, respects international obligations and acknowledges technological realities.
In this sense, the real policy choice is not whether to “allow” AI in creative work, but how to embed it within a rights framework that ensures Nigerian creators remain visible and empowered. That requires moving beyond generic enthusiasm or alarm and focusing instead on the concrete points where law, technology and practice intersect: ownership, consent, similarity, and evidence.
Intellectual property in creative pipelines: from idea to evidence
Understanding how IP functions in an AI‑enabled environment requires looking closely at how creative projects actually unfold. Take a typical animated series developed in Lagos. It may begin with a small writers’ room shaping a story bible, character arcs and episodic outlines. Visual development artists then translate those ideas into character sheets, environment concepts and mood boards. Producers assemble financing and talent, animators and VFX artists build assets and scenes, sound designers and composers craft the sonic world, and editors bring everything together into coherent episodes.
At each stage, layers of IP emerge. The script and treatment embody literary rights; character designs and concept art give rise to artistic works; the final footage, with its integrated visuals and audio, constitutes a cinematograph film. Names, logos and distinctive character appearances may function as trademarks, while certain physical embodiments—figurines, merchandise or interface designs—could fall under industrial designs law. These are not theoretical categories; they determine who can license, adapt or merchandise the project over time.
Where AI enters this pipeline, it typically does so as a tool: an image generator used to prototype backgrounds; a writing assistant used to explore dialogue variants; an AI‑driven plug‑in used to clean audio or automate rotoscoping. If a concept artist feeds a prompt into an image generator, receives several options, and then painstakingly redraws and refines one of them to align with the show’s visual language, the final design reflects original human authorship, even if AI helped generate the starting point. Conversely, if a studio simply lifts raw AI outputs into a production without human transformation, those elements may contribute to the look and feel of the show but may not themselves be protected works under current doctrine.
For studios, the challenge is to ensure that the legal record keeps pace with this creative complexity. The Copyright Act 2022 generally vests initial ownership in the author unless a contract states otherwise, while the Evidence Act 2011 requires that digital records be shown to be authentic and reliably produced. In an AI‑enabled pipeline, that means studios must go beyond having “some contracts” and move towards an “evidence by design” approach: keeping signed agreements with all key contributors; storing dated versions of scripts, designs and project files; preserving AI prompts and outputs linked to particular assets; and backing up everything on at least two independent systems.
What may seem like administrative overhead today will determine tomorrow’s bargaining power. When a distributor or funder asks “Do you own this?” they are really asking “Can you show us the chain‑of‑title?” In a world where AI is part of the process, that chain must include evidence of who did what, how AI was used and where human creativity made the difference.
Generative AI, training data and emerging risk patterns
The most visible debates around AI in creative industries have not been about helpful tools that clean audio or speed up rotoscoping, but about generative systems that create images, text, music and video based on vast training datasets. Artists, photographers, voice actors and studios have raised a series of concerns: that their work has been used to train models without consent or compensation; that AI outputs sometimes reproduce distinctive styles or even specific works; and that synthetic voices and likenesses can be deployed in ways that undermine performers’ careers or reputations.
Even though many of the headline disputes are currently being litigated in North America and Europe, the underlying issues are directly relevant to Nigeria. If a model has been trained on global image or audio datasets that include African works, Nigerian creators may find their distinctive aesthetics reflected in outputs with no recognition or remuneration. If a studio uses a commercial AI tool that promises “royalty‑free” outputs but was trained on unlicensed content, that studio may be exposed to infringement claims when exporting the work. If a voice model can convincingly imitate a well‑known Nigerian actor, the risk is not abstract; it is immediate.
Under current Nigerian law, these questions will likely be analysed through a combination of copyright, performers’ rights, privacy, data protection and passing‑off doctrines. Did the use of underlying works respect their economic and moral rights? Is an AI‑generated output “substantially similar” to a protected work or performance? Was there informed consent for the capture and reuse of voice or likeness data? Do platform terms fairly allocate risk between AI providers and creative users?
A leadership posture for Nigeria does not require waiting for foreign courts to settle every dispute. It involves articulating a principled stance based on existing law: that large‑scale use of Nigerian cultural content for AI training should not occur in a legal vacuum; that creators have legitimate expectations of control and compensation when their works are commercially exploited; and that performers’ voices and likenesses are not mere raw materials but attributes that deserve protection and respect. It also means insisting, in contracts with AI vendors and cloud providers, that training on uploaded project materials be clearly addressed rather than hidden in fine print.
In practical terms, this suggests three immediate best‑practice moves for Nigerian studios and creators. First, treat AI platform terms as serious legal instruments and seek clarification where rights and responsibilities are unclear. Second, document how AI is used in each project, including prompts, settings and human modifications, so that if a question arises, there is an evidential trail. Third, avoid relying exclusively on tools whose training and licensing status is opaque for high‑value, export‑oriented projects, particularly where distinctive character designs, music or voices are involved.
Sector responsibilities and shared opportunities
Nigeria’s ability to turn creative talent into durable intellectual property will depend on how each segment of the ecosystem chooses to act over the next few years. The law sets the frame, but day‑to‑day practices in studios, freelance relationships, platforms, funding agreements and development programmes will determine whether that frame is filled with enforceable rights or with uncertainty.
- Studios and production companies
Studios sit at the centre of creative pipelines and are often the first point of contact for regulators, platforms and investors. Their responsibility goes beyond making great work; it includes designing processes that produce clean, provable rights. This begins with treating IP governance as a production function, not a purely legal afterthought. Writers, animators, editors, VFX artists, sound designers and performers should all work under written agreements that clearly define ownership, licensing, moral rights and how AI tools may or may not be used in the project.
A forward‑looking studio culture also integrates documentation into workflow tools. Version control, asset libraries and project management systems can be configured to preserve timestamps, authorship information and AI usage logs as part of ordinary work rather than as a separate “legal task.” This turns the studio’s catalogue into an asset that can withstand external scrutiny, making it more attractive to distributors and co‑production partners who increasingly demand evidence that rights are secure and AI use has been responsible and transparent
A forward‑looking studio culture also integrates documentation into workflow tools. Version control, asset libraries and project management systems can be configured to preserve timestamps, authorship information and AI usage logs as part of ordinary work rather than as a separate “legal task.” This turns the studio’s catalogue into an asset that can withstand external scrutiny, making it more attractive to distributors and co‑production partners who increasingly demand evidence that rights are secure and AI use has been responsible and transparent
- Freelancers and small teams
Freelancers, independent creators and small studios often carry the greatest creative burden while holding the least contractual power. Yet they are also the group that can most quickly shift norms from the bottom up. By insisting on written agreements for their work, keeping detailed records of their own AI usage and source materials, and offering clients “clean rights” as a selling point, freelancers can position themselves not merely as talent, but as reliable partners in IP management.
This is particularly important in AI‑assisted projects, where a freelancer may be asked to deliver concept art, voice samples or scripts produced with the help of generative tools. In such cases, freelancers should document how AI was used, what platforms were involved, and what human modifications they introduced, and they should ensure their contracts explicitly address ownership and warranties relating to that process. Over time, these practices can help shift expectations, so that transparent AI use and strong documentation become hallmarks of professionalism in the Nigerian creative sector.
This is particularly important in AI‑assisted projects, where a freelancer may be asked to deliver concept art, voice samples or scripts produced with the help of generative tools. In such cases, freelancers should document how AI was used, what platforms were involved, and what human modifications they introduced, and they should ensure their contracts explicitly address ownership and warranties relating to that process. Over time, these practices can help shift expectations, so that transparent AI use and strong documentation become hallmarks of professionalism in the Nigerian creative sector.
- Platforms, distributors and broadcasters
Distributors, broadcasters and digital platforms control critical gateways between creators and audiences. Their business models depend on trust: audiences trust them to deliver authentic content; regulators trust them to comply with law and policy; investors and advertisers trust them to avoid legal and reputational risk. In an AI‑enabled environment, that trust is increasingly tied to how well platforms can show that content has clean rights and that AI use is disclosed and managed responsibly.
One practical response is to make IP and AI questions a standard part of intake and onboarding. Platforms and broadcasters can require producers to provide brief rights summaries, chain‑of‑title documentation and AI usage declarations for new projects. This does not mean rejecting AI‑assisted works; rather, it creates a structured way to gather information, manage risk and, where necessary, request additional assurances or documentation. By doing so consistently, platforms help lift the baseline of practice across the sector and provide regulators with confidence that AI is not being used in ways that undermine performers’ rights, audience trust or Nigeria’s international reputation.
One practical response is to make IP and AI questions a standard part of intake and onboarding. Platforms and broadcasters can require producers to provide brief rights summaries, chain‑of‑title documentation and AI usage declarations for new projects. This does not mean rejecting AI‑assisted works; rather, it creates a structured way to gather information, manage risk and, where necessary, request additional assurances or documentation. By doing so consistently, platforms help lift the baseline of practice across the sector and provide regulators with confidence that AI is not being used in ways that undermine performers’ rights, audience trust or Nigeria’s international reputation.
- Investors, financiers and sponsors
For investors, financiers and sponsors, IP is not a theoretical concern; it is a core asset that underpins revenue projections, collateral and exit strategies. Where rights are uncertain, valuations fall, legal costs rise and insurance becomes more complex. Conversely, where a project or studio can demonstrate robust IP governance—clear agreements, documented AI use, preserved evidence—investment risk decreases and the terms of finance can improve.
Investment documents can play a powerful standard‑setting role. Term sheets, loan agreements and equity documents can incorporate IP due‑diligence conditions, require AI usage disclosures and warranties, and encourage the adoption of policies on performer consent and data protection. Development agencies and impact investors, in particular, can combine capital with technical assistance to help studios build the legal and operational capacity needed to meet these expectations. In doing so, they are not merely protecting their own interests; they are helping to professionalise a sector whose long‑term health depends on strong, enforceable rights .
Policy, regulatory and development recommendations
Transforming Nigeria’s creative IP landscape in the age of AI will require coordinated action. This paper proposes a set of pragmatic recommendations that build on existing law, respect creative practice and recognise the constraints under which regulators and funders operate.
Investment documents can play a powerful standard‑setting role. Term sheets, loan agreements and equity documents can incorporate IP due‑diligence conditions, require AI usage disclosures and warranties, and encourage the adoption of policies on performer consent and data protection. Development agencies and impact investors, in particular, can combine capital with technical assistance to help studios build the legal and operational capacity needed to meet these expectations. In doing so, they are not merely protecting their own interests; they are helping to professionalise a sector whose long‑term health depends on strong, enforceable rights .
Policy, regulatory and development recommendations
Transforming Nigeria’s creative IP landscape in the age of AI will require coordinated action. This paper proposes a set of pragmatic recommendations that build on existing law, respect creative practice and recognise the constraints under which regulators and funders operate.
- For government and regulators
Regulators can take a light‑touch but decisive approach by clarifying how existing rules apply to AI‑assisted creativity and by integrating disclosure requirements into processes that already exist. Without rewriting the entire copyright framework, guidance can reaffirm that human authorship remains the legal cornerstone of protection and set out factors that indicate when human contribution in an AI‑assisted work is likely to reach the originality threshold.
Content registration, classification and broadcasting regimes can require simple AI‑usage declarations for new film, television and animation projects, along with brief summaries of rights ownership. Such declarations do not need to be burdensome; they can take the form of short forms or digital checklists. Over time, this will create a valuable dataset on how AI is being used in Nigerian creative work, informing more targeted policy interventions and helping regulators distinguish between responsible use and risky practices.
Regulators can also support training and capacity‑building initiatives by partnering with institutions like Del‑York Creative Academy and networks such as YAPPI to deliver structured IP and AI literacy programmes for creators, lawyers and officials. This investment in human capital is as important as any statutory change, because even the best‑drafted rules will fail if the people applying them do not understand how creative workflows and technologies operate in practice.
Content registration, classification and broadcasting regimes can require simple AI‑usage declarations for new film, television and animation projects, along with brief summaries of rights ownership. Such declarations do not need to be burdensome; they can take the form of short forms or digital checklists. Over time, this will create a valuable dataset on how AI is being used in Nigerian creative work, informing more targeted policy interventions and helping regulators distinguish between responsible use and risky practices.
Regulators can also support training and capacity‑building initiatives by partnering with institutions like Del‑York Creative Academy and networks such as YAPPI to deliver structured IP and AI literacy programmes for creators, lawyers and officials. This investment in human capital is as important as any statutory change, because even the best‑drafted rules will fail if the people applying them do not understand how creative workflows and technologies operate in practice.
- For development partners and agencies
Development agencies and international partners play a significant role in funding training, infrastructure and policy reform in Nigeria’s creative industries. Integrating a strong IP and AI component into these programmes can magnify their impact. Support can be directed towards national or regional hubs that offer legal clinics for creatives, hands‑on workshops on digital evidence preservation and AI documentation, and incubator programmes that help studios build internal IP systems.
At the policy level, development partners can facilitate dialogue between Nigerian regulators, courts and counterparts in other African and global jurisdictions who are grappling with similar questions. This can help avoid fragmented approaches and ensure that Nigerian creatives are not disadvantaged by uneven rules when their work crosses borders. By backing research and multi‑stakeholder forums on AI, IP and cultural rights, development agencies can position Nigeria as a reference point for other countries seeking to balance innovation and protection in the creative economy.
- For investors and funders
Investors and funders can incorporate IP and AI considerations into their evaluation frameworks without turning every project into a legal audit. Simple tools—a standard IP questionnaire, a checklist of key contracts, a requirement for a basic chain‑of‑title file—can quickly reveal whether a studio or production has taken rights seriously. Where gaps exist but the underlying creative potential is strong, funders can make support conditional on remedial steps, such as formalising contributor agreements or improving documentation practices.
Impact‑oriented funders and donors may also consider dedicated lines of support for legal and IP advisory services in funded projects, recognising that many small teams cannot yet afford specialised counsel. By tying capital to improved governance rather than punishing projects for initial weaknesses, investors can help raise the overall standard of practice and reduce the perceived tension between creativity and compliance.
Conclusion
Nigeria’s creative industries are already reshaping how the country is seen and heard around the world. The question now is whether the law, business models and daily practices around intellectual property will evolve fast enough to protect and amplify that achievement in an era of pervasive AI. The foundations exist: a legal framework that centres human authorship, a new generation of creators fluent in digital tools, and a growing recognition among regulators and funders that culture is not peripheral to development but central to it.
For creatives, the call to action is to treat rights as part of the creative process rather than a post‑production formality. Document your work, understand the terms of the tools you use, insist on clear agreements, and see yourself not just as a storyteller or designer, but as the originator of assets that carry long‑term value. For studios, the imperative is to build “evidence by design” into your pipelines so that your catalogues can withstand scrutiny and travel confidently across borders.
For regulators, this is an opportunity to lead with clarity rather than fear: to interpret existing laws in ways that preserve human creativity at the centre, require transparency in AI use and give courts and agencies the tools they need to act when rights are threatened.
For investors and development partners, it is a chance to back not only projects but also the infrastructure of trust, contracts, systems, skills—that will make Nigeria’s creative IP bankable at scale.
For enthusiasts and audiences, there is a role as well: to value originality, to ask where stories and images come from, and to support creators and companies that treat rights and identities with respect. The more these expectations permeate the market, the harder it becomes for exploitative practices to hide behind technology or complexity.
Del‑York Creative Academy and the YAPPI initiative commit to remaining active conveners in this conversation, bringing together creators, regulators, funders and technologists to refine the ideas set out in this paper and translate them into practical tools and collaborations. The path to a fair, AI‑aware creative ecosystem will not be shaped by one document or one institution, but by a community willing to experiment, learn and adjust while keeping one principle fixed: that the value generated by Nigerian creativity must flow first and foremost to the people and communities who create it.
References
African Union. (2023). Draft protocol on intellectual property rights to the Agreement Establishing the African Continental Free Trade Area (AfCFTA)
Beijing Treaty on Audiovisual Performances, June 24, 2012.
Berne Convention for the Protection of Literary and Artistic Works, Sept. 9, 1886 (as revised).
Del‑York Creative Academy, & Youth in Animation and Post‑Production Initiative. (2025).
Intellectual property protection in Nigeria’s creative future: A position paper on audiovisual, animation, post‑production and AI‑assisted creation. Del‑York Creative Academy.
Federal Republic of Nigeria. (1999). Constitution of the Federal Republic of Nigeria (as amended).
Federal Republic of Nigeria. (2004). Trademarks Act (Cap. T13 LFN 2004).
Federal Republic of Nigeria. (2004). Patents and Designs Act (Cap. P2 LFN 2004).
Federal Republic of Nigeria. (2011). Evidence Act (No. 18 of 2011).
Federal Republic of Nigeria. (2015). Cybercrimes (Prohibition, Prevention, etc.) Act, 2015.
Federal Republic of Nigeria. (2020). Companies and Allied Matters Act, 2020.
Federal Republic of Nigeria. (2022). Copyright Act, 2022.
Federal Republic of Nigeria. (2023). Nigeria Data Protection Act, 2023.
World Intellectual Property Organization. (1996). WIPO Copyright Treaty (WCT).
World Intellectual Property Organization. (1996).
WIPO Performances and Phonograms Treaty (WPPT). World Trade Organization. (1994).
Agreement on Trade‑Related Aspects of Intellectual Property Rights (TRIPS).