Indie Game Awards AI Rules: What Devs Must Know to Submit

Let's cut to the chase. If you're an indie developer eyeing a submission to major festivals like the Independent Games Festival (IGF) or The Game Awards' indie categories, you've probably heard the rumblings. New rules about artificial intelligence are popping up in submission guidelines, and they're causing a mix of confusion, anxiety, and outright frustration in Discord servers and Twitter threads. I've been through the submission grind myself, and now, watching from the other side, I see teams making the same avoidable mistakes. The core issue isn't that awards are "anti-AI." It's that they're scrambling to define what ethical, transparent AI use looks like in a creative field built on originality.

What Are the Indie Game Awards AI Rules?

It's not one universal law. Different awards have different phrasing, but they converge on a few key principles. The Independent Games Festival (IGF), for instance, updated its rules to require disclosure of AI-generated content. The Game Awards haven't published explicit AI-only rules yet, but their general rules emphasize originality and ownership, which now implicitly covers AI. The common threads are transparency and provenance.

Most rules ask you to declare if any significant portion of your game's assets—code, narrative, artwork, audio, music—was created using generative AI tools. "Significant" is the tricky word. Using an AI tool to upscale a final texture? Probably fine. Using it to generate all your character concept art from prompts? That's what they want to know about.

The Core Ask: Awards bodies want to understand the human creative footprint in your game. They need to assess originality and ensure submissions don't infringe on copyrighted material used to train the AI models. It's a due diligence step, not an automatic disqualifier.

Breaking Down the Specific Requirements

Based on current guidelines and insider conversations, here’s what you're likely being asked to account for:

  • Asset Disclosure: Listing which assets (2D art, 3D models, dialogue, sound effects, code snippets) involved AI generation.
  • Tool Identification: Naming the specific tools used (e.g., Midjourney v6, Stable Diffusion, ChatGPT-4, GitHub Copilot).
  • Workflow Description: Briefly explaining how the AI output was integrated, edited, and transformed by the human team. This is crucial—it shows creative direction.
  • Training Data Acknowledgment: Some forms may ask about your knowledge of the tool's training data, especially regarding copyright and consent. This is the murkiest area.

Why These Rules Matter (It's Not Just About Cheating)

On the surface, it's about fairness. No one wants a bot to win a "Best Narrative" award. But dig deeper, and there are legal and philosophical currents at play.

I talked to a colleague who sat on a recent jury. Their biggest headache wasn't spotting AI art; it was the legal uncertainty. If a game winning a major award later faces a copyright lawsuit because its AI-generated assets were trained on copyrighted works without permission, the award's credibility takes a hit. They're insulating themselves.

There's also the preservation of the indie spirit. Awards like IGF were founded to celebrate the unique, personal voice of small teams. When a team uses AI as a collaborative brush, that voice can still be strong. When AI becomes a replacement for core creative decisions, that voice gets muffled. The rules are an attempt to find that line.

A mistake I see: Developers think "disclosure" means "penalty." It doesn't. Non-disclosure, however, is a surefire path to disqualification if discovered later. Honesty is your only viable policy.

A Practical, Step-by-Step Guide for Your Submission

Let's make this actionable. Imagine you're part of "Pixel Forge," a team of three making a retro-styled RPG. You used AI for some background concept art and to help brainstorm dialogue branches. Here’s your game plan.

Step 1: The Pre-Submission AI Audit

Before you even look at the submission form, conduct an internal audit. Go through every asset bucket. This isn't a one-hour task. Create a simple spreadsheet:

Asset Type Specific Asset AI Tool Used? Degree of Human Modification Notes for Submission
Concept Art "Forest Temple" mood board images Yes (Midjourney) High. Used as base, heavily painted over in Photoshop, color palette changed, composition altered. Disclose as AI-assisted concept. Provide 1-2 sentence description of workflow.
Dialogue First draft for side-quest NPCs Yes (ChatGPT) Medium. AI generated verbose paragraphs. We edited for brevity, added character-specific slang, and rewrote 70% of lines for pacing. Disclose as AI-assisted writing. Emphasize human rewriting and direction.
Code Procedural dungeon generation algorithm Yes (GitHub Copilot) Low. Used for code suggestions and boilerplate. Core logic and architecture are original. Disclose as AI-assisted programming. Standard practice, low risk.
Music Main theme melody No N/A. Composed by our team member. No disclosure needed.

This audit does two things: it prepares your disclosure, and it forces you to critically evaluate your own creative process. How much of "your" game is truly yours?

Step 2: Crafting Your Disclosure Statement

Don't write a novel. Be concise, clear, and factual. Use the language from your audit.

Bad Example (Vague & Defensive): "We used some AI like everyone else, but we changed it a lot so it's fine."

Good Example (Clear & Professional): "Our project utilized AI tools in specific, limited capacities. Midjourney was used to generate initial concept art for environmental design, which was then extensively edited and painted over by our artist. ChatGPT assisted in brainstorming dialogue for secondary characters, with all output being heavily revised and rewritten to fit character voice and narrative tone. GitHub Copilot was used for code completion. All core creative direction, design, art finalization, narrative, and code architecture originated with our human team."

Step 3: Document, Document, Document

Keep your audit spreadsheet. Save early AI outputs and your finalized versions side-by-side. This isn't just for the awards; it's for your own portfolio. It demonstrates your skill in directing and refining AI, which is becoming a valuable skill in itself.

The Subtle Pitfalls Most Indie Teams Miss

Here's where my decade of watching submissions pays off. The obvious mistake is hiding AI use. The subtle ones are more interesting.

Pitfall 1: The "Style Mimicry" Trap. You use an AI tool fine-tuned on the art style of a famous indie game (e.g., "in the style of Hollow Knight"). Even with editing, the award jurors—who have played everything—will spot the derivative core. It kills originality points instantly. Use AI for inspiration, not imitation.

Pitfall 2: Over-reliance on AI Narrative Tools. AI can generate functional dialogue, but it often lacks subtext, unique voice, and thematic depth. If your "Best Narrative" submission relied heavily on AI for key story beats, jurors will feel the emotional hollow-ness. The human touch in storytelling is still the differentiator.

Pitfall 3: Ignoring the Tool's Terms. Did you know some AI tools claim partial ownership of their output? Or restrict commercial use? If you didn't read the terms of service of the AI tool you used, you might not fully own what you're submitting. That's a direct violation of award rules requiring you to own your IP. Always check.

The Future: Where Are Indie Game Awards AI Rules Heading?

This is just the first draft. The rules will evolve from blanket "disclosure" to more nuanced categories. I predict we'll see:

  • Dedicated AI Categories: Awards for "Best Use of Generative AI in Game Development" that celebrate the toolcraft itself.
  • Technical Standards: Push for AI tools to include embedded metadata (like a creative commons license for training data) that travels with the asset, making provenance clear.
  • Increased Scrutiny on Training Data: Awards may start favoring tools trained on ethically sourced, licensed, or public domain data. The SPDX standard for software bill of materials might inspire similar "AI bill of materials" demands.

The goal isn't to freeze out AI. It's to integrate it responsibly into the creative evaluation framework. The indie game community has always been about innovation, and this is the next frontier to navigate.

Your Burning Questions, Answered

If my game used an AI tool that's just a fancy Photoshop filter (like an AI upscaler or denoiser), do I need to declare it?
Most guidelines focus on generative AI—tools that create new content from prompts. Purely assistive or enhancing tools (upscalers, denoisers, smart fill for removing objects) are generally considered part of the standard digital art toolkit. You're not generating new artistic content; you're refining existing human-created work. When in doubt, a brief note like "Standard digital enhancement tools (including AI-based upscaling) were used in asset polishing" shows transparency without overcomplicating things.
What's the real risk of using an AI model trained on copyrighted art? Could our game get disqualified months later?
The risk is shifting from the award body to the legal realm. The award might disqualify you if a copyright infringement is proven, but the bigger threat is a lawsuit from rights holders. The current legal landscape, as seen in cases like Getty Images v. Stability AI, is untested for final game assets. The awards' rules are a pre-emptive shield for them. Your best mitigation is to use AI output as a compositional starting point and transform it significantly with your own art direction. The more derivative the final asset is of a specific AI training input, the higher the risk.
We're a tiny team. AI let us create a visually rich world we couldn't have made otherwise. Will judges hold that against us compared to a team with a full-time artist?
This is the core dilemma. A savvy jury will judge the final product's coherence, style, and artistry. If your AI-assisted art is seamlessly integrated and serves a strong, unified creative vision, it will be evaluated on its merits. However, if the art feels like a patchwork of different AI styles or lacks a cohesive hand, it will score lower on artistic achievement. The "how" matters more than the "what." Use your submission statement to frame the narrative: "As a two-person team, we used AI to expand our artistic scope, but maintained strict creative control over palette, composition, and world-building to ensure a cohesive experience." Turn your constraint into part of your story.
Are there any major indie awards that currently have no AI rules at all?
As of now, explicit, detailed AI rules are still emerging. However, the overarching principle of originality and ownership in every award's terms and conditions implicitly covers AI-generated content. Submitting an asset you don't fully own or control is a violation. It's safer to assume all major awards now require some level of disclosure regarding generative AI. Always, always read the latest official rules for the specific award you're targeting. Don't rely on forum posts from six months ago.

The landscape is changing fast. Last year's controversy is this year's submission checkbox. By understanding the intent behind the Indie Game Awards AI rules—promoting transparency, protecting originality, and navigating legal gray areas—you're not just complying. You're positioning yourself as a thoughtful, ethical creator in a new era. That's a reputation worth building, award or not.

Now go check your asset folders.