How to Build an Ethical AI Policy for Your Charity
- Alexander Reid
- Nov 18
- 2 min read
Most charities are already using AI informally — drafting emails, outlining reports, or summarising meetings.Without a clear policy, this can quickly become inconsistent, risky, or confusing for staff.
An ethical AI policy doesn’t have to be long. It should simply give people clarity, boundaries, and confidence so they can use AI safely and effectively.
1. What an AI policy should do
A good policy:
explains your organisation’s approach to AI
protects supporter trust and data
gives staff simple guidance
ensures humans stay responsible
It does not need to be technical or restrictive.
2. The essential components
Purpose & Principles
Focus on values: trust, accessibility, accountability.
Approved and unapproved use
List the types of tools staff can use (e.g., Copilot, approved chat tools) and those they shouldn’t (public AI tools that store data).Keep it general, not tool-by-tool.
Safe data handling
The most important section.
Make it clear that staff must not upload:
personal or supporter information
confidential documents
anything sensitive
Encourage the use of enterprise or closed versions where possible.
Human oversight
AI can assist with drafting, structure, and summaries — but humans remain responsible for accuracy, tone, and decisions.
Transparency
Include a simple statement for internal documents:
“This draft was assisted by AI and reviewed by [Name].”
This builds trust and avoids confusion.
3. Useful rules to include
Short policies work best when they offer clear, memorable rules, such as:
No personal data in public AI tools.
Always review AI output before using it.
Keep tone aligned with organisational voice.
Don’t present AI content as factual without checking.
Update the policy every six months as tools change.
Simple, predictable guidelines reduce risk and encourage responsible use.
4. How to write it quickly
You can build a usable policy in a week:
Gather examples of how staff already use AI.
Identify risks (data, tone, accuracy).
Draft the 5 core sections.
Share with fundraising, comms, HR, IT for feedback.
Add a 1-page “quick guide” for everyday use.
Publish and introduce it at a team meeting.
Keep the tone supportive, not restrictive.
5. Why it matters
Charities rely on trust. AI can genuinely help teams save time — often several hours a week — but only when people understand how to use it safely.
A clear, ethical AI policy gives staff confidence to experiment responsibly and protects the organisation when tools evolve.
If your organisation wants help understanding the risks and opportunities of AI for charities, you’re welcome to join our next free Bitesize AI for Charities session:👉 https://www.theaitrainingcompany.com/events/bitesize-ai-a-free-webinar-for-charities-3
Comments