Article: Practical AI and Ethics – Lessons from Our February Workshop

A group of people in a workshop environment facing away from the camera looking at a presentation on screen about AI.

Artificial Intelligence is changing how charities and community organisations work. Our Practical AI Implementation and Ethics workshop on 11 February 2026, led by Kate Watson of AskMrsWatson.com, helped local charities and community groups understand how to use it safely, responsibly and with confidence.

The session mixed live demonstrations of tools such as ChatGPT and Microsoft Copilot with open discussions about ethics, privacy, and digital wellbeing. Participants saw how these tools can help reduce repetitive tasks like drafting reports, bids or meeting minutes, while keeping the human voice and judgement that funders and communities trust.

Top tips with practical examples

Treat AI like a keen trainee

  • Use AI to beat “blank page” nerves by asking it to suggest structures or bullet points for a funding bid, newsletter, safeguarding update or meeting agenda, then fill in the real details yourself.
  • Think of the output as a rough draft: keep what is useful, delete what is wrong or bland, and add your own examples, local context and lived experience.
  • For example, one practical suggestion was to ask AI for a simple three-part structure for a funding bid (need, activity, impact), then write the actual content in your own words.

Always keep a human in the loop

  • AI can sound confident while being completely wrong, so always check key facts, statistics, legal references and links before using them.
  • Make sure someone with subject knowledge reviews important outputs such as policies, funding bids, public communications or anything involving risk.
  • A useful habit is to build checkpoints into your process: for example, “AI draft, staff edit, final review by manager or subject specialist.”

Use AI to tidy notes and draft documents

  • You can photograph handwritten meeting notes and ask AI to turn them into typed minutes, action lists or a summary for trustees or volunteers.
  • AI can also suggest headings and bullet points for policies (such as AI use, safeguarding or data protection), which you then adapt to your organisation’s existing templates and tone.
  • Use AI to create one page summaries of longer documents, highlight changes made to documents, overviews for different audiences etc.

Protect people and data

  • Do not paste counselling notes, case records or commercially sensitive information into public AI tools; use anonymous or fictional examples instead if you want to practise.
  • Prefer organisation-approved tools and accounts, with clear rules about what staff and volunteers can and cannot upload.
  • If you use AI to help with recruitment, funding or other high-impact decisions, make sure you have strong checks for fairness, bias and accuracy and clear human accountability.

Write for funders, not for the tool

  • Funders increasingly recognise the signs of generic AI text. They want to hear your organisation’s real voice, stories and outcomes.
  • A helpful pattern is to use AI for structure, lists of possible points and wording ideas, but then rewrite in your own style and add local examples and real results.
  • You can also ask AI to suggest questions to check whether a draft still feels “from the heart” and clearly linked to your mission.

Build simple rules and prompts for your team

  • An AI usage policy does not have to be complicated: it can cover what tools are allowed, what data must never be shared, who reviews what, and how often the policy is checked.
  • Creating a small “prompt library” can help staff get started: for example, prompts for “rewrite this email in a calm, clear tone,” “summarise this policy for volunteers,” or “suggest three headings for a newsletter article.”

Take security and scams seriously

  • Criminals can now use AI to create very convincing emails, fake voices and fake videos, so spelling mistakes are no longer a reliable warning sign.
  • One practical tip from the workshop was to agree a “safe word” or verification process with key staff and family members so that urgent requests for money, passwords or gift cards are always checked by phone or through a second channel.
  • Organisations should also keep good backups, use two-factor authentication where possible and run simple awareness activities, such as example phishing emails, to build staff confidence.

Remember accessibility and the environment

  • Faster content creation increases the need to check accessibility: add headings, alt text, captions and readable layouts to anything you publish or share.
  • AI tools rely on large data centres that use energy and water, so it helps to keep prompts focused, avoid unnecessary generations and choose tools and providers that take sustainability seriously.

How organisations can put this into practice

  • Drafting or updating an AI usage policy and linking it to existing policies such as data protection, social media and incident response.
  • Using AI to create first drafts of newsletters, agendas, risk assessments and summaries, then editing for accuracy, accessibility and tone.
  • Setting up simple training and support, such as a shared prompt library, mentoring for beginners and regular conversations about what is and is not working.
  • Planning for security incidents such as phishing and ransomware, with clear checklists, offline backups and communication plans.

Kate also encouraged everyone to see digital skills as part of wellbeing and fairness, helping staff and volunteers feel less overwhelmed and more able to focus on the parts of their work that need human judgement, empathy and creativity

Further Support

VODA’s draft AI policy is available to organisations in the North Tyneside voluntary, community and social enterprise sector; for a copy and additional support, email development@voda.org.uk.

 

Share:

Related Posts

Two women sat on a brown sofa. Both are wearing blue. They are smiling and chatting over a cuppa and some cake.

The Benefits of Volunteering as a Befriender 

Befriending Week (1–7 November) celebrates the positive impact that befriending has on individuals and communities across the UK.  What is befriending?  Befriending Networks, the largest network of befriending organisations in the world, defines befriending as “a relationship supported by an organisation to enable

Read More »
Trustees Week logo

Celebrating Trustees Week 2025

In our latest Governance Blog, we are flying the flag for trustees this Trustees’ Week 2025. Trustees’ Week is an annual celebration of the work of trustees across the UK and each year the theme is to celebrate, support and

Read More »
VODA

Search