Artificial intelligence (AI) is increasingly being adopted across not-for-profit workplaces, but alongside the productivity gains from this new technology comes a growing concern: what is the cognitive cost for the people using it?
Research shows that relying heavily on AI can degrade critical thinking and intellectual independence – precisely the skills NFPs need to solve complex problems.
So how can NFP leaders model and embed responsible AI use for your teams while still benefiting from the productivity gains AI can offer?
How are NFPs using AI now?
Beth Kanter is an internationally recognised thought leader on digital transformation in NFPs.
She says that NFPs are most commonly using generative AI for creating content, analysing data, summarising meeting notes, conducting research and brainstorming ideas.
Research shows that AI tools are also helping NFPs identify and reach potential funders, prepare board packs and management reports, personalise communications at scale and extract insights from large datasets – tasks that were once highly laborious for many resource-constrained organisations.
How much thinking should you hand over to AI?
Every time you open an AI tool, you face a small but important decision: what will you do yourself, and what will you hand over to AI?
That choice, says Kanter, shapes your brain over time.
“The more you hand over your thinking to AI, the less practice your brain gets at thinking,” she says. “Thinking is a skill, and like any skill it can get rusty.”
Delegating thinking to AI is an example of cognitive offloading, which essentially means using something external to do mental work for you.
The term also applies to actions like storing phone numbers in a device, using calculators, relying on calendars and following GPS for directions – all of which are generally considered “healthy” forms of cognitive offloading.
Generative AI goes much further in that it doesn’t just store information or handle routine tasks; it generates analysis, proposes ideas and reworks entire bodies of text – tasks that would require humans a deeper level of mental effort and critical thinking.
This is where the risk of unhealthy cognitive offloading often begins.
“The tricky part is that when we’re under pressure in the workplace, we’re often the worst judges of whether we’ve crossed the line into unhealthy cognitive offloading,” Kanter says.
“A 2025 study from Carnegie Mellon found that as people gain confidence using AI, they delegate more, their brains become less engaged and they use less critical thinking – especially when automating tasks they used to do themselves.
“What we need to avoid is the habit of mindless productivity with AI – letting it do all of our thinking instead of using it to stretch our thinking.”
She adds that the best practice for retaining cognitive skills in the workplace is to avoid default delegation for every task and engage your brain in problem-solving before using AI.
For NFP leaders, this means setting simple norms around AI use, such as:
- “Think first, prompt second” – encouraging staff to outline the problem they have or jot down key ideas for a brainstorm before turning to AI to expand on them.
- Use AI to challenge thinking – prompting AI to critique a plan, suggest alternative approaches or identify gaps in an argument.
- Treat outputs as drafts – requiring actual people to review, edit and fact-check AI-generated text before it is shared or published, especially for anything public-facing
What should you do with all that saved time?
Of course, the big boon from using AI for more routine purposes is all that glorious time it gives back. While studies vary, many suggest the savings can amount to roughly one hour per workday.
Kanter refers to these savings as the “time dividend”. When she asks NFP employees how they would best like to use this time, the most common responses she gets include building deeper relationships, focusing on self-care, and making space for more strategic thinking and learning.
But in practice, she says, the time dividend often disappears into more of the same work.
“Berkeley researchers recently followed a company for eight months after it adopted AI, and what they found was that staff voluntarily took on more tasks, worked through breaks, and were multitasking constantly because AI made doing the work feel effortless and fast,” she says.
“Nobody told them to overwork – they just did it, and it was leading to burnout.”
For NFP leaders adopting AI, the challenge is to ensure the time dividend is deliberately reinvested in human-centred activities.
A good place to start is by talking openly with staff about the time AI is saving and agreeing on how some of that time should be used. It could go to:
- Relationship building – having deeper conversations with workmates, donors, volunteers and community partners about where they’re at and how they can work more impactfully together.
- Strategic thinking – making space for individual reflection on a project or issue, or running team brainstorming sessions to map out ideas and direction.
- Self-care and burnout prevention – encouraging people to step away from their screens, take proper breaks or go for a walk, rather than filling every spare moment with more work.
What skills matter most when working with AI?
Kanter says the skill of discernment is becoming increasingly important in AI-assisted workflows.
“Human and AI collaboration comes down to the skill of figuring out who does what, and how much human agency a task needs,” she says. “There is also the skill of knowing when not to use AI at all. Discernment has become a core AI fluency skill.”
She gives the example of hiring a new team member to illustrate the various levels of human-AI collaboration and where human judgement remains central.
“To create the job description, you might provide AI with some rough thinking, notes from a meeting and examples of previous descriptions,” she says.
“AI may produce or revise a draft from that, but a human needs to refine it for culture and fit. This is what we call AI augmentation – using artificial intelligence to enhance human capability rather than replace it.
“When it comes to choosing who to hire, though – that remains a human-only task.”
Many of the skills Kanter believes are essential for working effectively with AI are not technical, but human – including judgement, emotional intelligence, adaptability, ethical decision-making and cultural sensitivity.
“There are also metacognitive skills that are needed,” she adds. “Such as self-awareness, self-regulation and understanding how you think.”
How NFP leaders can model responsible AI use
Kanter offers the following tips to NFP leaders for using AI while staying firmly in the driver’s seat:
1. Define your human-only tasks
Identify the decisions that only you can make because they make you a better leader and require your unique judgment.
Ask: Will doing this task strengthen my leadership? Does it require my judgement? Am I avoiding thinking?
If tasks will contribute to you strengthening as a leader, or build your professional “muscle”, then avoid using AI to make sure you don’t lose the ability to navigate complex human dynamics yourself.
2. Design your AI workflow deliberately
Use personalised instructions in your AI tool of choice to guide how it interacts with you. For example, you might create a workflow that requires yourself to share a messy draft before the AI produces a polished answer so your own thinking comes first.
You might also instruct the AI not to offer praise, which can make some people more likely to accept answers without questioning them.
If you work in a team, consider sharing your thought-through AI workflow at your next team meeting. Explain why particular tasks require your unique human judgment and the value that brings.
And invite your team to do the same. Ask them which parts of their roles they believe should remain “human-only” to preserve the integrity of your mission.
3. Use physical thinking tools
Step away from the screen when you need clarity. Take a walk, practice recalling information instead of immediately asking your AI tool, or write ideas or interview notes by hand.
Research shows handwriting can improve thinking, learning, memory and clarity, so rather than defaulting to using AI tools to make notes when you’re interviewing job applicants, consider the value of making the extra effort to take interview notes by hand.
4. Use AI to organise ideas, not replace them
Bring your own rough thinking into the conversation when you use AI tools. Rather than asking the AI to write things for you, paste in your messy notes or a “brain dump” and ask AI to help organise themes, group ideas into categories or ask questions that help refine your thinking.
–
AI is probably the most powerful technology invented in the 21st century so far, but because it’s still in its infancy, there’s a long way to go before it’s clear what the best way to use it in all situations is.
Responsible AI adoption isn’t just a technical shift; it’s a cultural one. If your team sees you using AI to “shortcut” deep thinking, they’ll likely follow suit. But if you make the choice to use it to only to sharpen your thinking, you can help set the standard for excellence in your organisation.
Not-For-Profit People is an initiative of EthicalJobs.com.au — Australia’s top job-search site for the not-for-profit sector and beyond. 10,000 Australian charities, not-for-profits and social enterprises use EthicalJobs.com.au to find dedicated and passionate staff and volunteers to help them work for a better world. Find our more at EthicalJobs.com.au/advertise
Related Posts


