1 Fundraising finding: Revealing AI use can hurt nonprofit results and relationships (unless you explain why)

AI tools can save time and money. Many teams use them every day. But new 2026 research shows a real downside for charities: donors react badly when a nonprofit says it used AI.

Here’s what the experiments found.

  • First test: Same charity appeal on Facebook. Same image. One change. One group saw a note that the image was AI-generated. Clicks fell by about 25%.
  • Next test: People read a donation appeal. Half were told the nonprofit staff wrote it. Half were told the nonprofit used AI to write it. When the charity revealed its use of AI, the intent to donate dropped. Why? People rated the AI-use charity as less authentic, less genuine, and less sincere. They also felt the charity cared more about hitting external financial targets than helping people.
  • Third test: Only the nonprofit description changed. One version said staff handles routine tasks (like targeting donors and designing fundraising activities). The other said the nonprofit often uses AI tools for these tasks. Again, donations fell when AI use was revealed. The reason was the same: lower perceived authenticity and more external financial motives.

So how do you fix it? Add a purpose statement.

In a follow-up test, the charity explained it used AI “to reduce costs and provide more help for children.” That one phrase removed the negative effect.

One more important finding: This penalty applied to charities, not companies.

In the last experiment, one group could pay any amount to a company for a canvas bag; whatever they paid went to a children’s charity. The other group received the bag directly from the charity for any donation. Revealing AI use for the bag campaign hurt donations when the charity used it. But the same AI disclosure did not hurt donations when the company used it.

Bottom line: Donors hold charities to different rules than businesses. The rules are different because the relationship is different. Neurologically, charitable decisions engage brain regions used in relationships with close friends and family. Purely financial decisions don’t. If a company uses AI to reach out to us, we expect that. If a friend uses AI to reach out to us, that’s disappointing. That’s not what a real friend does. It can feel like they’re “less authentic, less genuine, and less sincere.” Unless, of course, they explain why they needed to use it. Then, it’s fine.

Practical takeaway:
If you disclose AI use, don’t stop there. Say why you used it. Tie it to mission or relationship. Make the reason concrete: lower costs, faster help, more services delivered.
And anytime you’re thinking about a new fundraising tactic, start with this: “What would a good friend do?” If it fits, you’re safe. If it doesn’t, be careful!

Link to original study: Zhao, Y., Zhou, P. P., & Chen, Z. (2026). AI’s Hidden Price: AI Tools Reduce Donor Engagement Through Extrinsic Motivation Inferences. Journal of Business Ethics, 1-18. https://link.springer.com/article/10.1007/s10551-026-06247-2

 

Russell James, J.D., Ph.D., CFP®️ is a professor at Texas Tech University. He directs the on-campus and online graduate program in Charitable Financial Planning and also teaches Charitable Gift Law at the Texas Tech University School of Law. Dr. James has over 100 publications in academic journals, conference proceedings, professional periodicals, and books including 20 on neuroimaging and neuroeconomics. He has been quoted in a variety of news sources including The New York Times, The Wall Street Journal, CNN, MSNBC, CNBC, ABC News, U.S. News & World Report, USA Today, the Associated Press, Bloomberg News and the Chronicle of Philanthropy.

Related Resources:

MarketSmart LLC
Privacy Overview

We use cookies to ensure that we give you the best experience on our website. By continuing to use this site, you agree to our use of cookies in accordance with our Privacy Policy.