How to Make the Most of Artificial Intelligence and Other Technologies: Advice From Experts

MAR 28, 2022

Technology is often presented as the solution to many problems for nonprofits — reducing staff burnout, better targeting of fundraising efforts, and improving budgeting, to name just a few. It can help with all those things, but there are pitfalls to avoid.

The Chronicle invited tech experts Beth Kanter and Allison Fine, co-authors of The Smart Nonprofit: Staying Human-Centered in an Automated World, to a virtual forum to help nonprofit professionals better understand where investments in technology make the most sense and how to avoid some of the traps that ensnare the unwary. The session, Smart Tech: How to Use AI and Other Advances to Meet Your Mission, was hosted by Margie Fleming Glennon, director of learning and editorial products for the Chronicle.

As just one example of the power of technology, Fine cited the Rainforest Action Network, which used technology and reams of data to analyze the interests of new donors; it reached out to them in targeted ways intended to turn them into monthly donors, with phenomenal success. increasing the number of monthly donors by 866 percent.

“We know that hitting the jackpot with donors is getting them to move from being a one-time donor to being a monthly recurring donor,” Fine says. “By customizing the communications, like what kind of story would interest this person, they were able to make that leap for those donations.”

Read on for highlights of the discussion, or watch the video to get all the insights Kanter and Fine shared.

Don’t let past bad experiences get in the way. Fine says that nonprofits’ experience with social media, which can be very “noisy” and produce a lot of data that isn’t always helpful, may have soured them on the next waves of technology.

“We know technology’s not a panacea for the problems that organizations have,” Fine says. But Kanter and Fine say that the right kinds of technology, if applied well and monitored carefully, can improve a nonprofit’s fundraising while making life better for its employees. Technology is too powerful and its potential to improve the world too great for anyone to sit back and say it’s not their thing, Fine says.

Kanter adds, “We have this great moment, this once-in-a-lifetime opportunity to remake, revitalize, and rehumanize nonprofit work, and we’ll all benefit.”

Start small and learn as you go. If you freeze up when the topic turns to tech, do whatever is necessary to get comfortable with it, says Fine. Find a friend who can mentor you. Read a book. Take advantage of the learning opportunities NTEN, a group of nonprofit professionals focused on technology, has to offer. “You cannot just leave the idea of automating systems and processes just to technical people,” says Fine.

As you get started, take the time to make sure the technology is ready when you roll it out. Test it on a small group of users to get their feedback and make improvements, says Kanter.

Be as selective about technology as any other aspect of your organization. Do your own research, check online reviews, and ask peers about their experience with any tech product.

For example, Kanter says several automated programs can be bought to analyze websites and recommend ways to make them more accessible to people with disabilities. A nonprofit she was advising discovered through a simple online search that the seller of a product the nonprofit was considering was the subject of lawsuits filed by disability-rights organizations citing problems with the algorithm the software used.

The lesson, says Kanter: “The technology changes, but the due diligence doesn’t.”

Fine cautions nonprofits to avoid any product where the vendor refuses to explain how the tool was built.

“If they say, ‘oh that’s proprietary, it’s a black box, you can’t look,’ then I say, ‘no, I’m not going to work with you,’” says Fine. “There are plenty of other, you know, places I can go. I need to know what assumptions were built into this product, and what data sets were used to train it, to see what problems we might have with it.”

Be aware that software and data are not always value neutral. Using the latest software and the most robust data sets available isn’t enough to ensure fair processes and outcomes, Fine and Kanter say. “This is a leadership challenge, not a technical challenge,” says Fine.

For example, she says, a tool intended to help your human-resources department screen résumés may have biases that reinforce old, unfair methods of hiring.

“It may have, built into the code by some coder at some point, assumptions about race and gender,” says Fine. “Those data sets, particularly in the social sector, have historically been racist in, say, housing or food benefits or hiring. The language that we’re using for job descriptions not only gets people in, but it keeps people out as well.”

Keep humans involved. Smart technologies are meant to assist you, not take over jobs entirely. In addition to keeping an eye out for biases in the application of technology, humans are often needed to make sense of the data collected and how best to apply it.

Kanter gave the example of the Trevor Project, which provides a crisis line and counseling for LGBTQ youths. The nonprofit, facing a shortage of trained counselors, created a bot named Riley that uses sophisticated technology to learn as it interacts with people. “But they didn’t use it to replace the counselors on the front line, who work directly with youth, because they saw that piece of the job as being very human centered,” says Kanter.

Instead, Riley was used to help train those counselors by simulating common questions they were likely to encounter. “It balances letting the counselors do the human work that they do so well and letting the bot help train them,” Kanter says.

Chat bots also can assist fundraisers in determining where best to target their outreach. Bots can efficiently answer thousands of basic questions from online visitors to a nonprofit, and they quickly “come back with some suggestions to the fundraiser, so they can then shift their time into actually working with the donor, cultivating the donor, and maybe not exhausting themselves looking at so many open-ended comments,” says Kanter.

Be vigilant about the ethics of technology as you step up its use. As an example of where technology can lead organizations astray, Fine cited facial-recognition technology that has been used to track and trace Covid. Some of that same technology has been misused by law enforcement in ways that negatively affect people of color, so nonprofits must be wary of abuse.

Organizations also must be wary of how much data they are compiling on donors and clients, Fine says, and how those people may or may not want their personal data stored and used.

“We want nonprofits to raise the bar and say, what is the most we can do to protect our users’ privacy, to use the technology responsibly and well, to make sure that the technology is not out in front of our people and that the bots aren’t overwhelming the humans in our system,” says Fine.

Get started now, and don’t be intimidated. Take one small step at a time, advises Fine. Check out some of the available chat bots or maybe some software that can automate some budget tasks. Kanter calls it “learning snacking.”

“The technology is becoming very quickly commercialized and inexpensive, and stupid simple to use, so this is not going to be technology that only an advanced Ph.D. can use, which it was until just a few years ago,” Fine says. “This is technology for everyday use.”

We welcome your thoughts and questions about this article. Please email the editors or submit a letter for publication.


Dan Parks

Dan joined the Chronicle of Philanthropy in 2014. He previously was managing editor of Bloomberg Government. He also worked as a reporter and editor at Congressional Quarterly.

Similar articles you can read