AI Governance in Practice, Part 1: A View from the Non-Profit Sector
This is the first of two commentaries providing practice-based insights into how AI is being used and governed around the world. In Part 1, I had the pleasure of speaking with Claudia Juech of Bloomberg Philanthropies, Government Innovation on January 18, 2023 regarding how AI is being put to work in the non-profit world. This is part of an EGG commentary series exploring how AI’s development is affecting economic, social and political decision-making around the world.
Much has been made about the potential of AI to enhance social good, and commentators have suggested that non-profits, as proponents of social good, may reap benefits as well. For instance, machine learning could help enhance fundraising or automate repetitive tasks, thereby freeing up resources and personnel for other tasks. But how does this play out in practice?
Claudia notes that a crucial starting point for understanding the trajectory of AI in non-profits is to clarify what kinds of data analysis are needed to achieve their strategic goals. “It’s sometimes machine learning, sometimes more sophisticated statistical methods or data science, but self-learning, imitating human behavior? I would be very hard pressed to point to non-profit organizations I have worked with” these advanced AI technologies.
In fact, the use of AI technologies in non-profits is very much an ongoing process. Claudia noted that, in the initial phases, there was a dichotomous view of these technologies. Some actors, drawing on the hype about AI as a tool and looking towards tech companies and the private sector, “believed that there’s outsized potential to realize opportunities and impact” by using AI. Others, often involved in advocacy or with concerns about surveillance capitalism, were more hesitant, worrying about data rights and privacy violations that would accompany these technologies. These “camps weren’t talking to one another in the beginning” but have now “become more realistic and less entrenched,” focused on how you can use AI while “adhering to high standards of ethical and responsible behavior.”
Challenges of AI in practice
Nonetheless, Claudia says her conversations with over 400 partners working around the world over the past 6 years have revealed that, despite evolving understandings of AI and its social uses, many non-profits continue to face significant challenges in employing these technologies. Sometimes there are issues of organizational culture as employees question whether these technologies are the “best way how we can help people.” Other challenges relate to problems of implementation. “Many [non-profits are very early in their digital journey. They don’t have a digital strategy. They have to develop their vision of data use across the organization. They, especially the smaller ones, have very few resources, whether financial or talent.”
Funding comes from a variety of sources. For instance, some foundations provide non-profits with funding because they’re focused on technological issues, others because a project could benefit from AI. These philanthropic pots of money “are often small…but tend to be the most flexible” from a use perspective. Public agencies offer larger grants. However, these “might require more of an administrative effort on behalf of the nonprofits and processes might take more time.” Alternatively, they may be too large for non-profits unless working through a collective. The private sector offers resources as well, often as part of corporate social responsibility initiatives. These are diverse, including money, products, software licenses – resources which Claudia describes as “very common and non-negligible.” Where the money comes from can affect how AI is used in the short term, the extent to which human capital is developed, and strategic implications for employing AI in future projects.
Other challenges arise from the approach a non-profit takes when choosing a vendor as well as the conditions under which help is received and implemented. Like other actors, non-profits face strong pressure to choose a vendor based on price. As a result, the non-profit might “end up with a solution that is not the best for their problem” because that offer was accompanied by the best conditions. Relevant conditions include whether help comes in the form of a donation as well as whether “in-kind implementation help” is part of the deal. Moreover, in some instances, non-profits may “hire a consultant to manage a project because they might not have funding to bring someone on full-time.” This runs the risk of “losing that knowledge once the consultant leaves.” Even if they do develop sufficient technological skills during the project, “they have to think about how to deal with now being locked into a specific technology and the efforts it might take to migrate the solution to another vendor or product line” once project funding and license donations end.
Data security is another challenge looming large for non-profits using AI and digital technologies, albeit one not unique to non-profits. As Claudia noted, credit rating agencies also have data breaches, and we hear about it on the news for a few days after the leak. But such leaks are likely to have more permanent consequences for non-profits, both due to the nature of non-profits’ data and their financial dependence on others for their work. “When [a data breach] happens to a non-profit, it’s almost a permanent stain on there on their track record as these organizations deal with vulnerable populations and donor expectations are high.” Moreover, it can negatively affect multiple funding sources. “We have seen small individual donors as well as foundations adjust funding after data breaches (or in response to questions around data use).”
Finally, social and technological contexts affect how non-profits engage with AI and digital technologies. Claudia noted that, while in technologically more advanced countries in Africa like Nigeria, South Africa or Kenya “there’s more activity evolving [regarding how non-profits can use technology], there are a lot of countries where even internet access continues to still be a problem in rural areas.” Non-profits around the world seek to “adopt solutions developed in the corporate sector, for example, in health” but they must do so realizing that using them in some contexts will be more difficult than others. For instance, in India, with its wealth of technological talent, there are global players such as the AAPTI Institute who work on data governance issues alongside the Open Data Institute (ODI) in the UK and many others. Yet bringing these “world-class people and world-class efforts to a farmers’ collaborative or women’s collaborative in rural India” remains a challenge. Importantly, this is also true in the United States, where Claudia noted organizations like Pecan Street also seek to provide farmers with data products. Nonetheless, a relative lack of technological skill and infrastructure may make such challenges potentially more difficult to solve in developing countries than in other parts of the world.
What role for regulation?
The immensity of the opportunities and hurdles facing non-profits interested in using AI raises the question of how and if regulatory bodies and governments can play a role in facilitating the former and mediating the latter. A traditional approach toward regulation would see governments setting limits and creating incentives to encourage non-profits to use these technologies to amplify their impact. In fact, regulations like the EU’s proposed AI Act and its General Data Protection Regulation as well as (sub-)national spin-offs like the California Privacy Rights Act “are on non-profits’ minds […] and are part of their compliance work.” Similarly, we have seen initiatives by organizations like the UNICEF Manifesto on Good Governance of Children’s Data addressing data privacy issues related to children. As Claudia notes, “those things have become quasi standards in the sector and others are definitely looking to that and applying that” in their work.
Yet non-profits are also taking different paths to influence government regulations and are setting their own standards through bottom-up processes. Larger non-profits, such as Germany’s Arbeiterwohlfahrt, are engaging deeply with issues of data use, considering both “how can they use these tools to increase their [social] impact, but also how can they play a role in the societal discourse around data.” In other settings, we see so-called “implementation agencies” going to countries in Africa without a fully developed data strategy and “implementing digital health solutions on the ground on behalf of the government. But they are doing it in this space where there is no GDPR, there is no kind of data privacy act and similar things. So, in a way these non-profits are setting quasi standards from the bottom up.”
Nonetheless, it’s important to note that, for non-profits, AI is “an instrument and not the solution for every problem. We need many different instruments, and data is just one of them. But it has enormous potential to change how we operate.” How? Claudia notes that much non-profit work “still treats a lot of people the same way” regardless of the social, medical or other challenges they are facing. “If we have more data, we can understand what’s working in certain contexts and work with people to deliver better solutions for them.”
Claudia Juech leads program development and program design for the Government Innovation portfolio at Bloomberg Philanthropies. As a former Senior Executive at the Rockefeller Foundation, the founding CEO of the Cloudera Foundation, and most recently, the VP of Data and Society at the Patrick J. McGovern Foundation (PJMF), Claudia has more than 15 years of experience leading programs and operations across geographies. At Cloudera and PJMF, she implemented a new foundation model designed to deliver the key elements nonprofits need to use data effectively: appropriate funding in combination with direct access to technology, data expertise, and long-term technical support.
Laura Mahrenbach is an adjunct professor at the School of Social Sciences and Technology at the Technical University of Munich. Her research explores how global power shifts and technology interact, with a special focus on the implications of this interaction for the countries of the Global South as well as related governance dilemmas at the national and global levels. More information available at www.mahrenbach.com. This work was supported by Deutsche Forschungsgemeinschaft (Grant No. 3698966954).
Photo by cottonbro studio