From gig work to classrooms, Professor Nagla Rizk explains how feminist AI offers solutions to systemic bias in data and algorithms.
Can data be sexist? Can artificial intelligence discriminate? As AI has rapidly evolved over the past decade, researchers have revealed how biased data can disproportionately harm women and marginalized groups. Through the Access to Knowledge for Development Center (A2K4D) at the Onsi Sawiris School of Business and its flagship initiative, the MENA Observatory on Responsible AI, Nagla Rizk ’83, ’87, professor of economics and founding director of A2K4D, leads the Feminist AI Research Network’s MENA hub. The network works to develop inclusive AI systems that create opportunities while addressing inequality.
"If we don't adopt a feminist-sensitive approach to technology, we risk leaving behind a key part of the population."
So what is feminist AI? Rizk explained, “Feminist AI refers to the act of deconstructing oppressive systems, dismantling historic biases and ingrained inequalities, then building inclusive AI structures that are based on principles of justice, transparency, agency, pluralism and more.”
In short, it is the development of AI systems that ensure fairness across genders. Rather than simply identifying bias, feminist AI seeks to address injustices at the level of data and algorithm design.
Feminist AI is closely linked to the principle of “intersectionality,” the concept that social categorizations such as race, class and gender overlap to create interconnected systems of discrimination. “It is, in short, when oppression is linked and layered,” Rizk affirmed.
AI: Friend or Foe?
Human biases persist in AI. When models rely on large datasets, those biases can be amplified. Rizk focuses on identifying gaps, bias and invisibility in data — and addressing them at the root.
“Technology has the potential to advance development, inclusion and move toward achieving the Sustainable Development Goals. At the same time, technology embodies a peril,” Rizk stated. “As humans build AI models — with data and algorithms at their core — every link in this chain carries the potential to trigger inequality. This could negatively impact women and marginalized groups, so it’s important to think about inclusion when designing AI models.”
"If we don't adopt a feminist-sensitive approach to technology, we risk leaving behind a key part of the population."
Data can be biased against women at both micro and macro levels, and AI tends to amplify these biases. For instance, image search results for the word “doctor” on Google show significantly fewer women, while searches for “domestic helper” overwhelmingly feature them. Similarly, AI hiring tools trained on male-dominated datasets have favored male candidates. In one instance, credit scoring models granted men 10-20 times more credit than their wives despite shared assets, due to algorithmic biases and structural flaws in financial data weighting.
“These inherent deficiencies in the data compound systemic issues that women already face, such as gender-based hiring, pay gaps and lack of financial security,” said Rizk.
Data also has ways of “forgetting” women. Early health technologies often overlook women’s reproductive health, and AI diagnostic systems have replicated gender biases in areas like cardiovascular diseases, where women’s symptoms have been frequently dismissed.
“If women are invisible in the data, they will be invisible in the solution, and eventually in policy,” Rizk warned. “If we don't adopt a feminist-sensitive approach to technology, we risk leaving behind a key part of the population. We also risk running into problems that will need fixing after they’ve already caused damage. The important point is that feminist AI is proactive. It is transformational.”

A MENA-Specific Approach
The Feminist AI MENA hub operates within the broader Inclusive AI Research Network, supported by Canada’s International Development Research Centre. Its work focuses on dismantling patriarchal structures, oppressive systems and historical inequalities in technology and society, while simultaneously building inclusive systems grounded in feminist principles, acknowledgement of intersectionality and equitable representation across design, deployment and the impact of AI.
“Context matters. Technology for the MENA region has to be informed by the region’s realities and should cater to its requirements.”
Encouraging women in STEM is key to balancing gender representation in technology design. In the MENA region, the gender gap is far wider in STEM careers than in education — a phenomenon known as “the gender paradox.” The absence of women from the workforce creates a “feedback loop” — algorithms become less gender-sensitive, reinforcing inequality.
The hub’s research highlights gendered biases in MENA's gig work. In ride-sharing apps, bonuses are frequently based on hours worked, which can penalize women who spend fewer hours due to caregiving responsibilities. To compensate, many work during odd surge hours, exposing themselves to safety risks, especially in remote areas, where connectivity can be an issue, exacerbating safety concerns. These algorithmic pressures compound the challenges of already precarious work — lacking job security, social protection or insurance. In a region experiencing the highest female unemployment rate and the lowest labor participation rate globally, these structural biases trap women between a rock and a hard place.
By conducting evidence-based research in the region, the Feminist AI MENA hub supports transformational technology development and shares findings with the international feminist AI network. “Technology is a product of society and should respond to its needs.” Rizk said.
“Context matters. Technology for the MENA region has to be informed by the region’s realities and should cater to its requirements.”
Rizk and colleagues continue to develop region-sensitive research, bringing their findings to policymakers, civil society and the international research network. She is also taking these principles into the classroom through her course, Feminist AI: Technology, Gender and Development.
“It gave students a different perspective on using technology,” Rizk explained.
Emphasizing the need for more nuanced, balanced and accessible tech design, Rizk noted, “We want to raise awareness and deliver a message of fairness, justice and inclusion. To be a feminist, you must be sympathetic to all marginalized communities, not just women. Technology must be inclusive to all. We work toward that future.”