Eva Gengler

Eva Gengler is an author, speaker, and co-founder of feminist AI and enableYou. She holds a PhD.

What's the most important question right now?

Are we using AI to automate an oppressive past – or to build a more empowering future?

AI is largely built, governed, and monetized by a small, privileged part of society, while many others are reduced to consumers and data sources. A small, privileged group of white men decide where the future is heading. When we apply AI to existing systems, we risk scaling the injustices already embedded in them. Hiring, lending, policing, the justice system, military technology, all of these already reflect structural inequalities. Automating them does not make them more just.

This is especially dangerous for people who are already marginalized: women, queer people, People of Color, people with disabilities, activists, and those facing multiple forms of discrimination at once.

But AI can also be used differently. It can help us question broken systems instead of optimizing them: fairer recruiting, expose pay and leadership gaps, protect biodiversity, detect misinformation, and assist victims of domestic violence. Technology reflects the objectives we give it, the people who shape it, the data we train it with, and the power structures it serves. AI should be feminist: power-critical, inclusive, and oriented toward justice. Built collectively, not by a few for the many.

What's one fact about AI that everyone should know?

AI systems are trained on human decisions, human language, human institutions, and human histories, so they inherit our biases, our power structures, and our mistakes. When we pretend otherwise, we make its politics invisible and harder to challenge. AI is neither neutral nor objective.

This is not unique to AI. Journalism is not neutral. Scientific studies are not neutral. Everything is shaped by individual, institutional, political, and economic objectives. The question is not whether AI has values. The question is whose values it serves. If AI is built within neo-liberal, neo-colonial, and patriarchal systems, it will reproduce those logics. But this is not inevitable. We can change objectives, institutions, laws, teams, and power relations behind these systems. AI is not a technical destiny. It is a political choice, and now is the time to make a different one.

We need strong legislation, real accountability from management teams, and a broader societal shift in how we understand technology and power

A powerful book unpacking the ideology behind today’s AI and Silicon Valley is "More Everything Forever" by Adam Becker.

What future are you looking forward to?

One in which we learn from the past to build more just, diverse, and inclusive futures, rather than locking in the hierarchies that shaped it.

I want a future that understands people do not start from the same place, and that this shapes our opportunities, privileges, vulnerabilities, and choices. One that treats inequality as a structural reality, not an individual failure. Driven by care for each other and for the planet, rooted in mutual support rather than domination, in justice rather than efficiency, responsibility rather than endless growth.

AI will not be the single answer to this future. No technology can be. But we will also not achieve this future by ignoring AI. The question is whether we use it to repeat and reinforce systems of oppression and privilege or to dismantle them.