An interactive reflection on bias and inequality through artificial intelligence
Milan, May 2025 – As part of the 24th International Exhibition of Triennale Milano, the installation Not For Her has been unveiled—an immersive and interactive work that uses artificial intelligence to explore and make visible gender discrimination in the workplace. The project was conceived and developed by the Politecnico di Milano, with the support of FAIR – Future Artificial Intelligence Research, which co-funded its development through Spoke 4.
An experience that challenges prejudice
“Not for her” goes beyond a traditional technological exhibit, inviting participants to experience firsthand a situation of discrimination during a job interview, in real time. Through the integration of generative AI, speech recognition, and multi-agent systems, each interaction becomes unique, personalized, and deeply immersive.
The project unfolds through two complementary moments: a multimedia triptych that encourages collective reflection on gender discrimination, and an interactive installation that challenges deeply rooted assumptions. Every element is designed to stimulate critical thinking, creativity, and individual awareness. Artificial intelligence is not presented as a dogma or a threat, but as a tool to be consciously adopted in support of change.
The role of FAIR
FAIR supported the project through Spoke 4, dedicated to Artificial Intelligence for Society.
“FAIR is proud to have supported Not For Her – comments Giuseppe De Pietro, President of the FAIR Foundation – a project that combines scientific rigor with social awareness, helping to reflect on bias and inequality through artificial intelligence. Through its activities, FAIR promotes a human-centered, transparent, and responsible AI, free from prejudice and at the service of society.”
Nicola Gatti, coordinator of Spoke 4 and curator of the installation, added:
“Not For Her demonstrates that mastering technology means guiding it toward the common good. Artificial intelligence is often perceived as an uncontrollable force, full of bias and potentially a source of new inequalities. But AI is a technology, and as such, it is neutral. Bias does not originate in the data itself, but depends on how it is collected and interpreted. With Not For Her, we wanted to overturn this narrative: AI can act as a critical mirror, capable of exposing our biases and making us more aware. This reminds us that technology is a tool—and it is up to us to decide how to shape and use it.”
A tangible experience to foster awareness
The installation was made possible thanks to the invitation of Donatella Sciuto, Rector of the Politecnico di Milano, and was developed through collaboration between academics, designers, and engineers from the university, including Ingrid Maria Paoletti, Matteo Ruta, Umberto Tolino, and Ilaria Bollati. The project represents a significant step toward what experts define as Agentic AI, a new generation of intelligent systems combining generative models, Retrieval Augmented Generation, finite state machines, and autonomous agents.
The result is a technology that does not replace human beings, but supports them in a process of self-reflection, providing tools to understand, recognize, and address the biases that shape our societies.
Visit the Triennale to experience the potential of AI firsthand
The installation is open to visitors at Triennale Milano, Viale Alemagna 6, from Tuesday to Sunday, with continuous opening hours from 10:30 AM to 8:00 PM. An opportunity to explore not only the work developed by the Politecnico di Milano and supported by FAIR, but also to engage in a broader journey that encourages reflection on technology, artificial intelligence, and bias.
For more information about the installation and visiting details, please refer to the official Triennale website.



