The future of AI is interdisciplinary | Waterloo News – The Iron Warrior

Canada Research Chair in Technology and Social Change is advancing artificial intelligence with more diverse human experiences
Dr. Lai-Tze Fan wants to make a more equitable AI. Critical approaches to advancing artificial intelligence are urgent, she says. “We tend to want AI to do more and more tasks for us and not be involved ourselves. But if that tendency continues, critical thinking in the process of generation is going to collapse in on itself and we won’t have a responsible role in the technology anymore. And that could make AI a dehumanizing tool.”
An Assistant Professor in Sociology and Legal Studies, Fan was recently announced as a Canada Research Chair (CRC) in Technology and Social Change. In fact her work is richly interdisciplinary, combining media studies, science and technology studies, interactive storytelling, critical design, and research-creation. Her CRC program examines technological design and bias in AI — “something that’s been on everybody’s mind these days.”
By investigating the design of sexist, racist, and classist AI voice assistant software, racist facial recognition systems, and exploitative AI hardware production, she is working to identify how AI produces human experiences that reinforce social inequalities. And she wants to change that.
Fan’s goal is to encourage and enhance equity, diversity, and inclusion in AI design and to improve technological literacy.
Of course the speed of AI advances is a challenge for everyone, she says, as Open AI forges ahead, pushing other companies to move faster and work on more ways to integrate AI into our everyday lives. “They’re developing day-to-day, and the CEO of Open AI himself is addressing issues of governance and regulation. But this has always been an issue with tech industry versus legislation and governance: it develops so fast that regulators cannot keep up.”
Fan’s CRC research will be based in her Unseen-AI Lab (U&AI Lab) and engage interdisciplinary collaborators to develop approaches that prevent inequitable AI at the design and production stage. Her research plan includes three case studies on software, hardware, and big data – all focused through an equity lens.
In the first, Fan is looking at how inequities and stereotypes found in human labour are transferred to our experiences interacting with an AI assistant. She gives an example: “What would it look like to have a feminist Siri where, if you used abusive language towards her, she just shuts off?” Working with colleagues, Fan aims write more equitable software scripts, and in this way, design civility and fairness into the user experience.
For the hardware study, Fan will develop VR experiences to educate and expose people to the environmental consequences of the ever-growing demand for superpowered computers and more/newer personal devices in our daily life.
The third study examines the inherent racism of big data used for facial recognition technology – which are trained primarily on databases of white faces. Given these databases simply don’t have enough diverse faces, Fan wants to rebuild and expand the data. “It’s ambitious, but why not build one?” Of course, to ensure the biodata collection and management is equitable, this work will involve collaborators with ethics expertise, she adds.
Now based in Waterloo’s Department of Sociology and Legal Studies, Fan’s academic career has traversed English literature to media studies to technology studies and research-creation. By her postdoctoral studies, she found “it didn’t make sense to be disciplinary anymore.” Today she is an experienced practitioner of digital installations, digital storytelling, creative coding, and game design. To date she’s had 23 solo and collaborative research-creation projects, including collaborations with MIT, Georgia Tech, and Waterloo’s Institute for Quantum Computing.
“This work has to be interdisciplinary,” says Fan who is inviting researchers and students from a range of STEM, humanities and social science disciplines to collaborate in the U&AI Lab. She welcomes interested faculty, staff, and students to write her directly, even if they haven’t had a chance to meet her yet. “Computer science and engineering students will be able to help us examine the risks and benefits of AI technologies that they’re currently learning to design and produce. Students from the social sciences and humanities trained in critical theories of gender, race, and class, as well as qualitative and quantitative methods, can tackle real-world evolving issues in AI industry and policy.”
Combining this breadth of expertise and experience in her CRC research, Fan will contribute to AI design in research and industry, improve technological literacy, and most important, strengthen equity, diversity and inclusion in human-AI experiences.
Photo credit: Eivind Senneset.
Read more
Dr. Carla Fehr and multidisciplinary colleagues explore ways to include principles of fairness and justice in Artificial Intelligence 
Read more
An ethical approach to technological change is growing in Waterloo’s innovation ecosystem
Read more
Academics and entrepreneurs invited by Canada’s top data steward to partner with government for future growth
Find an expert
Find a COVID-19 expert
Contact Media Relations
The University of Waterloo acknowledges that much of our work takes place on the traditional territory of the Neutral, Anishinaabeg and Haudenosaunee peoples. Our main campus is situated on the Haldimand Tract, the land granted to the Six Nations that includes six miles on each side of the Grand River. Our active work toward reconciliation takes place across our campuses through research, learning, teaching, and community building, and is co-ordinated within our Office of Indigenous Relations.


Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top