Inclusive research Archives - Ascend Magazine /ascend/category/inclusive-research/ Fri, 30 Jan 2026 16:43:45 +0000 en-CA hourly 1 https://wordpress.org/?v=6.9.4 Mitigating bias /ascend/article/mitigating-bias/ Wed, 28 Jan 2026 14:26:29 +0000 /ascend/?post_type=article&p=654 As artificial intelligence (AI) advances – particularly large language models (LLMs) which are increasingly integrated into social, governmental and economic systems – discriminatory stereotypes and biases persist. These prejudices reflect and reinforce historical and systemic inequalities embedded in massive datasets that models like OpenAI’s Generative Pre-trained Transformer (GPT) and Google’s Gemini learn from. 첥Ƶ […]

The post Mitigating bias appeared first on Ascend Magazine.

]]>
As artificial intelligence (AI) advances – particularly large language models (LLMs) which are increasingly integrated into social, governmental and economic systems – discriminatory stereotypes and biases persist. These prejudices reflect and reinforce historical and systemic inequalities embedded in massive datasets that models like OpenAI’s Generative Pre-trained Transformer (GPT) and Google’s Gemini learn from.

첥Ƶ researchers from across faculties are joining forces to develop frameworks to identify and mitigate biases in LLMs rooted in colonialism, racism and ableism.

Health informatics Professor Christo El Morr’s work spans a range of topics, from achieving accessible and inclusive AI to modelling and building bilingual and accessible knowledge infrastructures, and creating frameworks to address AI bias.

“Currently, AI operates as a tool of corporate and state control, reinforcing systems of exclusion and marginalization under the guise of progress,” says El Morr, who co-edited Beyond Tech Fixes: Towards an AI Future Where Disability Justice Thrives, published in October 2025. The book challenges the prevailing assumption that AI can be “fixed” by improving datasets, adding ethical guidelines or refining bias-detection algorithms.

Christo El Morr

Internationally, El Morr and his Faculty of Health collaborator Professor Vijay Mago recently convened philosophers, social scientists and AI researchers at a symposium in India to advance global research collaborations with support from York’s Global Research Excellence Seed Fund.

El Morr is involved in multiple equity-focused and LLM-related studies, partnering with colleagues at York, including long-time collaborator and Critical Disability Studies Professor Rachel da Silveira Gorman in the Faculty of Graduate Studies, and other Canadian and international universities.

As a principal investigator on several studies funded by Social Sciences and Humanities Research Council (SSHRC) grants, El Morr collaborates with Gorman on advancing AI and disability advocacy, accessibility for persons with disabilities and AI and equity.

“Across projects, we centre data sovereignty, community governance and decolonial design. This means long-term partnerships, ensuring consent over data use, and sharing power over how models are trained, evaluated and deployed,” says El Morr.

His most recent SSRHC-funded research project, Equity Artificial Intelligence: towards a framework to address AI bias, with Gorman, Faculty of Health Assistant Professor Elham Dolatabadi and Lassonde School of Engineering Assistant Professor , explores how AI can be reimagined through frameworks of equity, justice and liberation.

Seyyed-Kalantari also leads the study, Design of Benchmarks for Fairness and Bias Evaluation and De-Biasing of Natural Language Model to Incorporate User Diversity, focusing on addressing fairness issues in LLMs, which often favour majority groups due to biased training data.

Laleh Seyyed-Kalantari

A recipient of a Connected Minds seed grant, the project is in collaboration with the Vector Institute and aims to design domain-specific testing benchmarks to assess and score fairness across diverse dimensions such as race, gender, religion and social status.

“By focusing on linguistic bias, particularly in the context of sentiment analysis, my work aims to mitigate stereotypes and ensure more inclusive LLMs that better support marginalized groups, including Indigenous Peoples, racialized communities and those with disabilities,” says Seyyed-Kalantari, who leads York’s Responsible AI Lab and is co-director of the new Mitigating Dialect Bias solutions network, which received $700,000 in Canadian Institute for Advanced Research funding.

She sees cultural bias arising from misinterpretation of dialects as a major concern in LLMs. “For example, African American Vernacular English often uses grammar, vocabulary and expressions that are not part of Standard American English. AI may interpret such words and phrases as ‘toxic’ and harmful. This is because LLMs are trained on data that favour dominant dialects.”

This is an issue that affects dialects around the world, something Seyyed-Kalantari plans to address next.

The post Mitigating bias appeared first on Ascend Magazine.

]]>
Ethical AI /ascend/article/ethical-ai/ Wed, 28 Jan 2026 14:26:17 +0000 /ascend/?post_type=article&p=657 There is a well-thumbed novel on Pina D’Agostino’s bookshelf that keeps her up at night, but also propels her forward. It is Nobel Prize-winning writer Kazuo Ishiguro’s futuristic fiction Klara and the Sun, in which an anxious generation of parents buy humanoid robots to act as “artificial friends” for their lonely kids. The more privileged […]

The post Ethical AI appeared first on Ascend Magazine.

]]>
There is a well-thumbed novel on Pina D’Agostino’s bookshelf that keeps her up at night, but also propels her forward.

It is Nobel Prize-winning writer Kazuo Ishiguro’s futuristic fiction Klara and the Sun, in which an anxious generation of parents buy humanoid robots to act as “artificial friends” for their lonely kids. The more privileged parents also leverage genetic enhancement technology to make their children smarter.

“That is my nightmare, and I want to make sure we don’t get to that world,” says D’Agostino, who is the scientific director of Connected Minds, the massive 첥Ƶ-led, next-generation research project that seeks to understand and predict the opportunities and risks to society associated with advancing technology.

“That is why Connected Minds and its work is important. We’re trying to get ahead of the development aspects of technology and advance socially responsible technologies to leave the world a better place – one where we are connected to each other in a human way, not disconnected.”

D’Agostino says Connected Minds is more than a research program; it is a movement toward a future where technology and human well-being evolve together in a more socially beneficial way. With Canada providing $105.7 million in funding toward the $318-million transformational project, creating a new national Ministry of Artificial Intelligence and Data Innovation and putting in place a voluntary code of conduct for technological advances, D’Agostino views the country as a world leader in this type of responsible AI. She expects legislation will come next.

A key player with Connected Minds since before its launch, D’Agostino’s background makes her a natural to lead it. She is a professor at Osgoode Hall Law School and the Tier 1 York Research Chair in Intellectual Property, Artificial Intelligence and Emerging Technologies. She is also founder and director of the IP Innovation Clinic, the country’s largest intellectual property clinic, helping York faculty and researchers as well as the broader community.

"My focus is on how the law and interdisciplinary collaboration can help ensure that no one is left behind."

In the spring of 2025, D’Agostino was appointed as York’s associate vice-president of research, and in October as the Chair of the Board of the Ontario Centre for Innovation.

Connected Minds is already making an impact, and D’Agostino takes particular pride in its interdisciplinary nature. The program brings together universities and industries, hospitals and policy-makers, artists and Indigenous communities, and is engaging more than 50 community partners and research collaborators over seven years to create a responsible and inclusive approach to technological development.

“When we cross over and we work together, great things happen. There is an array of different projects that we are working on, confronting how technologies interplay with human behaviour and ensuring that the goals and outcomes are going to really change the world in a positive way,” says D’Agostino, recognized by Canadian Lawyer magazine as one of the Top 25 Most Influential Lawyers in 2022.

“My DNA has always been one rooted in, I would say, social justice and fairness, and an evidence-based approach to everything I do. I’ve always been fascinated by technology, and how throughout history new forms bring promise, but also challenge different communities. My focus is on how the law and interdisciplinary collaboration can help ensure that no one is left behind.

“We need to get ahead of these challenges and have appropriate governance frameworks in place to ensure we are strong as a society.”

While she’s got her eye on those big picture issues, D’Agostino is also concerned about the impact of technology on her own four children, including her triplets.

“Kids and the next generation, that’s what I come back to,” she says. “Being a mother of four little ones, I see a new generation and a new way of seeing technology. They’re early adopters of it. For better or for worse, the kids are inheriting the world we’re leaving behind for them.”

The post Ethical AI appeared first on Ascend Magazine.

]]>
Connected Minds: one year later /ascend/article/connected-minds-one-year-later/ Wed, 31 Jul 2024 00:42:59 +0000 /ascend/?post_type=article&p=491 Since Connected Minds: Neural and Machine Systems for a Healthy, Just Society launched in spring 2023, the $318.4-million project has already achieved several milestones pushing forward the project – and 첥Ƶ – as a leader in socially responsible emerging technology. It’s been over a year since President and Vice-Chancellor Rhonda Lenton and Vice-President Research and Innovation Amir Asif announced that […]

The post Connected Minds: one year later appeared first on Ascend Magazine.

]]>
Since Connected Minds: Neural and Machine Systems for a Healthy, Just Society launched in spring 2023, the $318.4-million project has already achieved several milestones pushing forward the project – and 첥Ƶ – as a leader in socially responsible emerging technology.

It’s been over a year since President and Vice-Chancellor Rhonda Lenton and Vice-President Research and Innovation Amir Asif  had received $105.7 million from the Canada First Research Excellence Fund (CFREF), the “largest single federal grant ever awarded to York.”

The Connected Minds leadership team, from left: Gunnar Blohm, vice director for Queen's, Doug Crawford, founding scientific director, Pina D'Agostino, director, and Sean Hillier, associate director
The Connected Minds leadership team, from left: Gunnar Blohm, vice director for Queen's, Doug Crawford, founding scientific director, Pina D'Agostino, director, and Sean Hillier, associate director

The cutting-edge program aims to bring together experts across eight York Faculties and three Queen’s Faculties to examine the ways in which technology is transforming society – dubbed the “techno-social collective” – and will work to balance both the potential risks and benefits for humanity. Some of the program’s proposed projects include explorations into a more inclusive metaverse, virtual reality and community organizing, neurotechnologies for healthy aging, Indigenous data sovereignty and how human brain function changes when people interact with artificial intelligence (AI) versus each other.

Since the funding announcements in early 2023, Connected Minds – the biggest  in the University’s history – has been busy.

“As founding scientific director, it’s incredibly gratifying see the progress we have made this first year, thanks to the very hard work of our leadership team, dedicated staff and the support of our board of directors,” says Doug Crawford, who is also a Distinguished Research Professor and Canada Research Chair in visuomotor neuroscience.

In addition to seed grants and PhD awards given out, over the past 12 months, Connected Minds has expanded its roster of experts by onboarding 14 research-enhanced hires across 첥Ƶ and institutional partner Queen’s University.

The new additions are part of the program’s efforts to attract and retain the best talent, as well as a fulfillment of its commitment to add 35 strategic faculty hires, research Chairs or equivalent levels of support to its interdisciplinary research ecosystem. The new Connected Minds members will benefit from support that includes $100,000 in startup research funding, salary top-up and/or teaching release, and a research allowance of $25,000 per year.

Connected Minds’ progress was also successfully commended by the Tri-agency Institutional Programs Secretariat – which administers the Canada First Research Excellence Fund – during a site visit showcasing the various research units affiliated with the program, and the progress its made.

Connected Minds director Pina D'Agostino
Connected Minds Director Pina D'Agostino

To further demonstrate the program’s – and 첥Ƶ’s – leadership in socially responsible technology, Connected Minds has also been organizing events, like the Introductory Meeting on Law and Neuroscience in Canada, which united experts from Canada and the United States for in-depth discussions on socially responsible research at the intersection of law and neuroscience at the renowned Monk School of Global Affairs in Toronto.

Connect Minds also hosted an event marking the culmination of its inaugural year: the Connected Minds Annual Research Retreat in February 2024. The retreat united members across diverse disciplines – including arts, science, health, law and more – to collectively shape the future of socially responsible technology. The goal was to help provide networking opportunities for members to get to know each other better and form the teams that will apply to grants and achieve the program’s long-term goals. It aimed to do so through information sessions, active participation in shaping Connected Minds’ Equity, Diversity and Inclusion (EDI) action plan, and highlighting research-enhanced hires, who delivered big-idea talks during the retreat.

York President & Vice-Chancellor Rhonda Lenton explores Biskaabiiyaang, an Indigenous metaverse created by assistant professor Maya Chacaby, a Connected Minds researcher.
York President & Vice-Chancellor Rhonda Lenton explores Biskaabiiyaang, an Indigenous metaverse created by assistant professor Maya Chacaby, a Connected Minds researcher.

The retreat also marked another notable milestone: a transition in leadership. Crawford will be succeeded by Professor Pina D’Agostino, founder and former director of IP Osgoode and co-director of the Centre for Artificial Intelligence & Society, where her expertise is frequently sought by government bodies to address the evolving intersection of AI and the law. Now, it will be applied to leading Connected Minds into what will promise to be another year of accomplishments.

“I am thrilled to be taking the program to the next level by building on the strong foundation we now have and engaging with all of our incredible partners and communities to work towards our goals of a healthy and just society,” says D’Agostino, looking ahead to how Connected Minds will continue to thrive and make contributions to interdisciplinary research.

The post Connected Minds: one year later appeared first on Ascend Magazine.

]]>
첥Ƶ welcomes transformative investments in next phase to create a new School of Medicine /ascend/article/investments-school-of-medicine/ Wed, 31 Jul 2024 00:39:20 +0000 /ascend/?post_type=article&p=497 첥Ƶ has long been a leader in health education and research, and recent announcements by the Government of Ontario have enabled the University to further this vision, with a new School of Medicine. In April 2024, at the Cortellucci Vaughan Hospital, Premier Doug Ford announced that 첥Ƶ’s new School of Medicine will open […]

The post 첥Ƶ welcomes transformative investments in next phase to create a new School of Medicine appeared first on Ascend Magazine.

]]>
첥Ƶ has long been a leader in health education and research, and recent announcements by the Government of Ontario have enabled the University to further this vision, with a new School of Medicine. In April 2024, at the Cortellucci Vaughan Hospital, Premier Doug Ford announced that 첥Ƶ’s new School of Medicine will open its doors to its first cohort of future doctors in 2028.

“[This] announcement is part of our plan to connect more Ontario families to more convenient care, including primary care,” said Premier Doug Ford. “As the first medical school in Canada focused primarily on training family doctors, this new school will make an enormous impact in the lives of people in York Region and across Ontario.”

The premier also announced that 첥Ƶ’s School of Medicine has the province’s support for 80 undergraduate spots and 102 postgraduate spots when the doors open in 2028, going up to 240 undergraduate seats and 293 postgraduate seats on an annual basis once operating at full capacity. By focusing on training primary care doctors, 첥Ƶ’s training model will devote approximately 70 per cent of the new postgraduate training seats to primary care.

The new commitments build on $9 million in start-up funding announced in March 2024 as part of the Ontario Budget 2024: Building a Better Ontario. Taken together, the announcements boost the development of the School of Medicine at 첥Ƶ significantly and accelerates the pace and path to new medical education in the fastest growing region in Ontario.

첥Ƶ's School of Medicine Announcement

“These new investments to support increased physician education in Ontario come at a critical time and mark an important milestone in York’s trajectory as an internationally recognized leader in higher education. Amidst growing demand for family doctors and other primary care general specialists, I want to thank Premier Ford and his government for being responsive to this pressing need, for their vision and clear commitment to York’s School of Medicine – and a healthier future for Ontarians,” said Rhonda Lenton, president and vice-chancellor.

첥Ƶ’s School of Medicine will prepare the next generation of talented frontline primary care doctors who represent the diversity of the communities in which they live, with plans to launch a unique bridging program to ensure no qualified future doctor goes without access to medical education.

The City of Vaughan has been an early supporter and valued partner throughout the planning and development of this critical education infrastructure project, and has agreed to transfer land to the University to build the School of Medicine within the  beside Mackenzie Health’s Cortellucci Vaughan Hospital. 

Vaughan Mayor Steven Del Duca affirmed the city’s ongoing commitment, “I was thrilled to join Premier Ford to officially announce the approval and funding of 첥Ƶ’s School of Medicine, located in the Vaughan Healthcare Centre Precinct (VHCP). When ready, this new School of Medicine will help address the much-needed doctor shortage in our community.”

“Mackenzie Health is excited to be a lead partner on this journey with 첥Ƶ in the creation of a School of Medicine, located next to Cortellucci Vaughan Hospital and just a few kilometers from Mackenzie Richmond Hill Hospital. There is an urgent need for additional primary care providers in our community and for our health care system. Building capacity will ensure patients receive the comprehensive care they need and deserve, especially as they age and begin to experience more complex health challenges. We are grateful for the provincial government’s continued health care investments, and we look forward to working with 첥Ƶ to train the family physicians of the future who will care for the growing and aging community in western York Region and beyond,” said Mackenzie Health President and CEO Altaf Stationwala.

“Early support from York’s strong network of community health providers, hospitals, and municipalities with whom we have been working has been instrumental throughout the School of Medicine planning phase. I am grateful for our many highly skilled and thoughtful partners for their leadership and insights, and we look forward to continuing to grow and expand our work with them,” said Lenton. 

The post 첥Ƶ welcomes transformative investments in next phase to create a new School of Medicine appeared first on Ascend Magazine.

]]>
De-escalating robocops? /ascend/article/de-escalating-robocops/ Wed, 31 Jul 2024 00:38:29 +0000 /ascend/?post_type=article&p=489 Picture this: a 911 operator in your city receives a call from a person in mental distress and needs to send help. They could dispatch the police or an integrated unit of both police and mental health professionals. But instead, the operator sends a robot. This scenario may sound like science fiction, but it’s the […]

The post De-escalating robocops? appeared first on Ascend Magazine.

]]>
Picture this: a 911 operator in your city receives a call from a person in mental distress and needs to send help.

They could dispatch the police or an integrated unit of both police and mental health professionals. But instead, the operator sends a robot.

This scenario may sound like science fiction, but it’s the kind of futuristic thinking that has researchers at 첥Ƶ considering all angles when it comes to artificial intelligence (AI) and crisis response.

Building more empathetic bots through interdisciplinary research

In a paper published in Applied Sciences earlier this year, psychology PhD candidate Kathryn Pierce and her co-authors explore the potential role robots could play in crisis de-escalation, as well as the capabilities engineers would need to program them to be effective.

The visionary paper is part of a larger project at the Lassonde School of Engineering that involves early-stage research to design and test robots to assist in security and police force tasks. The York engineers asked the psychology researchers to provide their social scientific lens to their forward-thinking work on humanizing machines.

“De-escalation is not a well-researched topic and very little literature exists about what de-escalation really looks like moment by moment,” says Pierce, who is supervised by Dr. Debra Pepler, a renowned psychologist and Distinguished Research Professor in the Faculty of Health. “This makes it difficult to determine what kinds of behavioural changes are necessary in both responders and the person in crisis to lead to a more positive outcome.”

No hard and fast rules for de-escalation, for both humans and robots

With limited academic understanding of what really happens in human-to-human interactions during a crisis response, let alone robot-to-human, training a robot to calm a person down poses an incredibly tall task.

Despite the challenge, Pierce and her co-authors were able to develop a preliminary model outlining the functions a robot should theoretically be able to perform for effective de-escalation. These functions are made up of verbal and non-verbal communication strategies that engineers would need to be mindful of when building a robot for such a task.

Kathryn Pierce, Psychology PhD Candidate
Kathryn Pierce, Psychology PhD Candidate

Some of these strategies include a robot’s gaze – the way a machine and human look at one another – the speed in which they approach (slow and predictable), and the sound and tone of their voice (empathetic and warm).

But, as the researchers point out, ultimately, robots cannot be “programmed in a fixed, algorithmic, rule-based manner” because there are no fixed rules for how people calm each other.

“Even if there were algorithms governing human-to-human de-escalation, whether those would translate into an effective robot-to-human de-escalation is an empirical question,” they write.

It is also difficult to determine whether people will react to robots emulating human behaviour the same way they would if it was an actual person.

Advances in AI could add new layer of complication to the future of crisis response

In recent years, the use and discussion of non-police crisis response services have garnered growing attention in various cities across North America, and elsewhere in the world.

Advocates for replacing traditional law enforcement with social workers, nurses or mental health workers – or at least the integration of these professionals with police units – argue that this leads to better outcomes.

published earlier this year showed that police responding to people in mental distress use less force if accompanied by a health-care provider. found that community responses were more effective for crime prevention and cost savings.

Introducing robots into the mix would add to the complexity of crisis response services design and reforms. And it could lead to a whole host of issues for engineers, social scientists and governments to grapple with in the future.

The here and now

For the time being, Pierce and her co-authors see a machine’s greatest potential in video recording. Robots would accompany human responders on calls to film the interaction. The footage could then be reviewed for responders to reflect on what went well and what to improve upon.

Researchers could also use this data to train robots to de-escalate situations more like their human counterparts.

Another use for AI surveillance the researchers theorize could be to have robots trained to identify individuals in public who are exhibiting warning signs of agitation, allowing for police or mental health professionals to intervene before a crisis point is ever reached.

While a world in which a 911 operator dispatches an autonomous robot to a crisis call may be too hard to conceive, Pierce and her co-authors do see a more immediate, realistic line of inquiry for this emerging area of research.

“I think what’s most practical would be to have engineers direct their focus on how robots can ultimately assist in de-escalation, rather than aiming for them to act independently,” says Pierce. “It’s a testament to the power and sophistication of the human mind that our emotions are hard to replicate. What our paper ultimately shows, or reaffirms, is that modern machines are still no match for human intricacies.”

The paper, “,” was co-authored by Pierce and Pepler, along with Michael Jenkin, a professor of electrical engineering and computer science in the Lassonde School of Engineering, and Stephanie Craig, an assistant professor of psychology at the University of Guelph.

The work was funded by the Canadian Innovation for Defence Excellence & Security Innovation Networks.

The post De-escalating robocops? appeared first on Ascend Magazine.

]]>
Voice-activated sexism: exploring consequences of gendered technology /ascend/article/voice-activated-sexism/ Tue, 30 Jul 2024 18:41:31 +0000 /ascend/?post_type=article&p=490 “Voice-activated personal assistants (VAPAs) use women’s voices as a default setting, and this gendered technology significantly influences the treatment of women tech experts by male audiences online,” says Stephen J. Neville, who conducted the work alongside Alex Borkowski, both of whom are in the Joint Graduate Program in Communication & Culture at York and Toronto Metropolitan University.  […]

The post Voice-activated sexism: exploring consequences of gendered technology appeared first on Ascend Magazine.

]]>
“Voice-activated personal assistants (VAPAs) use women’s voices as a default setting, and this gendered technology significantly influences the treatment of women tech experts by male audiences online,” says Stephen J. Neville, who conducted the work alongside Alex Borkowski, both of whom are in the Joint Graduate Program in Communication & Culture at York and Toronto Metropolitan University. 

Unboxing is a popular video genre on YouTube and features people unwrapping and reviewing the latest high-tech gadget or product, like smart speakers. These videos often also offer a walk-through or demonstration of such a device.

“Today’s consumers learn about new tech products online before buying them, and unboxing videos are seen as providing a trusted third-party review,” says Borkowski. “We were curious to learn more about the resonance between VAPAs and women tech experts.”   

Neville and Borkowski watched over 200 of the most popular smart speaker unboxing videos on YouTube, the majority of which featured men, studying their contents, structure and aesthetics. Videos of women doing the unboxing made up only 10.9 per cent of their initial sample and garnered far fewer views.  

Analyzing over 4,000 comments on videos made by women revealed a troubling but rather unsurprising finding: the women’s intelligence was often insulted, or they were sexually objectified.  

The pair of researchers argue some of these comments treat the women as if they are broken machines – a concept developed in previous media studies research – and are issued commands like a smart speaker to stop talking (or shut up), go mute or turn off.  

“Sexism and misogyny are pervasive online and offline, and it extends to YouTube, which creates a challenging environment for female content creators,” says Borkowski. “Our research shows the domestication of smart speakers has had a spillover effect in the media consumption of these unboxing videos and women tech experts.”  

A substantial portion of the pair’s research focused on analyzing each woman YouTubers’ presentation or performance style, and the ways in which they engaged with the product.   

Based on this analysis, Neville and Borkowski observed the female content creators showed technical prowess and a solid understanding of smart speakers overall, but one aspect of their performances contradicted this display of expertise.   

In some of the unboxing videos, when the VAPA is turned on, the women’s reactions were over the top, with some acting overwhelmingly shocked or audibly gasping.   

The pair see this exaggerated behaviour as indicative of the way women are forced to navigate society at large, being expected to conform to traditional femininity.  

“Our findings suggest that some of these women can at times act ditzy to undercut their own authority and expertise with new technology,” says Neville. “This behaviour functions almost like a pre-emptive defence to the negative reaction they anticipate receiving from the audience.”  

According to Borkowski, the idea of a technologically savvy woman is threatening to some, so these women have learned to adapt their behaviour in an attempt to minimize the level of vitriol or hate they receive online.  

“It’s a burden male tech experts never contend with,” she says.  

Despite these negative conditions facing women online, there are grounds for optimism. Neville and Borkowski see potential for the concept of women as broken machines to be co-opted to promote equity and social justice. 

“Albeit broken, women tech experts viewed as machines provides them with a platform and channel to shape the way their audiences see and use technology,” says Neville. “They can also block trolls and disable comments as a way to resist online misogyny.”  

“The popularity of these unboxing videos provides an opportunity for female content creators to discuss bigger issues with technology beyond the functionality or practicality of one product, including concerns about privacy, surveillance and control,” says Borkowski.  

The research, “,” was published as a book chapter earlier this year in the Routledge Handbook on Media and Technology Domestication.

The post Voice-activated sexism: exploring consequences of gendered technology appeared first on Ascend Magazine.

]]>