Skip to main content

AI and Gender in the Judicial System

Post History
AI and Gender in the Judicial System
Posted By: Paola Maite Diaz-Morales
Posted On: 2025-10-15T22:22:25Z

AI and Gender in the Judicial System


As artificial intelligence (AI) rapidly advances and integrates into nearly every professional field, understanding who has access to these tools and who does not has become a central question of justice and equity. UNESCO’s 2024 Global Judges’ Initiative survey reveals that 44% of judicial operators have used AI tools for work-related activities. However, despite the valuable baseline data the survey provides, the research contains a notable gap in that it lacks a gendered analysis of AI adoption in judicial systems. In fact, there is very little research in general on the intersection of gender and AI use within the judiciary. It is here that I present the work that has been done in this and adjacent fields and draw upon it to situate the International Association of Women Judges’ (IAWJ) role in filling this knowledge gap and ensuring that the AI revolution in justice systems advances, rather than undermines, gender equity.


UNESCO’s survey of 563 judicial operators across 96 countries provides essential information. While 93% reported familiarity with AI, only 41% used AI chatbots, and 11% used AI systems daily. More concerning is the lack of issued AI usage guidelines, with only 9% of operators’ organizations having rules or training. Despite collecting gender data from respondents, the survey did not analyze the findings through a gendered lens, which leaves questions about whether women judges face different barriers, have equal access, are included in policy development, or have different attitudes toward AI unanswered.


Though research on gender and AI implementation in the judiciary is limited, studies in related fields provide important insights. A 2024 collaboration between Linklaters LLP, The Next 100 Years, and She Breaks the Law surveyed 90 women in legal professions and found concerning patterns that reflect a gender gap. While 77% consider AI extremely significant for the profession’s future, 37% reported that their organization has not embraced AI. 43% also said they have observed bias in AI and legal tech, including biased tools, reports of biased outcomes, and qualitative impacts.


This study also touches on some of the structural barriers, like women who are out of the office for more extended periods of time for caregiving roles, occupy positions with more “organizational dusting” responsibilities, or women who work for leaders who are not supportive of AI adoption have less time to develop skills and are therefore at a disadvantage.


Further, evidence from broad studies on the gender gap in AI shows a widespread pattern. Global research from Harvard Business School, Berkley, and Stanford analyzing studies from various countries found that women are less likely to use ChatGPT and other generative AI tools than men, with researchers observing that “the gender gap adoption is nearly universal.” Several factors outlined in this study and others contribute to this gap. First, there are technology literacy gaps. A 2025 systematic review of Asian higher education found that women were found to be less familiar with AI technologies and could not use AI tools as frequently as men, with a lower level of AI literacy among women resulting in lower adoption rates. Secondly, the review also highlighted the lack of inclusion of women in policymaking for adopting AI.


The relationship between trust and adoption is more nuanced than the structural factors above. Harvard’s research shows that women are not inherently more distrustful of AI than men. However, the Linklaters study found that 43% of women observed bias in AI tools, and research in Hong Kong noted that despite high adoption of tools like ChatGPT, women are more concerned about AI misuse of their data, which disproportionately affects female adoption. These differences show that women’s concerns may not be generalized mistrust of systems but a response to actual bias.



It is also important to note that while our primary focus is to ensure equal access to AI literacy and use, we cannot ignore the bias embedded within AI systems, as these issues must be addressed simultaneously. Across the board, AI systems show patterns of discrimination. The Linklaters study states that women lawyers report needing to instruct AI to write “as a man” to achieve an appropriate professional tone, AI defaults to portraying men in leadership positions, and algorithmic decision-making in hiring has shown discriminatory outcomes towards women. For women judges, these biases are particularly concerning in case management and risk assessment tools that can encode gender stereotypes and affect outcomes in family law, domestic violence, and custody cases.


AI is already reshaping the judiciary. The question now, however, is whether this transformation will be inclusive and equitable. The IAWJ is uniquely positioned to address these gaps, and the preceding research findings show us why. If we look at the “mistrust factor” debate, where Harvard’s research shows no gender difference in AI trust while other studies suggest a heightened concern among women, we see fundamental gaps in understanding women judges’ specific experiences. These gaps persist not only because the data doesn’t exist, but also because there is no real consensus that we could use to inform initiatives, despite field differences. Through a comprehensive IAWJ survey, we can gain clarity on where to focus efforts. If trust is not a barrier for our members, we avoid misdirecting resources. If, conversely, women judges do perceive higher risks that other studies have missed, we prevent these issues from falling through the cracks. Further, there is a clear demand for guidance on responsible AI use in judicial contexts. Still, these guidelines will be incomplete and potentially inequitable if they are not informed by gender-responsive research.



Much more can be said about how existing findings on gender, AI, and localized implementation of their intersection can apply to the judiciary. Still, the essential task is to produce data that centers women judges’ experiences, and to ensure that tools meant to support justice do not perpetuate the very inequalities that the judiciary seeks to address. Thus, the IAWJ’s collaboration with institutions like UNESCO to conduct gender-focused research on judicial AI adoption signifies an essential step toward this goal.

 

Works Cited: 

Harvey, Marc. “No Woman Left Behind: Closing the AI Gender Gap in Law.” Linklaters, 3 Dec. 2024, www.linklaters.com/en/knowledge/publications/alerts-newsletters-and-guides/2024/december/03/gender-in-ai-report. 

Kalim, Usama, et al. “Barriers to AI adoption for women in higher education: A systematic review of the Asian context.” Smart Learning Environments, vol. 12, no. 1, 5 June 2025, https://doi.org/10.1186/s40561-025-00390-5. 

Lawyers Hub and UNESCO. Lawyers Hub, 2024, Findings Report on Gender Perspectives and Artificial Intelligence Governance in Africa’s Judiciary, https://www.lawyershub.org/Digital%20Resources/Reports/UNESCO%20Findings%20Report%20on%20Gender%20Perspectives%20and%20AI%20Governance%20in%20Africas%20Judiciary.pdf. 

Otis, Nicholas G., et al. Global Evidence on Gender Gaps and Generative AI, 14 Oct. 2024, https://doi.org/10.31219/osf.io/h6a7c. 

Team, NEXT IAS Current Affairs, and NEXT IAS Current Affairs Team. “Gender Gap in the Higher Judiciary.” Current Affairs - NEXT IAS, 10 Mar. 2025, www.nextias.com/ca/current-affairs/10-03-2025/gender-gap-in-higher-judiciary.