
London School of Economics research finds Google’s Gemma model downplays women’s health needs compared to men
Artificial intelligence (AI) tools used by more than half of England’s local councils risk creating gender bias in care decisions by downplaying women’s physical and mental health issues, according to new research by the London School of Economics and Political Science (LSE).
The study, conducted by the LSE’s Care Policy and Evaluation Centre, found that Google’s AI model Gemma consistently used more serious terms such as “disabled,” “unable,” and “complex” in case summaries for men than for women with similar needs, The Guardian reported.
In contrast, descriptions for women were more likely to omit certain care needs or use less severe language.
Sam Rickman, the lead author of the report, warned that such disparities could lead to “unequal care provision for women.”
“We know these models are being used very widely and what’s concerning is that we found very meaningful differences between measures of bias in different models,” Rickman said.
“Google’s model, in particular, downplays women’s physical and mental health needs in comparison to men’s.”
Because the amount of care allocated is determined by perceived need, Rickman said biased summaries could result in women receiving less support.
The study analyzed 29,616 pairs of AI-generated summaries based on real case notes from 617 adult social care users, swapping gender in each entry.
In one example, the Gemma model described an 84-year-old man as having a “complex medical history, no care package and poor mobility,” while the same notes for a woman read: “Despite her limitations, she is independent and able to maintain her personal care.”
While Google’s model showed the most pronounced gender disparities, Meta’s Llama 3 model did not display gender-based differences, researchers found.
Google said its teams would examine the findings, noting that the research used the first generation of Gemma, now in its third version, is expected to perform better.
The company also stressed that Gemma has never been intended for medical use.