BLACK mobile logo

international

Built for all?: Artificial intelligence and the LGBTQ+ community

November 18, 2025

Artificial intelligence technology is becoming increasingly integrated into daily life, with global surveys showing 55% of people view AI as more beneficial than harmful. However, LGBTQ+ community members face disproportionate risks from AI systems due to biased training data that perpetuates harmful stereotypes and reductive representations of queer identities. Beyond digital harms, AI-powered automatic gender recognition systems are being deployed for surveillance purposes, particularly in countries with anti-LGBTQ+ governments like Hungary, where authorities monitor Pride events under the guise of public safety.

Who is affected

  • LGBTQ+ community members (lesbian, gay, bisexual, transgender, and queer+ individuals)
  • Queer artists and activists
  • LGBTQ+ event attendees at Pride gatherings
  • Average citizens in the LGBTQ+ community
  • Hungarian LGBTQ+ individuals specifically targeted by government surveillance

What action is being taken

  • Governments antagonistic to LGBTQ+ communities are adopting automatic gender recognition (AGR) systems
  • Hungarian Prime Minister Viktor Orbán has sanctioned AI-enabled biometric monitoring at local Pride events
  • European Union institutions are reviewing Hungary's surveillance policy
  • UNESCO has conducted research assessing biases in large language models

Why it matters

  • This matters because AI tools that shape daily experiences were not designed with all people in mind, creating disproportionate harm for LGBTQ+ individuals. The technology perpetuates damaging stereotypes that are difficult to eliminate due to biased training data, while surveillance applications threaten individual privacy and enable governments to intimidate activists mobilizing for change. These issues demonstrate that what may be beneficial technology for the general population presents specific challenges that adversely impact vulnerable communities, undermining their safety, dignity, and right to accurate representation.

What's next

  • Leaders across sectors must forge partnerships between AI developers and LGBTQ+ community stakeholders
  • Constructive collaboration should ensure training data more accurately reflects the lived realities of queer people
  • Robust safeguards must be implemented to prevent AI misuse for surveillance against the community
  • Systems with gender detection capabilities must be strictly prohibited
  • LGBTQ+ individuals' input should be solicited at all stages of the tool development lifecycle

Read full article from source: Global Voices

Built for all?: Artificial intelligence and the LGBTQ+ community