BLACK mobile logo

international

‘Smart’ (or Machiavellian?) surveillance: The power of terminology

September 3, 2025

The article critically examines how surveillance technologies, often marketed as "smart" solutions, are being deployed by governments and law enforcement agencies worldwide with mixed impacts. While facial recognition has helped identify child abusers and locate missing Indigenous people in some cases, the same technology has shown significant racial and gender biases, disproportionately misclassifying darker-skinned females and perpetuating existing social inequalities. The piece details how algorithmic systems have been weaponized against vulnerable populations, from asylum seekers at the US-Mexico border to Palestinians in the West Bank, highlighting how technologies reflect and amplify the biases of their human creators and operators.

Who is affected

  • Asylum seekers at the US-Mexico border who face discrimination through the CBP One app's facial recognition system
  • People with darker skin tones, particularly darker-skinned females who experience higher error rates (up to 34.7%) in facial recognition systems
  • Black Americans who are disproportionately represented in mugshot databases and targeted by predictive policing
  • Lower-income families in the Netherlands who were wrongfully penalized by tax authorities' algorithms
  • Domestic abuse survivors in Spain who were mislabeled as "low risk" by algorithms and subsequently harmed
  • Palestinians in the West Bank who are extensively profiled and monitored through Israel's "Wolf Pack" surveillance system
  • Children and families, including over a thousand Dutch children wrongfully placed in foster care due to algorithmic errors

What action is being taken

  • US law enforcement is using facial recognition to identify child abusers and locate missing Indigenous people
  • The CBP One app is requiring asylum seekers to register with facial recognition systems at the US-Mexico border
  • ICE is employing facial recognition technologies to monitor people awaiting deportation hearings
  • Israeli military is operating the "Wolf Pack" surveillance system to profile Palestinians in the West Bank
  • Israel is utilizing a large language model to surveil Palestinians and collect extensive personal data
  • Spain is implementing algorithmic risk assessment for domestic abuse cases to allocate support resources
  • Researchers like Joy Buolamwini at MIT are conducting studies to document bias in facial recognition systems

Why it matters

  • Facial recognition technology perpetuates existing social inequalities and racial biases, with error rates up to 34.7% for darker-skinned females compared to just 0.8% for lighter-skinned males
  • Surveillance technologies marketed as "smart" and "efficient" are being used as tools of control and discrimination against already marginalized populations
  • Algorithmic systems deployed in high-stakes situations have led to serious consequences including wrongful family separations, barriers to asylum, and domestic violence victims being murdered after being mislabeled as "low risk"
  • These technologies extend and amplify existing power structures and biases rather than correcting them, as evidenced by their deployment in predictive policing and border control
  • The centralization of personal data enables unprecedented levels of monitoring and control, described by one Israeli intelligence source as having "more tools to know what every person in the West Bank is doing"

What's next

  • No explicit next steps stated in the article

Read full article from source: Global Voices

‘Smart’ (or Machiavellian?) surveillance: The power of terminology