Weapon or Tool?: How the Tech Community Can Shape Robust Standards and Norms for AI, Gender, and Peacebuilding

By: Nick Zuroski | December 2023 | Blog

 

 

In October, following earlier remarks that artificial intelligence (AI) poses “enormous potential and enormous danger,” President Biden issued a sprawling executive order (EO) on the technology. The EO establishes a White House AI Council and tasks agencies with forming guidelines for the safe use of AI, from disinformation to cybersecurity. While the Gender Policy Council—established by President Biden in 2021 to advance gender equity and equality in US domestic and foreign policy—is included in the AI Council, the EO fails to mention gender, violence, or conflict in relation to AI best practices. Ironically, this oversight came just before the UN’s 16 Days of Activism Against Gender-Based Violence.

The international community lacks standards and norms to ensure that AI contributes to gender equality and builds sustainable peace. Evidence shows that digital technologies can drive gendered violence and violent conflict. Generative AI can, for instance, multiply disinformation that targets women and girls, while gender bias in AI training data and companies can perpetuate harmful gendered stereotypes. But, if used properly, digital technologies like AI can amplify gender equality and peacebuilding interventions. AI digital dialogue tools can bring more women into official peace processes—making them more representative and successful—and AI-automated data analysis can “enable real-time identification of gendered dimensions of conflict.”Maximising the benefits and minimising the risks of AI is vital, given the current 30-year high in global violent conflict, which women bear the brunt of. This Insight will explore how AI can both worsen and eliminate gendered violence and how technology companies can adopt AI standards and norms that integrate gendered perspectives to prevent and reduce violent conflict and build sustainable peace.

Read More     Go Back

Read More

Go Back

 

Comments are closed.