skip to content

Algorithmic Justice Training for Civil Legal Services Providers

  • 3:00 PM - 4:00 PM
  • Eastern Time (US & Canada)
  • By: Practising Law Institute
Topics:
  • Technology

Low-income people are increasingly ensnared in automated decision-making systems that act as gatekeepers to life necessities such as housing, employment, health care, education, and public benefits. They are also persistently surveilled and monitored in their schools, workplaces, neighborhoods, and even their homes. The data-centric technologies at the heart of this gatekeeping and surveillance lack transparency, and as a result, many affected people may not even be aware of this digital layer of harm. Further, many judges and other legal system actors are overly deferential to computer-generated outcomes due to misunderstandings about their reliability, neutrality, and core functions. In light of these dynamics, a competent legal services lawyer needs to understand how these technological systems impact clients and to effectuate legal strategies to combat the resulting harms.

This Briefing will introduce legal services providers to the algorithmic technologies contributing to their clients’ injuries and to highlight potential legal strategies and remedies. Participants will leave this presentation with an understanding of algorithmic systems, including machine learning and AI; an ability to issue spot how data-centric technologies are affecting their clients; and an overview of legal strategies for countering algorithmic harms.

This session is appropriate for civil legal services attorneys across the country.

Topics include:

  • Introduction to algorithms, machine learning, and AI within the context of surveillance capitalism (15 minutes)
  • Overview of algorithmic impacts on low-income populations (10 minutes)
  • Overview of barriers to algorithmic justice, including lack of transparency, automation bias, and the existing federal and state data privacy regime (10 minutes)
  • Discussion of strategies for bringing transparency to algorithmic systems (10 minutes)
  • Discussion of strategies for litigating algorithmic harms (10 minutes)
  • Suggestions for additional learning and available resources (5 minutes)

Who Should Attend:
Legal aid and legal services advocates, nonprofit lawyers, pro bono attorneys and allied professionals representing low-income people in civil legal systems.

  • CLE Credit Comments: 1 Total Credits 1 Professional Practice