Efficiency Reimagined: The AI Edge 

Behind the Scenes: LLMs, Bias, and User Implications

This session of The AI Edge is led by Zora Rush, a sociotechnical AI linguist, who designs and writes test environments for leading AI-enabled products. She supports a leading corporation in this industry, renowned for pioneering some of the most influential artificial intelligence innovations in the world. In her role, she develops resources to help engineers understand and classify harmful content and create best practices for evaluating content harms in AI products. She will be sharing a behind-the-scenes look at her work on bias mitigation for a major generative AI application and will share how this work relates to her previous experiences as a linguist, an educator, and an assessment developer.

 

Subscribers receive access to the  live, applied webinar, unlimited access to recordings, and membership in our closed discussion community to discuss learnings with fellow researchers!

Our emphasis on applied use ensures you can immediately implement what you learn. In each live training, we walk you through step-by-step, allowing for plenty of Q&A and support.

Topics covered in this webinar:

  • Overview of LLMs and generative AI
  • Bias in LLMs and the products they power
    • What bias looks like in AI
    • Why LLMs are biased
    • How responsible corporations are working to mitigate bias in LLMs
    • Persistent challenges with bias in LLMs
    • Strategies for evaluating information for bias
    • Implications for the industry and education
  • Resource - Handout: 
    •  Helpful links for learning more about generative AI
    • Questions to ask when considering your needs for LLM-enabled products
    • Questions to ask when buying an LLM-enabled product
  • Q&A Session

Sign up below for the session that works best for your schedule! Enrollment sizes for each are limited, to allow for plenty of discussion and questions as we go.

Registration fee: $29

 

Wednesday, November 1

12:00 noon ET/9:00 am PT

Enroll in this session