• Move Fast, but without Bias: Ethical AI Development in a Start-up Culture (A)

    This case set is part of the Giving Voice to Values (GVV) curriculum. To see other material in the GVV curriculum, please visit http://store.darden.virginia.edu/giving-voice-to-values. Taylor is a senior product manager at Catalise, a start-up that develops artificial intelligence (AI) diagnostic technology for mental health disorders. When Taylor is promoted to this position, her first project is to manage the launch of Catalisten, an AI-based software that diagnoses major depressive disorder by analyzing patients' speech patterns. Company leaders expect Catalisten to be the company's new blockbuster product and expedite its launch to ensure a competitive advantage. As Taylor onboards onto the Catalisten team, she learns that the product, which is nearly complete, misdiagnoses female patients at significantly higher rates than male patients. When Taylor raises the concern to a team member, she is pushed to "bury" the algorithm's discrepancy in order to protect the product launch time line. In this A case, Taylor's challenge is to convince Catalise's chief product officer to delay Catalisten's launch to address the AI software's gender bias. This case set is intended for use at the MBA level in courses in business innovation, entrepreneurship, engineering, technology product development, tech ethics, and leadership and ethics. It could be taught to advanced undergraduates who have developed a foundation in GVV, ethical decision-making, or computer science.
    詳細資料
  • Move Fast, but without Bias: Ethical AI Development in a Start-up Culture (B)

    This case set is part of the Giving Voice to Values (GVV) curriculum. To see other material in the GVV curriculum, please visit http://store.darden.virginia.edu/giving-voice-to-values. Taylor is a senior product manager at Catalise, a start-up that develops artificial intelligence (AI) diagnostic technology for mental health disorders. When Taylor is promoted to this position, her first project is to manage the launch of Catalisten, an AI-based software that diagnoses major depressive disorder by analyzing patients' speech patterns. Company leaders expect Catalisten to be the company's new blockbuster product and expedite its launch to ensure a competitive advantage. As Taylor onboards onto the Catalisten team, she learns that the product, which is nearly complete, misdiagnoses female patients at significantly higher rates than male patients. When Taylor raises the concern to a team member, she is pushed to "bury" the algorithm's discrepancy in order to protect the product launch time line. In the A case, Taylor's challenge is to convince Catalise's chief product officer to delay Catalisten's launch to address the AI software's gender bias. In this B case, students follow Taylor as she approaches her chief product officer about delaying Catalisten's launch and addressing the product's bias. This case set is intended for use at the MBA level in courses in business innovation, entrepreneurship, engineering, technology product development, tech ethics, and leadership and ethics. It could be taught to advanced undergraduates who have developed a foundation in GVV, ethical decision-making, or computer science.
    詳細資料
  • Programming a "Fairer" System: Assessing Bias in Enterprise AI Products (A)

    This case is part of the Giving Voice to Values (GVV) curriculum. To see other material in the GVV curriculum, please visit http://store.darden.virginia.edu/giving-voice-to-values. In this case, Timothy Brennan is the founder and CEO of technology company Northpointe, Inc. (Northpointe), and the creator of its flagship software program, COMPAS, an artificially intelligent software tool for US court systems that predicts a defendant's likelihood to reoffend and informs bail, parole, and probation sentencing decisions. Brennan originally created COMPAS in order to standardize decision-making within the criminal justice system and to reduce the likelihood of human error or bias impacting court rulings. However, years after COMPAS's public release and widespread adoption within the US court systems, an investigative journalism report claims that COMPAS is more likely to mislabel Black defendants as higher risk and White defendants as lower risk of recidivism. To complicate the matter, any coding adjustments that Northpointe would make to uncover or address the bias-causing programming might reduce the software's performance or reveal sensitive operational information to competitors. In this A case, Brennan's challenge is to organize a response to investigate bias within the COMPAS software, while still protecting the complexity and intellectual property of the product. In the B case, students read a synopsis of Brennan's actual response and review its implications for Northpointe and the US criminal justice system. They are encouraged to consider how Brennan could have responded more creatively and constructively.
    詳細資料
  • Programming a "Fairer" System: Assessing Bias in Enterprise AI Products (B)

    This case is part of the Giving Voice to Values (GVV) curriculum. To see other material in the GVV curriculum, please visit http://store.darden.virginia.edu/giving-voice-to-values. In this case, Timothy Brennan is the founder and CEO of technology company Northpointe, Inc. (Northpointe), and the creator of its flagship software program, COMPAS, an artificially intelligent software tool for US court systems that predicts a defendant's likelihood to reoffend and informs bail, parole, and probation sentencing decisions. Brennan originally created COMPAS in order to standardize decision-making within the criminal justice system and to reduce the likelihood of human error or bias impacting court rulings. However, years after COMPAS's public release and widespread adoption within the US court systems, an investigative journalism report claims that COMPAS is more likely to mislabel Black defendants as higher risk and White defendants as lower risk of recidivism. To complicate the matter, any coding adjustments that Northpointe would make to uncover or address the bias-causing programming might reduce the software's performance or reveal sensitive operational information to competitors. In the A case, Brennan's challenge is to organize a response to investigate bias within the COMPAS software, while still protecting the complexity and intellectual property of the product. In this B case, students read a synopsis of Brennan's actual response and review its implications for Northpointe and the US criminal justice system. They are encouraged to consider how Brennan could have responded more creatively and constructively.
    詳細資料