Eugenia Kuyda launched Replika AI in 2017 as an empathetic digital companion to combat loneliness and provide emotional support. The platform surged in popularity during the COVID-19 pandemic, offering non-judgmental support to isolated users. By 2023, Replika boasted 10 million users, with 40% engaging in romantic partnerships with their Replikas. Kuyda's strategy of prioritizing users' emotional well-being over engagement metrics was successful: 85% users reported improved mood post-interaction. However, as users ventured into more intimate conversations with the AI, Kuyda faced a pivotal decision: Should she restrict these exchanges or embrace them as the platform's natural evolution? This choice was complicated by the need to safeguard Replika's brand, prevent misuse, while satisfying its diverse user base. The case explores the ethical boundaries of AI-human relationships and the evolving role of AI in addressing loneliness. It is suitable for courses in technology, entrepreneurship, AI, and ethics.
In March 2024, Anthropic, a leading AI safety and research company, made headlines with the launch of Claude 3, its most advanced AI model. This marked Anthropic's bold entry into the multimodal GenAI domain, showcasing capabilities extending to both image and text analysis. Co-founded by former OpenAI employees, Anthropic aimed to be at the forefront of generative AI innovations. The broader AI landscape had seen technologies like ChatGPT transition from niche applications to mainstream tools, sparking global discussions about their potential impact. Established as a Public Benefit Corporation, Anthropic prioritized public good alongside financial returns. The company emphasized aligning technological progress with human values, driven by concerns over AI's potential for harm without robust safety mechanisms. Anthropic's cautious strategy, including delaying the release of an earlier version of Claude to ensure appropriate safety protocols, contrasted with competitors such as OpenAI whose release of ChatGPT triggered an AI arms race. As a company with aggressive growth targets and a 75x revenue multiple, Anthropic had to balance its foundational safety mission against the demands of commercial success. The OpenAI experience with its Board replacement had demonstrated the importance of governance and the risks of misaligned values within the company. Did Anthropic's corporate structure effectively guard against profit-driven incentives that could compromise safety? As AI models became more powerful, what tools should Anthropic develop and share to prevent harm?
This is the first of a three-case series that explores the challenges faced by Uniswap, a key player in the decentralized finance (DeFi) sector. Founded by Hayden Adams, the case traces Uniswap's rapid growth from a simple idea inspired by a Reddit post to becoming one of the leading decentralized exchanges in Web3, with trading volumes peaking at $6 billion in the summer of 2020. The case zeroes in on a critical moment when Uniswap is threatened by a "vampire attack" from SushiSwap, a rival that forks Uniswap's open-source code to launch a competing service, threatening to drain its liquidity. Adams faces the dilemma of how he should respond to the rise of SushiSwap. The case delves into the competitive dynamics of the DeFi ecosystem, challenges of open-source software, and strategic responses to market disruptions.
Target Malaria, a non-profit research consortium, is exploring the application of CRISPR-Cas9 gene editing technology to combat malaria in Sub-Saharan Africa. Its approach uses gene drives, a revolutionary tool, to suppress the population of malaria-carrying mosquitoes. Although gene drives are 5-10 years from being tested in the wild, Target Malaria's strategy of staged implementation has been driven by a thoughtful, highly regulated, and long pathway. The case describes the complexity and technical intricacies of gene drive technology, the stakeholder and community engagement process, ecological and ethical risks with releasing genetically modified organisms into the wild, and the regulatory structure. Since a gene drive has the potential to alter not just a single organism but an entire species, the case raises critical questions central to the deployment of transformative technologies in public health: How can the global community govern technologies when their effects transcend national borders? What are the potential long-term ecological impacts, and how can they be mitigated? How do you balance risks and benefits of a technology like gene drives, given that malaria kills hundreds of thousands of people (mostly children) every year?
In November 2023, the board of OpenAI, one of the most successful companies in the history of technology, decided to fire Sam Altman, its charismatic and influential CEO. Their decision shocked the corporate world and had people wondering why OpenAI had designed a governance structure that made such a decision possible. In the last year, the company had introduced ChatGPT, the fastest growing app in history, and achieved a valuation of almost $90 billion. Altman had become the public face of AI and was instrumental in making the remarkable progress possible. Over five chaotic days, the company went through three CEO changes, had 90 percent of its employees almost move to Microsoft, and saw five of the six original members of the board resign and be replaced by two new members. The extraordinary high-stakes saga brought together the combustible mixture of idealism, capitalism, and power in the context of different world views of the promise and peril of the AI revolution. The case traces the history of modern AI, OpenAI's groundbreaking developments, its wrestle with the dual forces of commercial success and ethical responsibility, and finally its dramatic leadership upheaval. By highlighting the challenges of balancing advancing AI technology with protecting humanity's interests, the case offers a comprehensive exploration for educators in leadership, strategy, technology, ethics, and governance.
In a world where attention is a scarce commodity, this case explores the meteoric rise of TikTok-an app that transformed from a niche platform for teens into the most visited domain by 2021-surpassing even Google. Its algorithm was a sophisticated mechanism for capturing and holding user attention: "When you gaze into TikTok, TikTok gazes into you." TikTok elevated the design of algorithmic systems to a new level, and outperformed U.S tech giants in user engagement. The app faced a host of challenges-including a complicated relationship with the Chinese government, legislative scrutiny in the U.S., and global concerns over data security-but grew unabated. How did TikTok design and operate a system that determined each person's preferences in culturally different countries? Is TikTok a threat to democracy in countries like India and the U.S.? Were the potential harms caused by TikTok any different from those caused by other social media companies? The case provides educators an opportunity to discuss the dynamics of global digital platforms, data security, cultural impact, and the geopolitical implications of technological advancements. It also covers the design of the TikTok algorithm, and the importance of the process and organization in making the algorithm effective.
Verve Therapeutics, a public biotech company based in Boston, created a novel approach to addressing cardiovascular disease (CVD) - a leading cause of deaths globally. The company's approach was a single shot treatment to permanently lower cholesterol, thus reducing the risk of heart attacks. Built on decades of post-doctoral and lab research led by CEO Sekar (Sek) Kathiresan, a trained cardiologist and academic, Verve used gene editing - akin to a molecular surgical procedure-for a curative intent. Not only had the medicine reached human trials in record time, but Verve incorporated new innovations that could allow the technology to be used more widely. The company successfully built a solid syndicate of investors and raised a total of $860 million. Unlike other gene editing or gene therapy companies that focused on rare diseases affecting small populations, Verve's approach was the first example of a gene editing treatment that could potentially benefit millions of people. Verve's lead investor was interested in creating Verve 2.0 and apply the company's expertise to cure a range of rare metabolic diseases. Should Sek continue to build out the core product aimed at treating heart disease, or should he apply the technology to other adjacent diseases? Would this be a potential distraction from Verve's core mission?
Since the early days of the internet, Taiwan had a vibrant community of civic hackers and open-source programmers who engaged with social issues. Audrey Tang was one of them. She spearheaded the 2014 Sunflower Student Movement in Taiwan, where protestors peacefully occupied the Parliament to demand greater transparency around a proposed trade deal with mainland China. As Taiwan's Digital Minister, Tang had been at the forefront of Taiwan's development as a digital democracy that leveraged information technology and citizen participation. Tang referred to democracy as a "social technology." Like any technology, Tang believed that democracy could be improved by people, and experimented with ideas to make democracy work better. She supported the notion that openness and transparency created mutual trust between the public and the government and allowed for collective action. In Taiwan, hackers were seen as partners to the government. Digital technology was used to solicit ideas, build consensus, assess public sentiment, and address both domestic and international misinformation. In early 2020, Tang is faced with the problem of a potentially growing COVID pandemic and the explosion of misinformation. How can Tang control both the pandemic and the infodemic while retaining the government's principles of cooperation and transparency?
This is part of a three-case series that follows Dulcie Madden's journey as a founder over five years. Case (A) is about managing growth and cash flow; Case (B) is about the exit decision and conditions on a sale; Case (C) shows Madden dealing with adversity and the choice of "when to quit?"
This is part of a three-case series that follows Dulcie Madden's journey as a founder over five years. Case (A) is about managing growth and cash flow; Case (B) is about the exit decision and conditions on a sale; Case (C) shows Madden dealing with adversity and the choice of "when to quit?"
This is part of a three-case series that follows Dulcie Madden's journey as a founder over five years. Case (A) is about managing growth and cash flow; Case (B) is about the exit decision and conditions on a sale; Case (C) shows Madden dealing with adversity and the choice of "when to quit?" Rest Devices was an early stage company started by a team from MIT. Dulcie Madden decided to drop out of her MBA program at MIT to join Rest Devices as the CEO and use the innovative sensor technology to create a line of smart clothing for babies. A partnership with Babies "R" Us became a strong catalyst to growth but left Rest Devices increasingly constrained by additional capital requirements necessary to deliver the required number of units. As Madden tried to balance the need for capital with the business pressures of delivering a product for a nationwide rollout, she was faced with a series of strategic choices including an unsolicited acquisition offer, a lawsuit, failed financing, and tensions within the team.
Crafting a Founders' Agreement is an important component of startup infrastructure as it documents a complex set of decisions that build a company's roots. Its four key elements are: roles and responsibilities, rights (decision rights, rewards, position on board), commitments (time invested, IP, capital contributions) and contingencies (vesting, stock restrictions etc.). The note indicates how to initiate a conversation with a co-founder, and provides a checklist of questions that can serve as a conversation guide to draft the Founders' Agreement together. An Agreement is a framework, not a rigid set of rules set in stone, and should be created with flexibility and conditionality. The process of creating and revising the Agreement can bring clarity and commitment to the founding team and enable them to anticipate conflicts before they hurt the business. A Sample Agreement is included.
The case examines the focus of an early stage company, and how venture capital can distort a founder's view. It encompasses issues such as financing, founders' definition of success/failure, defining and pivoting a business model, organizational impact of a pivot, as well as the role of VCs and Boards in outlining company strategy. In 2008, Jason Jacobs, a fitness and technology enthusiast created RunKeeper - an iPhone app to track a runner's distance, speed, calories and route taken. In its initial years, RunKeeper was a fast growing, profitable company and did not utilize the $1.5 million it raised in its Seed and Series A rounds. As RunKeeper gained momentum, Jacobs created a grand health vision (Health Graph) that would increase the chances of securing VC funding. Heralded as the "Facebook of Fitness," RunKeeper willed itself to be the one-stop location for all important health information for consumers. Despite raising $10 million, the next few years were turbulent. RunKeeper became allergic to revenue, ramped up its burn, and tried to pursue both the running app and the Health Graph - but did neither well. At the end of the case, the company is almost out of cash, and Jacobs has exhausted his prospects for raising external capital. He needs to revert to his current investors to keep the company afloat. Jacobs' instinct suggests that refocusing the company on its core product and runner base would be the best way forward. However, the last round was raised on the promise of a big health vision. Jacobs wonders whether his current investors would fund a smaller vision and how onerous the terms would be. Would they push Jacobs to pursue a sale in an over-crowded health app market? Or would they decide that he was not the right person for the company?