學門類別
哈佛
- General Management
- Marketing
- Entrepreneurship
- International Business
- Accounting
- Finance
- Operations Management
- Strategy
- Human Resource Management
- Social Enterprise
- Business Ethics
- Organizational Behavior
- Information Technology
- Negotiation
- Business & Government Relations
- Service Management
- Sales
- Economics
- Teaching & the Case Method
最新個案
- A practical guide to SEC ï¬nancial reporting and disclosures for successful regulatory crowdfunding
- Quality shareholders versus transient investors: The alarming case of product recalls
- The Health Equity Accelerator at Boston Medical Center
- Monosha Biotech: Growth Challenges of a Social Enterprise Brand
- Assessing the Value of Unifying and De-duplicating Customer Data, Spreadsheet Supplement
- Building an AI First Snack Company: A Hands-on Generative AI Exercise, Data Supplement
- Building an AI First Snack Company: A Hands-on Generative AI Exercise
- Board Director Dilemmas: The Tradeoffs of Board Selection
- Barbie: Reviving a Cultural Icon at Mattel (Abridged)
- Happiness Capital: A Hundred-Year-Old Family Business's Quest to Create Happiness
Generative AI Value Chain
內容大綱
Generative AI refers to a type of artificial intelligence (AI) that can create new content (e.g., text, image, or audio) in response to a prompt from a user. ChatGPT, Bard, and Claude are examples of text generating AIs, and DALL-E, Midjourney, and Stable Diffusion are examples of the image-generating variety. During training, a generative AI learns the underlying structure of the desired output by absorbing a mass of relevant data - all of the books in the public domain, for example, or petabytes of text scraped from across the internet. Once trained, generative AIs work by creating outputs that recreate, with calculated variation, the underlying patterns learned in training. In 2023, all these types of generative AI were created in a similar process. At the core of any generative AI system is the model, a mathematical representation of patterns that forms the basis of 'knowledge' for the system. The structure of the model is determined by its architecture, the theoretical organization of parameters in an artificial neural networks that the system uses to generate its outputs. To learn, the model relies on a mountain of training data, a collection of examples relevant to the task the model is being trained to perform. During an initial pre-training process, the model learns to adjust its parameter-weights (assumed by the architecture), improving its prediction quality with many iterations over time; that model is further refined through a fine-tuning process. Training an AI system requires specialized hardware, like GPUs in data centers, that consume enormous amounts of electricity to handle heavy and massively-parallel computational loads.