學門類別
哈佛
- General Management
- Marketing
- Entrepreneurship
- International Business
- Accounting
- Finance
- Operations Management
- Strategy
- Human Resource Management
- Social Enterprise
- Business Ethics
- Organizational Behavior
- Information Technology
- Negotiation
- Business & Government Relations
- Service Management
- Sales
- Economics
- Teaching & the Case Method
最新個案
- A practical guide to SEC ï¬nancial reporting and disclosures for successful regulatory crowdfunding
- Quality shareholders versus transient investors: The alarming case of product recalls
- The Health Equity Accelerator at Boston Medical Center
- Monosha Biotech: Growth Challenges of a Social Enterprise Brand
- Assessing the Value of Unifying and De-duplicating Customer Data, Spreadsheet Supplement
- Building an AI First Snack Company: A Hands-on Generative AI Exercise, Data Supplement
- Building an AI First Snack Company: A Hands-on Generative AI Exercise
- Board Director Dilemmas: The Tradeoffs of Board Selection
- Barbie: Reviving a Cultural Icon at Mattel (Abridged)
- Happiness Capital: A Hundred-Year-Old Family Business's Quest to Create Happiness
Google Cloud Platform: BigQuery Explainable AI
內容大綱
Machine-learning (ML) models have become a common tool used across a multitude of industries to help people make decisions. As these models have increased in predictive power, many have also grown in complexity. The pursuit to improve the accuracy of predictions has diminished the interpretability of many models-leaving users with little understanding of the model's behavior or trust in its predictions. The field of eXplainable artificial intelligence (XAI) seeks to encourage the development of interpretable models. Google Cloud Platform offers two Explainable AI functions in BigQuery ML that allow users to examine the attribution of model features, which aids in model behavior verification and bias recognition. One of the functions available in BigQuery provides a global perspective on the features used to train the model, while the second function examines local feature attribution associated with individual predictions in more detail. This note offers an overview of Explainable AI in BigQuery ML, using as an example a (fictional) realtor's linear regression model that predicted a home's latest sale price based on predictor variables such as the total tax assessment from the year of the last sale, the square footage of the house, the number of bedrooms, the number of bathrooms, and whether the condition of the home is below average. After training the linear model, the feature attribution can be studied from a global and local perspective in BigQuery.