In October 2018, Christie's auction house sold an artwork called Portrait of Edmond Belamy produced by an artificial intelligence (AI) model built to generate new works after being trained on upward of thousands of paintings. Three years later, Jason Allen submitted an art piece to the "digital art/digitally manipulated photography" division of an art competition associated with the 2022 Colorado State Fair. The submission relied heavily on a generative AI program called Midjourney. Instead of collecting works of art or fine-tuning an AI model, Midjourney and other modern text-to-image generators simply accepted a text description (or prompt) of an image and produced an image reflective of that description. These examples point toward the capabilities of AI image generators but also raise several questions: How do these image generators even work? And how did the system used to create Portrait of Edmond Belamy evolve into text-to-image AI art generators like Midjourney? Given the rise of these tools, what are some of the artistic, societal, and legal implications of AI art generators? Regardless of one's perspective on the previous questions, it is hard to argue that art-generating AI models will not find their way into our lives to some degree. By understanding more about the technology of image generating tools and the different perspectives on them, managers can be better prepared to navigate these tools' uses and effects-including risks. This preparation will prove helpful as innovative ideas and technological advancements are introduced to more industries, sparking more conversations between decision-makers with a range of perspectives. While AI art and the questions it raises are interesting, the money is in the tools. That means instead of programs like AI art generators designed to process certain inputs and produce certain outputs, business managers across industries will look toward tools for which generative AI art has laid a foundation, and they'll
The Internet of Things (IoT) can be generally explained as a network of objects that collect data regarding their surroundings and transfer it across the internet to various systems. While the addition of more information can improve a business's data-driven decision-making, should all businesses invest in IoT? This technical note explores IoT technologies, discusses how it is being used in industry, brings up some advantages and disadvantages, offers examples and applications, and investigates how it might fit into a firm's digital strategy.
A digital twin can be described as a digital model of a real object. But what makes it different from a prototype? How are digital twins being used in business settings outside manufacturing? How do they relate to machine learning and augmented reality? When digital twins meet their fullest potential, they enable businesses to optimize operations by managing their assets more efficiently, generating higher productivity, and reducing costs. This technical note discusses some potential benefits and drawbacks of digital twins; offers examples in aviation, car manufacturing, and agriculture; and helps managers identify where implementing the technology may be useful.
Logistic regression is a modeling technique often used to predict a binary variable-a variable coded as 1 if an event of interest occurs (e.g., a borrower defaults on a loan) and coded as 0 otherwise. This note details how logistic regression applies the logistic function to generate a probability forecast for a binary event. It also includes an example of how to fit a logistic regression model to loan default data using StatTools (an Excel add-in). The StatTools output is then used to predict a loan's default as a function of the borrower's credit score.
As companies increasingly rely on data and analytics to make decisions and improve their operations, it becomes clear that data quality can be a major issue. The insights gleaned from data analytics are useful only if the data is of high quality. This note summarizes some common issues that can result in low-quality data and explores the crucial task of data engineering. This note is taught at Darden in the second-year elective, "Digital Operations." It would also be suitable in a course or module covering data analytics.
This technical note provides a general overview of cloud computing and how businesses are using it to achieve their digital goals. Specifically, this note outlines the different service models that cloud computing supports, including Infrastructure-as-a-Service, Platform-as-a-Service, and Software-as-a-Service. The note also details some of the advantages and disadvantages of cloud computing, and it reviews the cloud computing industry and some of its top players.
This technical note details virtual and augmented reality and how these immersive technologies can be used in business. Specifically, this note highlights several business applications of both virtual and augmented reality-describing each technology's advantages and disadvantages. Various companies have uncovered new and interesting applications using these technologies, ranging from assisting distribution-center workers to improving learning and development. This note is taught at Darden in the second-year course, "Digital Operations." It would also be suitable in a course focused on technology strategy covering virtual and augmented reality or a module on the metaverse.
This technical note offers an overview of artificial intelligence (AI) and some ways in which machine learning is leveraged to build this type of intelligence. The note also explores two schools of thought around AI's potential, discusses the possible roles for AI in business, highlights opportunities for collaboration between employees and AI, and considers one approach around responsible AI implementation. This note is taught at Darden in the second-year Digital Operations course. It would also be suitable in a module covering AI.
This technical note examines two common methods for estimating customer demand using historical data observations that are constrained by availability. First, this note explains how to leverage the averaging method for unconstraining the demand of airline ticket bookings. Next, the expectation-maximization algorithm is introduced using historical product sales data. Both examples rely on fictious data, and an accompanying Excel workbook provides example calculations for both methods highlighted in this technical note.
Google Cloud Platform offers BigQuery ML, a popular cloud computing resource for developing data models. This note provides information about creating, evaluating, and deploying data models with BigQuery ML.
Machine-learning (ML) models have become a common tool used across a multitude of industries to help people make decisions. As these models have increased in predictive power, many have also grown in complexity. The pursuit to improve the accuracy of predictions has diminished the interpretability of many models-leaving users with little understanding of the model's behavior or trust in its predictions. The field of eXplainable artificial intelligence (XAI) seeks to encourage the development of interpretable models. Google Cloud Platform offers two Explainable AI functions in BigQuery ML that allow users to examine the attribution of model features, which aids in model behavior verification and bias recognition. One of the functions available in BigQuery provides a global perspective on the features used to train the model, while the second function examines local feature attribution associated with individual predictions in more detail. This note offers an overview of Explainable AI in BigQuery ML, using as an example a (fictional) realtor's linear regression model that predicted a home's latest sale price based on predictor variables such as the total tax assessment from the year of the last sale, the square footage of the house, the number of bedrooms, the number of bathrooms, and whether the condition of the home is below average. After training the linear model, the feature attribution can be studied from a global and local perspective in BigQuery.
Google Cloud Platform (GCP) offers a popular cloud computing resource for writing Structured Query Language (SQL) queries. The resource is called BigQuery. This note provides step-by-step information about accessing BigQuery and using it to upload and explore data. It also includes a quick-start guide.
Tableau is a powerful visualization software that supports the data analysis process. The program allows users to create custom visualizations from datasets of all sizes and types using a simple drag-and-drop interface. This note introduces new users to Tableau by providing guidance on connecting to data, exploring the interface, and creating a number of common visualizations.
Google Cloud Platform (GCP) offers Cloud Storage, a popular cloud computing resource for data storage. This note provides information about uploading data to Cloud Storage, creating BigQuery tables from data found in Cloud Storage, previewing files in Cloud Storage, and sharing Cloud Storage buckets.
This exercise explores customer transaction data generated from a business owner's website and illustrates the added benefit of basic data analytics practices used to uncover business insights. Students discover the business's purchase trends when they answer the provided questions by writing SQL queries. The questions guide students to determine which product offerings the business should promote and which customer segments to target. Additionally, the case discusses the common relational database design that is often associated with transactional data and its metadata.