This is an MIT Sloan Management Review article. The way we think about strategy is woefully incomplete, the authors contend. The traditional idea of focusing on the positioning of products (or services) underplays much of what most would agree makes a company truly competitive. Not only does it give short shrift to what a company knows, it ignores completely the fact that in today's dynamic economy, organizations have to continually reinvent who they are and what they do in large and small ways. And one important means of doing so is through innovation. An effective strategy, then, is comprised of three key components: product/market, knowledge and innovation positions. But even if a company masters the three strategic positions of product/market, knowledge and innovation independently, it is still at risk. Only when all three positions are aligned and mutually reinforcing can a strategy succeed. In adopting the notion of alignment, organizations need to view each position -- product/market, knowledge and innovation -- as aspects of an organization's overall strategy. Creating an integrated strategy thus requires focusing not on each position separately, but rather on all three positions simultaneously. The authors introduce the notion of competing based not only on what an organization makes or the service it provides, but on what it knows and how it innovates. Each aspect represents a competitive position that must be evaluated relative to the capabilities of the organization and to others in the marketplace battling for the same space. And each component must not only be aligned with the other two, but it needs to be adjusted as circumstances warrant. When done correctly, organizations -- such as Buckman Laboratories, which is profiled here -- thrive. When done badly, the company can suffer, and perhaps fatally so, as the history of Polaroid points out.
By using standard components, Compaq's supercomputer line, known as the Alpha Server SC, attempted to avoid some of the risk associated with supercomputer development. However, more than five years after unveiling its first supercomputer, known as Turbozilla, Compaq's High Performance Technical Computing Group had yet to make a profit, despite rapidly growing demand. The bidding process for large government contracts typically resulted in discounts of up to 70 per cent, leaving little, if any, margin. More importantly, some Compaq managers had become anxious that HPTC systems had begun to consume significant company resources. Alpha servers that had previously been allocated to high margin commercial customers were increasingly being diverted to zero margin supercomputers. Highlighted is the stage by stage process of developing a new product development organization. In doing so, it demonstrates that an important capability and competitive advantage for effective knowledge transfer in global new product development is the ability to construct a network of relationships that are built over time, that lead to the creation of social capital, and that are strong and extensive enough to inhibit duplication by others. Supplement Compaq High Performance Computing (B), product 9B03M042 is available.
Modine Manufacturing operates primarily in a single product category consisting of the manufacture and sale of heat transfer equipment. A major customer announced the cancellation of an agreement with Modine to develop a key engine component, which was needed by automotive companies in order to make their engines comply with new emissions guidelines set to take effect in several years. However, the expectation that the government would relax emissions guidelines was believed to have led to the cancellation of the project. Would other customers do the same? Considerable resources were spent on this project, causing projects that would be handled by the new product development area to be developed outside of this unit. The vice-president of technical services must analyse the company's new product strategy to determine its effectiveness in developing new products and what model the company should use for product development.
This note discusses the role of supercomputing in nuclear weapons research, from the first UNIVAC system installed in 1953 to the planned installation of a 100 TeraOPS system at Lawrence Livermore Laboratories in 2004. Topics include military and civilian uses for supercomputers, the evolution to parallel processing, the growing importance of open source software such as Linux, and emergent scientific uses, such as genetic sequencing and AIDS research. Traditionally, supercomputers were employed to simulate complex processes that occurred over a specific period of time. For example, a nuclear explosion had a beginning (detonation) and an end. More and more, however, applications began to shift toward analyzing immense databases that contained a seemingly endless number of variables. Despite significant advances in technology, the most powerful computers available still cannot reliably predict the weather for the next week in any significant detail.