Business Impact
Creating a Culture of Ideas
Nicholas Negroponte says expertise is overrated. To build a nation of innovators, we should focus on youth, diversity, and collaboration.
Innovation is inefficient. More often than not, it is undisciplined, contrarian, and iconoclastic; and it nourishes itself with confusion and contradiction. In short, being innovative flies in the face of what almost all parents want for their children, most CEOs want for their companies, and heads of states want for their countries. And innovative people are a pain in the ass.
Yet without innovation we are doomed-by boredom and monotony-to decline. So what makes innovation happen, and just where do new ideas come from? The basic answers-providing a good educational system, encouraging different viewpoints, and fostering collaboration-may not be surprising. Moreover, the ability to fulfill these criteria has served the United States well. But some things-the nature of higher education among them-will have to change in order to ensure a perpetual source of new ideas.
One of the basics of a good system of innovation is diversity. In some ways, the stronger the culture (national, institutional, generational, or other), the less likely it is to harbor innovative thinking. Common and deep-seated beliefs, widespread norms, and behavior and performance standards are enemies of new ideas. Any society that prides itself on being harmonious and homogeneous is very unlikely to catalyze idiosyncratic thinking. Suppression of innovation need not be overt. It can be simply a matter of people’s walking around in tacit agreement and full comfort with the status quo.
A very heterogeneous culture, by contrast, breeds innovation by virtue of its people, who look at everything from different viewpoints. America, the so-called melting pot, is seen by many as having no culture (with either a capital C or a lowercase c). In rankings of students in industrial countries, U.S. high school students come across as average, at best, in reading, mathematics, and science. And unfortunately, the nation is unrivaled in gun-related crimes among young people. Yet, looking back over the past century, the United States has accounted for about a third of all Nobel prizes and has produced an unrivaled outpouring of innovations-from factory automation to the integrated circuit and gene splicing-that are the backbone of worldwide economic growth.
I see two reasons for this. One is that we do not stigmatize those who have tried and been unsuccessful. In fact, many venture capitalists are more, not less, likely to invest in somebody who has failed with an earlier startup than in someone who is launching his or her first company. The real disappointment is when people do not learn from their mistakes.
The other reason is that we are uniquely willing to listen to our young. In many cultures, age carries too much weight. Experience is rewarded over imagination, and respect can be too deferential. In some cultures, people are given jobs on the basis of age, creating a sedentary environment stifling to the young. Remember the saying “Children are to be seen and not heard”? Well, look at the economic growth created by such “children” as Bill Gates and Michael Dell, to name just two.
That’s the good news. But when it comes to nurturing our youth, we have to do better. I am especially concerned about early education, which can (and usually does) have a profoundly negative effect on creativity. In the race to understand what children learn, we are far too enthusiastic about celebrating their successes. What is more fascinating is what children do wrong. Even the concept of “wrong” should get some attention. Though the wind is not made by leaves flapping, as some children guess, the theory is sufficiently profound that it should not be dismissed out of hand. In fact, disassembling erroneous concepts is one of the best ways to find new ideas. The process is akin to debugging a computer program and has almost nothing to do with drill and practice (which is once again becoming a cornerstone of schooling).
Our biggest challenge in stimulating a creative culture is finding ways to encourage multiple points of views. Many engineering deadlocks have been broken by people who are not engineers at all. This is simply because perspective is more important than IQ. The irony is that perspective will not get kids into college, nor does it help them thrive there. Academia rewards depth. Expertise is bred by experts who work with their own kind. Departments and labs focus on fields and subfields, now and then adding or subtracting a domain. Graduate degrees, not to mention tenure, depend upon tunneling into truths and illuminating ideas in narrow areas.
The antidote to such canalization and compartmentalization is being interdisciplinary, a term that is at once utterly banal and, in advanced studies, describes an almost impossible goal. Interdisciplinary labs and projects emerged in the 1960s to address big problems spanning the frontiers of the physical and social sciences, engineering, and the arts. The idea was to unite complementary bodies of knowledge to address issues that transcended any one skill set. Fine. Only recently, however, have people realized that interdisciplinary approaches can bring enormous value to some very small problems and that interdisciplinary environments also stimulate creativity. In maximizing the differences in backgrounds, cultures, ages, and the like, we increase the likelihood that the results will not be what we had imagined.
Two additional ingredients are needed to cultivate new ideas. Both have to do with maximizing serendipity. First, we need to encourage risk. This is particularly hard in midcareer and often flies in the face of peer review and the mechanisms for corporate advancement. This is simply because risk, on its own, can look pretty stupid. People who look around corners are exposed to failure and ridicule, and thus they must find buoyancy, or support, within their own environment. If they don’t, counterintuitive ideas will remain so.
The second ingredient is encouragement for openness and idea sharing-another banality nearly impossible to achieve. At the digital bubble’s peak, being open about ideas was particularly hard for computer scientists because people saw riches coming from not sharing their ideas. Students would withhold ideas until after graduation. As one person held his or her cards close, another followed, and as a result, many research labs declined in value and effectiveness. In this regard, thank God the bubble has burst.
Not so many years ago, Bell Labs conducted so much research it could easily house some very high-risk programs, including the so-called blue-sky thinking that led to information theory and the discovery of the cosmic microwave background radiation. But the world benefited, and sometimes AT&T did too.
Now, Bell Labs is a shadow of its former self, subdivided several times through AT&T’s 1984 divestiture and subsequent split into Lucent, NCR, and the parent firm. Moreover, it is not alone. As the economy sags and companies trim their expenses, some of the first cuts are in high-risk or open-ended research programs. Even if the research budget does not drop, the nature of projects is prone to be more developmental than really innovative. If the trend continues, eventually we will suffer a deficit of new ideas. Already, fewer and fewer big corporations are focusing on new ideas. And the formation of startups has come almost to a standstill.
More than ever before, in the new “new economy,” research and innovation will need to be housed in those places where there are parallel agendas and multiple means of support. Universities, suitably reinvented to be interdisciplinary, can fit this profile because their other “product line,” besides research, is people. When research and learning are combined, far greater risks can be taken and the generation of ideas can be less efficient. Right now, only a handful of U.S. universities constitute such “research universities.” More will have to become so. Universities worldwide will have to follow.
Industry can outsource basic research, just as it does many other operations. That means innovation has to become a precompetitive phenomenon-something Japan understood in the early 1980s, when its Ministry of International Trade and Industry (now the Ministry of Economy, Trade, and Industry) funded Japanese companies’ collaboration on robotics, artificial intelligence, and semiconductor manufacturing. While this approach does not always work, it can be far more effective than most companies assume. Costs are shared, different viewpoints are nourished, and innovation stands a chance for survival in even the worst economic times.
The ability to make big leaps of thought is a common denominator among the originators of breakthrough ideas. Usually this ability resides in people with very wide backgrounds, multidisciplinary minds, and a broad spectrum of experiences. Family influences, role models, travel, and living in diverse settings are obvious contributors, as are educational systems and the way cultures value youth and perspective. As a society, we can shape some of these. Some we can’t. A key to ensuring a stream of big ideas is accepting these messy truths about the origin of ideas and continuing to reward innovation and celebrate emerging technologies.