About the Book :: Table of Contents :: Excerpt :: Contributors

Excerpt from Creating a Learning Culture: Strategy, Practice, and Technology. Marcia Conner and James G. Clawson (editors). Cambridge University Press; September 2004. Available in Paperback and Hardcover

Chapter 1

Leading and Learning with Nobody in Charge

Harlan Cleveland (1918-2008), political scientist and public executive, was president emeritus of the World Academy of Art and Science. He served as a United Nations relief administrator in Italy and China, a Marshall Plan executive, a magazine editor and publisher, assistant secretary of state, and U.S. ambassador to NATO. As an academic leader he twice was an academic dean and once a university president (the University of Hawaii). He wrote hundreds of magazine and journal articles, and was author or coauthor of 12 books on executive leadership and international affairs. His last book was Nobody in Charge: Essays on the Future of Leadership (Jossey-Bass, 2002). He earned his bachelor’s degree from Princeton University and was a Rhodes Scholar.

If we raise our periscopes for a 360-degree look around, we see that the pyramids and hierarchies of years past are rapidly being replaced with networks and uncentralized systems.

In these systems, larger numbers of people than ever take initiative, make policy, collaborate to point their organizations’ ways forward, and work together to release human ingenuity and maximize human choice. These people’s actions are not, for the most part, the result of being told what to do. They are the consequence, not of command and control, but of consultation, of relationships that are intermixed, interwoven, and interactive.

This is the state of affairs that led me to describe the most advanced form of human organization as a nobody-in-charge system. That phrase, which became a book title, was not wholly tongue-in-cheek; it was a way of describing the style of leadership that was already a strong trend as we moved into the twenty-first century.

In the last quarter of the twentieth century, this trend was driven by the sudden convergence of ever faster, more retentive computers with rapidly spreading, increasingly wider-band telecommunications—a dynamic complexity that gets more dynamic and more complex with each passing year.

It is clear by now, as only a few futurists were forecasting in the 1970s, that information is the world’s dominant resource, taking the role that has been played successively in history by such physical resources as labor, stone, bronze, minerals, metals, and energy.

But information—refined by rational thinking into knowledge, converted by both intuition and reasoning into wisdom—is fundamentally different from all its predecessors. Consider these five propositions. Information is not necessarily depletive: it expands as it’s used. It is readily transportable, at close to the speed of light—or, by telepathy and prayer, even faster than that. It leaks so easily that it is much harder to hide and to hoard than tangible resources ever were; it cannot be owned (only its delivery service can). The spread of information, converted into knowledge, empowers the many by eroding the influence that once empowered the few who were “in the know.” And giving or selling information is not an exchange transaction; it’s a sharing transaction. 

These deceptively simple propositions, as they sink in around the world and down the generations, require new kinds of learning in every intellectual discipline and the rethinking of every inherited tradition. The same is true, with special emphasis, of future leadership in organizations. Organizations are essentially products of the mind and spirit, expressions of what is thought, imagined, and believed about relationships among people, and thus a rich source of relearning experiences.

A Personal Note

I came to my own relearning experience from a lifetime as a public executive—in the federal government, magazine publishing, university administration, U.S. diplomacy, and international organizations—and, over the same span of time, as a political scientist trying to capture and record what I was learning from experience about organizations and leadership.

I had often observed, for example, that large organizations needed to be loosely structured in order to work at all. The bosses of totalitarian governance, whether fascist or communist, never came to terms with this axiom; their rigidities seemed to lead to their downfall. 

Coming of age in the U.S. government, I often felt like a kind of entrepreneur in the bureaucratic jungle. As I studied its fancies and foibles from inside and outside for half a century, I came to realize that pyramids are not the natural form of organization—as cultures long submissive to monarchs or emperors had evidently come to believe. 

Late in the eighteenth century, the leaders of the 13 American colonies not only declared their independence in human-rights language that reads pretty well in the twenty-first century. They also drafted a Constitution that departed dramatically from the pyramids of power and oppression the colonists had learned to despise. Indeed, they created the basis for a nobody-in-charge society—quite literally a unique experiment in uncentralized governance.

The separation of powers with its checks and balances were designed to deny any part of our federal government the chance to make too much yardage at the expense of the other parts—and of the people it was supposed to serve. The federal system itself was designed to create a continuous tussle between the states and the central government. The tussle was intended to be permanent; no part of the system was ever supposed to win. 

Looking back on this in 2000, I realized that this way of thinking might well have global implications in the new century. It is not just the durability of their extraordinary invention that testifies to the founders’ wisdom. It is clear from the record they left that they—at least, the deepest thinkers among them, James Madison and Thomas Jefferson—knew just how unprecedented was the system they were proposing to build. The people were really supposed to be sovereign. Jefferson still believed this even after his eight years of trying, as President from 1801 to 1809, to be their “servant leader.”

“I know of no safe depository of the ultimate powers of the society but the people themselves,” Thomas Jefferson wrote to a friend in 1820, “and if we think them not enlightened enough to exercise their control with a wholesome discretion, the remedy is not to take it from them, but to inform their discretion.”

What’s truly astonishing is that now, at the beginning of this new century, the practical prospect for a workable world seems to lie in reinventing their nobody-in-charge concept forglobal application.

The real-life management of peace worldwide seems bound to require a Madisonian world of bargains and accommodations among national and functional factions, a world in which people are able to agree on what to do next without feeling the need (or being dragooned by some global government) to agree on religious creeds, economic canons, or political credos. A practical pluralism, not a unitary universalism, is the likely destiny of the human race.

The Twilight of Hierarchy

In the century to which we just said goodbye, we learned again and again that complex social systems work badly if they are too centralized. Seismic changes in styles of leadership are shifting nearly everywhere from top-down vertical relationships toward horizontal, consensual, collaborative modes of bringing people together to make something different happen.

The complexities of modern life, and the interconnectedness of everything to everything else, mean that in our communities, our nations, and our world, nobody can possibly know enough to be in general charge of anything important or interesting. This state of affairs is becoming more apparent with each passing year. It may be one reason why, more and more, the “followers”—especially university students and educated adults—seem so often to come forth with policy judgments while their established “leaders” are still making up their minds. 

That isn’t the way it was when physical resources were dominant. When the few had access to key resources and the many did not, there never seemed to be enough to go around. This made possible—perhaps even necessary—the development of hierarchies of five kinds: hierarchies of power based on control (of new weapons, of new transport vehicles, of trade routes, of markets, of communications, and even of knowledge, back when secrets could be secure); hierarchies of influence based on secrecy; hierarchies of class based on ownership; hierarchies of privilege based on early access to particular pieces of land or currently valuable resources; and hierarchies of politics based on geography.

Each of these five bases for hierarchy and discrimination began crumbling in the waning years of the twentieth century—because the old means of control were of dwindling efficacy, secrets were harder and harder to keep (as the CIA and the White House relearned every few weeks), and ownership, early arrival, and geography were of declining importance in accessing, remembering, analyzing, and using the knowledge and wisdom that are the truly valuable legal tender of our time.

Drift Toward Uncentralization

The drift toward uncentralized systems began to take hold of our destiny in the second half of the twentieth century. Just below the surface in every kind of organization, something important was happening, something very different from the vertical practice—recommendations up, orders down—of both public administration and business management. The “bright future for complexity,” foretold in a 1927 New Yorker piece by E. B. White, had come to pass, prodded and speeded by the modern miracles of information technology. 

This was not a temporary aberration from some centralized norm. It was happening because information had recently become the world’s dominant resource. With every generation of information technology—that is, every two or three years—our future becomes more uncentralized. This has to be good news for individual creativity and invention, for personal freedom, for human choice. 

The twilight of hierarchy means that we need new kinds of leaders. The new era requires leaders who are nonstop learners and will eagerly share what they learn. It requires leaders who learn, early and often, how to fuse chaos and order in uncentralized systems.

The century just past thus gave rise to a dichotomy between how organizations are described and how they actually work. Many organizations, for instance, still look like pyramids from a distance; but both their internal processes and their external relations feature much less order–giving and much more consultation and consensus. The sheer complexity of what has to get done—by governments and corporations, and by their myriad contractors, subcontractors, and nonprofit critics and cheerleaders—means that huge numbers of people exercise independent judgment and consult with each other and with outsiders; they don’t just do as they’re told.

Naturally, the search has been on for alternatives to centralization as an organizing concept. The first and seemingly obvious candidate was decentralization. It turned out, however, that most of the central administrators who opted to decentralize found, to their satisfaction, that this was a new way to preserve hierarchy. If things were becoming so complicated that grandpa could no longer understand it all, he could still subdivide and parcel out the work to be done—while hanging onto central control with more and more creative accounting systems. 

Decentralization thus became an aspect, indeed a subhead, of centralization. The real opposite of centralization is of course uncentralization. Decentralizing is arranged from the top, by delegation of authority. Uncentralization features, indeed encourages, imaginative initiative and entrepenourship from all members of an organization, whatever their hierarchical rank. Mao Tse-tung played with this idea for a time; he called it “many flowers blooming.” He then pulled back when it became clear that if China’s government really permitted the free exercise of opinion and initiative, the Communist Party’s central control would be the first casualty. 

Despite the trend toward looser, less hierarchical organizational systems, for most twentieth-century people the image of good organization was still a pyramid. In corporations, organization charts were drawn following Max Weber’s model of bureaucracy. Nonprofit agencies usually did likewise; they assumed that organizations making a profit must be doing something right. 

In government, the pyramid’s top tier was typically staffed by political executives, with serried ranks of civil servants—servants expected to be civil to politicians—arranged in the lower tiers. 

Organized religion had likewise developed hierarchical trappings—that’s what “organized” was taken to mean. Holy men (and in some denominations, grudgingly, holy women) were in the pulpit; affluent laypersons served as middle managers; parishioners in the pews were expected to be religious but not self-organized. Labor unions, despite their more egalitarian vocabulary, often had the look and feel of pyramids. And so did many social service agencies—though few went so far as the Salvation Army did in using military titles and uniforms. 

The marriage of computers and telecommunications multiplied the speed and extended to global range financial speculation, business transactions, military operations, political dissidence, and humanitarian activity. The widening access to information about what’s happening, about who is doing what to whom and when and where, brought into financial markets and business decisions and military strategy and political protest and even humanitarian relief a host of kibitzers, lobbyists, and second-guessers who knew so much—or could readily discover it on the Internet—that they had to be taken into account.

There are still, to be sure, distinctions between organizations where the style of management is looser and more collegial and others where recommendations mostly go up and orders mostly come down. But by the end of the twentieth century, all kinds of organizations—from military platoons to urban hospitals—were moving away from vertical administration toward more consultative styles of operation.

The Nature of Uncentralization

Uncentralized systems feature personal initiative, voluntary cooperation, joint ventures, committee work, and networking. Their workways are reinforced by the rapid progress of information technology and its impact on everything from preschool education to the understanding of our universe. 

Very large uncentralized systems, many of them global in scale, based on massive information outputs and widespread feedback, have been developed in the twentieth century. Global information systems unimaginable before the marriage of computers and telecommunications—currency and commodity markets, epidemic controls, automatic banking, worldwide credit cards, airline and hotel reservation systems, global navigation guidance, and the World Weather Watch come readily to mind—already seem normal, almost routine. 

It is no accident of history that each of these systems grew from the propensity of ambitious leaders to think hard about how the spread of knowledge could enable more and more people to solve problems by organizing information in imaginative ways—in other words, leading by learning.

In all these cases, there are commonly agreed standards, plus a great deal of uncentralized discretion. The same is true, even more true, of the international foreign exchange market and the Internet, now the world’s two most pervasive nobody-in-charge systems. Their common standards so far are mostly technical. Ethical standards for global human behavior await the social inventors of the twenty-first century.

It’s in the nature of uncentralized organization that every participant must be continuously in learning mode. It’s also natural that those who learn the most, and learn most rapidly, emerge as leaders. And part of what they learn is the necessity to teach their colleagues (regardless of rank) about what they’re doing together and how and, especially, why.

What is less certain, and most important as complexity increases, is how we develop our capacity to educate a growing proportion of our population to direct toward human needs and purposes our extraordinary talent for scientific discovery, our unexampled capacity to convert scientific insights into useful technologies, our bent toward doing what’s never been done before. We’ll need a rapidly growing cadre of get-it-all-together professionals, educated in integrative thinking.

Aptitudes and Attitudes

The spread of knowledge greatly influences the way people in modern organizations work together—working with rather than for each other. 

The executive leaders of the future will, I think, be marked by a set of attitudes and aptitudes that seem to be necessary for the leadership of equals, the key to the administration of complexity. They will be more reflective practitioners than the executives of the past. They will be low-key people, with soft voices and high boiling points. They will show a talent for consensus and a tolerance for ambiguity. They will have a penchant for unwarranted optimism. And they will find private joy in complexity and change. 

The work of executives often consists of meeting a series of unforeseeable obstacles on the road to an objective that can be clearly specified only when it has nearly been met. They try to imagine the unforeseen by posing contingencies and asking themselves how their organization systems would adjust if these chances arose. Of course, the planned-for contingency never happens; something else happens instead. The planning therefore does not produce a usable plan but something more precious: people better trained to analyze the unpredicted and to winnow out for the decision makers (who are almost always plural) the choices that would be too costly to fudge or postpone. 

This sort of system requires the participating experts and staff assistants to understand what it is like to be an executive leader, how it feels to frame a decision that will stick. But it also demands that the decision makers themselves participate in the staff work, try to understand the expert testimony, measure the options and filter the imagined consequences of each through their best computers, which are their own brains. 

Even in collective research or policy making by committee, the breakthrough ideas often turn out to be the product of one person’s advance brooding, reading, consulting, and learning—of someone’s sudden inspiration that assembles in a usable pattern the random data and partial reasoning of others. Anyone who has worked with organized systems has to be impressed with the capacity of the human brain to cope with complexity. Viewed as a sensitive computer not limited to quantified bits, the brain is able to take in a wide range of observations, weigh them according to their multiple relevance, store them in a memory of fantastic dimensions, retrieve them with high speed and reasonable accuracy, organize them into options, come up with a practical course of action, and transmit instructions to other parts of the body in a fraction of a second. 

An organization system is by definition too ramified for any one executive’s mind to encompass. But leaders can focus on the relations among its parts and its people, and they can concentrate their executive energy on the parts that don’t fit together and on the relationships that are not working well.

In the information-rich environment of the twenty-first century, the leader must therefore be reflective, not just by training but also by temperament. The leader who isn’t learning all the time, personally plowing through the analysis and trying to figure out what it means, is not making decisions but merely presiding while others decide. My experience has taught me that the obligation to think hard, fast, and free is the one executive function that can neither be avoided nor delegated to others. 

Your personality, your winning smile, your sexiness, or your attractive voice may seem persuasive leadership assets. But it’s by thinking and imagining that you can decide where you want to go, and persuade others to come along.

Mutual Adjustment

If all organizations are—slowly or rapidly—becoming nobody-in-charge systems, how will anything get done? How will we get everybody in on the act and still get some action? 

We will do it, I think, by minimizing, and clearly defining, what everybody must agree on—common norms and standards—and in all other matters maximizing each participant’s opportunity and incentive to use his or her common sense, imagination, and persuasive skills to advance the organization’s common purpose. This requires learning all the time.

It also requires, of course, that those who are going to pursue an organization’s purpose together be openly consulted not only about how they will pursue it but also about the purpose itself. Wisdom about uncentralized systems thus starts with a simple observation: most of what each of us does from day to day does not happen because someone told us to do it but because we know it needs to be done.

When you walk along a city street, you don’t collide with other pedestrians; you, and they, instinctively avoid bumping into each other. To generalize: any human system that works is working because nearly all of the people involved in it cooperate to make sure it works. 

Political scientist Charles Lindblom called this “mutual adjustment”: in a generally understood environment of moral rules, norms, conventions, and mores, very large numbers of people can watch each other, then modify their own behavior just enough to accommodate the differing purposes of others, but not so much that the mutual adjusters lose sight of where they themselves want to go. 

Imagine a large clump of people on either side of a busy downtown intersection, waiting for the traffic light to change before crossing the street. There is macro discipline here. The convention of the red light means the same thing—stop—to all the participants in this complexity, even though there is no physical barrier to violating the norm. Then the light turns green. It would be theoretically possible, with the help of a sizable staff of computer analysts, to chart in a central micro plan the passageway for each pedestrian to enable him or her to get to the other sidewalk without colliding with any other pedestrian. But not even the most totalitarian systems have tried to plan in such detail. 

What works is mutual adjustment: somehow those two knots of people march toward each other, and there are no collisions. Each person adjusts to the others, yet all reach their objective—a positive-sum game if there ever was one.

What enables mutual adjustment to work is the wide availability of opportunities to learn from relevant information—so each mutual adjuster can figure out what others might do under varied conditions and give forth useful signals about his or her own behavior.

Perhaps the best current example of mutual adjustment at work is the Internet—at least on a good day. People all over the world are exchanging information, images, music, and voice messages, with so little regulation that their “commerce” is often noncommercial—in effect, a multilateral barter system. Most of their transactions are not essentially exchanges but sharing arrangements. Where there are rules of behavior, they are increasingly arrived at by consensus among the participants, or at least ratified in action by those who will be guided by them. 

That doesn’t mean the rule-abiding citizens are serfs, doing some lord’s bidding; there’s no lord around there. If the rules work, it’s because nearly all those who need to abide by them are motivated to comply because the rules make sense to them.

The Value of Integrative Thinking

Civilization is rooted in compromise—between democracy and authority, between a free-market economy and a caring society, between openness and secrecy, between vertical and horizontal relationships, between active and passive citizenship. The required solvent for civilization is respect for differences. The art is to be different together.

Civilization will be built by cooperation and compassion, in a social climate where people of different groups can deal with each other in ways that respect their cultural differences. “Wholeness incorporating diversity” was philosopher John Gardner’s succinct formulation. The legend on U.S. currency is even shorter, perhaps because it’s in Latin: E pluribus unum(“from many, one”). Helping the many think of themselves as one, selling wholeness that can incorporate diversity, will be a central challenge for many different kinds of leaders in the twenty-first century.

When nobody can be in general charge, and some self-selected subset of everybody is partly in charge, the notion of educating for leadership morphs into educating for citizenship. In the upside-down pyramid, where the people ultimately do make the policy, leadership is continuous dialogue—not acts but interactions between those who lead and those who follow, the leaders and followers often being different mixes of citizens, depending on what is up for decision.

Learning is thus the drivewheel of organizational transformation in the informatized society. With information now the world’s dominant resource, the quality of life in our communities and our leadership in the world depend on how many of us (and which of us) get educated for the new knowledge environment—and how demanding, relevant, continuous, broad, and wise (not merely knowledgeable) that learning is. Integrative learning—learning how to get it all together— has to be the essence of education for leadership.

We are born with naturally integrative minds. I suspect that a newborn baby knows from the start, by instinct, that everything is related to everything else. Before the child is exposed to formal education, its curiosity is all-embracing. The child hasn’t yet been told about the parts and so is interested in the whole.

Children and young students are not shocked to learn that everything is related to everything else, that their destiny is somehow mixed up with the fate of the other six billion people (so far) with whom they share a vulnerable planet. It’s only later in life, after they have been taught about the world in vertical slices of knowledge, by different experts in separate buildings in unrelated courses of study, that they lose track of how it all fits together.

That’s why children ask more “why?” questions than anybody. It’s quite possible for even young children to learn to think in systems. They live with interdependence every day—in families and homerooms and the local public park, which is a very complex ecological system. The ambience of mutual dependence, the ambiguities of personal relations, and the conflicting ambitions of groups are the stuff of socialization from our earliest years.

If they’re encouraged to practice integrative thinking from their earliest years, the children who become leaders can tackle with less diffidence the Cheshire Cat’s first question: “Where do you want to get to?”

Everyone seems to know that “out there in the real world,” all the problems are interdisciplinary and all the solutions are interdepartmental, interprofessional, interdependent, and international. 

But the more we learn, ironically, the less tied together is our learning. It’s not situation-as-a-whole thinking, it’s the separation of the specialized kinds of knowledge that (like racial prejudice) must be “carefully taught.”

Jasmina Wellinghoff, a Twin Cities scientist and writer, wrote about her daughter:

When my six-year-old learned that we heat the house with forced air, she immediately wanted to know who is forcing the air, where natural gas comes from, and how it got stuck underground. After I did my best to explain all this, came the next question: ‘If we didn’t have natural gas, would we die in the winter?’ There you have it. Geology, engineering, physics, and biology, all together in a hierarchy of concepts and facts. She ended up studying the structure of the earth’s crust, combustion, hydraulics, and the classification of living beings—all in different years and quarters, neatly separated, tested, and graded.

Our institutions—including schools and colleges—start with a heavy bias against breadth. For a while it was a useful bias: the secret of the scientific revolution’s success was not breadth but specialized depth. Chopping up the study of physical reality into vertically sliced puzzles, each to be deciphered separately by different experts using different analytical chains of reasoning (“disciplines”), made possible the modern division and specialization of labor.

But one thing led to another, as E. B. White thought it would (“Have you ever considered,” he wrote in the 1920s, “how complicated things can get, what with one thing always leading to another?”). The resulting complexity now makes it imperative that these differing analytical systems be cross-related in interdisciplinary thinking and coordinated action. Those who would lead must therefore learn to think integratively.

A New Core Curriculum?

The trouble is that schools and colleges, and especially graduate schools, are geared more to categorizing and analyzing the patches of knowledge than to stitching them together—even though the people who learn how to do that stitching will be the leaders of the next generation. What should we be helping them learn, for this purpose, during the years they are full-time learners?

Most of us who are now parents or grandparents were not exposed early and often to a tangle of cultures, currencies, conflicts, and communities. To our schoolchildren from now on, learning about these complexities should be routine. But that will require a new emphasis on integrative thinking in our schools, our higher education, our popular culture, and at that ultimate educational institution, the family dinner table.

What we need now is a theory of general education that is clearly relevant to life and work in a context whose dominant resource is information—a rapidly changing scene in which uncertainty is the main planning factor. 

Perhaps, in the alternating current of general and job-oriented education, it is time for a new synthesis, a new “core curriculum”—something very different from Columbia’s World Civilization, Syracuse’s Responsible Citizenship, or Chicago’s Great Books, yet still a central idea about what every educated person should know, and have, and try to be. 

Such a curriculum is not going to have much to do with learning facts. It is said that each half hour produces enough new knowledge to fill a 24-volume edition of the Encyclopedia Britannica. But even if that much data could now be put on a single optical disk, that still would make it accessible only to those who already know what they are looking for. Besides, most of the facts children now learn in school are unlikely to be true for as long as they can remember them. The last time I took physics, in the 1930s, I was told the atom couldn’t be split. That information has not served me well in the nuclear era.

What budding leaders need above all are rechargeable batteries of general theory with which they can creatively process the shifting “facts” they encounter in a lifetime of experience. If we think hard about the requirements of the new knowledge environment and consult the instincts and perceptions of our own future-oriented students, I believe we could construct a new core curriculum from such elements as these:

Education in integrative brainwork: the capacity to synthesize for the solution of real?world problems the analytical methods and insights of conventional academic disciplines. (Exposure to basic science and mathematics, to elementary systems analysis, and to what a computer can and cannot do, are part, but only a part, of this education.) 

Education about social goals, public purposes, the costs and benefits of openness, and the ethics of citizenship to enable each prospective leader to answer for himself or herself two questions: “Apart from the fact that I am expected to do this, is this what I would expect myself to do?” and “Does the validity of this action depend on its secrecy?” 

A capacity for self-analysis: the achievement of some fluency in answering the question, “Who am I?” through the study of ethnic heritage, religion and philosophy, art and literature.

Some practice in real-world negotiation, the psychology of consultation, and the nature of leadership in the knowledge environment.

A global perspective and an attitude of personal responsibility for the general outcome—passports to citizenship in an interdependent world.

Uncentralized Leadership — A Checklist

How to conceive, plan, organize, and lead human institutions in ways that best release human ingenuity and maximize human choice is one of the great conundrums of the century ahead. Long-ago philosophy and recent history provide useful hints for leaders—the people who bring other people together in organizations to make something different happen. Here, by way of summary, are some hints from my own experience.

No individual can be truly “in general charge” of anything interesting or important. That means everyone involved is partly in charge. How big a part each participant plays will depend on how responsible he or she feels for the general outcome of the collective effort, and what he or she is willing to do about it.

Broader is better. The more people affected by a decision feel they were consulted about it, the more likely it is that the decision will stick.

Looser is better. The fewer and narrower are the rules that everyone must follow, the more room there is for individual discretion and initiative, small-group insights and innovations, regional adaptations, functional variations. Flexibility and informality are good for workers’ morale, constituency support, investor enthusiasm, customer satisfaction.

Planning is not “architecture,” it’s more like fluid drive. Real-life planning is improvisation on a general sense of direction, announced by a few perhaps, but only after genuine consultation with the many who will need to improvise on it.

Information is for sharing, not hoarding. Planning staffs, systems analysis units, and others whose full-time assignment is to think shouldn’t report only in secret to some boss. Their relevant knowledge has to be shared, sooner rather than later, with all those who might be able to use it to advance the organization’s purpose. (Some years ago Japanese auto companies—advised by a genius engineer from Michigan—started sharing much more information on productivity with workers on the assembly lines. Small groups of workers on the factory floor, reacting to that information, were able to think up countless little changes that increased speed, cut costs, improved quality, and enhanced productivity. Quite suddenly, Japanese autos became globally supercompetitive.)

Uncentralized Systems — a Checklist

It may also be helpful to sum up—and thus oversimplify—the rationale for the uncentralized systems that seem likely to be more and more characteristic of the post-post-modern era now ahead of us: In order for any complex activity to run in an uncentralized manner, there have to be some rules of the game (like standards).

These rules need to be adopted through a participatory or representative process so that nearly all the “followers” will feel they have been part of the “leadership.”

Until the rules become shared doctrine, there needs to be some interim authority—the policeman at a new urban intersection, the foreman in an industrial process, the guru in an ashram, a parent in a family—to remind everybody about the rules.

In time, the rules become internalized standards of behavior—and the resulting community doesn’t need anybody to be “in charge.” Procedural reminders can be mostly automated.

The rules are then learned in families and schools, by adult training and experience, and by informal (but effective) peer pressure.

In every well-functioning market, most of those involved in the myriad transactions are able to buy when they want to buy and sell when they want to sell, precisely because no one is in charge, telling them what to do. The discipline is instead provided by wide and instant knowledge of the prevailing price of whatever is sought or offered. Modern information technologies have made this knowledge spread possible on a global scale.

The uncentralized way of thinking and working naturally becomes more complicated as civilization moves from the small homogeneous village to large multicultural societies, and beyond that to the governance of communities in cyberspace. But there is evidently a path from the need for standards through the practice of consensus and the constituting of interim authorities (whose mandate is to work themselves out of their interim jobs), to patterns of naturally cooperative human behavior.

It’s a path that may become universally valid for organized human effort, however complex. Once upon a time, it seems to have required centuries and even millennia for human societies to find their way along a path without precedent. But everything else is speeded up these days. Maybe, once we can trace the path, our capacity to build uncentralized organizations will also be greatly accelerated—if we keep learning.

In any event, the motivation of men and women in organizations to keep learning—and their willingness to try what’s never been done before—will be the priceless ingredient of progress in the uncentralized systems of the twenty-first century.

© Creating a Learning Culture: Strategy, Practice, and Technology. Marcia Conner and James G. Clawson, editors. (Cambridge, UK: Cambridge University Press, July 2004) Reprinted here with permission.