Cult of the Expert
Democracy was not a product of the enlightenment. Democratic and participatory principles are much older than that, and probably date back to hunter-gatherer social habits. The great success of these rational methods is due to the greatly increased span of control you can achieve with them. The rationalist revolution in a nutshell is: big simple ideas used to organize and control large complex and dynamic environments. Rational structures allow you to design and organize large complex entities like factories and societies. The application of rational structure to the design and organization of complex entities celebrates its greatest success with its contribution to the industrial revolution. Industrial production, and the enormous social changes it has produced, is the great showcase for the merit of rational structure's dominance in our society. So we like our rational structures. They were an appropriate answer to a world dominated by irrational prejudice and abuse of power by the leading institution of the day (the Catholic Church). They have brought us enormous leaps in standards of living, a better educated population, which in turn led to a more democratic society. What could possibly be wrong here?
Like many successful systems, our rational structures posses inherent properties that can backfire. The very success of a system is often caused by the same aspects that eventually causes its demise. Rational structures divide things up. By pigeonholing reality into small manageable chunks the structure achieves its ability to control and stabilize large systems. Control, stability and efficiency are core values, they provide the raison d'etre for any organization based on principles of rationality and structure. To ensure the sustainability of a rationally structured organization the system uses strict rules. Everything outside and passing through the system has to be categorized rigidly and processed consistently and predictably.
What happens if such a system or structure runs into something it hasn't run into before? It cannot recognize the new object as such. The system's integrity demands that the new fact must be categorized according to existing categories, thus getting its stamp of recognition which enables the system to digest it. Of course in the process the real meaning of such a new fact is lost. It's newness and uniqueness has been lost. This is one of the reasons that truly rationally structured organizations cannot learn.
Another reason has to do with the role of knowledge within rationally structured organizations. These organizations are typically structured as a hierarchy. The relative strength of positions within the hierarchy is determined by how much knowledge relevant to other levels and positions in the hierarchy can safely be controlled. The knowledge you have or have access to, becomes one of the most important instruments for you to fulfill your agenda. Controlling knowledge means power, hence the profoundly rationalistic adage "Knowledge is Power". But this is a kind of relative power, having only meaning within the structure. Sharing or withholding knowledge are ways to consolidate positions of power. One may just as easily redefine the principle as "Secrecy is Power" for its implicit meaning is more compatible with actual practice. This leads to extreme forms of knowledge fragmentation within organizations. The degree to which this is reality can easily be observed all around us. Whether we look at universities, government institutions or large companies, knowledge is fragmented and each fragment, covering some subset of facts, becomes the domain of an 'expert', and so we naturally associate 'knowledge' with 'expertise'. Experts are all about knowledge, but only knowledge limited to their domain of expertise. The implication is straightforward and profound: any issues involving relevant facts across different domains of expertise become invisible to the rational structure. The facts required to see and understand the issues is present within the organization, within the structure, but they're not connected to each other, making it impossible to notice what is really going on. And so we have come to live in a fact-based reality, without understanding much of what goes on around us, let alone be in a position of thinking about effective strategies for dealing with what is happening around us. The Expert, addicted as he is to his knowledge domain, tends to dig deeper into his own area of expertise as some kind of defense against pressure from outside. The reaction is understandable, logical even, but only exacerbates the problem.
It is very difficult to change this for a different much more systemic reason. Rational systems tend to become self-justifying. The core of our rationality lies in our logical assumptions. One important assumption of the kind of rationality that dominates our society is that it is right. Rationality, and science--which is just an instance of applied rationality-- champions the use of method. The merit and justification of an outcome is determined by the method that produced the outcome. Scientist literally define their profession this way. Science is the investigation of nature following a certain (well defined) method in doing so. Following protocol is therefore a guarantee for quality. The rules of investigation are valid by definition. Validity is no longer related to outcome, only to method.
Now what happens if such a method-driven mentality encounters an anomaly, something, some fact, that does not seem compatible with the set of existing assumptions and conclusions. Step 1 is for the system to criticize the method with which this anomaly was found, which is consistent with everyday scientific practice. Whenever a scientist produces a remarkable and unexpected result her methods are critized first (personal attacks may follow later). The reason for the system to make such attacks is again logical. Since the rational method is valid by definition it cannot deal with exceptions. If a certain result, a certain fact, is not produced or reproducable by the rules of the system, the system itself collapses. The anomalous facts contradicts other facts that are produced or predicted by the system. Logicians call this Ex Falsum Sequitur Quodlibet: from the contradiction follows everything. It means that when a system becomes inconsistent, supporting both a fact and a denial of this fact, we no longer have anything to rely upon. The one contradiction means the entire system is faulty. In other words, the more strict the logic of your system, the more vulnerable the whole enterprise becomes. So science, and rational structures in general, have good (logical) reasons to avoid and deny the existence of those nasty facts that contradict the rules. In the meantime the world changes faster and faster in more and more unexpected ways, while our institutions remain inherently blind.
What are we to do?