Tuesday, March 1, 2016

Head Games: Modernism and Post-Modernism

Zheng Fanzhi: "Van Gogh III" (2017)

An oblique style so muddles most Post-modernist discourse that its opponents rarely know precisely what they are opposing, other than stylistics, and even writers who fancy themselves post-modernists rarely evidence much understanding of underlying issues. Essentially, Post-modernism represents a conscious abandonment of Modernism, which in the context of European philosophy refers to the search for a self-sufficient and perfectly coherent system of human knowledge. Modernism traces its roots to René Descartes, who proposed developing a complete system of knowledge based on the methodical inspection and correlation of mental and physical experience. This approach effectively divorced metaphysics from theology and thus secularized the question ‘How do we know what we know?’ for subsequent philosophers. Two basic approaches to this question developed in the following centuries. Empiricists, following John Locke, insisted that we derive mental constructs of reality from sensory experience; Idealists, following Emmanuel Kant, insisted that we interpret sensory experience based on innate mental constructs of reality.
These contrasting approaches paralleled similarly contrasting approaches to the same question as posed by philosophers since the time of Plato and Aristotle. Plato had held that concepts are innate in all conscious beings but they have a reality independent of any consciousness. According to this view, which became known as Realism in reference to the presumed independent reality of concepts, cognition is a matter of ordering perception according to innate mental templates. This was the view championed by Augustine of Hippo, who adapted it to Christian theology by positing our creation in the Image of God as the source of innate concepts. In opposition to Plato, Aristotle had argued that we build up mental constructs of reality based on experience of the physical world. This view, which dominated discussions of metaphysics in the Middle Ages, became known as Thomism after Thomas Aquinas, who adapted the views of Aristotle to Christian theology just as Augustine had adapted the views of Plato. William of Ockham later elaborated an extreme interpretation of Aristotle, which became known as Nominalism in reference to the provisional nature of the terms (i.e. categories) by which we perceive reality. Ockham argued that concepts are not innate—or, if innate, then unreliable—and that only the intervention of a transcendent intelligence can rescue people from perceptual error. Unlike the churchmen, who viewed reason as subservient to faith, Descartes asserted the primacy of reason as the basis of philosophy as well as faith. He thereby sidelined the whole of Medieval European philosophy as a misguided detour from the speculative traditions of Antiquity.
Later discussions of these issues conflated elements of Platonic and Aristotelian positions in ways that defy easy correlation with traditional positions. Phenomenologists (Georg Wilhelm Friedrich Hegel, Marin Heidegger, Jean-Paul Sartre) argued that that mental constructs of reality are intuited from experience (physical, emotional, and social) and validated by an act of will, whether personal (as in Existentialism) or collective (as in Marxism). Analytic Philosophers (Bertrand Russell, Richard Rorty) argued that mental constructs of reality inferred from physical experience are validated by universal principles such as mathematics that are innate in human consciousness. Generally speaking, Phenomenologists claimed a moral intuition of reality whereas Analytic Philosophers claimed logic-based objectivity. With regard to ethics, both Phenomenologists and Analytic Philosophers promoted a personal commitment to the welfare of others as the only transcendental truth, personal immortality and the meaning of life being construed as functions of collective continuity. The critical weakness common to both approaches was the difficulty in establishing universal truth given the inherently individuated nature of validation. Phenomenologists approached this problem by insisting, quasi-mystically, that intuition is trans-subjective. Analytic Philosophers approached the problem by insisting (mistakenly as it turns out) that rigorously logical processes with a fixed set of axioms cannot yield contradictory results. The dark corollary to both traditions was the Nihilism of Friedrich Nietzsche, who proclaimed that systems of meaning grounded in the self ultimately require the imposition of a super-self as axiomatic referent.
The great champion of Analytic Philosophy was Bertrand Russell, who together with Alfred North Whitehead authored the Principia Mathematica, a logical deduction of the entire framework of arithmetic and algebra from a few propositions. This was to be the first step in the reduction of all philosophical questions to a consolidated system of impregnable logic that dismissed any remaining questions (e.g. ‘Why are we here?’) as irrelevant. Ironically, the program unraveled following the efforts of Kurt Gödel to resolve an issue peripheral to the Principia Mathematica. Gödel employed a system of meta-mathematical mapping to reduce the paradox ‘Is the statement [This statement is not demonstrable] demonstrable?’ to a simple either/or proposition with only one possible answer. What Gödel inadvertently discovered was that no amount of basic propositions can account for all the demonstrable possibilities of a complex system; moreover, no system can justify its foundational propositions from within the system—the validity of any system is always exterior to the system.
Following this impasse, philosophers approached the task of developing a comprehensive system of philosophy by concentrating on some pervasive aspect—e.g. presence, power, language. Language became an especially attractive object of inquiry because it provides the means of elaborating, communicating, and preserving thoughts. Ferdinand de Saussure inaugurated the current line of inquiry into this field by developing a theory of linguistics based on difference, an approach which derives from Hegel’s thesis that identity is relational. Though impressive, this system suffered from a naïve objectivity that ignored the importance of personal and cultural subjectivity. Ironically, Charles Sanders Peirce had earlier proposed a theory of language that appreciated the subjectivity of language, but this very appreciation (based on Ockham’s tripartite distinction between object, symbol, and subject) stifled further development of a comprehensive theory of language. The most original contributor to language theory was Ludwig Wittgenstein, who observed that utterances have ‘accepted’ but not ‘true’ meanings because language is primarily a function of use, a kind of game with rules and strategies that change with time and distribution as the influence of individual participants grows and fades. To take a simple example, the word ‘nice’ (from ne scire) originally meant ‘ignorant’ but it has come, over the course of many uses, to mean ‘pleasant.’ What should it mean? Why? Any answer to these questions, whether dogmatic or libertarian, raises the issue of manipulation of meaning. The consequences of Wittgenstein’s thesis are profound for any general theory of knowledge at a time when population growth, widespread higher education, improved communications technology, and the ripple effect of theoretical advances have led to an explosion of information that must be vetted, categorized, correlated, archived, and distributed—all on the basis of unwittingly biased understandings.
Among the Phenomenologists who turned their attention to language, Claude Levi-Strauss stands out as the last great Modernist. By adapting Saussure’s insight into the nature of signs to the anthropology of pre-literate societies, Levi-Strauss convincingly demonstrated the underlying symbolic reasoning linking apparently random associations. He observed, for example, that the fearful association of storms with rabbits, harelips, and twins in the Andes is rooted in a fear of the unnatural splitting of entities. In this case, twins are considered imbued with the principle of splitting and therefore capable of provoking storms, which are viewed as cloud-splitting; harelips are quasi-twins, as are the rabbits for which they are named, and therefore also a threat. This approach, which became known as Structuralism, imploded following the attempt by Algirdas Greimas to develop a formal theory of symbolic communication. The problem was the recognition by Greimas that the opposite of any given symbol is a function of context. For example, the opposite of marriage can be construed as (among other possibilities) rape, homosexuality, celibacy, or widowhood depending upon the aspect under consideration.
The question of why some oppositions seem more obvious than others led to Post-structuralism, namely the work of Michel Foucault, who contended that the conventions we use to describe reality effectively limit our ability to discern reality. Foucault further argued that these conventions are the artifacts of social power, which he tended to view as something exercised from above—a thesis that effectively challenged the validity of all social and historical relations. The implications of Foucault’s challenge, which recalls that of Wittgenstein, can hardly be overstated at a time when information has become such big business that consumers rightly wonder whether it is carefully tailored to the advantage of the provider at the expense of the receiver. Jean-Francois Lyotard labels this the ‘Post-Modern Condition,’ and he glowingly observes that, under the circumstances described above, localized appeals to legitimacy based on dialog easily outflank generalized appeals based on established principles because dialog is intrinsically participatory, and this democratic nature disarms the distrust that immediately greets dogmatic statements. The very idea of Truth thus becomes the nemesis of the Post-Modernist for whom there exist only inherently oppressive conventions.
The next step was Deconstruction, a critique of metaphysics itself as an oppressive discourse based on the suppression of otherness. This project is most closely associated with Jacques Derrida, who won a large following among literary critics for his nimble rhetoric and his use of porous and multivalent language, but the roots of this project can be traced to Heidegger’s challenge to deconstruct European metaphysics by examining the suppression of tensions and inconsistencies that belie its artificiality. The critical approach was elaborated by Emmanuel Levinas, who developed a theory of alterity (‘otherness’) that adapted Saussure’s theory of meaning-as-difference to identify a trace of the Other within every construction of identity, including the Self. This approach achieved truly disruptive force in Jacques Lacan’s reinterpretation of Self as a function of violence that manifests itself in the simultaneous separation from and cannibalization of external authority. Derrida insinuated these critiques of meaning into every imaginable field of philosophical discourse, playfully inverting the teleological impulse of Hegelian Phenomenology by reveling in the myriad threads of meaning interwoven into every word.