Monday, March 9, 2015

The Birth of Science: A Primer on Intellectual History (Part 3)

Empiricism, Sensationalism, Rationalism, & Positivism
As we described in Part 2, science is a distinct type of philosophy. The Renaissance and Enlightenment philosophers who established what we today think of as science are customarily classified by their stance on a few basic, philosophical, perspectives: empiricism, sensationalism, rationalism, and positivism.

Empiricism and sensationalism both refer to the belief that all knowledge comes through the senses. These philosophies are very similar, both stressing that all knowledge enters the mind through the senses. Empiricism was most popular with the British philosophers, and sensationalism with the French.

Empiricists and sensationalists both rejected rationalism, the position established be DesCartes, that  argued thinking and the processes of the mind should be the route to knowledge. The distinction between empiricism and rationalism can be blurred. I often like to think of rationalism as information that comes through thinking, whereas empiricism comes through the senses. For example a thought experiment, or speculating about something is rationalism, whereas measuring something and categorizing it is empiricism.

Positivism is a concept that was very popular in early science, and increased in popularity into the 20th century. Positivism places importance on publicly observable events. This ideology was very common with empiricists and sensationalists, who felt that science should focus on measurable and observable experience. Positivism is a strict scientific attitude that holds as the goal of science to establish scientific laws and statements. In the 20th century the ideas of positivism would be challenged by what is sometimes called postpositivism. An extreme form of positivism, one in which the belief is that the only valid or useful form knowledge comes from science is referred to as scientism.

A momentous year for intellectual history, and for the philosophy of science is 1781. In this year, Immanuel Kant published a work entitled Critique of Pure Reason. In this tome of critical philosophy, Kant presented a model of individual thinking that synthesized empiricism and rationalism. Kant described how empirical and rational are both active in experience, in other words, we do not passively record the world, as empiricists would have it, but actively participate in making of the world through perception.

Kant's synthesis resulted in a divisive chasm in philosophy, sometimes called the Kantian split. We can see philosophy taking two different directions in the following 19th century, both claiming lineage back to Kant. One side of this split is called analytic philosophy and is popular in the English speaking world. It prefers logic, mathematics, and empiricism through controlled experimentation. Continental philosophy was mostly popular in the German, French, and Italian cultures, and is skeptical of much of analytic philosophy's claims. The lineage of continental philosophy can be seen from Kant, to G.W.F Hegel, to Karl Marx. Continental philosophers include figures such as Friedrich Nietzsche, the existential philosophers, and Martin Heidegger. Science as we know it today has grown out of analytic philosophy, whereas science criticism grows out of continental philosophy.

It is important to consider what the implications of the Kantian split are for epistemology and the philosophy of science. On one side, the analytic philosophers, we have belief that logic, mathematical models, and method will lead to laws of nature. On the other we find a critique of this, and an emphasis on cultural, social, political, biological, and economic pressures on human knowledge. This might best be illustrated through what is known as the science wars.

As a conclusion to this primer on the history and philosophy of science, I will introduce three different philosophers of science and their take on what science is. These three thinkers might serve as an introduction to the science wars. They are: Karl Popper, Thomas Kuhn, and Paul Feyerabend.

20th Century Science Wars
Karl Popper argued that science starts with a problem. He said that scientific method follows three steps: problem, conjectures, refutations. The emphasis here is that science is a method that is used to arrive at a solution, and that, in time, all theories are found to be false and replaced by improved theories. This is the common impression of what science is to most laypeople. However, it is important to point out that how science is actually done and what people believe about it, is much different than what Popper describes.

Popper proposed that science must limit itself to falsifiability, that is, an idea (hypothesis) must be testable for incorrectness. The hypothesis must make risky predictions that can be incorrect. For Popper, the scientists should be trying to prove their ideas to be wrong, rather than trying to find evidence for their ideas. This is where current practices by scientists run countercurrent to Poppers system. Many scientists today look for evidence to support their hypothesis, rather than disprove it.

Thomas Kuhn published a book in 1962 that revolutionized how we think about science. In The Structures of Scientific Revolutions, Kuhn suggests that science is not a method, but rather a social phenomenon. He claims that science is a product of social, economic, and political pressures that dictate what is studied and how it is studied. Kuhn argued that thought progressed in popular viewpoints of how science should be done, and what it studied, called paradigms. He contended that the history of science is a series of shifting paradigms, based more on who held the positions of power in the scientific community (journal editors, professors, research funding) rather than the scientific findings themselves. Kuhn said that these paradigm shifts in science occur not because of a significant finding, but rather, because old ways of thinking are retired when the people who hold them retire. Kuhn's view of science emphasizes science as a social phenomenon.

Paul Feyerabend was an anarchist thinker. In his 1975 text, Against Method he argues that the only true way to scientific discovery is an "anything goes" approach to thinking. Feyerabend says that the methods and rules that the scientists follow actually discourage and inhibit new discoveries. He points out that all of the major scientific discoveries of the 19th and 20th centuries were made by people who rejected the methods and systems that their peers followed. Feyerabend's works challenge all of the assumptions of Enlightenment thinking and seems poised to make a significant impact on contemporary scientific thought.


Please direct all comments, corrections, and questions to Matthew Giobbi.

Sunday, March 8, 2015

The Birth of Science: A Primer on Intellectual History (Part 2)



Vitruvian Man, Leonardo da Vinci
The Late Renaissance
Another critical element for the Renaissance was the mechanical movable type printing press, which was introduced in Germany in 1450. The printing press afforded thinkers like Martin Luther, Desiderius Erasmus, and later, Niccolo Machiavelli an amplified voice. With the printing press, Renaissance thinkers shared their thoughts in mass production,  something that served to increases the dissemination of ideas and the rate of change in society.

The renaissance is typically dated as 1450 to 1600. These years are, as is true with all historical period mapping, general and not specific; the labels are described later in time by scholars who are writing their narrative of history. There are a few hallmark characteristics of the Renaissance mood, what we often call the Zeitgeist. These qualities include: individualism, personal religion, an intense interest in the past, and anti-Aristotelianism (Hergenhahn). It is interesting to note that Aristotle had ushered in the Renaissance within Catholic church doctrine (these Aristotelian church philosophers are called Scholastics) but later he was attacked by Renaissance humanists. This rejection of Aristotle had more to do with a rejection of Catholic Scholasticism than with Aristotle's work itself. The influential Renaissance theorists include Francesco Petrach, Giovani Pico, Erasmus, Luther, and Michel de Montaigne.

The burning of Giordano Bruno in 1600
In the later years of the Renaissance, a few key figures in the foundation of science emerged. Each of these figures contributed something unique to the foundation of what we call science. Nicolaus Copernicus published The Revolution of Heavenly Spheres in 1543. This book is important because it essentially changed the intellectual worldview from a geocentric (earth-centered) narrative to the heliocentric system (sun-centered) of the universe. This change took some time, and it had significant cultural repercussions. Copernicus had managed to escape the Catholic wrath that his book touched off, largely because he died the year it was published. Others who embraced Copernicus' heliocentric theory did not fare so well.

Giordano Bruno was burned at the stake in 1600 for heretic views against the Christian church. Another figure who offended the Catholic church was Galileo Galilei who, in 1543 published On the Revolutions of the Heavenly Spheres, and invented the telescope in 1609. Although he escaped the fate of Bruno, the Catholic church did place him under house arrest until his death in 1642. He continued to write.

I like to think of Galileo and Leonardo da Vinci as the two figures who embodied the spirit of Renaissance thinking, as well as the foundations for modern science. Both pursued knowledge in diverse ways, from art to experimentation. The etymology of the word science is scientia, which means knowledge. These two thinkers were scientists in the broadest sense, not bound by contemporary divisions of academic thought.

The question on the minds of the late Renaissance thinkers was, what is the best method for thinking? Renee Descartes published his Discourse on the Method of Rightly Conducting One's Reason and of Seeking Truth in the Sciences in 1637. Descartes' work focused on establishing a system of thinking that would lead us to sound conclusions about nature. I like to think of Descartes as one of four theorists who laid the foundation for modern thought. We typically discuss modernity as beginning in 1600 and ending somewhere around World War I and World War II. The other four philosophers that contributed to the groundwork of modernism are Isaac Newton, Francis Bacon, and John Locke.

Modernity
The modern period (from 1600 to around 1905) is characterized by the attitudes of modernism. Modernism is an attitude that basis knowledge on systematic thinking, mathematics, logic, and objective experience. Renee Descartes contributed mathematics and deductive logic to this attitude. Francis Bacon worked extensively on induction and experimentation. Isaac Newton added mathematics and the idea of universal laws, and John Locke emphasized empiricism and universal laws. Central to the attitude of modernism, the idea that mathematics, logic, and a scientific method serves to answers all questions that humans face. The idea of objectivity, or the existence of an objective reality separate from human "subjective" experience dominates this scientific worldview. We call this tradition the Enlightenment (Age of Reason). We often refer to this period, which begins with Copernicus and melds into The Enlightenment, The Scientific Revolution. It describes the flourishing of mathematics, chemistry, physics, biology, and empirical philosophy.

Albert Einstein
The modernist attitude reached its zenith in what we commonly call the classical period (mostly the 18th century) and lasted into the 19th century. In the philosophy of science we refer to the scientific worldview of this time as the old view of science. The philosopher of science, Hilary Putnam describes the old view of science as being based on the idea that scientist collect and accumulate facts and build those facts into a "treasure chest" of accumulated knowledge. The old science idea that inductive logic (collecting observable evidence), scientific method, and the gathering of facts verified by experiment, was replaced in the 20th century by the new view of science, which most sciences use today. The new view of science is marked by the attitude that there is a human contribution to the phenomenon of reality (not merely an objective reality), that there is not one "method" of science (each science does science differently), and that multiple "true" descriptions of reality exist simultaneously. The transition from the old view to the new view of science is mostly due to what we call the Einsteinian revolution, which took place in 1905, when Albert Einstein published his Annus Mirabilis papers on special relativity. However, some important events occurred long before 1905 that lead to this change in the way we think about and do science.

Saturday, March 7, 2015

The Birth of Science: A Primer on Intellectual History (Part 1: From Antiquity to the Renaissance)

For over a decade I have been teaching a course on the history and systems of psychology at Rutgers University at Newark. The class, which serves as a capstone course for undergraduate psychology students, surveys an intellectual history from antiquity through the 21st century. It is my goal in this course to help students to understand and appreciate the political, philosophical, cultural, and historical influences on psychology through the ages.

We begin the course with a survey of intellectual history. As I am a believer in presenting a reading of history, rather than the history, I ask my students, as I now ask you, to appreciate that this sketch of intellectual history is one that I have arrived at, and is not the only reading available. As I have researched over the years, my understanding of the story has evolved. I have no doubt that the story I tell now will be different from the story I tell ten years from now. One thing that we know from thinking about intellectual history is that we must speak in the plural, of histories, rather than of history.

Giobbi's Timeline of Intellectual History
Antiquity (to 600 B.C.E.)
The earliest appearance of human questioning and answering came in the form of narrative stories. We call this myth, taken from the Greek mythos, which means "speech, thought, story... anything deriving form the mouth". These narratives center around animism, anthropomorphism, and magic. 

The term myth is commonly thought of in a more narrow sense, meaning something that is invented and not necessarily true. The sense of the word in the context of intellectual history is simply narrative explanation. There are narratives that are no longer practiced, but enjoyed for their wisdom and entertainment, such as the early Greek Olympian and Dionysiac-Orphic narratives. There are narratives of antiquity continue to be practiced, such as the Judeo-Christian-Islamic narratives.

The term animism refers to the practice of viewing the world as something that is living and active, rather than inanimate. For example, the poetic idea of angry skies or happy clouds is animism. A more precise term, anthropomorphism is used to describe nature as having human attributes. This can be seen in the human motivations, feelings, and actions of the Greek, Olympian gods.

Any ritual or act that is done to influence nature or a God is referred to as magic. Magic includes any type of ritualistic behavior, such as a rain dance, or ritualistic thought, such as prayer. The essence of magic is the idea that ritual can influence occurrences. We see this tradition alive and well today in what we call religious and spiritual belief. The important aspect to keep in mind is that myth serves to predict, control, and understand the natural world (Humphrey). 

The two main forms of narrative that existed in the ancient Greek world were the Olympian religion and the Dionysiac-Orphic religion. It is common to characterize the former as the religion described in the Homeric poems. The ideal life was one lived for glory through noble deeds and ended at death. The Olympian gods appear to mirror the characteristics of the Greek nobility, which comprised most of the religion's followers.

The poorer ancient Greeks; peasants, laborers, and slaves, tended to believe in the Dionysiac-Orphic religions. This religion, based on Dionysus, incorporated wine, sexual frenzy, and the transmigration of the soul; the belief that the soul is trapped in a body, as punishment for a sin committed in the heavens. The belief that the soul escapes earthly existence at death would later influence the Judeo-Christian belief.

In the East, the Vedic religions; Hinduism, Buddhism, Jainism, and Sikhism, as well as the East Asian  Taoism, Shinto, and Confucianism all emerged from 1500, B.C.E. on. These belief systems are treated as a religion by some practitioners, and as a life philosophy for others. This is an important distinction to be aware of. We find that Eastern and Western thought synthesize in many thinkers from 600 B.C.E. to the present.

Western Philosophy
It is commonly accepted that the first Western philosopher was Thales (ca. 625-547 B.C.E.). What made Thales different from other thinkers is that he rejected supernatural phenomena (such as gods and spirits) and looked to the physical world for explanations to the fundamental question, what is the world made of? Thales had traveled in the East and it is believed that his thinking was influenced by Eastern thought. The primary question that Thales proposed, and the question that would dominate philosophy (the love of wisdom) until Socrates, was; what is the fundamental substance of which the world is made? This primary element was called physis meaning the nature of stuff. Thales concluded that the fundamental physis was water. Thales is said to have once fallen into a well while deep in thought.

Early philosophers proposed various answers to the question of what the fundamental physis is. Anaximander proposed the basic physis was chaos (an abyss, wide open), which is almost postmodern in its vagueness. It certainly conjures up contemporary work in theoretical physics. Heraclitus proposed that the physis is fire, and pointed out that everything is in a state of becoming, rather than being.

The thinkers that came before Socrates are typically called the Pre-Socratics. The reason for this is because with Socrates came a distinct shift in philosophy's focus. Unlike the earlier philosophers, Socrates was interested in the question, what does one mean by....? What does one mean by "beauty". What does one mean when they say "justice"? Socrates is said to have lived by the dictum "know thyself". Because Socrates never wrote anything down, the Socrates that we know comes from the writings of his friend Plato, he featured Socrates as a character in a series of 25 dialogues. This is  the reason that some scholars refer to the earlier Greek philosophers as the "Pre-Platonics" rather than the "Pre-Socratics".

Along with the earliest philosophers were a group of thinkers who are customarily called the Sophists. These thinkers challenged the idea that one could arrive at an ultimate, universal truth, and instead proposed that truths existed within contexts. The Sophists were frequent targeted by Socrates and Plato. The contemporary manifestation of the Sophists is postmodernism.

Raffael's The School of Athens
In Raffael's painting The School of Athens, we find two central figures, one pointing up and the other pointing down. These two characters are Aristotle and Plato. Aristotle, who is pointing down, was a student of Plato's. Raffael depicted Aristotle pointing to the earth because his philosophy was based on finding truth through the natural world. Plato is depicted by pointing upwards because his philosophy was based on truth being metaphysical (beyond the physical). Both of these philosophers were looking to establish ultimate, universal, truth, and each proposed that it existed someplace different.

With the rise of the Roman Empire we find a shift towards life philosophy, or a philosophy for the good life. This idea was not new, Plato and Aristotle both discussed the idea. However, for these philosophers, how to live was at the center of philosophy. Pyrrho of Elis formed a school called Skepticism. Antisthenes proposed Cynicism. Epicurus taught that the good life found through simple living. Much of the other philosophers were influenced by Plato's teachings, and we call them neoplatonists (new Platonism). These neo-Platonists influenced early Christianity a great deal; much of Christian theology of this time can be traced to Plato's thinking. By the time that the Roman Empire fell in 476 C.E., the writings of Aristotle had been lost to the Western world. Aristotle's works were alive and well in Arabia, Syria, Egypt, Persia, Sicily, and Spain. From about 410 C.E. until around 1000 C.E., Europe experienced what is described as a "dark" period. Just how dark these Dark Ages were is debated. What we do know is that at the very same time, the Islamic world was the cultural center of the world. Much of our modern mathematics and science is based on Middle Eastern thought from this period.

In the High Middle Ages (1000s through 1300) and the Late Middle Ages (1300s through the 1500s) there was a 200 year struggle for control of the Holy Lands between the Roman Catholic Church and Islam. During this time, returning crusaders and trade merchants reintroduced Aristotle from the Islamic world back into Europe. This reintroduction of Aristotle is said to have been one of the major catalysts for the European Renaissance; the cultural rebirth of the 15th century.

Friday, March 6, 2015

Four Uncanny Moments in Cinema



This blog originally appeared on April 21, 2012.


Recently a friend and I got on the subject of childhood movies and the uncanny. Sigmund Freud took up his own thinking on the uncanny in a essay from 1919 entitled The Uncanny. It is from the essay that most psychologists are familiar with Das Unheimliche. Freud makes a distinction between the heimliche (concealed) and the unheimliche (unconcealed). Freud described the phenomenon of the uncanny as a projection of the repressed id onto the figure which brings forth the discomforting experience. Here are my top 4 examples of the uncanny from familiar films.

4. Mary Poppins
There is something uncanny about the entire Mary Poppins story. This scene stands out for me as a moment of the uncanny.

3. Chitty Chitty Bang Bang
I am not alone in sensing the uncanny in Chitty Chitty Bang Bang. The "Child Catcher" is a particularly uncanny moment. 

2. The Wizard of Oz  
What is it about the Wizard of Oz that is so familiar, yet so strange?

1. La Dolce Vita
My number one moment of the uncanny is the finalĂ© from La Dolce Vita. Fellini is a master of resonating the unconscious.

Thursday, March 5, 2015

The Evolution of Erich Fromm


This blog originally appeared on December 12, 2011. 
"Fromm had an unparalleled ability to write for the public; the ability to express sensitive, complicated, and often paradoxical thoughts in a graspable way, while maintaining an intelligent conversation. Fromm was a man interested in actively incorporating his ideas and making them accessible to the man on the street."
Erich Fromm was a central figure of the American counterculture from World War II through the heart of the Cold War era. Beginning with his first English title Escape From Freedom (1941), through his final writings dealing with existential humanism, On Being Human, Erich Fromm created a unique convergence of psychoanalysis, Marxism, humanism, and Buddhism. Not holding dogmatically to any one of these life philosophies, he instead mined each for wisdom that could help in coping with the issues of the late 20th century. Influences on Fromm’s thinking include the Talmud and the Torah, the teachings of Christ and the Buddha, Master Eckhart, and Goethe. His style of thinking was not singular, but rather, a plurality of convergences that resulted in a voice that helped to organize the voices of four decades of the conscientious.

What distinguished Fromm from other thinkers of his time was his rejection of dogmatism in any form. This free-floating pluralism resulted in a voice truly independent from a school of thought. Most notably might be Fromm’s split from the Frankfurt School of Critical Theory. Through Fromm’s investigation and rejection of certain core, Freudian concepts, he found himself at odds with some of the Frankfurt School tradition. However, Fromm found this to be an experience of liberation, one in which he could retain much of what he found valuable in the Critical Theory tradition, while not being chained to it ideologically.

The most notable shift in Fromm’s thinking came in his 1960 text Psychoanalysis and Buddhism. As a thinker who would not moor himself to any one central piling, Fromm explored key concepts in the Eastern traditions. Not unlike his German predecessors Hegel, Schopenhauer, and Heidegger, Erich Fromm found Eastern thinking not only to enhance, challenge, and express many of the ideas of the Western tradition, but also to offer a new way of thinking about the issues that we face. Whereas popular figures such as Alan Watts would edify Zen and Tao, Erich Fromm integrated the Eastern ideas with the Western philosophical tradition. Fromm studied and practiced the life philosophy of Buddhism, however unlike others, it never became the life philosophy. Today the meeting of Buddhism and psychoanalysis has become a tradition of its own. This area of thought first found its voice through Erich Fromm.

The 1930s through the 1960s found Fromm doing most of his American writing. This was a time when academic psychology, as well as pop-psychology, was entranced with American behaviorism. For most academic psychologists Behaviorism was the arrival of psychology as a pure, lawful science. For those psychologists and other thinkers, outside of experimental psychology, behaviorism was yet another manifestation of the Newtonian fantasy. Fromm was not only critical of a dogmatically experimental psychology, but he considered it to be a dangerous ideology. Fromm was informed of the dangers of a purely experimental or scientific worldview through the writings Martin Heidegger. In The Sane Society Fromm takes on experimental psychology with Heideggerian sensitivities.

This discomfort with academic psychology continued when the cognitive movement began in the 1960s. Fromm became increasingly critical of models that overbearingly reduced human being into machines (in this instance computers). Fromm was not alone in this critique of behaviorism and, later, cognitive psychology. Humanistic psychology was the “third force” that reacted not only against experimental, but also, psychodynamic psychology. But Fromm was less interested in promoting any one school of thought than he was in integration of these schools. He was clearly critical of the movements in American, academic psychology, but he was equally as critical of Freudian psychoanalysis. Although Fromm considered his work to be humanist -he goes as far as to consider Marx as a great humanist- he is not the typical humanist of the period. Fromm’s writings and theories are far more developed and theoretical to be considered next to the typical, feel-good, representatives of the humanistic movement in psychology.

Fromm formed a convergence of philosophy, economics, theology, psychology, sociology, and political science. His theories and writings are difficult to place in any one academic department and truly contend the tendency to organize thinkers by subject matter. In Fromm’s texts we find that being human is a social conglomeration of the philosophical, the political, the emotional, and the spiritual. This, of course, reflects the soil in which he first broke through. Frankfurt School thinkers like Marcuse and Adorno had laid out the interdisciplinary approach; the blending of Freud and Marx was necessarily an interdisciplinary project. Fromm continued this project by reinvesting into man as a spiritual being.

Philosophically, Fromm dwells in that group of thinkers that come after the Kantian split. Clearly an existentialist, Fromm is informed not only by Kant but also Heidegger, Hegel, Husserl, Schopenhauer, and Nietzsche. He finds camaraderie with Spinoza, Master Eckhart, Leibniz, and Pascal and it is not uncharacteristic for him to draw on ancient Greek thinking. He does not, however, fetishize and romanticize Ancient Greece, instead he saves this honor for pre-enlightenment Europe.

Politically and economically, Fromm was a Marxist. However, his radical, humanist reading of Marx set him apart from his cohorts. Although he shared this position with the Frankfurt School thinkers, Fromm took Marxist humanism to a new level. In his 1961 text Marx’s Concept of Man, Fromm presents and discusses Marx's early concepts of alienation and private property. Through Fromm’s pen we find these ideas made practical for the Twentieth Century in critical issues of freedom and a self, based on having.

Sociologically we find Fromm in the company of Marxist theorists. The ideas of Durkheim, de Toqueville, and Arendt resonate with the Frommian spirit. Psychologically, Fromm is a psychoanalyst. His rejection of Freud’s privileging of sexual drives is monumental and intelligent. His 1935 paper The Social Determination of Psychoanalytic Therapy alienated him from both orthodox psychoanalysis and the Frankfurt School’s harbinger, Max Horkheimer. Although Freud had focused on culture, society, and civilization in his later writings, he still held culture to be the sublimation of sexual drives. Fromm did not entirely reject this, he did however, show that culture had become a greater influence on human being than biological drives. For orthodox Freudians this was heresy, but for the new wave of thinkers such as Karen Horney, Erik Erikson, and even Wilhelm Reich, Fromm was the pioneer of social psychoanalysis.

Erich Fromm was concerned not only with society and man as independent subjects, but rather of the Gestalt of the social person. Man and culture would not be parsed from one another as is customary in social psychology and sociology. Although he did not play-out the conversation of man and society, and the S/O split to its end, as did say, Jacques Lacan and the French thinkers of the 20th Century, he did introduce a widespread readership to the possibility of that kind of thinking. We can think of Fromm as someone who was completely aware of what was behind the curtain, but realized that pulling the curtain down too quickly would be uneventful. As a psychoanalyst, Fromm understood that nature resists sudden changes, and that to affect culture as a whole, new ideas were best presented in subtle chippings, rather than mammoth blows. In this way Fromm was much more effective at introducing the layperson to the ideas of Heidegger, Marx, and Adorno, than have been cultural icons such as Jacques Derrida, Michel Foucault, and Jean Baudrillard. Fromm had an unparalleled ability to write for the public; the ability to express sensitive, complicated, and often paradoxical thoughts in a graspable way, while maintaining an intelligent conversation. Fromm was a man interested in actively incorporating his ideas and making them accessible to the man on the street.

The issues that occupied Fromm’s thinking manifested during the pre-Nazi, modern world of political fascism, through the post-Vietnam War, postmodern world of culture marketing. His writings deal with individual freedom in the age political fascism through the age of technology. Many of his concerns continue to be the concerns of today, and where much of his thinking was premonitory, most of it has become more relevant than when it was written.

Overshadowing the issues of Nazi fascism, the American Civil Rights Movement, cultural colonialism, the wars in Korea and Vietnam, and corporate fascism was the impending promise of atomic annihilation. The atomic question took center stage for much of Fromm’s life and became the most urgent issue to be addressed. But behind this external threat of a nuclear apocalypse was another issue of the problem of technology. Fromm was equally as concerned with the ideology, technology, politics, and capitalism as he was the atomic bomb. For Fromm, President Eisenhauer’s warning of a military-industrial-complex, a corporate incentive to go to war, was as threatening to mankind as the bomb.

At the foundation, however, of Fromm’s concerns was a person’s relationship with herself. Based on the human need for a sense of self, Fromm described a modern, social personality that was alienated from an authentic life and enmeshed in an ideology of consumerism. Fromm’s best-known book To Have or to Be is an exploration into the trend of basing one’s sense of self on what they have rather than on what they do. This is the Fromm that dealt with ideology and complex intersection of politics, economy, culture, and psychology in what is called personality.

Erich Fromm is a name that has not become forgotten, but perhaps has become overlooked, in 21st century thought. Fromm’s accessible, clearly written, and concise writing made him readable by nonprofessional thinkers. His ideas were comparable to those expressed by his Frankfurt School colleagues but did not assume or require a graduate degree to read. For this reason, a generation of revolutionaries came to embrace Fromm’s texts, while academic and public intellectuals have bypassed him for the more obscure writings of Herbert Marcuse, Theodore Adorno, and Max Horkheimer. Even those German, French, and American writers of poststructuralism hold much in common with Fromm’s writing, if not for his clear and understandable style. For this reason, Fromm has been neglected by the academy and forgotten by the aging generation of 1960s radicals.

Erich Fromm’s thoughts and teachings are increasingly relevant to the issues of today. We will find that many of the issues remain, in addition to new manifestations of old problems. Much of his thinking, based on three thousand years of intellectual history, is timeless and reflects the core issues of human existence. What is unique about Fromm is not only how he presents his thoughts, but also, how he organizes and constructs them.

Wednesday, March 4, 2015

Rethinking Reductionism With Google Maps

This blog originally appeared on March 28, 2013

"The romantic spontaneity and courage are gone,
the vision is materialistic and depressing.
Ideals appear as inert by-products of physiology;
what is higher is explained by what is lower
and treated forever as a case of 'nothing but'
-nothing but something else of quite an inferior sort.
You get, in short, a materialistic universe,
in which only the tough-minded find
themselves congenially at home."

 -William James

(The Present Dilemma in Philosophy)


Matthew Giobbi, 2012.
A sea change has occurred in how we understand the structure of knowing in cognitive neuroscience. Today, researchers, writers, and professors of psychology are holding discussions in a way that is much more in-line with the attitude of William James's radical empiricism.

James instructed the emerging science of psychology to embrace a cross-paradigmatic (in today's terms, an interdisciplinary) attitude of investigation. It has been a long time coming for psychology. James, greatly in spirit with his friend C.S. Peirce, was attempting to point the science of psychology in the direction that the other sciences of the 20th & 21st Centuries would take; a trajectory towards semiotics. Much of what Peirce outlined in his works on semiotics, a system of thought that has been the central influence on contemporary science, was unpacked for psychology in James's radical empiricism and pragmatism. Today, it seems that we are closer than ever to the third culture that C.P. Snow had called for in 1959; a truly radical empiricism.

Despite this shift in how we approach knowing, there are two philosophical attitudes that seem to prevail amongst students entering into the university study of the social sciences. It is for these students that I present this essay. It is not a suggestion to reject, but rather, an invitation to expand how we think about knowing through the social sciences. These two attitudes are strikingly present in conversation with most of my first-year students. Both share a common origin in early, classical, concepts of the philosophy of science, as well as an almost taken for granted (captivation-in-an-acceptedness) place in the Enlightenment rules for thinking. In addition these philosophies are closely related to two fallacies of thought, a consideration that is the topic of this undertaking. The two concepts that I speak of are Reductionism and Mechanism.

William James
In his extraordinarily insightful text on the philosophy of science, Worldviews, Richard DeWitt explores the evolution of the scientific knowledge systems since the early Greek thinkers. Just as Professor Hilary Putnam describes, in an interview with Bryan Magee, DeWitt outlines some central attitudes that have been dismissed within some sciences, and privileged within others. Whether this be the result of an internalist attitude within a specific field of study (only learning the history and philosophy of the science from within that science), or due to the absence of the study of the philosophy of science in most university science departments, the question of what science is has a different answer depending on the discipline in which it is asked. This is especially true for the social sciences. The main distinction between physics and the social sciences has been the adoption of Peirce's philosophy in the former, and a forgetting of it (through James's pragmatism) in psychology. This is the context of the problem, but let's turn to the two specific concepts of interest in this discussion; reductionism and mechanism.

The idea of reductionism is woven into the fabric of our sense of reality. Although it seems obvious that bigger is made-up of smaller (subatomic, atomic, cells, organs, etc...), an accompanying sensibility is not necessarily true; that smaller is the cause of bigger. Reductionism, then, is the idea that larger features are caused by smaller features. Examples include the idea that an area of the brain causes a certain behavior or temperament, or that a particular emotion is merely a result of certain neurotransmitters. This attitude of reductionism commits what is referred to as a causal fallacyspecifically, the idea that smaller causes bigger. It is an attractive, almost commonsense, point of view. However, critical analysis shows us that smaller might be correlated with bigger, but, smaller is not necessarily the cause of bigger. As we all learn in the first year of research methods, "correlation is not causation".

C.S. Peirce
Let's consider an example. In a popular Introduction to Psychology text by David Myers, the author correctly points out that brain scans of virtuoso violinists reveal a specific development in the motor strip of the right, frontal cortex. This area of the brain is associated with the left hand and fingers, which are predominant in violin playing. The right hand is mostly used for grasping the bow, rather than fingering notes, which accounts for the difference in neural concentration and activity between the left and right motor cortex. Keep in mind that this is true due to the lateralization of brain function; the left side of the body is associated with the right side of the body. Note the choice of the word associated rather than caused. Even in the use of the most basic words one can infer causation rather than correlation. The point Myers makes is that the violinist's brain has concentrated neural tissue and activity through years of practice of the instrument, and in turn, correlates with greater finger dexterity while playing the instrument. We do not have a clear causal relationship here, but rather, a correlationship. In this example, we cannot say that the brain in causing the violin playing, no more than we can say that brain chemicals are causing an emotion. The idea that the smaller causes the larger is a fallacy that has a history rooted in the 16th and 17th Century Scientific Revolution, a tradition from within The Enlightenment.

At the time, physics was largely developed through Newtonian, or what is now called Classical Mechanics. The idea was that all the structures of the natural universe (from planets to the brain) were merely a mechanized, clockwork structure that are governed by universal laws, just waiting to be "discovered". The way to discovery of this mechanized, lawful natural order was through reduction; dissection, magnification, and peeling away to the ultimate substance. This ultimate stuff, it was thought, would be arrived at through careful observation and measurement. Newtonian notions of science were abandoned in the early 20th Century, in particular with the Einsteinian Revolution which established that stuff at the subatomic level does not follow the same laws as the substances at the atomic level. In other words, Newtonian science does not work at the subatomic level. Today, physicists speak less in terms of classical mechanics, and more in the ideas of theoretical physics; Chaos Theory, String Theory, and subatomic physics.

New models of science, which physics embraced in the early Twentieth Century, were largely based on the influential thinking of C.S. Peirce. Without Peirce's work on semiotics there would be no theoretical physics. Whereas most of the sciences moved away from the "old view" of science, much of the social sciences did not. Despite the fact that the founder of American psychology, William James, called for a scientific psychology greatly influenced by Peirce, the more simplistic system of behaviorism completely overshadowed James in the early Twentieth Century. Radical Empiricism and pragmatism were not alone in this, the Gestalt tradition was also drowned by the behaviorist paradigm, not to return until cognitive psychology emerged in the 1960s. The way in which scientific psychology has been done, since the Nineteenth Century, has largely been based on antiquated notions of a Newtonian Science. Today, as predicted by thinkers including Thomas Kuhn and Paul Feyerabend, scientific psychology has rethought what "science" is and how it is done.

A rethinking the fallacious assumption, that reductionism infers cause and effect, can be illustrated by using a familiar model from Google Maps. With google maps we have a function that is similar to that of the microscope when looking at a tissue sample; magnification. Through "zooming out" (the - function) we can take a distant of view of the object from afar. As we increase our magnification ("zoom in" with the + function), we are able to approach the street level of a specific neighborhood. We are tempted, when magnifying a piece of tissue, to understand the cells as building the tissue. We are also tempted to understand neurotransmitters (or brain areas) as the "cause" of a simultaneous emotion, behavior, or thought process. However, we would never claim that somehow a street in Newark causes the universe. We do not view the magnification of maps in the same way that we view the magnification of neural tissue or the brain. The question is, why do we assume causation through reductionism, and can we expand our approach and understanding of science, in a radically empirical way, through the Google Map metaphor?