One of the many things I find exasperating about our culture is the so-called conflict between science and religion. I think the conflict is only between one restrictive, narrow-minded vision of science and one equally (and similarly) restrictive, narrow-minded vision of religion. I consider the partisans on both sides of this divide equally fundamentalist, and I would very much like to see more of us who hold more moderate positions gain attention in our public discourse (i.e., the media). But as it is now, only the extremists get air time.
I’ve done a little chipping away in this blog at the fixed positions on both sides of this polarized debate, and while I’m not interested in a frontal assault on either position (I was raised literally in the middle of a Civil War battlefield, so I know the futility of that tactic, even if I hadn’t read Sun Tzu), I want to step up the opposition to the hijacking of our intellectual life by extremists.
So here’s the first barrage:
One of the great crises of spirituality in the Western world was precipitated by the publication in 1859 of Charles Darwin’s book “On the Origin of Species by Means of Natural Selection, or the Preservation of Favoured Races in the Struggle for Life.” The spritual crisis was precipitated by the fact that because this work seemed to imply that a literal interpretation of Judeo-Christian scripture was erroneous, there was a widespread belief that “Darwin has disproven the Bible.” As a result, some people abandoned their Christian faith and others hardened theirs. (This was, in fact, what gave birth to the fundamentalist movement, which originated among a group of Baptist ministers who decided that the best answer to the challenge of science to the scriptures was to declare the scriptures right and science wrong.)
Interestingly, the word “evolution” doesn’t appear anywhere in the first edition of Darwin’s book. In fact, the only place in it where any form of the word “evolve” can be found is at the end, the final word of the final sentence of the book:
“It is interesting to contemplate an entangled bank, clothed with many plants of many kinds, with birds singing on the bushes, with various insects flitting about, and with worms crawling through the damp earth, and to reflect that these elaborately constructed forms, so different from each other, and dependent on each other in so complex a manner, have all been produced by laws acting around us. These laws, taken in the largest sense, being Growth with Reproduction; Inheritance which is almost implied by reproduction; Variability from the indirect and direct action of the external conditions of life, and from use and disuse; a Ratio of Increase so high as to lead to a Struggle for Life, and as a consequence to Natural Selection, entailing Divergence of Character and the Extinction of less-improved forms. Thus, from the war of nature, from famine and death, the most exalted object which we are capable of conceiving, namely, the production of the higher animals, directly follows. There is grandeur in this view of life, with its several powers, having been originally breathed into a few forms or into one; and that, whilst this planet has gone cycling on according to the fixed law of gravity, from so simple a beginning endless forms most beautiful and most wonderful have been, and are being, evolved.”
In the sixth edition, published in 1872, “evolution” is much more prominent, mainly in describing Darwin’s supporters and his responses to his critics. For example: “It is admitted by most evolutionists that mammals are descended from a marsupial form; and if so, the mammary glands will have been at first developed within the marsupial sack.”
In short, during the 13 years since the publication of the first edition of “Origin of Species,” Darwin has shifted from making observations of nature and drawing conclusions from them to defending his theories against the onslaughts of his many critics – mainly the religious establishment – and aligning himself with partisans who support him.
Given that there were so many who believed that “Darwin has disproved the Bible” and more generally that “Science has disproved God,” it’s interesting that Darwin made only one small change to that final paragraph reproduced above. Here it is again, with the one small change highlighted:
“It is interesting to contemplate a tangled bank, clothed with many plants of many kinds, with birds singing on the bushes, with various insects flitting about, and with worms crawling through the damp earth, and to reflect that these elaborately constructed forms, so different from each other, and dependent upon each other in so complex a manner, have all been produced by laws acting around us. These laws, taken in the largest sense, being Growth with reproduction; Inheritance which is almost implied by reproduction; Variability from the indirect and direct action of the conditions of life, and from use and disuse; a Ratio of Increase so high as to lead to a Struggle for Life, and as a consequence to Natural Selection, entailing Divergence of Character and the Extinction of less improved forms. Thus, from the war of nature, from famine and death, the most exalted object which we are capable of conceiving, namely, the production of the higher animals, directly follows. There is grandeur in this view of life, with its several powers, having been originally breathed by the Creator into a few forms or into one; and that, whilst this planet has gone circling on according to the fixed law of gravity, from so simple a beginning endless forms most beautiful and most wonderful have been, and are being evolved.”
Friday, November 14, 2008
Thursday, November 13, 2008
Spin Cycle
The big economic news this morning was the report that Germany’s gross domestic product declined in the third quarter, marking the second quarter in a row that Europe’s biggest economy has posted a decline in output. That two-quarter decline of course prompted finance reporters to proclaim that Germany now has met “the technical definition of a recession,” which they said is “two or more quarters of decline in GDP.”
Um, no. In the U.S. at least, the organization that more or less officially declares when recessions begin and end, the National Bureau of Economic Research, defines a recession “technically” like this:
“A recession is a significant decline in economic activity spread across the economy, lasting more than a few months, normally visible in real GDP, real income, employment, industrial production, and wholesale-retail sales. A recession begins just after the economy reaches a peak of activity and ends as the economy reaches its trough.”
In practical terms, most recessions do include two or more consecutive quarters of declining GDP, but a recession can start or end during a quarter that shows an overall increase. As the NBER puts it:
“Most of the recessions identified by our procedures do consist of two or more quarters of declining real GDP, but not all of them. The most recent recession in our chronology was in 2001. According to data as of July 2008, the 2001 recession involved declines in the first and third quarters of 2001 but not in two consecutive quarters. Our procedure differs from the two-quarter rule in a number of ways. First, we consider the depth as well as the duration of the decline in economic activity. Recall that our definition includes the phrase, "a significant decline in economic activity." Second, we use a broader array of indicators than just real GDP. One reason for this is that the GDP data are subject to considerable revision. Third, we use monthly indicators to arrive at a monthly chronology.”
So far, the NBER hasn’t declared that the U.S. economy is in a recession, but they’re often a little slow to make such determinations; it wasn’t until July 2003 that they decided the 2001 recession had ended in November of that year.
Interestingly, the NBER body that makes these determinations is called the Business Cycle Dating Committee. Until recently, a casual observer of the talk coming out of Wall Street and Washington might have gotten the idea that the “business cycle” was a thing of the past, there would be no more downturns, just an endless vista of rising prosperity until the end of time, because our god-like regulators and CEOs would exercise their miraculous powers to make it so.
These of course would be the same regulators and CEOs who are now running hysterically around seeking taxpayer handouts and warning of the imminent collapse of our entire economic system if they don’t get them.
Cycles play a large role in the natural order and are often related to the kind of yin-yang/creation-destruction/attraction-repulsion systems found widely in philosophy and physical science. Given the fundamental dualism of financial and economic dynamics (buy or sell), it would be pretty surprising if we didn’t see some evidence of cyclicality in the economy and markets.
Well, no surprise here:
The chart shows the daily closing price of the Dow Jones Industrial Average from 1897 to the present, on log scale (click to enlarge). The gridlines are set to a time interval of three and a quarter years, and as you can see if you look closely, many of the most significant lows in the index occur pretty close to those lines.
That cycle is even clearer in this next one:
What this chart shows is the 52-week average of the daily percentage price change in the Dow over the same period shown in the first chart. The cyclicality in this time series is pretty obvious, though it clearly isn’t precise enough to be useful in making investment forecasts.
Possibly the most interesting thing about this chart is the fact that the Oct. 27 low in the Dow at 8175 came just seven days shy of the exact predicted date for the 3.25-year cycle low. The index and the average rate of change both rebounded from that point, possibly signaling the start of an upswing that could last a year or more. However, for that interpretation to retain any validity, the index will have to stay above the 8175 level; it’s succeeding so far, but just barely.
Um, no. In the U.S. at least, the organization that more or less officially declares when recessions begin and end, the National Bureau of Economic Research, defines a recession “technically” like this:
“A recession is a significant decline in economic activity spread across the economy, lasting more than a few months, normally visible in real GDP, real income, employment, industrial production, and wholesale-retail sales. A recession begins just after the economy reaches a peak of activity and ends as the economy reaches its trough.”
In practical terms, most recessions do include two or more consecutive quarters of declining GDP, but a recession can start or end during a quarter that shows an overall increase. As the NBER puts it:
“Most of the recessions identified by our procedures do consist of two or more quarters of declining real GDP, but not all of them. The most recent recession in our chronology was in 2001. According to data as of July 2008, the 2001 recession involved declines in the first and third quarters of 2001 but not in two consecutive quarters. Our procedure differs from the two-quarter rule in a number of ways. First, we consider the depth as well as the duration of the decline in economic activity. Recall that our definition includes the phrase, "a significant decline in economic activity." Second, we use a broader array of indicators than just real GDP. One reason for this is that the GDP data are subject to considerable revision. Third, we use monthly indicators to arrive at a monthly chronology.”
So far, the NBER hasn’t declared that the U.S. economy is in a recession, but they’re often a little slow to make such determinations; it wasn’t until July 2003 that they decided the 2001 recession had ended in November of that year.
Interestingly, the NBER body that makes these determinations is called the Business Cycle Dating Committee. Until recently, a casual observer of the talk coming out of Wall Street and Washington might have gotten the idea that the “business cycle” was a thing of the past, there would be no more downturns, just an endless vista of rising prosperity until the end of time, because our god-like regulators and CEOs would exercise their miraculous powers to make it so.
These of course would be the same regulators and CEOs who are now running hysterically around seeking taxpayer handouts and warning of the imminent collapse of our entire economic system if they don’t get them.
Cycles play a large role in the natural order and are often related to the kind of yin-yang/creation-destruction/attraction-repulsion systems found widely in philosophy and physical science. Given the fundamental dualism of financial and economic dynamics (buy or sell), it would be pretty surprising if we didn’t see some evidence of cyclicality in the economy and markets.
Well, no surprise here:
The chart shows the daily closing price of the Dow Jones Industrial Average from 1897 to the present, on log scale (click to enlarge). The gridlines are set to a time interval of three and a quarter years, and as you can see if you look closely, many of the most significant lows in the index occur pretty close to those lines.
That cycle is even clearer in this next one:
What this chart shows is the 52-week average of the daily percentage price change in the Dow over the same period shown in the first chart. The cyclicality in this time series is pretty obvious, though it clearly isn’t precise enough to be useful in making investment forecasts.
Possibly the most interesting thing about this chart is the fact that the Oct. 27 low in the Dow at 8175 came just seven days shy of the exact predicted date for the 3.25-year cycle low. The index and the average rate of change both rebounded from that point, possibly signaling the start of an upswing that could last a year or more. However, for that interpretation to retain any validity, the index will have to stay above the 8175 level; it’s succeeding so far, but just barely.
Tuesday, November 11, 2008
Reasonably Irrational
I’ve been trying for some time to heap scorn on one of the central tenets of orthodox economics, namely the concept of the “rational investor.” Anyone who follows the markets can see quite clearly that investors behave irrationally at times, or have we forgotten the dot-com bubble? But the theory remains firmly in place, not because economists are stupid or because they’re deliberately trying to mislead people, but because the whole structure of mainstream economic theory would collapse without it.
Put simply, economists believe that economies and markets function efficiently because people naturally choose the courses of action that are most likely to give them the greatest benefit. In this obviously naive belief, economists are clinging to the ideas of those theorists of the so-called Age of Reason, such as John Locke and Adam Smith, who laid the foundations of our modern political and economic systems. For Locke and Smith and their like-minded contemporaries, “reason” alone is sufficient to guide all human life and unlock all the mysteries of existence, while “unreason” is all bad and a great impediment to our progress as individuals and as a society.
In particular, the thinkers of the 18th-century “Enlightenment” – many of them Deists, including a number of the founding fathers of the United States – identified “unreason” with traditional and “emotional” forms of religion. After all, they were keenly aware of the violent upheavals of the 15th and 16th centuries, when partisans on both sides of the Reformation engaged in repeated and vicious wars to promote or defend their theological positions.
These cutting-edge 18th-century opinions still hold sway with a large number of contemporary thinkers. Richard Dawkins, for example, in “The God Delusion,” voices the opinion that religious belief persists in our time mainly because of bad parenting (i.e., parents teaching their children religion), and if only we could rid ourselves of this irrational belief in the supernatural, the world would quickly enjoy unprecedented peace and harmony.
The main problem with this whole line of thought is that it takes into account only a small part of the human psyche while denying and devaluing the rest.
This was already the response of the Romantic movement, which followed close on the heels of the Enlightenment and celebrated the emotions and fantasies that had been swept out of the tidy Neoclassical worldview of Locke and Smith. The Romantics restored “irrationality” to a place of value and usefulness, perhaps even giving it too high an estimation; these swings of the pendulum do tend to carry to extremes.
It’s a bit ironic that the rationalists of the Age of Reason looked to ancient philosophy for support for their arguments, because the ancients actually had a much more balanced view of human psychology. In particular, Plato and his followers clearly delineated the psyche into an irrational and a rational part, and though they did argue that the rational soul should rule the individual psyche, they contended that the psyche as a whole should aim to serve a higher, super-rational level of being. (To be technical, this “higher level” is called nous in Greek and is translated generally as “spirit” or “intellect,” depending on the inclinations of the translator; neither term really works very well, in my opinion.)
There are many, I’m sure, who will find it absurd to accord any value to irrationality. But consider: Are our sense-perceptions rational? Of course not; they simply report the facts of our environment to our emotions and our thinking. What about instincts? No, but they're pretty useful in keeping us from starving to death and so on.
What about emotions? Well, as Carl Jung pointed out, there is in fact a kind of emotional logic, which is why he defined "feeling" as a "rational function": We can rate and rank and judge things according to how they make us feel, good or bad, better or worse. And that kind of evaluation seems pretty important to our well-being. But in our modern worldview, dominated by the belief that “rationality” consists entirely of verbal or numerical logic, it doesn’t make the cut.
And let’s not forget the importance of irrationality in creativity, in making breakthroughs. Logical analysis just breaks things down or connects one existing thing to another; it doesn’t produce anything new.
However, ignoring or denying the existence or importance of these things doesn’t make them go away; instead, it simply sweeps them under the mental rug, into the unconscious – something else a lot of contemporary thinkers like to pretend is nonexistent. And from their lurking-place in our mental shadow, they can feed on our basic appetites and drives, and grow large and powerful enough to dominate us now and then, causing all sorts of embarrassing problems and bloody conflicts.
In addition, there’s a tendency toward the thoroughly unproven and frankly rather smug belief that “we” – that is, the intellectual inheritors of the Western (specifically, the Northwestern European) worldview – are the only really rational people, while “they” – all those mostly darker people in the rest of the world – are irrational (“medieval,” “emotionally volatile,” “politically immature,” etc. etc.) and therefore in need of our benevolent (of course) guidance (or the firm hand of a dictator chosen by us).
It scarcely needs to be said, but I’ll state that I don’t think “we” are as rational as some of us like to believe, nor are “they” as irrational. And in any case, I think we need to practice irrationality to some extent. You might say that the problem isn’t that we’re irrational, it’s that we just aren’t very good at it.
Put simply, economists believe that economies and markets function efficiently because people naturally choose the courses of action that are most likely to give them the greatest benefit. In this obviously naive belief, economists are clinging to the ideas of those theorists of the so-called Age of Reason, such as John Locke and Adam Smith, who laid the foundations of our modern political and economic systems. For Locke and Smith and their like-minded contemporaries, “reason” alone is sufficient to guide all human life and unlock all the mysteries of existence, while “unreason” is all bad and a great impediment to our progress as individuals and as a society.
In particular, the thinkers of the 18th-century “Enlightenment” – many of them Deists, including a number of the founding fathers of the United States – identified “unreason” with traditional and “emotional” forms of religion. After all, they were keenly aware of the violent upheavals of the 15th and 16th centuries, when partisans on both sides of the Reformation engaged in repeated and vicious wars to promote or defend their theological positions.
These cutting-edge 18th-century opinions still hold sway with a large number of contemporary thinkers. Richard Dawkins, for example, in “The God Delusion,” voices the opinion that religious belief persists in our time mainly because of bad parenting (i.e., parents teaching their children religion), and if only we could rid ourselves of this irrational belief in the supernatural, the world would quickly enjoy unprecedented peace and harmony.
The main problem with this whole line of thought is that it takes into account only a small part of the human psyche while denying and devaluing the rest.
This was already the response of the Romantic movement, which followed close on the heels of the Enlightenment and celebrated the emotions and fantasies that had been swept out of the tidy Neoclassical worldview of Locke and Smith. The Romantics restored “irrationality” to a place of value and usefulness, perhaps even giving it too high an estimation; these swings of the pendulum do tend to carry to extremes.
It’s a bit ironic that the rationalists of the Age of Reason looked to ancient philosophy for support for their arguments, because the ancients actually had a much more balanced view of human psychology. In particular, Plato and his followers clearly delineated the psyche into an irrational and a rational part, and though they did argue that the rational soul should rule the individual psyche, they contended that the psyche as a whole should aim to serve a higher, super-rational level of being. (To be technical, this “higher level” is called nous in Greek and is translated generally as “spirit” or “intellect,” depending on the inclinations of the translator; neither term really works very well, in my opinion.)
There are many, I’m sure, who will find it absurd to accord any value to irrationality. But consider: Are our sense-perceptions rational? Of course not; they simply report the facts of our environment to our emotions and our thinking. What about instincts? No, but they're pretty useful in keeping us from starving to death and so on.
What about emotions? Well, as Carl Jung pointed out, there is in fact a kind of emotional logic, which is why he defined "feeling" as a "rational function": We can rate and rank and judge things according to how they make us feel, good or bad, better or worse. And that kind of evaluation seems pretty important to our well-being. But in our modern worldview, dominated by the belief that “rationality” consists entirely of verbal or numerical logic, it doesn’t make the cut.
And let’s not forget the importance of irrationality in creativity, in making breakthroughs. Logical analysis just breaks things down or connects one existing thing to another; it doesn’t produce anything new.
However, ignoring or denying the existence or importance of these things doesn’t make them go away; instead, it simply sweeps them under the mental rug, into the unconscious – something else a lot of contemporary thinkers like to pretend is nonexistent. And from their lurking-place in our mental shadow, they can feed on our basic appetites and drives, and grow large and powerful enough to dominate us now and then, causing all sorts of embarrassing problems and bloody conflicts.
In addition, there’s a tendency toward the thoroughly unproven and frankly rather smug belief that “we” – that is, the intellectual inheritors of the Western (specifically, the Northwestern European) worldview – are the only really rational people, while “they” – all those mostly darker people in the rest of the world – are irrational (“medieval,” “emotionally volatile,” “politically immature,” etc. etc.) and therefore in need of our benevolent (of course) guidance (or the firm hand of a dictator chosen by us).
It scarcely needs to be said, but I’ll state that I don’t think “we” are as rational as some of us like to believe, nor are “they” as irrational. And in any case, I think we need to practice irrationality to some extent. You might say that the problem isn’t that we’re irrational, it’s that we just aren’t very good at it.
Labels:
Adam Smith,
Age of Reason,
colonialism,
Deism,
economics,
irrational,
Jung,
plato,
rationalism,
Western worldview
Sunday, November 9, 2008
Identity Theft
I'm finding it more and more difficult to persuade myself that I have anything to say here that's sufficiently different from what a gazillion other blogs are saying to make it worth anyone's time to read this one. We like to think of ourselves as unique individuals and to believe our thoughts, feelings and experiences in general are very different from everyone else's, but when you live in a nation of 300-plus million people, even if only 1/10th of 1 percent of your fellow citizens have the same thought, that's still 300,000 of you with the same idea.
Moreover, how many choices do we really have? Religion, for example: There's a sort of menu of options ranging from highly traditional, fundamentalist sectarianism to outright materialistic atheism. The same sort of thing is true with politics. Sports: Pick a game, then choose a team to support. So identity-formation becomes like a meal in a Chinese restaurant: Pick one item from column A, one from column B, etc. And one person is a Baptist Republican Redskins fan who drives a Ford and likes Toby Keith, another is a Unitarian Democrat Yankees fan who drives a Toyota and listens to Tori Amos. I don't know how many such combinations are possible, but the number probably isn't very large, and some of the differences are pretty insignificant.
Worse still, we live in a world in which not just physical products are mass-produced, so are ideas, attitudes, styles, dreams. The products are marketed as a way of expressing who we are, and we buy them, and we also buy the premise that what we own, what we wear, what we drive expresses who we are. And maybe it does, and maybe it's not completely absurd to go to the mall to buy some individuality, but it certainly seems likely that we're limiting our possibilities that way.
If you've ever watched an older relative in the twilight of life, you've seen them seemingly fade away as they lose their grip on the attitudes, opinions, obsessions, addictions, preferences and finally the memories by which they defined themselves. What's left then, if that kind of self-definition is all they have? But who or what is it that made those choices in the first place?
Oh sure, I know, no two of us have the same fingerprints, except monozygotic twins. So yes, we're all unique in that sense - which means we're all the same, doesn't it?
Moreover, how many choices do we really have? Religion, for example: There's a sort of menu of options ranging from highly traditional, fundamentalist sectarianism to outright materialistic atheism. The same sort of thing is true with politics. Sports: Pick a game, then choose a team to support. So identity-formation becomes like a meal in a Chinese restaurant: Pick one item from column A, one from column B, etc. And one person is a Baptist Republican Redskins fan who drives a Ford and likes Toby Keith, another is a Unitarian Democrat Yankees fan who drives a Toyota and listens to Tori Amos. I don't know how many such combinations are possible, but the number probably isn't very large, and some of the differences are pretty insignificant.
Worse still, we live in a world in which not just physical products are mass-produced, so are ideas, attitudes, styles, dreams. The products are marketed as a way of expressing who we are, and we buy them, and we also buy the premise that what we own, what we wear, what we drive expresses who we are. And maybe it does, and maybe it's not completely absurd to go to the mall to buy some individuality, but it certainly seems likely that we're limiting our possibilities that way.
If you've ever watched an older relative in the twilight of life, you've seen them seemingly fade away as they lose their grip on the attitudes, opinions, obsessions, addictions, preferences and finally the memories by which they defined themselves. What's left then, if that kind of self-definition is all they have? But who or what is it that made those choices in the first place?
Oh sure, I know, no two of us have the same fingerprints, except monozygotic twins. So yes, we're all unique in that sense - which means we're all the same, doesn't it?
Subscribe to:
Posts (Atom)