Chapter
17 – The Theistic Bottom Line
The three large principles summed up in the previous
chapter are enough. Having established them, we have enough to conclude that a
higher power or consciousness exists in our material universe. Or rather, as
was promised in the introduction, we have enough to conclude that belief in God
is a rational choice for an informed modern human being to make.
And that is the point. Belief in God is a choice. It
is simply a more rational choice than its alternatives.
It is also worth reiterating three other points here:
first, we must have a moral program in our heads to function at all; second,
the one we’ve inherited from the past is dangerously out-of-date; and third,
whatever new one we devise, it will have to be turned from a cerebral code into
a personal one. A moral code must be felt and lived as personal or else it isn’t
really a moral code at all. It will not guide us when a moral crisis comes.
This final chapter gives a more informal
explanation and interpretation of the pieces of argument assembled so far and then adds some other, better-known pieces whose significance in this discussion will
now be explained as we go along. It will also try to answer some of the most
likely reactions to the ideas in this book.
My promise was that by the end of this book, we
would be able to assemble a strong case for theism – that is, belief in God. We’re
almost there. We shall begin this last chapter by revisiting, in a more
personal way, a vexing problem in Philosophy mentioned in Chapter 4, a problem
that is three hundred years old. The solution to this problem drives home our
first main point on the final stretch of the thinking process that leads to
theism.
Many scientists claim that their branch of human
knowledge, unlike all the ones that came before the rise of Science, does not
have any basic assumptions at its foundation and that it is instead built from
the ground up on merely observing reality, forming theories, designing research, doing
the research, checking the results against one’s theories, and then doing
more hypothesizing, research, and so on. Under this view, Science has no need
of foundational assumptions in the way that, say, Philosophy or Euclidean Geometry
do. Science is founded only on hard facts, they claim. But in this claim, as
has been pointed out by thinkers like Nicholas Maxwell, those scientists are
wrong.1
Over the last four centuries, the scientific way of
thinking, Bacon’s “new instrument,” has made possible the amazing progress in
human knowledge and technology that today we associate with Science. But in the
meantime, at least for philosophers, that way of thinking has come in for some
tougher analysis.
Cover of early copy of Novum Organum
(credit: John P. McCaskey, via Wikimedia Commons)
The heart of the matter, then, is the inductive
method normally associated with Science. The way in which scientists can come
upon a phenomenon they cannot explain with any of their current models, devise
a new theory that tries to explain the phenomenon, test the theory by doing
experiments in the material world, and keep going back and forth from theory to
experiment, adjusting and refining – this is the way of gaining knowledge
called the scientific method. It has
led us to so many powerful new insights and technologies. It really was an
amazing breakthrough when Francis Bacon—whether we credit him with originating
it or merely expressing what many in his era were already thinking—saw and
explained what he called his “new instrument” (novum organum).
But as David Hume famously proved, the logic this
method is built on is not perfect. Any natural law that we try to state as a
way of describing our observations of reality is a gamble, one that may seem to
summarize and bring order to entire files of experiences, but a gamble
nonetheless. A natural law is a scientist’s claim about what he thinks is going
to happen in specific future circumstances. But every natural law proposed is
taking for granted a deep first assumption about the real world. That
assumption is that events in the future will continue to steadily follow the
patterns we have been able to spot in the flows of events in the past. But we
simply can’t ever know whether this assumption is true. At any time, we may
come on new data that stymie our best theories.
Albert Michelson (credit: Bunzil, via Wikimedia Commons)
Edward Morley (credit: Wikimedia Commons)
Clearly, Science is open to making mistakes. For
scientists themselves, a shocking example of such a mistake was a mistake in
Physics. Newton’s model of how gravity and acceleration work was excellent, but
it wasn’t telling the full story of what goes on in the universe. After two
centuries of taking Newton’s equations as their gospel, physicists were stunned
by the experiment done by Albert Michelson and Edward Morley in 1887. In essence,
it showed that Newton’s laws were not adequate to explain all of what was
really going on. Einstein’s thinking on these new data is what led him to the Theory
of Relativity. But first came Michelson and Morley’s experiment, which showed
that the scientific method, and Newton, were not infallible.
Newton was not proved totally wrong, but
his laws were shown to be mere approximations, accurate only for smaller masses
at slow speeds. As the masses and speeds tested become larger, Newton’s laws
become less useful for predicting what is going to happen next.
Nevertheless, it was a scientist, Einstein, doing science
who found the limitations of the laws and models specified by an earlier
scientist. Newton was not amended by a clergyman or carpenter or a reading from
an ancient holy text. Thus, from the personal standpoint, I have always
believed, I still believe, and I’m confident I always will believe that the
universe is consistent, that it runs by laws that will be the same in 2525 as
they are now, even though we don’t understand all of them very well yet.
Relativity theory describes how the stars moved in
the year 1,000,000 BC exactly as accurately as it describes the stars’
movements now. In that era, living things reproduced and changed by the process
we call evolution just as reliably as
living things do now. I believe that, for living things, genetic variation and
natural selection are constants.
But I can’t prove beyond any doubt that the
universe runs by one consistent set of laws; I can only choose to take a
Bayesian kind of gamble on the foundational belief that this is so. I prefer
this belief as a starting point over any alternative beliefs that portray the
universe as being under laws that are capricious and unpredictable. Science has
had so many successes that, even if I can’t be certain that its findings and
theories are infallible, I choose to heed what scientists have to say. That
choice, for me, is just a smart Bayesian gamble, preferable to any of the
superstitious alternatives. Or as Robert Frost said: “Two roads diverged in a
yellow wood, and I …I took the one less traveled by, and that has made all the difference.”
two roads diverged in a wood (credit: Jakec, via Wikimedia Commons)
two roads diverged in a wood (credit: Jakec, via Wikimedia Commons)
There is even evidence that tribes of the past knew
of the inductive method and gained knowledge by it.2,3,4 More and
more as millennia passed, they turned to their gods only when they couldn’t
figure out on their own how some natural process worked. One of the big effects
of Science has been to steadily dispel superstitions as better insights into
the workings of physical reality are acquired. In fact, most people today, at
least in the West, concede almost automatically that superstitions need to be
dispelled. Plagues aren’t caused by evil spirits or God’s punishment, and they
don’t go away if we burn incense or chant for days at a time. But if we control
rats, we can control bubonic plague. If we selectively breed our livestock,
then our chickens, cows, sheep, and pigs keep giving more eggs, milk, wool, and
pork. In short, all humans and all human societies keep gradually becoming more
rational, i.e. more scientific in their outlook and lifestyles, because
survival demands it.
My model of cultural evolution also showed me why
some superstitious beliefs hang on for generations before they are dispelled.
But in the end, as old thinkers are replaced by more enlightened ones, the
method of human learning, whether it is individual or tribal, is an inductive
one. We get ideas about the material world and we test them. We sometimes test world
views or moral systems over generations, and what we learn is absorbed by the
tribe over generations rather than cognized by any one individual. But our
knowledge keeps growing, as it must if we are to survive. We are the only
concept-driven species that we have encountered so far. The knowledge-accumulating,
social way of surviving is the human way. Our genetically-acquired assets
(speed, strength, etc.) are trivial by comparison. We live by accumulating better and better knowledge and passing it on to our kids, or we die.
No comments:
Post a Comment
What are your thoughts now? Comment and I will reply. I promise.