Chapter 16 –
The Theistic Bottom Line
The three large principles summed up in the
previous chapter are enough. Having established them, we have enough to
conclude that a higher power or consciousness exists in our material universe.
Or rather, as was promised in the introduction, we have enough to conclude that
belief in God is a rational choice for a thinking, informed, modern human being
to make.
And that is the point. Belief in God is a
choice. It is simply a more rational choice than its alternatives.
It is also worth reiterating three other
points here: first, we must have a moral program in our heads to function at
all; second, the one we’ve inherited from the past is dangerously out-of-date;
and third, whatever new one we devise, it will have to be turned from a
cerebral code into a personal one. A moral code must be felt and lived as
personal or else it isn’t really a moral code at all.
This
final chapter gives a more informal explanation and interpretation of the
pieces assembled so far, and adds some other, better-known pieces whose significance
in this discussion will now be explained as we go along. It will also try to
answer some of the most likely reactions to the ideas in this book. My promise
was that by the end of this book, we would be able to assemble a strong case
for theism—that is, belief in God. We’re almost there. We shall begin this last
chapter by revisiting, in a more personal way, a vexing problem in philosophy
mentioned in Chapter 4., a problem that is three hundred years old. The
solution to this problem drives home our first main point on the final stretch
of the thinking process that leads to theism.
Christopher Lloyd in "Back To The Future", a popular
image of a scientist
Many
scientists claim that their branch of human knowledge, unlike all of the ones
that came before the rise of science, does not have any basic assumptions at
its foundation and that it is instead built from the ground up on merely
observing reality, hypothesizing, designing and doing research, checking the
results against one’s hypothesis, and then doing more hypothesizing, research, and
so on. Under this view, science has no need of foundational assumptions in the
way that, say, philosophy or Euclidean geometry do. Science is founded only on
hard fact, they claim. But in this claim, as has been pointed out by thinkers
like Nicholas Maxwell, those scientists are wrong.1
Over
the last four centuries, the scientific way of thinking, Bacon’s “new
instrument,” has made possible the amazing progress in human knowledge and
technology that today we associate with science. But in the meantime, at least
for philosophers, it has come in for some tougher analysis.
title page of early copy of Novum Organum
The
heart of the matter, then, is the inductive method normally associated with science.
The way in which scientists can come upon a phenomenon they cannot explain with
any of their current models, devise a new theory that tries to explain the
phenomenon, test the theory by doing experiments in the material world, and keep
going back and forth from theory to experiment, adjusting and refining—this is
the way of gaining knowledge called the scientific
method. It has led us to so many powerful new insights and technologies. It
really was an amazing breakthrough when Francis Bacon—whether we credit him
with originating it or merely expressing what many in his era were already
thinking—saw and explained what he called his “new instrument” (novum organum).
But
as David Hume famously proved, the logic this method is built on is not
perfect. Any natural law that we try to state as a way of describing our
observations of reality is a gamble, one that may seem to summarize and bring
order to entire files of experiences, but a gamble nonetheless. A natural law
is a scientist’s claim about what he thinks is going to happen in specific
future circumstances. But every natural law proposed is taking for granted a
deep first assumption about the real world. That assumption is that events in
the future will continue to steadily follow the patterns we have been able to
spot in the flows of events in the past. But we simply can’t ever know whether
this assumption is true. At any time, we may come on new data that stymie our
best theories.
Albert Michelson
Edward Morley
Clearly,
science is still open to making mistakes. For scientists themselves, a shocking
example of such a mistake was the “mistake” in Newtonian physics. Newton’s
models of how gravity and acceleration work were excellent, but they weren’t
telling the whole story of what goes on in the world. After two centuries of
taking Newton’s models and equations as gospel, physicists were stunned by the
experiment done by Albert Michelson and Edward Morley in 1887. In essence, it showed
that Newton’s laws were not adequate to explain all of what was really going
on. Einstein’s thinking on these new data is what led him to the theory of relativity.
But first came Michelson and Morley’s experiment, which showed that the
scientific method, and Newton, were not infallible.
Newton
was not proved totally wrong, but his laws were shown to be approximations, accurate only for smaller masses and at slower speeds. As the
masses and speeds tested become larger, Newton’s laws become less and less useful for
predicting what is going to happen next.
Nevertheless,
it was a scientist, Einstein, doing science who found the limitations of the
laws and models specified by an earlier scientist. Newton was not amended by a
clergyman or a reading from an ancient holy text. Thus, from the personal
standpoint, I have always believed, I still believe, and I’m confident I always
will believe that the universe is consistent, that it runs by laws that will be
the same in 2525 as they are now, even though we don’t understand all of them
very well yet. Relativity theory described how the stars moved in the year
1,000,000 BC exactly as accurately as it describes the stars’ movements now. In
that era, living things reproduced and changed by the process that we call
evolution just as reliably as living things do now. I believe that, for living
things, genetic variation and natural selection are constants.
But
I can’t prove beyond any doubt that the universe runs by one consistent set of
laws; I can only choose to take a Bayesian kind of gamble on the foundational
belief that this is so. I prefer this belief as a starting point over any
alternative beliefs that portray the universe as being run under laws that are
capricious and unpredictable. Science has had so many successes that, even if I
can’t be certain that its findings and theories are infallible, I choose to
heed what scientists have to say. That choice, for me, is just a smart Bayesian
gamble, preferable to any of the superstitious alternatives. Or as Robert Frost
said:
Two roads diverged in a
wood, and I,
I took the one less
travelled by,
And that has made all the
difference.
There
is even evidence that the superstitious tribes of the past knew of the
inductive method and gained knowledge by it.2,3,4 They turned to
their gods only when they couldn’t figure out how some natural process worked.
One of the big effects of science has been to gradually dispel superstitions as
better insights into the workings of physical reality are acquired. In fact,
most people today, at least in the West, concede almost automatically that
superstitions do need to be dispelled. Plagues aren’t caused by evil spirits or
God’s punishment, and they don’t go away if we burn an incense stick or chant
for days at a time. But if we control the rats, we can control bubonic plague. If
we selectively breed our livestock, then our chickens, cows, sheep, and pigs
keep giving more eggs, milk, wool, and pork. In short, all humans and all human societies keep
gradually becoming more rational because survival demands it.
No comments:
Post a Comment
What are your thoughts now? Comment and I will reply. I promise.