A former climate skeptic speaks

By Paul Braterman

Scientists are supposed to change their minds when the balance of evidence changes. In my experience, this doesn’t always happen, but one very respected scientist who has changed his mind, not once but twice, and very publicly, is Prof Richard Muller of the Berkeley Earth Land Temperature Project, UC Berkeley, and Lawrence Berkeley National Laboratory.

Three years ago, he was among the few remaining respected scientists to reject the IPCC (Intergovernmental Panel on Climate Change) analysis of current climate, and unconvinced that significant climate change was happening at all, let alone that it might be driven by human activity. Not surprising, then, that a consortium of those with an interest in denial funded his BEST (Berkeley Earth Surface Temperature) project, to carry out a completely independent, assumption-free, analysis of the data. They got more than they bargained for.

First, BEST concluded, in findings published last year, that warming is indeed taking place as asserted by the overwhelming majority of the climate science community. Now, even more significantly, BEST has taken the position, in a paper submitted to The Third Santa Fe Conference on Global and Regional Climate Change, that CO2 is the most significant driver, that the IPCC estimate of the effect of CO2 (3oC warming for each doubling of CO2 concentration) is accurate, and that the amount of warming from the 1950s to the 2000s (0.87 +/- 0.05 oC) is if anything slightly more than the IPCC estimate.

Moreover, the BEST publications analyse and dismantle all the standard objections to this work. Yes, there are effects due to volcanoes. I don’t think anyone denies this. The 1991 Mount Pinatubo eruption led to summers bad enough to force up the price of potatoes, but earlier events, such as the eruptions of Laki in 1783, Tambora in 1815, and Cosiguina in 1835, were much more significant. We know this from accounts at the time, and can quantitatively estimate the intensity of the eruptions by the amount of sulphate in ice cores. The late 18th and early 19th century data are good enough to provide a scaling factor (0.15 oC cooling per gigatonne emitted sulphate), showing that, by contrast, the overall effect of volcanoes in the 20th century has been insignificant. No, there is no significant effect attributable to the Sun. No, there are no artefacts due to the number and location of climate stations, although this has not stopped WattsUp, a Koch-funded enterprise, from raising this yet again in response to Muller. There is some variability connected with oscillations in the Atlantic, which may be responsible for the 0.17 oC variation from the simplest model. This model, which attributes all change to a linear volcanic effect, and a logarithmic CO2 effect, is remarkably successful. In the light of Muller’s work there is no excuse for invoking alleged scientific uncertainty to delay urgent consideration of the effects of further increasing CO2 concentrations, and the appropriate policy responses.

There are some particular words of caution. We don’t understand why the difference between day and night time temperatures decreased from 1900 to 1987, but then started rising again. We can’t be certain that present temperatures exceed those of the Mediaeval Warm Period, so beloved of climate “sceptics”, although it seems clear that if things continue as they are, the issue will be beyond all doubt. Warming has not led to more hurricanes, and the heat wave afflicting the United States this summer is local rather than global.

Muller’s approach includes the effects to date of one important feedback, the positive feedback due to the fact that ocean warming leads to increased concentrations of water vapour, the most significant of all greenhouse gases. However, of necessity, it neglects effects that have not yet kicked in, most of which can only add to our concerns. A tiny minority of climate scientists still maintain that increased cloud cover will moderate the effects of warming, but the evidence (see Science 2009, 325, 376, for a discussion) now shows that the opposite is true. Melting of sea and land ice will speed up warming, by reducing the Earth’s albedo, the fraction of incident sunlight that is reflected straight back into space, since exposed ground, crops, and open ocean absorb more energy and snow and ice. There is the prospect of release of methane from thawing tundra, and increased release of carbon dioxide from soils, as bacterial activity increases with warming. The only negative feedback in prospect is the greater reflectivity of deserts, as compared with cultivated land, but that is the last thing we should be looking forward to.

Thus Muller’s approach offers us the lowest credible estimate of what is in store for us. Despite which, opposition parties in Australia and the US, including one US presidential candidate, and a vocal faction within the United Kingdom’s governing Conservative Party, continue, and may be expected to indefinitely continue, in their denial that any real problem has been shown to exist.

I find this frightening

Paul Braterman also now has his own blog Stars and Stalagmites here

Don’t say “Darwin” when you mean “evolution” (part 1)

By Paul Braterman

Part I,Dalton and Darwin

Don’t say “Darwin” when you mean “evolution”. Don’t say “theory of evolution” when you mean the established historical facts of change over time and common descent. And above all, don’t say “Darwin’s theory of evolution” except in the historical context of the evolution of ideas. If you do, you are guilty of scientific, logical, historical, and pedagogical errors, and playing into the hands of our Creationist opponents.

Dalton is to the modern atomic theory, and the modern atomic theory is to chemistry, as Darwin(not to forget Wallace) is to evolution, and as evolution is to biology. But we don’t call our present perspective on atoms “Dalton’s theory”, and indeed, unless we are speaking historically, it sounds odd to even talk about “atomic theory” when we discuss atoms. So why should we refer to “Darwin’s theory”, and indeed why should we talk about the “theory of evolution” when we really mean the fact that evolution has taken place? I argue here that we shouldn’t, and that, given the ongoing opposition to the central facts of biology, it is actively damaging to do so.

John Dalton produced his “atomic theory” in the early 19th century. He arrived at it by way of a theory of gas pressure that we now know to be totally erroneous. Wielding Occam’s Razor rather too energetically, he assumed that the simplest compound between two elements contained just one atom of each, so that water would have the formula HO. He rejected what now seems to us perhaps the most striking validation of his theory, Gay-Lussac’s observation that gases combine according to simple ratios, because it pointed towards what later became known as Avogadro’s Hypothesis, which in turn required some gaseous molecules to be divisible,[1] and when it came to gaseous elements Dalton had not grasped the distinction between atoms (the fundamental particles of chemical composition) and molecules (the fundamental particles of gas pressure). It was half a century before his theory was generally accepted, and even then some remained sceptical, on the grounds that no one had ever observed the effects of individual atoms or molecules, until in 1905 Einstein pointed out that that was exactly what people were doing when they looked at Brownian motion. These days, however, individual atoms are routinely observed by the methods of high resolution transmission electron microscopy, and scanning tunnelling microscopy, both of which depend on concepts far beyond any available to Dalton.

Charles Darwin produced his theory of the mutability of species as the result of natural selection (he did not himself use the term “evolution”) in the mid-19th century. Central to the theory is the existence of sufficient heritable variation to explain the diversity of life, and a major stumbling block is the prospect that favourable variations will disappear through dilution.[2] He appealed to the experience of animal breeders, but as a solution to the problem of dilution this is grossly unfair, since breeders can and do deliberately select rare variants to breed between. He lamented the absence of fossil evidence, in terms still quoted by creationists despite the tons (literally) of such evidence that have been unearthed in the intervening 150 years. He was unaware of the digital nature of inheritance, as established by his contemporary, Gregor Mendel, but not widely known until that work was rediscovered (or more accurately, perhaps, reinterpreted)[3] at the beginning of the 20th century. He fully realised that evolution required many millions of years, and had no good answer when Lord Kelvin, his “ogre”, used thermodynamic arguments to show (correctly) that the then known sources of energy could only have kept the sun shining for a mere 20 million years or so. He had no inkling of the nature of the genetic material, and could not have conceived of the methods of molecular biology that now allow us, using much the same kind of evidence that the courts use to establish paternity, to compare related species and to chart their divergence in exquisite detail. Least of all did he have any notion of the source of the variations of which evolution depends, or of how the supply of variants is constantly replenished by mutation, a process that we can now observe at the level of an individual’s DNA.

The first half of the 20th century saw the formation of what became known as “the neo-Darwinian synthesis”, bringing together by the 1940s the concept of selection and the methods of population genetics. (The expression “neo-Darwinian” should now properly be restricted to the evolutionary thought of that time, although Creationists persist in applying it to current biology, for reasons discussed in part II).  The second half saw an explosion in our understanding of inheritance, based on laboratory studies, while the final decades saw breakthroughs in our understanding of human evolution, with the discovery of the fossilised remains of over a dozen species of our early relatives in eastern and southern Africa. By the end of the 20th century, evolution denial could fairly be compared with Holocaust denial. Given what we have learnt from molecular biology in the present century, it could now more fairly be compared with denying that Hitler ever invadedPoland in the first place.

So why does the name of Darwin still provoke controversy, why do people still speak of “the theory of evolution”, when as often as not they are referring, not to theory, but to the established historical facts,  why does it matter, and how should we respond? These topics will be the subject of my next posting, “Naming and Framing”.

(You will find  more on Dalton  and his times, and on Kelvin and the age of the earth,  in my book, From Stars to Stalagmites, andthe arguments in these two posts are developed at greater length in an article that I wrote with Britt Holbrook; Putting Darwin in his Place; the Need to Watch our Language)

Footnotes

[1]  Two volumes of hydrogen combine with one volume of oxygen to make two volumes of steam. If, as required by Avogadro’s Hypothesis, equal volumes of gas contain equal numbers of molecules, then each molecule of oxygen must contain (at least) two atoms, as shown in the way we now write this equation: 2H2 + O2 = 2H2O

[2] Imagine a favourable red variant in a population of white flowers. Under the blending theory of inheritance then current, its first generation offspring will be deep pink, the second generation somewhat paler, and so on until the descendants are indistinguishable from the general population.

 [3]  See Genesis; the Evolution of Biology, Jan Sapp, OUP, 2003, pp 117-122

Discovery Institute barking mad over Australopithecus sediba’s diet

By Paul Braterman
20120705-222357.jpg

I don’t normally bother with the Creationist newssheet, Evolution News and Views, but the recent article there by David Klinghoffer goes beyond what I am willing to suffer in silence. Klinghoffer himself, of course, is a senior fellow of the Discovery Institute, and the author of How Would God Vote? Why the Bible Commands You to Be a Conservative. I do not know his academic credentials - I seem to remember that he is a lawyer, but neither his biography on the Discovery Institute website, nor his Wikipedia entry (which follows that biography rather closely), give any details, and I hope that some readers can tell us more about this.

I also think it worth noting that Klinghoffer’s article has nothing to do with Intelligent Design, misguided though that may be. Like so much Discovery Institute material, it is an attack on the well-established facts of common ancestry. In other words, what is being advocated is, in the strictest and narrowest sense of the word, creationism. And not even creationism as a philosophical or religious position, but as an interpretation of the facts of biology, in a manner that has been intellectually unsustainable since around 1830.

Anyway, to business: in my own recent posting here I describe why, when announcing their finding in 2010, the discoverers of Au. sediba chose, on reflection, to include it in the genus Australopithecus rather than in the genus Homo. That 2010 account does, however, give a long list of ways in which Au. sediba is closer than Australopithecus to modern humans, and the title I chose for my piece (An Almost Human Tragedy) reflects this. I also described the most recent, rather surprising, finding; that the diet of Au. sediba ignored available grasses, in favour of woodland products such as tree bark.

Now here is what Klinghoffer has to say about this same finding:

Another Human “Ancestor” Bites the Dust Bark

…Sure enough, the cooling trend [concerning the importance of Au. sediba] is now plainly in evidence, with Nature reporting that the creatures had a very notable characteristic in common with chimps, not humans, that had not previously been recognized: their diet, highlighted by tree bark and wood. This was found thanks to an analysis of tooth enamel and dental tartar and microwear. The NY Times lets its readers down softly:

"Dr. Berger was an author of the new journal report. Few other paleoanthropologists agree with Dr. Berger’s contention that the new species is the most plausible known ancestor of archaic and modern humans. [Emphasis added by Klinghoffer]. Dr. [Amanda G.] Henry’s group said that studies of additional fossils from the Malapa caves “will provide a better understanding of the dietary ecology of Au. sediba.”"

Actually, the New York Times account amplifies an earlier one, which said

The discoverer of the fossils, Lee Berger of the University of Witwatersrand in Johannesburg, says the new species, known as Australopithecus sediba, is the most plausible known ancestor of archaic and modern humans. Several other paleoanthropologists, while disagreeing with that interpretation, say the fossils are of great importance anyway, because they elucidate the mix-and-match process by which human evolution was shaped.

And the original paper in Science actually said, in the Abstract,

Combined craniodental and postcranial evidence demonstrates that this new species shares more derived features with early Homo than any other australopith species and thus might help reveal the ancestor of that genus

…and, in the body of the paper (p 203, column 3),

We can conclude that combined craniodental and postcranial evidence demonstrates that this new species shares more derived features with early Homo than does any other known australopith species, and thus represents a candidate ancestor for the genus, or a sister group to a close ancestor that persisted for some time after the first appearance of Homo[my added emphasis].

The situation is exactly as I described it, with no great claim to originality, in my earlier account:

The problem is no longer one of finding a missing link, but one of tracing an individual branch (the one that led to us) through a densely forking bush. It is always notoriously difficult to distinguish closely related species, because of individual differences. Even when we can, we have no way of being sure which extinct species lie on our direct ancestral line; it is difficult to tell the difference between our great-grandfather and our great-great-uncle, or between one great-great-uncle and another.

In short, then, on the basis of newspaper accounts and apparently without having read the original literature, Klinghoffer gleefully demotes Au. sediba from a position that most workers in the field had never even claimed for it, in the belief that the evolutionary account is thereby in some way undermined. Actually, the boot is on the other foot; the loser is the religious doctrine of separate creation. For if the 20 or so known distinct australopithecine and other early hominin species are not related by common descent, and were therefore doomed to extinction without progeny, why were they ever created in the first place?

New Book by Paul Braterman, Out Now!

By Les Ogilvie

This week sees the release of “From Stars to Stalagmites”, the first popular science book by Emeritus Professor of Chemistry and occasional 21st Floor contributor Paul S. Braterman. In the book, Professor Braterman seeks to explain the history of chemistry - not just as a reflection on chemistry as a science, but also as a description of the chemical history of the universe. Along the way he will tackle subjects ranging from the formation of stars, through the chemistry of warfare, to the uncertainty of the quantum world. As anyone who has read Paul’s articles here at the 21st Floor will no doubt realise, this promises to be an excellent, informative, knowledgable and highly readable work.

We will provide a review of the book as soon as possible, but in the meantime here is some of the praise the book has already received:

“This highly readable book does an excellent job of explaining scientific concepts in plain language, and brilliantly connects social history with scientific history and concepts. Strongly recommended for readers of all backgrounds.”

Oscar Liu
Senior Principal Scientist, Merck

“It’s a terrific read and the idea of intertwining the facts of chemistry with the history and personalities of the scientists who discovered it works brilliantly.”

John Wiltshire
Systems Engineer
Nelson Gold Medallist for Creativity

“Your writings are a wonderful compilation of chemistry, history, and human endeavors. The chapter on Haber was superb! … This text is something that every chemist should read!”

Prof Diana Mason
Regional Director and Associated Chemistry Teachers of Texas
University of North Texas

“Getting to know of atoms and molecules and their motions was not easy — Braterman pulls us into the story of the people who got us that hard-won knowledge. A superb combination of history and scientific explanation!”

Roald Hoffmann
Nobel Laureate Chemist and Writer

“Using an historical approach, From Stars to Stalagmites teaches about science in an engaging and fun manner that should appeal to interested lay readers and professionals alike.”

Richard Hirsh
Professor of History of Technology
Virginia Tech

The book is available to order now from publishers Word Scientific and is also available to order at Amazon.

Good without God? The roots of morality

By Paul Braterman

Firstly, God doesn’t help any. The proof is known as the Euthyphro dilemma, after the character who in Plato’s dialogue of that name rashly tried to tell Socrates what it is to be pious. In present-day terms, does God want what is good because it is good, or is it good simply because God wants it? Most of us (including Socrates and Euthyphro) would reject the second alternative, because it would make goodness depend on God’s arbitrary decision. But that means that goodness is defined independently of what God wants, even if (for a certain kind of believer) God always wants what is good, and we are no further forward. Some believers invoke the brotherhood of man, as a consequence of the fatherhood of God. Humanists may find such people natural allies, while claiming to have reached the same conclusion more quickly, by cutting out the middleman. Finally, there are those who believe, like the philosopher Alvin Plantinga, that our intellectual and moral faculties were divinely implanted by some supernatural process. To those who find this a satisfactory explanation, I have nothing useful to say.

If our moral sense is not a miraculous implant, it must be the product of our evolution. Regarding this, we have two very different schools of thought, which I shall call the hardheaded and softhearted (or, if you want to be cruel, hardhearted and softheaded), rather like the psychologists’ dichotomy of tough- versus tender-minded.

The hardheaded view is that the natural condition of humanity is a brutal and lawless selfishness, developed in response to a hostile and indifferent Nature, which must be kept in check by a recently involved veneer of civilisation. A decade before the publication of The Origin of Species, the poet Tennyson had written of “Nature, red in tooth and claw”. Herbert Spencer, a believer in individual competition and a minimal role for the State, referred to “survival of the fittest” when challenged by “the aggregate of external forces” (he had been a railway engineer before he developed his own theory of evolution). T. H. Huxley, remembered today as a leading advocate ofDarwin’s ideas, actually held different views toDarwin’s on the nature of morality, regarding it as something imposed by society on our natural individual selfishness. Similar views had been expressed much earlier by Hobbes, for whom human life in a state of nature would have been “solitary, poor, nasty, brutish and short”.

The softhearted view, which I find far more attractive (though this has nothing to do with whether or not it is correct), is that our capacity for morality is a hardwired product of our evolution. This is what Darwinthought, and in The Descent of Man he speaks of our moral sense as the noblest and most evolved part of our nature, arising from the combination of what he called the “instinct of sympathy” with our use of reason. This view also has deep roots. Hobbes himself spoke of relieving his own distress by giving money to a beggar, and Adam Smith, in The Theory of Moral Sentiments, regards concern for others as an essential part of our humanity. Kropotkin, a biologist as well as a revolutionary, gave such sympathies, even in animals, a central role in biology, in his 1902 work, Mutual Aid, A Factor of Evolution. Very recently, E.O. Wilson, in The Social Conquests of the Earth, has argued that social instincts explain the remarkable evolutionary success both of insects, and of humans.

Before considering the merits of these two viewpoints, there is one misunderstanding that we need to clear out of the way. We have all heard of the “selfish gene”, but the selfish gene is not a gene for selfishness. We can see this even at the molecular level. All of us have, within our own DNA, sequences derived from viruses that infected some remote ancestor. Such sequences are passed down from generation to generation and even species to species, and can persist, recognisable but mutating, for millions or tens of millions of years. We can even construct an evolutionary tree (Fig. 1) based on the acquisition of these “endogenous retroviruses”, ERVs, and it should surprise no one (except a creationist) that this agrees with the trees based on anatomy and the fossil record, or on molecular phylogeny.

Fig. 1. Retroviral insertions in DNA (courtesy John Wiltshire, after Theobald)

Now consider the eventual fate of an ERV. Given enough time, it will mutate beyond recognition, or perhaps disappear altogether. But if it should, in the course of this random mutating, chance to adopt a form that is of some value to the organism, then it will itself become subject to the normal processes of natural selection and refinement of function. In extreme cases, it may even become essential to the organism, as is the case with an ERV that now directs the construction of the placenta in Carnivora. When your cat has kittens, she has this ERV to thank, as well as the tom next door. The parasitic intruder has become an essential member of the household.

So is the ERV driven by some sense of sympathy for the host, or a sense of fairness or duty that makes it determined to pay for its board and lodging? Of course not. Nothing is involved beyond simple natural selection. What works, works; all living things are in competition for resources; and at times, the best way to compete is to cooperate. To put it rather differently, the struggle for existence always takes place within an environment; for a piece of DNA, the environment is the individual organism; and for simians such as ourselves, the environment is the social group, outside of which any individual would be hard put to survive.

But when we try to map the evolution of human morality, we run into an immediate problem. We know of some 20 distinct species more or less intermediate between our own species, and our last common ancestor with chimpanzees. The trouble is that what a skull or a toe bone might tell us about its owner’s behaviour is strictly limited. We have fossils of molars, but not fossils of morals. We can infer what food our ancestors ate, but not how generously they shared it.

So we have to use a range of more indirect approaches, such as comparison with our closest animal relatives, careful analysis of actual human behaviour, and consideration of what life must have been like, and what strategies would have favoured survival, when we lived as groups of hunter gatherers, or indeed for gregarious species in general.

This approach suggests a number of different precursors for human morality, many of them recognised by Darwin himself in The Descent of Man. Firstly and most obviously, we have kin selection. Obviously, if we do not succeed in caring for our offspring, our line will go extinct. Slightly less obviously, each of our siblings shares as much of our DNA as our own children. When the biologist J.B.S. Haldane was asked whether he would lay down his life for his brother, he said no, but he would lay it down for three brothers, or for five cousins. Next we have reciprocal altruism. You scratch my back, and I’ll scratch yours, and if your back doesn’t need scratching today I will bear your helpfulness in mind tomorrow. Thus returning favours is simply a matter of enlightened self-interest. Very intelligent animals, like us, will generalise beyond the individual occasion, so that we can build up reputations if we seem generous and trustworthy, and the easiest way to seem generous and trustworthy is to be generous and trustworthy.

Then there are matters involving the group as a whole. If you run away when the group is closing in on the mammoth, you won’t get invited to future hunting parties. Everyone in the group has an interest in avoiding destructive conflicts, but also in ensuring that other members pull their weight. This would explain the sense of fairness, the urge to punish, and, since punishing itself is difficult and dangerous, the urge to punish those who don’t want to do their fair share of punishing.

We could consider these things in terms of the survival and reproductive chances of individuals, or of the welfare of the group as a whole. Indeed, the quickest way to start a fight among biologists, is to ask about the relative importance of individual and group selection. I am not well-informed enough to contribute greatly here, beyond pointing out that humans, like ants, practice genocide. This can alter the genetic composition of a region’s population within a single generation, as claimed (mendaciously, I trust) in the Book of Joshua, and as seems to have happened in central Asia during the Mongol invasions.

If we are looking for hardwired elements of morality, we should consider the behaviour of children, and of animals. We know that babies can imitate expressions within an hour of birth. Humans at age one show concern at another’s distress, and consoling behaviour such as cuddling, and, by age two, have a clearly developed sense of fairness. So do chimpanzees, and even monkeys, along with cooperation, reciprocity, and peace-making, as Frans de Waal shows in a not-to-be missed 15 minute TED lecture. There is plenty here, without invoking the supernatural, for our evolving moral sense to build on, and the softhearted, it would seem, have the better of the argument.

But what is good?

Evolution, as we have seen, can explain many desirable moral traits, such as sympathy, a sense of fairness, and forgiveness. But it can equally well explain nepotism, xenophobia, and vengefulness. Nepotism is just kin selection in action. Xenophobia is preference of our own group to other groups, and vengefulness will quickly show those around us that we are not to be messed with. What about the double-edged virtues of loyalty, obedience, physical courage, and patriotism? We recoil with horror from the physical courage of the suicide bomber, and can no longer understand the devotion to country that led millions inBritainand continentalEuropeto volunteer for the trenches in 1914. As for obedience,Darwin gave this a central place in his account of morality, but we, after the horrors of the past century, immediately ask, obedience to what, and loyalty to whom?

And when it comes to religious faith, or even to political ideology, we find what seemed to be unbridgeable gaps. We can all understand the value of shared beliefs in establishing a group identity, but we still have the comical spectacle of those who insist on membership, while denouncing the group’s religious underpinnings.[1] Most readers of this column will regard beliefs without evidence as unwarranted, and think it virtuous to discard them. Yet there are others who claim that accepting this or that set of beliefs without evidence is the very basis of virtue, enough to justify the difference between salvation, and eternal torment. Or, perhaps more acceptably to some readers, there are the self-styled objectivists, such as Ayn Rand, who regard selfishness as both rational and moral, and sympathy as a form of weakness. J.K. Galbraith ridiculed such positions as “one of man’s oldest exercises in moral philosophy; that is, the search for a superior moral justification for selfishness”, but I do not see how to refute it. I accept Hume’s distinction between factual and moral judgements, between “is” and “ought”, and as a result, much to my discomfort, I find myself with no way of claiming that my personal morality has any more validity than my aesthetic preferences. I do not like this conclusion, and would be grateful to any reader who can rescue me from it.

This piece developed from a panel discussion with Keith Gilmour (RMPE teacher, Glasgow Brights, Centre for Unintelligent Design) and Simon Barrow (theologian, Ekklesia) at the Edinburgh International Science festival. My first general interest book, From Stars to Stalagmites, will be published by World Scientific in June; http://www.worldscibooks.com/popsci/7953.html



[1] For example, atheists, such as Jerry Coyne or myself, who insist that they are “cultural Jews”.