Friday, September 11, 2009

Once More with Feeling

There is a new hominid find being touted in the archaeology airwaves of late, this time a 1.8 million year old fossil from Georgia (theirs, not ours). Quoting Steve Connor in The Independent :

“The skulls, jawbones and fragments of limb bones [of this fossil] suggest that our ancient human ancestors migrated out of Africa far earlier than previously thought and spent a long evolutionary interlude in Eurasia – before moving back into Africa to complete the story of man.”

Oh, poppycock!

Once again, it has to be pointed out that just because a creature was a tool-using, fire-controlling primate, doesn’t mean it’s our ancestor. Plain and simple. It certainly doesn’t mean that our ancestors “spent a long evolutionary interlude in Eurasia – before moving back into Africa to complete the story of man.” Even should it eventually be proved that said fossil is in our direct line, it doesn’t mean that whatever creature it was didn’t live in Africa at the same time, as well. Just because we haven’t found a similar fossil in Africa, doesn’t mean that the creature didn’t live there. It only means we haven’t found such a fossil there, as yet.

Ergo Ergaster

But while we’re on the subject of Eurasian holidays for lost primates, can we ask a couple more questions?

What happened to all those guys who left Africa to live all over the Old World: the heidelburgensis, Java guy, Peking guy, floresiensis, not to mention neanderthal? There’s much discussion about the fate of the neanderthals vis-à-vis modern humans, but virtually nothing about h. erectus and his alter-egos: Java, Peking, ergaster, habilis, et al. It’s little wonder the Chinese claim that erectus/ergaster/Peking guy evolved locally into modern humans along with all the other erecti around the world. After all, what did happen to them, if they didn’t evolve into modern humans?

Still and all, while there’s little wonder about the claim, there’s little to substantiate it, as well. Furthermore, it’s hard to see how all the members of a widely dispersed species can evolve concurrently. I’m going with the theory (seemingly supported by the evidence) that modern humans only appeared once and then quickly took over the entire world.

So then, what did happen to the pre- or non-human primates that spread over the Old World. We know that they disappeared, but when? And how and why? Even though we only search for answers to those questions regarding the neanderthals, it seems as reasonable a question for the other species, as well. Certainly it’s being asked vis-à-vis the little people from Flores.

Current thinking (admittedly, this changes almost daily) is that the Flores Hobbits were not evolved from erectus, but shared a common ancestor with them. Interestingly, the claim is still out there that modern humans evolved from erectus. How that affects our relationship with the Hobbits is beyond me, but it certainly doesn’t address what happened to either erectus or Hobbit.

What I’m trying to understand is how a tool-using, fire-controlling animal, such as erectus, could simply disappear. Are we to believe that erectus died out naturally in most of its territory before modern humans arrived on the scene? It just seems so unlikely. Why do we think that primate line died out? Or was it still in place when modern humans poured out of Africa? Why would it have died out before humans got to it? What would have killed it off? If it didn’t evolve into modern humans—because that could only happen in one isolated place—did it simply die out before the arrival of modern humans? If so, why?

It appears to me that this Georgian find only adds to the number of biped primates that spread around the globe. We were only the most recent, but it’s beginning to look like we’re the last.

Thursday, September 3, 2009

Christians Aren't Perfect
Just Forgiven

[Bumper sticker, late 20th century.]

A position oft expressed goes: if there is no god, there is no meaning or direction to life and consequently one can behave however one wishes. It’s said so often and so matter-of-factly, that in most discussions about morality and its origins, that position is virtually a given. It is, one can safely say, the official American political opinion. There may be more enlightened countries around the globe, but if you want to get elected in the United States, you’d better adhere to the principle that morality is directed from above.

Which means, of course, that our country is run by arrogant fools and liars, but that’s another story.

The problems, though, go beyond electioneering. The curse of monotheism has been to create a race of zombies willing to do anything the power structure asks of it, including killing people who are in its way, for one reason or another. It’s not just a theoretical discussion, we have here. The pervasive claws of monotheism scratch at the tiniest corners of our society. They leave no mouse unscathed. The great bulk of our prison population, for example, is not a mass of murderers and mayhem, but of people there for cultural differences, not crimes. We criminalize many things in our society which in and of themselves are not crimes; consequently, the overwhelming majority of people in prison in this country are there for drug offenses, and drug choice is strictly controlled by religious content. They are not arrested for the effect the drug has upon them or society, but merely upon its illegality. One can safely say that all drug offenders in this country are there because of the illegal nature of their product, and not because of anything their product caused people to do. Almost all crime related to drug use is caused by the illegal nature of the drug—and this goes for any illegal drug—and not its pharmacological action. In and of itself it’s rarely a criminal issue and almost as rarely a societal one, other than one man’s meat being another’s poison. But as soon as one starts declaring, simply because they can, that another man’s meat is illegal, all hell breaks loose.

You’ll note that prohibition of alcohol—known in the Muslim world as the “Christian diversion”—lasted only a few years, while the prohibition of other drugs continues in this country to this day. At the rate we’re going, we’ll humanize our laws only slightly before Singapore. (You’ll note that Brazil, Argentina, and Mexico have decriminalized personal drug use across the board, following the European examples of Portugal, Holland, Switzerland, and on.) But one thing you can be sure of about Americans, we might have the last band in the parade, but it sure as hell will be the loudest.

But I digress.

Nonetheless, the claim of religious origins for morality is so ingrained in our culture that even humanists worry sometimes that there might be a “god” gene in there somewhere that necessitates religion; and they often have trouble knowing where everyday, drugstore morality could come from. As if it were a great mystery. How do we know how to be good, unless someone tells us how? Besides our moms and dads, of course.

Well, OK. Bears and wolves and lions and tigers have no religion, right? They can’t possibly “know God” in any meaningful sense. There’s nothing stopping the top-dog lion, as it were, from killing all his rivals, right?

Except that there is. For the most part, lions and tigers and wolves and bears don’t kill their rivals. Certainly nothing at the rate that humans do. When it comes to killing their own kind, we are the masters.

So, what’s stopping them? The religious response would be that God has programmed the animals to behave as they do. It’s all part of the great design. The humanist response is deceptively similar and simple: animal behavior is innate and was worked out through evolution. In either scenario, the animal has no choice. But here is where the problem gets sticky. The assumption is made that humans are fundamentally different from all other animals; and that because we intellectually realize that we can make choices, we assume that our behavior is governed by that ability and not by innate patterning. It’s that belief which dictates the academic stricture to not anthropomorphize, assuming that we are fundamentally isolated from all other species.

Needless-to-say, the assumptions don’t survive scrutiny.

The fundamental problem with a religious origin for morality is the question of what happened pre-religion? What happened when we were a “mere” animal like all the others? Were we naught but wanton killing machines (not that we aren’t now)? Did we only propagate by rape? How did we manage to not eat all our children, if we didn’t know right from wrong? Or, did the desire to eat ones children only come with the epiphany of right and wrong? I realize that these question edge upon the absurd, but they point to the complications of equating religion with morality (or is that vice-versa?).

On the other hand, you can be pretty sure that tool use predated religion (I don’t think chimps have religion), and it’s hard to know how tool (read “weapon”) use affected the balance of environmental forces. It’s hard to know at this remove how sudden, vast increases in weapon power affected killing rates, but it’s equally hard to imagine that it was negligible. Once it’s easy to kill a buffalo, it’s easy to kill a rival. Perhaps religion evolved as a counterweight to big, sharp rocks, something to curb our wanton tendencies. But my best guess is that religion was invented to fill the gap between practical information and wondering where the hell this all came from. Religion as a byproduct of self-reflection. Its use in society is much more complicated than that, but I believe that is its genesis.

Morality, on the other hand, existed prior to cognition. Morality is inherent in all animals. Probably all plants, too, for that matter, but I can’t vouch for that. But there’s no question each species is governed by its own set of rules as to what it can and can’t do. Mainly eat. And most of the time it excludes ones own species (except for guppies and sometimes other competitors’ children). It only makes sense that each species has evolved with a strict code of cooperation which insures the maximum survival of ones species; to be otherwise would be inherently impossible. Deviation from those rules, one could argue, is impossible; at least not until self-reflection surfaces. One can argue that evil only exists because we can think of it. Prior to thought, it was impossible for evil to exist; it is a strictly human construct.

But the rise of self-reflection didn’t eliminate the power of inherent intra-species rules, i.e. morality. Simply because one was suddenly capable of thinking that, “Gee, I could kill my nasty neighbor,” doesn’t mean that they would automatically do so. Surely, even from the very beginning of cognition there were deeply felt urges compelling one to specific behaviors. What one should and shouldn’t do existed long before anyone gave it any thought. All thinking did was give us the power to do what we shouldn’t.

And we’ve been arguing about it ever since. Needless-to-say, this entire argument is arcane to anyone who believes we were created in situ, as such, by God; and offhand I don’t know of any way of getting through to people like that. If you believe we were all plunked down here, fully formed, 6000 years ago, there’s nothing I can say that will alter any argument we might have.

Yet even a belief in evolution doesn’t prevent some people from thinking that evolution itself is divinely inspired and that the recognition of morality was programmed to coincide with the recognition of self.

The answer to which is: well, yeah, maybe, but I wouldn’t bank on it quite yet; and at the very least it doesn’t answer but only postpones and confuses the issue. After all, if you don’t have a personal god, you don’t have much of a god at all. If all a god does is set the rules and the ball in motion with a Big Bang, what kind of god is that? If the god doesn’t care about life on earth because it’s such a minuscule part of the universal story, how is that a god? And if the god really does care about what you do here on earth, how believable is that?

Once one has accepted evolution and the tenants of observational science, the question of god’s existence becomes moot. But the question of from whence morality is not moot and is open to all manner of interpretation. To begin with, it’s essentially tautological to say it’s inherent. The question then becomes, how do inherent moralities play out in the confusion of self-reflection? Certainly, religions step in early on as arbiters of what’s right and wrong, but they forever remain a gloss over our inherent natures. We know what’s right and wrong without anyone telling us. If you don’t believe me, ask any little kid. (After that ask any teenage girl.)

As we’ve already noted, without self-reflection it’s impossible for a member of a species to act contrary to the species’ rules, as it were. The underlying compulsion, however you want to look at it, is to get along.

And that compulsion does not go away simply because we become self-reflective and capable of acting contrary to our compulsions. Our compulsion is to cooperate and get along, but the confusion of self-reflection, especially when poorly understood, allows us to act contrary to our best interests, sometimes with disastrous results. The compulsion to cooperate and get along drives all of our behavior from fundamentals, such as speech and mannerisms, to cultural overlays, such as style and religion; and it’s not hard to twist a desire to conform into a tool dividing us from them. Once you corner the market on good, you can commit all sorts of evil in its name.

Which brings us to the sanctity of religion.

Let me observe that there is no such thing as a religious war. No god has ever told anyone to go kill anyone else. It has never happened. All decisions to kill people are made by people for people reasons: i.e. control/power. The reasons may be couched in religion and the combatants might think they’re going out there for the defense of religion, but someone always knows its a bunch of hooey and that they’re doing it for them. The person pulling the strings always knows it’s poppycock.

So why does religion get a free pass? Why does it get sanctified, if all it does it turn people into meat puppets?

Because it does it so well.

It does it so well that lots of people can’t imagine life without it. In fact, they’d rather kill than go without it. In fact, they’re often willing to kill you, even if all you want is to go without it.

Maybe now it’s getting clearer why monotheism was so important. Someone had to take control of the incredible power of religion. To leave morality scattered in the hands of multiple gods was not good for war. Better to have only one. Much easier.

The bottom line, of course, has always been the Golden Rule. The Ten Commandments are rather useless, being primarily concerned with religious power, and totally hit-and-miss with their few practical suggestions. You shouldn’t commit murder, that’s for sure (that’s number 6), and you shouldn’t commit adultery, steal, lie, nor—God forbid—even covet your neighbor’s riding lawn mower (that’s number 10), but apparently, if you’re not married, it’s perfectly all right to rape your neighbor’s wife, so long as you don’t covet her. It’s a fine distinction.

In any event, it’s easy to see where one’s natural urges, when it comes to right and wrong, are more reliable than religious prescription. The Golden Rule is ten times safer than the Ten Commandments. Trust me. Yet it doesn’t even make the list. Why is it not on the list? Because it’s not good for manipulation. It’s hard to convince people that they should make war on another people because they so much want war brought upon themselves. It’s a hard sell. On the other hand, with the current list, all you have to have is someone worshipping another god (or none at all) and your first commandment is to do something about it. You get to think of what to do. What do you think you should do to someone who violates the first rule in the list of “the ten most important rules”? Remember, this is six places above murder. What should you do, if a whole nation thumbs their nose at your god? (Actually, God doesn’t leave the choice of what to do about those nose-thumbers to you. He says kills them. Read Deuteronomy, if you’d like more of the same.)

You can see how important it becomes to have that god and protect its sanctity above all else. The blind compulsion to follow is the most potent organizing tool a society has. It’s inherent.

The real question then becomes, if it’s inherent, how do some people escape it? From whence rationalism and the Enlightenment? Are not rationalism and enlightenment as much a product of self-reflection as evil? Or for that matter, good itself? How does one escape the compulsion to follow the crowd?

Beats me, but it’s the great divide in the human race. Forget about race, religion, sex, country of origin. The great human divide is whether or not you’re able to give yourself over to someone else’s direction. Are you able to let someone else make your ethical decisions for you? If you are, you’re simply following the ancient necessity to fit in with the “the species”; it’s where you’re safe. How are people willing and able to abandon that security and make those decisions for themselves? For that matter, how does one get to the position of making those decisions for other people? Certainly, the clues lie in self-reflection. Eventually climbing the holy hierarchy, one comes to the realization that moral decisions are made by people, not holy writ. If you’re honest, you’ll eventually get to the point where you realize that the voices in your head are the product of your own imagination, not the outside voice of God.

But it’s not an easy realization to have or live with. Every person who realizes that morality is both an individual responsibility and a species necessity, has to make their own moral choices. They cannot rely on exterior authority. Guidance, yes, but authority, no. In the end, all moral decisions are personal. One can only make them for oneself.

Which is why that bumper sticker is so scary. Indeed, Christians can be absolved of their sins. They can have them washed away by the blood of Christ. Which means anything done in the name of God can be forgiven. War, torture, excommunication, burning at the stake; they’re all okay in the eyes of the Lord. And in the eyes of his believers. The people who run religions know that. They know they use their flock as canon fodder, if not just milk cows. They know that if you believe that the majority of the people believe a particular brand of religion, there’s good chance you’ll believe it too, unexamined.

The “unexamined” part is important. One constant of all religions is the requirement to believe in the absurd, because once you’ve accepted the impossible, nothing any longer is. Any realistic appraisal of any religion will immediately point up its absurdity, so it’s crucial that believers do so blindly. The choice, when the church has been able, is to kill people who examine their religion. No religion can withstand objective scrutiny, so it’s necessary to require believers to accept the absurd; to believe that God really is directing them when they speak in tongues. In any other instance, having voices in your head is a sign of insanity, but not if you claim that voice is that of God. That argument gets a special pass.

So, if you’re a Christian, you’re forgiven of your sins while the rest of us have to behave properly or suffer guilt. You, thank God, can avoid the suffering of guilt by simply believing you are forgiven for your sins. Sort of takes away the incentive not to commit them, doesn’t it. It’s nice to have a free pass.

In the end, it’s religions which encourage people to act barbaric, while non-believers are responsible for their own behavior. One understands that religious people behave morally by accident, not by conscious thought. The job of religion is not to make sure that people act properly, except in the sense of following its own special codes. The job of religion is to make sure people follow its authority. Monotheism in particular has little social value beyond population control. (You might, for example, think of the control aspects of charity versus insuring a decent standard of living for all. Charity is so much more powerful than developing self-reliance.)

We will quit this diatribe here. It’s a lonely diatribe, anyway.

But let me leave you with the admonition to be responsible. Don’t hand you soul to anyone else. Only you can prevent forest fires.

Thursday, August 20, 2009

Copernicus Redux

Copernicus, we remember, got in trouble for suggesting that Earth might not be the center of the Universe. We snicker now at such provincialisms. Yet at the same time we warn against anthropomorphism, the trait of looking at anything through human eyes. We are continually reminded that other animals don’t act like us, although most often the reminder comes without the qualification “other”; usually it’s just “animals don’t act like us,” as if we were somehow separate and distinct from the rest of the kingdom.

A defining difference between us and “the animals,” has traditionally been tool use. Only people use tools. That has been a given. Ergo, if one finds evidence of ancient tool use, one has found evidence of early humans. Look at the archaeology of Britain, for example. They’re forever talking about early people in Britain up to 500,000 years ago on the strength of finding stone tools. If A, then B. If all stone tools are made by people, then, if one finds stone tools, one has found evidence of people. Can’t be any other way.

Unless, of course, the premise is wrong; and the more we look around, the more it’s becoming evident that tool use is an upper-primate—call us apes if you will—characteristic, not simply a human one.

It’s also axiomatic that, if two tool-using primates are found to coexist, there’s no guarantee that one of them developed out of the other one; they could have, and probably did, come from a common ancestor further back. The Neanderthals were bad enough, but now we have the Flores Hobbits . The Australians are reporting that, not only was the Hobbit not a human, but that it predated (at least on Flores), the traditional pre-human primate that everyone likes to claim as an early human: homo ergaster (et al). He’s the same guy as Peking Man and the folks leaving those early tools in Britain, if I have it right, who both the Chinese and the English claim as early humans. The Neanderthals we could handle so long as H. ergaster was predecessor to them both; i.e. one tool using species giving rise to two branches: the Neanderthals and us.

But those pesky Hobbits throw a bone into the machinery. If they were not an evolution of h. ergaster, from whom did they evolve? And if they didn’t evolve from h. ergaster, who’s to say we did? Or the Neanderthals?

But isn’t it interesting that all three species, us, ergaster, and florensis, all managed to cross the forty or fifty miles of ocean necessary to reach Flores? Did two species arrive there by accident, with only us getting there on purpose because we knew how to navigate? Or did all three species have more in common than fire and tool use? What would it mean that at least three species of greater apes have learned how to sail? Or paddle?

Isn’t it a tad presumptuous to call all tool users “human”? And isn’t equally presumptuous to think that any characteristic we think of as human is our prerogative exclusively? I’m not saying that the human family isn’t big enough to hold some pretty weird characters, but I don’t necessarily think that any chip off the old stone is a person. Just because we now know chimps use tools, doesn’t make them any cuter to me. I’m still not ready to let them into the family. If they can figure out how to be butlers, fine, they can have a job; but don’t expect me to let them date my daughter.

Back to the drawing boards, folks. We’ve got some rethinking to do.

But chimps are not merely highly challenged people.

Wednesday, August 12, 2009

Seafood Mama

This just in from The New Scientist: Seafood gave us the edge on the Neanderthals.

Well, duh.

Although the article doesn’t prove or even conclude that. What it establishes is that early (40,000 ya, in this case) humans did eat a lot of marine life, whether they were living on the coast or inland; and they probably ate more than Neanderthals did. Whether or not this “gives us an edge” is, I imagine, open to debate; although something certainly did. If it was seafood, pass the scampi.

Concentrations of iodine in the bones of the early humans examined established its origins in an aquatic diet. Human iodine dependency is a well known fact, and the only explanation I know of for that dependency is a long-term accustomization of the human body to a seafood diet. Forty-thousand years is probably not enough to create a dependency in the entire species, such as we currently experience. (Better eat that iodized salt.)

But while I’ve got you, I’d like to remind you that Neanderthals were not people. Peking Man was not a human. Homo robustus was not human. Homo erectus was, not only not human, but probably wasn’t in our line. In fact, none of them probably was. And even if they were in our line, that doesn’t make them human. Somewhere there was a one-celled animal whose descendants are alive in the form of you and me; that one-celled animal was not human.

As far as we currently understand, humans arose some 200,000 years ago. Before that time, there were no humans. Those other bipedal critters in our direct line? Whoever they were, they weren’t people. And we don’t know for sure that any of those other bipedals we’ve discovered were in our line; perhaps none. We may know sometime, but to claim descendence from any currently known fossil species is jumping-the-gun.

Thanks, and have a nice day.

Friday, July 24, 2009

Spear Chuckers
Tools and Bipedalism

That humans have an intimate connection with water is self-evident, beginning with where we live and the amount of water we consume on a daily basis. In this we are unique among surviving primates. Unquestionably, other primates make use of and enjoy water holes, but none has attached themselves to the water hole as have humans.

Humans are also the only current obligate bipedal apes on the planet, so it’s been hard not to assume a causal relationship between our water dependence and our bipedal behavior, though the exact mechanism and motivation has been foggy. What has seemed probable from the outset is that the shift from discretionary to obligate bipedality was dictated by increased access to either food quantity or quality or both. It makes a certain amount of sense to think that an ape finding food foraging particularly good in the marshes, lagoons, and shallow waters would eventually adopt as permanent behavior, bipedality, that which serves them so well in a foraging environment. The main problem with that scenario was imagining that almost all of our ancestral food foraging was done in the water, rather than a shared foraging between aquatic and terrestrial resources, which seems much more likely, and with, presumably, only occasional time spent in an aquatic environment, with the rest spent on land. It’s hard to see quite why such an ape would take up walking on two legs on land, simply because it was necessary while in the water. It’s one thing to imagine an ape harvesting aquatic resources, but it’s another to think that any, much less a whole species, would choose to essentially live in the water on a daily basis. Nonetheless, given no other options, it seemed that bipedality was likely connected to aquatic foraging. The Alice in Wonderlandish musings about other potential scenarios, such as bipedality reducing ones solar exposure or increasing ones ability to see ones enemies/food, have always been beyond the pale of refutation. They can’t be taken seriously. Furthermore, none of the mainline theories has had the courage to tackle the water dependency question, yet. And with good reason. It complicates things no end.

But. I should make that bigger: BUT!

But there have been some recent developments (or, probably more correctly said, “recently released to the public developments”) concerning the study of chimps and bonobos that throw into question what it means to be human; most specifically the discoveries surrounding chimps making and using wooden spears in the hunting of bushbabies, a small primate cousin which they favor for dinner. The chimps sharpen carefully chosen sticks by gnawing on them. Like the neanderthals, they don’t throw their spears but rather use them to poke around in bushbaby nests until they find one.

Now switch your attention to the British chimp at a zoo who developed a rapid-fire throwing technique for pelting gawkers at his cage. When the zookeepers removed his rocks, he began to tear off chunks of plaster from the walls of his cage and used them. Furthermore, this wasn’t a spontaneous display by the chimp; instead he would spend the morning readying his ammunition stash in anticipation of opening. His was premeditated warfare.

It’s my understanding that bonobos share hunting predilections with the chimps, if not to the same degree of intensity. Nonetheless, it’s now evident that tool using, and even tool making, is not uncommon among apes, and appears the norm, rather than the exception. Needless-to-say, we’ve never found evidence in the wild of discarded chimp weapons, we wouldn’t recognize them, if we saw them. A stone looks awfully much like a stone, unless it’s in someone’s, or something’s, hand. It simply disappears in the archaeological record. As does a wooden spear. We can probably assume, for example, that proto-people used stones and wooden spears as weapons for millions of years before they hit on sharpening the stones. Which, consequently, makes a pack of australopithecines armed with such spears and stones a much more formidable foe than I’d perviously considered. A big enough pack might make a pride of lions think twice about the food value of those skinny little twerps.

Which makes me wonder if being able to haul around spears and projectiles might be enough of a benefit for food gathering and safety that those apes who can do it all the time can out-feed and out-breed those who can’t. Are we bipedal because we learned to use and carry tools? Prior to these recent discoveries, it had been assumed that bipedality arose prior to tool use, but that’s clearly not the case with our cousins, and there’s no reason to assume that it happened thus with us. Certainly, there had to be eons of using stones as tools before someone hit upon shaping the tool by flaking, which is where the archaeological record begins. When we see a millions of years old, crude hand axe, what we’re seeing is an enormous technological advance on an age-old artifact: the stone. We are not seeing the beginning of a technology; we are seeing an increased sophistication. One could easily imagine a pre-shaped stone age that lasted considerably longer than the “neo-stone age.”

Which essentially turns the standard viewpoint on its head. In the history of archaeology it has always been assumed that bipedality predated tool use. It was felt a serendipitous development that bipedality freed our hands for tool use; whereas the truth appears to be that our tool use accelerated our bipedality. In retrospect, it’s fairly astounding that no one, myself included, came to the logical conclusion that when we were witnessing the first crudely flaked hand tools, we were, inevitably, not seeing a brand new technology, but rather a refinement on an extant, time-tested product. Surely people, or rather would-be people, used stone tools for millions of years before they thought of shaping them. It was one thing to sharpen wooden spears; it’s a whole other matter to sharpen stones and a lot less self-evidently possible. It takes an intimate knowledge of stone acquired, undoubtedly, through millennia of experience, to see the possibilities of shaping certain kinds of them. That I’ve never seen this self-evident part of the process of stone technology mentioned in any discussion of the beginnings of tool use, implies that is hadn’t occurred to anyone until now. (Which, I guess, makes me feel a little better; at least I wasn’t alone.) Now, of course, with the realization that many apes are tool users, it’s observable that tool use precedes manufacture of their permanent versions. It is, unquestionably, one of the great insights of modern evolutionary archaeology, ranking right up there with Alistair Hardy's cogent observation about human body fat.

If bipedalism is a byproduct of tool use, then it frees the development of bipedalism from environment. In other words, if bipedalism arose in response to tool use, it could have happened independent of the ape’s physical surroundings. It could and probably did arise in many different environmental niches, albeit probably all within the standard primate range. It means that it didn’t develop in response to aquatic foraging, nor to the disappearance of the forest, the two current standard models. Tool use, instead of defining humans, appears to be a common development among the apes; and if tool use is common, then the tendency towards bipedalism is probably common, as well; and given enough time and enough apes…

What this theory of bipedalism doesn’t do, though, is solve the problems of our aquatic connections. Even if tool use created our bipedality, it wouldn’t have changed our basic environmental niche, which is where our physiology was created. That humans are apes who inhabit the waterline is not a theory but an observation. Our water dependence may be unrelated to our bipedality, but it’s not unrelated to where we developed. And it’s still reasonable to think that the smartest ape would choose the best foraging/hunting ground for itself, which is always going to be at the water’s edge. There’s still no argument for our particular human development other than down at the waterside. We’ve undoubtedly been hunting whatever the neighborhood looked like for a long time, be it jungle, be it savanna, but we’ve always lived down by the water hole. There is simply no other call for how we came to be. Or if there is, it’s yet to show its head.

Gilligan’s Ideas

But I’m not done yet: a few remarks on Ian Gilligan’s theories.

Gilligan’s theories were recently expounded in Science Alert, an Australian-New Zealand online science news source. The main thrust of Gillian’s work, and you should read the piece, is that the development of clothing radically affected human development, which is almost tautological, but deserves close attention. He is absolutely right in noting that, without clothing we’d still be stuck in the jungle, more or less; something many people discount or undervalue.

But the middle of the article finds some words about the naked ape that are most interesting:

As Gilligan points out, Homo sapiens are thermally very vulnerable, having at some point lost the thick fur covering of other mammals. The idea that this might have occurred in response to heat doesn’t really hold up, as fur can also insulate animals in warmer environments. Gilligan’s guess is that human hair loss came about as a side-effect of a slowing of the expression of the genetic code in our species, meaning that we’re essentially juvenile mammals in physiological terms, if not in mental capacity.

“Slowing of the expression of the genetic code.” Now, what does that mean? How does that manifest itself? How is the code normally expressed, and how is its expression slowed down? This is important, because it’s essential for understanding the next thought: “meaning that we’re essentially juvenile mammals in physiological terms, if not in mental capacity.”

I don’t know how you read that, but I read it to say that early human development is retarded in some manner and that said retardation somehow affects hair growth. I’m not sure how “juvenile mammals” fit into the human equation, because it seems to me that juveniles of furry animals are every bit as furry as the adults; so I don’t know how this “slowing of the expression of the genetic code” is supposed to function, but at first glance it’s unexplained. Presumably there’s more to it.

But offhand, I’m not buying it. At least until further clarification, but the obfuscating terminology is not helping.

Other than that, he’s dead on about the clothes.

Except maybe the dates.

Wednesday, June 24, 2009

Chimps at the Water Hole

Last night’s (June 23, 2009) NOVA, “Ape Genius,” opened with a group of chimps having a “pool party” (their words) in the wilds of Africa. There they were, dropping from branches into the water, splashing everywhere, behaving like, well, kids at the beach having a grand old time.

You won’t, of course, find much about that in print, because it hasn’t reached the general knowledge base yet, but it’s there in living color. Chimps like to play in the water. At least some of them do.

NOVA went on to show chimps making and sharpening spears to hunt bush babies, a small arboreal primate, as well as understanding and following fairly complicated English instructions. The gist of the show was that we share more traits than we’d like to admit with the other apes.

It began by explaining the human bias towards thinking that many characteristics, such as tool making, are uniquely human as coming from a feeling that people have been blessed by the hand of God, so to speak, making them a separate beast from the other apes, which, the narrator promulgated, was not the truth. Easy for him to say, yet the show continued to speak of the differences between humans and apes rather than between humans and the other apes: a small but significant distinction. For even the most humble of observers it’s difficult to say “other apes,” because couched within that phrase is the admission that we, too, are one of them. (Something, frankly, which bothers me when I see us all driving cars down the road: should apes be doing this, I wonder?)

The show questioned why people ended up the way we did versus how the other apes ended up, but it didn’t look at the broader implications of these discoveries towards evolution in general; it only looked at the relationships between humans and chimps and bonobos. Certainly, chimps and bonobos inhabit different strata of the forest and eat probably slightly different, if similar, diets; and it’s well known that the two species have quite different social structures and behaviors. “Ape Genius” tried to make a case for humans developing differently than other apes because of certain social traits which we have, such as being able to squelch our emotions, rather than seeing those social traits as part of a larger complex of behaviors. Significantly, they didn’t discuss the implications of chimp behavior at the water hole, which is frivolous and interactive, nor did they make any connection between bipedalism and human behavior. It could well be that the forces which inclined us to be bipedal were the same forces that inclined us to be socially cooperative. And it’s plausible that both those traits were picked up at the water hole. (If I were you, I’d keep watching those chimps at the water hole. If they start liking frogs and tubers, who knows how far it could go. Couple million years, they might stand up and sing.)

NOVA also failed to stress the ubiquity of tool-making beyond humans, chimps, and bonobos; but if we have three extant primates making tools, the implications for primates of the past is enlightening. For one thing, it means that tool-making is not a uniquely human characteristic; so that any tool-making fossil from the past is not necessarily in our line any more than is your local bonobo.

But watching those spear-chucking chimps made me rethink early hunting strategies and what sort of weapons would be effective in the open country; and I’m starting to think that a pack of Austrolopithicines armed with sharpened spears might be a formidable foe. They might not be fast of foot out there in the open, but they were clever and, probably, cooperative. This hunting with the dogs thing is starting to make a lot of sense. One thing us “higher” apes are good at is learning from observation. I can see our ancestors watching how pack carnivores work and imitating their cooperative methods. I can see how, when they found themselves hunting the same prey, that the primates would start mimicking and running along with the dogs, as it were. I can picture the dogs looking to each other and asking, “Who let the people out”? And I can see after a successful hunt, right from the get-go the tiny little A-piths tossing the dogs their share: A) it avoids a nasty fight; and B) it insures cooperation next time they find themselves hunting together.

We might have been tough guys out there on the veld. Provided, of course, we had enough to drink.

Saturday, June 13, 2009

Mr. Wrangham’s Excellent Conjecture

Richard Wrangham is about to publish a book, Catching Fire: How Cooking Made Us Human, where (according to New York Times author Claudia Dreifus) he suggests that cooking was an essential ingredient in our rise above our fellow primates, largely by reducing the amount of time spend chewing our food up. He compared our primal diet to that of chimps which, he says in an interview with Ms Dreifus, they’d have to “masticate for a full hour.” By cooking one could cram the same food down ones gullet in a few minutes, leaving one, presumably, more time to work on ones sonnets and Pythagorean theorems.

Done.

I haven’t read Wrangham’s manuscript, but I have read the NY Times interview plus another article on his proposal and am prepared to make a few observations.

The first of which being it is an excellent observation and undoubtedly one with a lot of merit. Unquestionably, cooking took hold for some reason. Whether or not cooked food is “more nutritious” and “healthier,” than uncooked food is certainly open to debate and shouldn’t be accepted prima facie (although cooking definitely makes some foods palatable, which could otherwise be deadly), but that cooking makes food quicker to eat is undoubtedly true, although that may not have played as great a role in its adoption as flavor. If Wrangham displays any major fault with his theory, it’s that he’s too much in love with it, a common problem with theories; I suffer it myself.

Another fault is comparing us to chimps too keenly. Unquestionably, we’re closely related to them and their cousins the bonobos, but we’re likewise a lot different. No one would confuse the two of us walking down the street. And while we may share a distant ancestor, we long ago took different paths; occupied different parts of the forest; and, evidently, ate a different diet; something Wrangham has yet to fully grasp, even after trying to mimic the chimp diet when he was living in Tanzania in 1972. It’s true, we’re both omnivores, but that doesn’t mean we eat the same things; and trying to figure out what a chimp eats won’t necessarily lead you to what proto-people ate two or three million years ago. Especially if you don’t know where those proto-people lived, an issue about which Wrangham is confused. (We won’t get into Wrangham’s being primarily a vegetarian, which casts serious doubt on his understanding of both nutrition and evolution. I should mention here that I had a wife once who tried to emulate the diet of her goats, under reasoning not too different from Wrangham’s, but was dissuaded after the first mouthful of barbed grass heads stuck in her throat for hours.)

Wrangham bases much of his argument on assumptions about human fire use from 1.8 million years ago, a somewhat earlier date for fire use than is generally agreed upon, but not so early as to be improbable. What becomes questionable is to what uses fire was put at that early time and how often was it available. The only sources I’ve found estimating when people were first capable of “making” fire pegs that date between 9,000 and 15,000 years ago, which seems impossibly recent to me. It’s hard for me to believe lightning could be a reliable source of fire, particularly for people traversing the northern reaches at the edge of glaciers, but it’s certainly likely that people were able to control fire long before they could create it. Still, it’s hard to imagine that accidental fire would ever be common or ubiquitous. One can only imagine that for eons people clustered around fires when they had one, but that for most people most of the time there was no artificial warmth; and when someone decided to try cooking something other than a hunk of meat is not determined. In any event, people were upright creatures millions of years before they captured fire, and they were, apparently, already on a superior road to the chimps by the time they stood up for good, long before they started cooking anything. Cooking was a great technological leap and a great time-saver, but it probably didn’t affect our basic nutritional intake, at least at first. Certainly, as some products lend themselves more conveniently to cooking, they tend to become emphasized, but I can only presume that cooking didn’t initially affect food choice. The bottom line, though, is that we have and had our own dietary preferences which are distinguished from what other apes eat and most likely always have been. It’s likely that those early dietary differentials are part of what propelled us on the path we found ourselves; not to mention that those diety preferences probably caused us to becaome upright, as well.

Wrangham drops other bon mots into his conversation which are clearly either inaccurate or unknown. When he states that “the austrolopithicines, the predecessors of our prehuman ancestors, lived in savannahs with dry uplands,” he’s making both errors and assumptions. When the austrolopithicines first descended from the branches, there were no savannas where they were. True, they showed up by the time fire was captured, but they certainly weren’t where any prehumans lived. The austrolopithicines weren’t able to change their habitat, just because their world was drying out around them. Fortunately for them, they lived in a micro-environment along river banks and around marshes and swamps that may have been reduced in size, but never disappeared. When the savannas appeared they could forage, scavenge, and hunt them without actually living in them; provided, of course, that they’d developed weapons and a way to carry water.

Wrangham, likewise, makes several assumptions about how having fires changed our socialization, such as fires providing a stabilizing hearth around which people would then cluster. “This was clearly a very different system from wandering around chimpanzee-style, sleeping wherever you wanted, always able to leave a group if there was any kind of social conflict,” he claims; ignoring that most animals, I’ll bet chimps included, have regular bedding spots (not to mention birthing spots) that aren’t chosen so devil-may-carefully as Wrangham would suggest. Offhand, despite a myth of wandering animals and people, everything has a home territory, every bird has its branch. Even albatrosses.

And while it may be a minor point, there’s no guarantee that austrolopithicines were our ancestors any more than the species h. habilis was, which he also claims as a “distant ancestor.” If an ancestor is a person in ones direct line, then there is no assurance that either austrolopithicines or habilis were our ancestors. We may all have shared another as yet undiscovered ancestor. Someday we might have tests that can determine the relationship of those old fossils to ourselves, but in the meantime we’re going to have to go on the assumption that those old species were relatives, but not necessarily ancestors. We can safely assume that the majority of fossil primate species, bipedal ones included, have yet to be discovered and may never be. Just because a species that shares many of our traits was common at an earlier time doesn’t mean that it is in our direct line.

One also has to wonder about Wrangham’s general life experiences when he makes a statement like, “They [h. habilis] certainly made hammers from stones, which they may have used to tenderize [meat]. We know that sparks fly when you hammer stone. It’s reasonable to imagine that our ancestors ate food warmed by the fires they ignited when they prepared their meat.”

No it’s not. It’s not reasonable at all. It’s a prime example of being in love with ones own theory; once you’re in love, anything is possible. Even if it’s not. Nobody whacking away at a wet piece of meat with a rock is going to send off sparks that are going to catch that meat on fire. Or anything else laying nearby, either. Ain’t gonna happen. If it happened once in the entire history of humanity, I’d be amazed, but to depend on it as a way to get ones cooking fire going? The danger of stretching ones argument like that is that it casts doubt on the rest of ones propositions.

Such as trying to emulate chimps’ eating patterns. He wanted to eat just like a chimp but “in the end… never did the full experiment.” He did allow, though, that “there were times when I went off without eating in the mornings and tried living off whatever I found. It left me extremely hungry.” He relays this, as if it was a valid contribution to the discussion; that it was a valid experiment from which he got a significant result: he was hungry. Aside from the aforementioned problem that we don’t share a diet with chimps, here was an untrained, naive city boy trying to live off what he could find to eat, despite not knowing beans about what’s edible or not in the landscape. And then he has the balls to imply that his hunger was akin to what all “prehumans” would have experienced.

Not likely. And remind me not to go hunting with this guy.

A couple other suppositions surrounding fire test the credulity of the most humble among us:

One was that early cooks would place carcasses in front of advancing wildfires in order to have them cooked as the fire passed over; which surely explains the high hazardous duty pay that habilis cooks earned. Not to mention an attrition rate higher than kamikaze pilots. This supposition, even though burned bones can be analyzed to see whether it was a cooking or a wildfire that charred them, as cooking fires reach much higher temperatures than wildfires. In other words, not only would such a method of cooking be absurdly dangerous, it wouldn’t provide a superior, or necessarily even adequate, result. Trust me, our ancestors survived to become us because they were clever, not foolhardy.

Another amazing suggestion is that people lost their fur/hair in front of the fire, the hair being no longer necessary to keep one warm. It immediately makes me picture the cowboys coming in from the range after riding herd, gathering round the camp fire, taking off their clothes… Sure, we’ve all seen Brokeback Mountain. We know all about campfires and long, lonely nights. Heck, every campfire I’ve ever been at, everyone has taken off their clothes. Your fires, too, I imagine. All that notwithstanding, wouldn’t we expect people who’d lost their hair from sitting in front of a fire to have gotten bald chests and hairy backs? Why hair on the tops of ones heads, for God’s sake? How about to keep from getting sunburned?

That’s what’s meant about the dangers of falling in love with ones own theory. The seduction to explain everything is too great.

It’s too bad, because the origins of cooking are obscure, yet obviously crucial to human development. The relationship between cooking and farming has yet to be explored in any detail. Did we cook anything other than meat prior to the adoption of farming? Was cooking a catalyst for farming? It’s an intriguing question which Wrangham is right to approach. Hopefully, the next person to look at the subject will be more grounded.