Saturday, November 14, 2009

Pop Quiz

What percentage of extinct species, any genus, have been discovered in fossilized form?

That is, of course, unknowable. So, make a guess. Two percent? Five? Ten? Twenty?

Twenty seems a bit high. Even 10% seems a bit high. Wanna go with 10 for sake of argument? How about giving you the benefit of the doubt and say 20.

Take that figure and move back a little. Move back 50,000 years. That sounds like a long time, but geologically it’s an eye-blink. Even “evolutionarily” it’s a small step. Fifty-thousand years simply don’t go very far. But from 50,000 years ago we have found fossil remains of three other tool-using, fire-managing, bipedal primates besides ourselves who were alive at the same time. Including us, that makes four. Assuming we’ve found only 20% of such critters from that time, that means there were probably another 16 or so species of tool-using primates out there fifty millennia ago.

Which also might mean that there would 16 other tool-users out there foraging around right now, if it weren’t for us. But it also brings up the question of how long did the 16 species survive? What happened to them? Why should an advanced primate go extinct? And do we remember them?

Think, for a minute, about the reports swirling around the Floresian Hobbits. It’s reported that islanders from Flores retain stories in their culture of when the “little people” lived in the forest. Those stories are being held up by some as proof of how recently H. floresiensis survived.

Oh? My culture has similar stories of little guys living in the woods. We would leave food out for them, and it would be gone in the morning. We called them “nisser” (pl.). But we went further; we also had stories of giants, who we called “trolls.” They’d live under the bridge and swipe your goats. Brave men had to go out and slay those monsters.

Are we drawing any conclusions, yet?

Let’s think once more about those Hobbits, the ones from Flores, not Tolkein. Let’s assume for a minute that the scientists are right; that this is not a dwarfed version of either us or erectus; that this is an independent species that arrived there independently. Now, surely they didn’t leave Africa and travel all the way to Flores without leaving anyone behind. In fact, there’s good reason to believe that, if they were on Flores, they were everywhere, or at the least widespread. Like the Neanderthal or erectus were widespread. Not to mention the other, as yet undiscovered “hominins” or “-inids” or whatever you’d like to call them. You can be sure they will be discovered. Not all of them, but more than we have now. The stock of upright tool-users has yet to be exhausted. (And talk to me once again about leprechauns.)

II.

Holy writ has always had it that humans are unique in the animal kingdom. In fact, in Judeo-Christian religions—the only ones I’m very familiar with—humans are no longer quite animals, but something above and beyond animal kind. We, instead, are the likeness of God on earth. We were given dominion over everything else.

Which, truth to tell, is hard to argue with. Dominion we have. We now have dominion and no leprechauns. It was a trade off.

(Was that yeti ‘nother leprechaun?)

The idea that we are unique is not only holy writ, it has been accepted academic dogma. It’s been argued that the probability of intelligent life elsewhere in the universe is small because the odds of it happening once, even here, are so small that it, conceivably, could be a rare phenomenon in a universe that otherwise might be teeming with life. It has been thought that the forces that tipped us into consciousness are so rare that it is unlikely to happen often again, if at all.

This, it turns out, is homo-centric thinking akin to thinking the earth is the center of the universe. It turns out that, instead of being unique, we are, or were, but one of a flock of tool-using bipeds. We may have been either the most intelligent or the most ruthless of the bipeds, but for whatever reason, we were the only one to survive. But for awhile we were commonplace. Yet academia has still been reluctant to understand the implications of our loss of uniqueness. So long as we are a unique species, the only one of its kind, the only one to walk upright, use tools, talk, have social rites, cook food, play the piano, etc., any other extinct creature that exhibited similar traits must be in our bloodline. This has been dogma to the extend that all tool-using bipeds are referred to as “humans” and are called our “ancestors.” Even though we know that, when modern humans first appeared on the stage, there were many groups of bipeds from which we could have evolved. We know we didn’t evolve from the Neanderthals. We know we didn’t evolve from the Hobbits. We tend to think we evolved from erectus, mainly because they were common; but just because they were common doesn’t mean they were ancestral to us; we may have shared a common ancestor. What they definitely were is predecessors to us. Science would be safer to use the term “predecessor” versus “ancestor” but that would weaken our claim to uniqueness.

The question is, can you call all bipeds “human,” even if they aren’t directly in our line? And how far back can you call direct members of our line “human”? We go all the way back to a one-celled amoeba, remember; we can’t call everything in that line “human.” The current assumption seems to be that all bipedal apes are/were descendants from one species of bipedal ape, although I don’t know that there’s anything in the fossil record that says bipedalism only occurred once among the apes. Again, I think it’s our desire for uniqueness that makes us interpret the data in favor of that argument to the very point of using language to tie all the species together. If we call all bipeds “human,” then they’re all our relatives; they’re all our ancestors. Even though they aren’t.

Which makes me feel it would clarify science a lot if it would restrict the use of the term “human” to mean modern humans only, making humans a species unto themselves. It answers the question, “If the Neanderthals were a species, what species are we?” to which the current answer is “modern humans.” They get an identity, we get a blah.

If you can’t make babies with them—and there’s no evidence we could make babies with Neanderthals or Hobbits, much less erectus—they should have, I argue, a different name. I don’t care if that hairy little, barrel chested critter over there buries his dead and makes costume jewelry, he’s not going out with my daughter.

It might then stop us from the compulsion to call Ardi and Lucy and all the other fossils who came before us our “ancestors”; because, when they call them our “ancestors,” it’s hard not to think of them in our direct line, because that’s what the language implies. Unfortunately for science, that case has yet to be made. There’s no reason at all to think that either Ardi or Lucy are ancestral to us; the most we know is that they preceded us. (Shared morphology is not enough; we shared immense morphology with the Neanderthals, yet they were not ancestral to us.)

III.

The lesson, I hope is humility. The earth is not the center of the universe. If we were made in the image of God, what about those other guys? Prototypes? Near-misses? Seconds?

To be human is, indeed, unique. Just as being a lion is unique. But as a lion is not unusual, it is a cat, people aren’t unusual, either; they’re simply part of a larger group of clever bipeds, which for want of any other term we call by their genus name “Homo”; creating the linguistic situation where all people are homos, but not all homos are people (a squares and rectangles kind of thing). Now, while this uncomfortable linguistic truth may cause certain difficulties in the larger world, it is, nonetheless, I believe, correct.

And in its way it’s comforting to know that we are not unusual. Unique maybe, but if it wasn’t going to be us, it was going to be someone else. It was destiny, not accident. Clever bipeds were once all the rage. Too bad we’re the only one left.

Still and all, it doesn’t hurt to leave a little milk out at night.

Thursday, October 1, 2009

And furthermore

The National Geographic article goes on to suggest that bipedality arose from the need for males to carry food to the females, for whatever reason; an old theory that’s been battered around for a while.

Pardon me, but that’s impossible. It violates one of the laws of evolution, which is that morphological changes are only made for food. Cosmetics for sex, morphology for food. Got it? Amoebas, birds, people, bacteria, grizzly bears, beans, and sequoias, we’re all the same: food for shape; sex for color. I’m not going to change my shape just to get you food.

It also edges on another evolutionary law: evolution only goes forward; it only moves in a positive direction. I.e. evolution is always towards something, never away from anything. The savannah theory directly violates that rule. One changes evolutionary direction because a new food source is appearing, not because an old one is disappearing.

(I feel the necessity to repeat these rules here because I don’t recall seeing them elsewhere, basic as they may be.)

Another evolutionary law often mentioned here: No species can evolve quickly enough to avoid environmental collapse. This is a correlate of the above law. Another reason why the savannah theory couldn’t be.

Up a lazy river

Oh, sorry, sorry, sorry for beating this dead horse.

Today’s (10/1/09) BBC News reports, in a piece titled “Fossil finds extend human story” concerning recently released analysis of a 17 year old find named Ardipithecus ramidus, that, “Even if it is not on the direct line to us, it offers new insights into how we evolved from the common ancestor we share with chimps.”

It goes on —blah, blah—about A. ramidus’s tree climbing, walking, and running capabilities; age, 4.4 million years; location, Ethiopia; and such; and then concludes with the “Duh!” moment:

“What is surprising about the discovery is that the remains were found in a forested area. It had been thought that early human evolution was prompted by the disappearance of trees - encouraging our ancestors to walk on the ground.

“’These creatures were living and dying in a woodland habitat, not an open savannah,’ said Professor White.”

(National Geographic went even further in its news release: “If White and his team are right that Ardi [as the fossil’s known] walked upright as well as climbed trees, the environmental evidence would seem to strike the death knell for the ‘savanna hypothesis’—a long-standing notion that our ancestors first stood up in response to their move onto an open grassland environment.”)

“It’s been thought” by whom?

No Ape Shit reader would ever be surprised about a discovery such as that. We’ve been saying for decades that we grew up on the river banks and swamp lands. Being bipedal has nothing at all to do with the disappearance of any forests. Never has, never will. It’s been a blind alley since it was first conceived. It was a dumb theory to begin with that has diverted understanding of human evolution for nearly a hundred years. It’ll take the field another hundred years to fully throw away that silly belief; and, trust me, no one will ever give a nod towards A. Hardy. By God, they’re going to have to back their way into reality rather than credit anyone from the outside having any insight into the issue.

Regardless, couple this report along with the most recent analysis that the Floresian Hobbits, despite fire and tool use, weren’t even human, and I find comfort in that. It means that becoming human was not a unique and improbable outcome of a single evolutionary path, but rather the natural extension of primate evolution (it’s also recently been discovered that New World monkeys have had increase brain size since their split with the Old World monkeys). Being not unique on this planet increases the likelihood of finding us elsewhere.

You can never fly too high.

Friday, September 11, 2009

Once More with Feeling

There is a new hominid find being touted in the archaeology airwaves of late, this time a 1.8 million year old fossil from Georgia (theirs, not ours). Quoting Steve Connor in The Independent :

“The skulls, jawbones and fragments of limb bones [of this fossil] suggest that our ancient human ancestors migrated out of Africa far earlier than previously thought and spent a long evolutionary interlude in Eurasia – before moving back into Africa to complete the story of man.”

Oh, poppycock!

Once again, it has to be pointed out that just because a creature was a tool-using, fire-controlling primate, doesn’t mean it’s our ancestor. Plain and simple. It certainly doesn’t mean that our ancestors “spent a long evolutionary interlude in Eurasia – before moving back into Africa to complete the story of man.” Even should it eventually be proved that said fossil is in our direct line, it doesn’t mean that whatever creature it was didn’t live in Africa at the same time, as well. Just because we haven’t found a similar fossil in Africa, doesn’t mean that the creature didn’t live there. It only means we haven’t found such a fossil there, as yet.

Ergo Ergaster

But while we’re on the subject of Eurasian holidays for lost primates, can we ask a couple more questions?

What happened to all those guys who left Africa to live all over the Old World: the heidelburgensis, Java guy, Peking guy, floresiensis, not to mention neanderthal? There’s much discussion about the fate of the neanderthals vis-à-vis modern humans, but virtually nothing about h. erectus and his alter-egos: Java, Peking, ergaster, habilis, et al. It’s little wonder the Chinese claim that erectus/ergaster/Peking guy evolved locally into modern humans along with all the other erecti around the world. After all, what did happen to them, if they didn’t evolve into modern humans?

Still and all, while there’s little wonder about the claim, there’s little to substantiate it, as well. Furthermore, it’s hard to see how all the members of a widely dispersed species can evolve concurrently. I’m going with the theory (seemingly supported by the evidence) that modern humans only appeared once and then quickly took over the entire world.

So then, what did happen to the pre- or non-human primates that spread over the Old World. We know that they disappeared, but when? And how and why? Even though we only search for answers to those questions regarding the neanderthals, it seems as reasonable a question for the other species, as well. Certainly it’s being asked vis-à-vis the little people from Flores.

Current thinking (admittedly, this changes almost daily) is that the Flores Hobbits were not evolved from erectus, but shared a common ancestor with them. Interestingly, the claim is still out there that modern humans evolved from erectus. How that affects our relationship with the Hobbits is beyond me, but it certainly doesn’t address what happened to either erectus or Hobbit.

What I’m trying to understand is how a tool-using, fire-controlling animal, such as erectus, could simply disappear. Are we to believe that erectus died out naturally in most of its territory before modern humans arrived on the scene? It just seems so unlikely. Why do we think that primate line died out? Or was it still in place when modern humans poured out of Africa? Why would it have died out before humans got to it? What would have killed it off? If it didn’t evolve into modern humans—because that could only happen in one isolated place—did it simply die out before the arrival of modern humans? If so, why?

It appears to me that this Georgian find only adds to the number of biped primates that spread around the globe. We were only the most recent, but it’s beginning to look like we’re the last.

Thursday, September 3, 2009

Christians Aren't Perfect
Just Forgiven

[Bumper sticker, late 20th century.]

A position oft expressed goes: if there is no god, there is no meaning or direction to life and consequently one can behave however one wishes. It’s said so often and so matter-of-factly, that in most discussions about morality and its origins, that position is virtually a given. It is, one can safely say, the official American political opinion. There may be more enlightened countries around the globe, but if you want to get elected in the United States, you’d better adhere to the principle that morality is directed from above.

Which means, of course, that our country is run by arrogant fools and liars, but that’s another story.

The problems, though, go beyond electioneering. The curse of monotheism has been to create a race of zombies willing to do anything the power structure asks of it, including killing people who are in its way, for one reason or another. It’s not just a theoretical discussion, we have here. The pervasive claws of monotheism scratch at the tiniest corners of our society. They leave no mouse unscathed. The great bulk of our prison population, for example, is not a mass of murderers and mayhem, but of people there for cultural differences, not crimes. We criminalize many things in our society which in and of themselves are not crimes; consequently, the overwhelming majority of people in prison in this country are there for drug offenses, and drug choice is strictly controlled by religious content. They are not arrested for the effect the drug has upon them or society, but merely upon its illegality. One can safely say that all drug offenders in this country are there because of the illegal nature of their product, and not because of anything their product caused people to do. Almost all crime related to drug use is caused by the illegal nature of the drug—and this goes for any illegal drug—and not its pharmacological action. In and of itself it’s rarely a criminal issue and almost as rarely a societal one, other than one man’s meat being another’s poison. But as soon as one starts declaring, simply because they can, that another man’s meat is illegal, all hell breaks loose.

You’ll note that prohibition of alcohol—known in the Muslim world as the “Christian diversion”—lasted only a few years, while the prohibition of other drugs continues in this country to this day. At the rate we’re going, we’ll humanize our laws only slightly before Singapore. (You’ll note that Brazil, Argentina, and Mexico have decriminalized personal drug use across the board, following the European examples of Portugal, Holland, Switzerland, and on.) But one thing you can be sure of about Americans, we might have the last band in the parade, but it sure as hell will be the loudest.

But I digress.

Nonetheless, the claim of religious origins for morality is so ingrained in our culture that even humanists worry sometimes that there might be a “god” gene in there somewhere that necessitates religion; and they often have trouble knowing where everyday, drugstore morality could come from. As if it were a great mystery. How do we know how to be good, unless someone tells us how? Besides our moms and dads, of course.

Well, OK. Bears and wolves and lions and tigers have no religion, right? They can’t possibly “know God” in any meaningful sense. There’s nothing stopping the top-dog lion, as it were, from killing all his rivals, right?

Except that there is. For the most part, lions and tigers and wolves and bears don’t kill their rivals. Certainly nothing at the rate that humans do. When it comes to killing their own kind, we are the masters.

So, what’s stopping them? The religious response would be that God has programmed the animals to behave as they do. It’s all part of the great design. The humanist response is deceptively similar and simple: animal behavior is innate and was worked out through evolution. In either scenario, the animal has no choice. But here is where the problem gets sticky. The assumption is made that humans are fundamentally different from all other animals; and that because we intellectually realize that we can make choices, we assume that our behavior is governed by that ability and not by innate patterning. It’s that belief which dictates the academic stricture to not anthropomorphize, assuming that we are fundamentally isolated from all other species.

Needless-to-say, the assumptions don’t survive scrutiny.

The fundamental problem with a religious origin for morality is the question of what happened pre-religion? What happened when we were a “mere” animal like all the others? Were we naught but wanton killing machines (not that we aren’t now)? Did we only propagate by rape? How did we manage to not eat all our children, if we didn’t know right from wrong? Or, did the desire to eat ones children only come with the epiphany of right and wrong? I realize that these question edge upon the absurd, but they point to the complications of equating religion with morality (or is that vice-versa?).

On the other hand, you can be pretty sure that tool use predated religion (I don’t think chimps have religion), and it’s hard to know how tool (read “weapon”) use affected the balance of environmental forces. It’s hard to know at this remove how sudden, vast increases in weapon power affected killing rates, but it’s equally hard to imagine that it was negligible. Once it’s easy to kill a buffalo, it’s easy to kill a rival. Perhaps religion evolved as a counterweight to big, sharp rocks, something to curb our wanton tendencies. But my best guess is that religion was invented to fill the gap between practical information and wondering where the hell this all came from. Religion as a byproduct of self-reflection. Its use in society is much more complicated than that, but I believe that is its genesis.

Morality, on the other hand, existed prior to cognition. Morality is inherent in all animals. Probably all plants, too, for that matter, but I can’t vouch for that. But there’s no question each species is governed by its own set of rules as to what it can and can’t do. Mainly eat. And most of the time it excludes ones own species (except for guppies and sometimes other competitors’ children). It only makes sense that each species has evolved with a strict code of cooperation which insures the maximum survival of ones species; to be otherwise would be inherently impossible. Deviation from those rules, one could argue, is impossible; at least not until self-reflection surfaces. One can argue that evil only exists because we can think of it. Prior to thought, it was impossible for evil to exist; it is a strictly human construct.

But the rise of self-reflection didn’t eliminate the power of inherent intra-species rules, i.e. morality. Simply because one was suddenly capable of thinking that, “Gee, I could kill my nasty neighbor,” doesn’t mean that they would automatically do so. Surely, even from the very beginning of cognition there were deeply felt urges compelling one to specific behaviors. What one should and shouldn’t do existed long before anyone gave it any thought. All thinking did was give us the power to do what we shouldn’t.

And we’ve been arguing about it ever since. Needless-to-say, this entire argument is arcane to anyone who believes we were created in situ, as such, by God; and offhand I don’t know of any way of getting through to people like that. If you believe we were all plunked down here, fully formed, 6000 years ago, there’s nothing I can say that will alter any argument we might have.

Yet even a belief in evolution doesn’t prevent some people from thinking that evolution itself is divinely inspired and that the recognition of morality was programmed to coincide with the recognition of self.

The answer to which is: well, yeah, maybe, but I wouldn’t bank on it quite yet; and at the very least it doesn’t answer but only postpones and confuses the issue. After all, if you don’t have a personal god, you don’t have much of a god at all. If all a god does is set the rules and the ball in motion with a Big Bang, what kind of god is that? If the god doesn’t care about life on earth because it’s such a minuscule part of the universal story, how is that a god? And if the god really does care about what you do here on earth, how believable is that?

Once one has accepted evolution and the tenants of observational science, the question of god’s existence becomes moot. But the question of from whence morality is not moot and is open to all manner of interpretation. To begin with, it’s essentially tautological to say it’s inherent. The question then becomes, how do inherent moralities play out in the confusion of self-reflection? Certainly, religions step in early on as arbiters of what’s right and wrong, but they forever remain a gloss over our inherent natures. We know what’s right and wrong without anyone telling us. If you don’t believe me, ask any little kid. (After that ask any teenage girl.)

As we’ve already noted, without self-reflection it’s impossible for a member of a species to act contrary to the species’ rules, as it were. The underlying compulsion, however you want to look at it, is to get along.

And that compulsion does not go away simply because we become self-reflective and capable of acting contrary to our compulsions. Our compulsion is to cooperate and get along, but the confusion of self-reflection, especially when poorly understood, allows us to act contrary to our best interests, sometimes with disastrous results. The compulsion to cooperate and get along drives all of our behavior from fundamentals, such as speech and mannerisms, to cultural overlays, such as style and religion; and it’s not hard to twist a desire to conform into a tool dividing us from them. Once you corner the market on good, you can commit all sorts of evil in its name.

Which brings us to the sanctity of religion.

Let me observe that there is no such thing as a religious war. No god has ever told anyone to go kill anyone else. It has never happened. All decisions to kill people are made by people for people reasons: i.e. control/power. The reasons may be couched in religion and the combatants might think they’re going out there for the defense of religion, but someone always knows its a bunch of hooey and that they’re doing it for them. The person pulling the strings always knows it’s poppycock.

So why does religion get a free pass? Why does it get sanctified, if all it does it turn people into meat puppets?

Because it does it so well.

It does it so well that lots of people can’t imagine life without it. In fact, they’d rather kill than go without it. In fact, they’re often willing to kill you, even if all you want is to go without it.

Maybe now it’s getting clearer why monotheism was so important. Someone had to take control of the incredible power of religion. To leave morality scattered in the hands of multiple gods was not good for war. Better to have only one. Much easier.

The bottom line, of course, has always been the Golden Rule. The Ten Commandments are rather useless, being primarily concerned with religious power, and totally hit-and-miss with their few practical suggestions. You shouldn’t commit murder, that’s for sure (that’s number 6), and you shouldn’t commit adultery, steal, lie, nor—God forbid—even covet your neighbor’s riding lawn mower (that’s number 10), but apparently, if you’re not married, it’s perfectly all right to rape your neighbor’s wife, so long as you don’t covet her. It’s a fine distinction.

In any event, it’s easy to see where one’s natural urges, when it comes to right and wrong, are more reliable than religious prescription. The Golden Rule is ten times safer than the Ten Commandments. Trust me. Yet it doesn’t even make the list. Why is it not on the list? Because it’s not good for manipulation. It’s hard to convince people that they should make war on another people because they so much want war brought upon themselves. It’s a hard sell. On the other hand, with the current list, all you have to have is someone worshipping another god (or none at all) and your first commandment is to do something about it. You get to think of what to do. What do you think you should do to someone who violates the first rule in the list of “the ten most important rules”? Remember, this is six places above murder. What should you do, if a whole nation thumbs their nose at your god? (Actually, God doesn’t leave the choice of what to do about those nose-thumbers to you. He says kills them. Read Deuteronomy, if you’d like more of the same.)

You can see how important it becomes to have that god and protect its sanctity above all else. The blind compulsion to follow is the most potent organizing tool a society has. It’s inherent.

The real question then becomes, if it’s inherent, how do some people escape it? From whence rationalism and the Enlightenment? Are not rationalism and enlightenment as much a product of self-reflection as evil? Or for that matter, good itself? How does one escape the compulsion to follow the crowd?

Beats me, but it’s the great divide in the human race. Forget about race, religion, sex, country of origin. The great human divide is whether or not you’re able to give yourself over to someone else’s direction. Are you able to let someone else make your ethical decisions for you? If you are, you’re simply following the ancient necessity to fit in with the “the species”; it’s where you’re safe. How are people willing and able to abandon that security and make those decisions for themselves? For that matter, how does one get to the position of making those decisions for other people? Certainly, the clues lie in self-reflection. Eventually climbing the holy hierarchy, one comes to the realization that moral decisions are made by people, not holy writ. If you’re honest, you’ll eventually get to the point where you realize that the voices in your head are the product of your own imagination, not the outside voice of God.

But it’s not an easy realization to have or live with. Every person who realizes that morality is both an individual responsibility and a species necessity, has to make their own moral choices. They cannot rely on exterior authority. Guidance, yes, but authority, no. In the end, all moral decisions are personal. One can only make them for oneself.

Which is why that bumper sticker is so scary. Indeed, Christians can be absolved of their sins. They can have them washed away by the blood of Christ. Which means anything done in the name of God can be forgiven. War, torture, excommunication, burning at the stake; they’re all okay in the eyes of the Lord. And in the eyes of his believers. The people who run religions know that. They know they use their flock as canon fodder, if not just milk cows. They know that if you believe that the majority of the people believe a particular brand of religion, there’s good chance you’ll believe it too, unexamined.

The “unexamined” part is important. One constant of all religions is the requirement to believe in the absurd, because once you’ve accepted the impossible, nothing any longer is. Any realistic appraisal of any religion will immediately point up its absurdity, so it’s crucial that believers do so blindly. The choice, when the church has been able, is to kill people who examine their religion. No religion can withstand objective scrutiny, so it’s necessary to require believers to accept the absurd; to believe that God really is directing them when they speak in tongues. In any other instance, having voices in your head is a sign of insanity, but not if you claim that voice is that of God. That argument gets a special pass.

So, if you’re a Christian, you’re forgiven of your sins while the rest of us have to behave properly or suffer guilt. You, thank God, can avoid the suffering of guilt by simply believing you are forgiven for your sins. Sort of takes away the incentive not to commit them, doesn’t it. It’s nice to have a free pass.

In the end, it’s religions which encourage people to act barbaric, while non-believers are responsible for their own behavior. One understands that religious people behave morally by accident, not by conscious thought. The job of religion is not to make sure that people act properly, except in the sense of following its own special codes. The job of religion is to make sure people follow its authority. Monotheism in particular has little social value beyond population control. (You might, for example, think of the control aspects of charity versus insuring a decent standard of living for all. Charity is so much more powerful than developing self-reliance.)

We will quit this diatribe here. It’s a lonely diatribe, anyway.

But let me leave you with the admonition to be responsible. Don’t hand you soul to anyone else. Only you can prevent forest fires.

Thursday, August 20, 2009

Copernicus Redux

Copernicus, we remember, got in trouble for suggesting that Earth might not be the center of the Universe. We snicker now at such provincialisms. Yet at the same time we warn against anthropomorphism, the trait of looking at anything through human eyes. We are continually reminded that other animals don’t act like us, although most often the reminder comes without the qualification “other”; usually it’s just “animals don’t act like us,” as if we were somehow separate and distinct from the rest of the kingdom.

A defining difference between us and “the animals,” has traditionally been tool use. Only people use tools. That has been a given. Ergo, if one finds evidence of ancient tool use, one has found evidence of early humans. Look at the archaeology of Britain, for example. They’re forever talking about early people in Britain up to 500,000 years ago on the strength of finding stone tools. If A, then B. If all stone tools are made by people, then, if one finds stone tools, one has found evidence of people. Can’t be any other way.

Unless, of course, the premise is wrong; and the more we look around, the more it’s becoming evident that tool use is an upper-primate—call us apes if you will—characteristic, not simply a human one.

It’s also axiomatic that, if two tool-using primates are found to coexist, there’s no guarantee that one of them developed out of the other one; they could have, and probably did, come from a common ancestor further back. The Neanderthals were bad enough, but now we have the Flores Hobbits . The Australians are reporting that, not only was the Hobbit not a human, but that it predated (at least on Flores), the traditional pre-human primate that everyone likes to claim as an early human: homo ergaster (et al). He’s the same guy as Peking Man and the folks leaving those early tools in Britain, if I have it right, who both the Chinese and the English claim as early humans. The Neanderthals we could handle so long as H. ergaster was predecessor to them both; i.e. one tool using species giving rise to two branches: the Neanderthals and us.

But those pesky Hobbits throw a bone into the machinery. If they were not an evolution of h. ergaster, from whom did they evolve? And if they didn’t evolve from h. ergaster, who’s to say we did? Or the Neanderthals?

But isn’t it interesting that all three species, us, ergaster, and florensis, all managed to cross the forty or fifty miles of ocean necessary to reach Flores? Did two species arrive there by accident, with only us getting there on purpose because we knew how to navigate? Or did all three species have more in common than fire and tool use? What would it mean that at least three species of greater apes have learned how to sail? Or paddle?

Isn’t it a tad presumptuous to call all tool users “human”? And isn’t equally presumptuous to think that any characteristic we think of as human is our prerogative exclusively? I’m not saying that the human family isn’t big enough to hold some pretty weird characters, but I don’t necessarily think that any chip off the old stone is a person. Just because we now know chimps use tools, doesn’t make them any cuter to me. I’m still not ready to let them into the family. If they can figure out how to be butlers, fine, they can have a job; but don’t expect me to let them date my daughter.

Back to the drawing boards, folks. We’ve got some rethinking to do.

But chimps are not merely highly challenged people.

Wednesday, August 12, 2009

Seafood Mama

This just in from The New Scientist: Seafood gave us the edge on the Neanderthals.

Well, duh.

Although the article doesn’t prove or even conclude that. What it establishes is that early (40,000 ya, in this case) humans did eat a lot of marine life, whether they were living on the coast or inland; and they probably ate more than Neanderthals did. Whether or not this “gives us an edge” is, I imagine, open to debate; although something certainly did. If it was seafood, pass the scampi.

Concentrations of iodine in the bones of the early humans examined established its origins in an aquatic diet. Human iodine dependency is a well known fact, and the only explanation I know of for that dependency is a long-term accustomization of the human body to a seafood diet. Forty-thousand years is probably not enough to create a dependency in the entire species, such as we currently experience. (Better eat that iodized salt.)

But while I’ve got you, I’d like to remind you that Neanderthals were not people. Peking Man was not a human. Homo robustus was not human. Homo erectus was, not only not human, but probably wasn’t in our line. In fact, none of them probably was. And even if they were in our line, that doesn’t make them human. Somewhere there was a one-celled animal whose descendants are alive in the form of you and me; that one-celled animal was not human.

As far as we currently understand, humans arose some 200,000 years ago. Before that time, there were no humans. Those other bipedal critters in our direct line? Whoever they were, they weren’t people. And we don’t know for sure that any of those other bipedals we’ve discovered were in our line; perhaps none. We may know sometime, but to claim descendence from any currently known fossil species is jumping-the-gun.

Thanks, and have a nice day.

Friday, July 24, 2009

Spear Chuckers
Tools and Bipedalism

That humans have an intimate connection with water is self-evident, beginning with where we live and the amount of water we consume on a daily basis. In this we are unique among surviving primates. Unquestionably, other primates make use of and enjoy water holes, but none has attached themselves to the water hole as have humans.

Humans are also the only current obligate bipedal apes on the planet, so it’s been hard not to assume a causal relationship between our water dependence and our bipedal behavior, though the exact mechanism and motivation has been foggy. What has seemed probable from the outset is that the shift from discretionary to obligate bipedality was dictated by increased access to either food quantity or quality or both. It makes a certain amount of sense to think that an ape finding food foraging particularly good in the marshes, lagoons, and shallow waters would eventually adopt as permanent behavior, bipedality, that which serves them so well in a foraging environment. The main problem with that scenario was imagining that almost all of our ancestral food foraging was done in the water, rather than a shared foraging between aquatic and terrestrial resources, which seems much more likely, and with, presumably, only occasional time spent in an aquatic environment, with the rest spent on land. It’s hard to see quite why such an ape would take up walking on two legs on land, simply because it was necessary while in the water. It’s one thing to imagine an ape harvesting aquatic resources, but it’s another to think that any, much less a whole species, would choose to essentially live in the water on a daily basis. Nonetheless, given no other options, it seemed that bipedality was likely connected to aquatic foraging. The Alice in Wonderlandish musings about other potential scenarios, such as bipedality reducing ones solar exposure or increasing ones ability to see ones enemies/food, have always been beyond the pale of refutation. They can’t be taken seriously. Furthermore, none of the mainline theories has had the courage to tackle the water dependency question, yet. And with good reason. It complicates things no end.

But. I should make that bigger: BUT!

But there have been some recent developments (or, probably more correctly said, “recently released to the public developments”) concerning the study of chimps and bonobos that throw into question what it means to be human; most specifically the discoveries surrounding chimps making and using wooden spears in the hunting of bushbabies, a small primate cousin which they favor for dinner. The chimps sharpen carefully chosen sticks by gnawing on them. Like the neanderthals, they don’t throw their spears but rather use them to poke around in bushbaby nests until they find one.

Now switch your attention to the British chimp at a zoo who developed a rapid-fire throwing technique for pelting gawkers at his cage. When the zookeepers removed his rocks, he began to tear off chunks of plaster from the walls of his cage and used them. Furthermore, this wasn’t a spontaneous display by the chimp; instead he would spend the morning readying his ammunition stash in anticipation of opening. His was premeditated warfare.

It’s my understanding that bonobos share hunting predilections with the chimps, if not to the same degree of intensity. Nonetheless, it’s now evident that tool using, and even tool making, is not uncommon among apes, and appears the norm, rather than the exception. Needless-to-say, we’ve never found evidence in the wild of discarded chimp weapons, we wouldn’t recognize them, if we saw them. A stone looks awfully much like a stone, unless it’s in someone’s, or something’s, hand. It simply disappears in the archaeological record. As does a wooden spear. We can probably assume, for example, that proto-people used stones and wooden spears as weapons for millions of years before they hit on sharpening the stones. Which, consequently, makes a pack of australopithecines armed with such spears and stones a much more formidable foe than I’d perviously considered. A big enough pack might make a pride of lions think twice about the food value of those skinny little twerps.

Which makes me wonder if being able to haul around spears and projectiles might be enough of a benefit for food gathering and safety that those apes who can do it all the time can out-feed and out-breed those who can’t. Are we bipedal because we learned to use and carry tools? Prior to these recent discoveries, it had been assumed that bipedality arose prior to tool use, but that’s clearly not the case with our cousins, and there’s no reason to assume that it happened thus with us. Certainly, there had to be eons of using stones as tools before someone hit upon shaping the tool by flaking, which is where the archaeological record begins. When we see a millions of years old, crude hand axe, what we’re seeing is an enormous technological advance on an age-old artifact: the stone. We are not seeing the beginning of a technology; we are seeing an increased sophistication. One could easily imagine a pre-shaped stone age that lasted considerably longer than the “neo-stone age.”

Which essentially turns the standard viewpoint on its head. In the history of archaeology it has always been assumed that bipedality predated tool use. It was felt a serendipitous development that bipedality freed our hands for tool use; whereas the truth appears to be that our tool use accelerated our bipedality. In retrospect, it’s fairly astounding that no one, myself included, came to the logical conclusion that when we were witnessing the first crudely flaked hand tools, we were, inevitably, not seeing a brand new technology, but rather a refinement on an extant, time-tested product. Surely people, or rather would-be people, used stone tools for millions of years before they thought of shaping them. It was one thing to sharpen wooden spears; it’s a whole other matter to sharpen stones and a lot less self-evidently possible. It takes an intimate knowledge of stone acquired, undoubtedly, through millennia of experience, to see the possibilities of shaping certain kinds of them. That I’ve never seen this self-evident part of the process of stone technology mentioned in any discussion of the beginnings of tool use, implies that is hadn’t occurred to anyone until now. (Which, I guess, makes me feel a little better; at least I wasn’t alone.) Now, of course, with the realization that many apes are tool users, it’s observable that tool use precedes manufacture of their permanent versions. It is, unquestionably, one of the great insights of modern evolutionary archaeology, ranking right up there with Alistair Hardy's cogent observation about human body fat.

If bipedalism is a byproduct of tool use, then it frees the development of bipedalism from environment. In other words, if bipedalism arose in response to tool use, it could have happened independent of the ape’s physical surroundings. It could and probably did arise in many different environmental niches, albeit probably all within the standard primate range. It means that it didn’t develop in response to aquatic foraging, nor to the disappearance of the forest, the two current standard models. Tool use, instead of defining humans, appears to be a common development among the apes; and if tool use is common, then the tendency towards bipedalism is probably common, as well; and given enough time and enough apes…

What this theory of bipedalism doesn’t do, though, is solve the problems of our aquatic connections. Even if tool use created our bipedality, it wouldn’t have changed our basic environmental niche, which is where our physiology was created. That humans are apes who inhabit the waterline is not a theory but an observation. Our water dependence may be unrelated to our bipedality, but it’s not unrelated to where we developed. And it’s still reasonable to think that the smartest ape would choose the best foraging/hunting ground for itself, which is always going to be at the water’s edge. There’s still no argument for our particular human development other than down at the waterside. We’ve undoubtedly been hunting whatever the neighborhood looked like for a long time, be it jungle, be it savanna, but we’ve always lived down by the water hole. There is simply no other call for how we came to be. Or if there is, it’s yet to show its head.

Gilligan’s Ideas

But I’m not done yet: a few remarks on Ian Gilligan’s theories.

Gilligan’s theories were recently expounded in Science Alert, an Australian-New Zealand online science news source. The main thrust of Gillian’s work, and you should read the piece, is that the development of clothing radically affected human development, which is almost tautological, but deserves close attention. He is absolutely right in noting that, without clothing we’d still be stuck in the jungle, more or less; something many people discount or undervalue.

But the middle of the article finds some words about the naked ape that are most interesting:

As Gilligan points out, Homo sapiens are thermally very vulnerable, having at some point lost the thick fur covering of other mammals. The idea that this might have occurred in response to heat doesn’t really hold up, as fur can also insulate animals in warmer environments. Gilligan’s guess is that human hair loss came about as a side-effect of a slowing of the expression of the genetic code in our species, meaning that we’re essentially juvenile mammals in physiological terms, if not in mental capacity.

“Slowing of the expression of the genetic code.” Now, what does that mean? How does that manifest itself? How is the code normally expressed, and how is its expression slowed down? This is important, because it’s essential for understanding the next thought: “meaning that we’re essentially juvenile mammals in physiological terms, if not in mental capacity.”

I don’t know how you read that, but I read it to say that early human development is retarded in some manner and that said retardation somehow affects hair growth. I’m not sure how “juvenile mammals” fit into the human equation, because it seems to me that juveniles of furry animals are every bit as furry as the adults; so I don’t know how this “slowing of the expression of the genetic code” is supposed to function, but at first glance it’s unexplained. Presumably there’s more to it.

But offhand, I’m not buying it. At least until further clarification, but the obfuscating terminology is not helping.

Other than that, he’s dead on about the clothes.

Except maybe the dates.

Wednesday, June 24, 2009

Chimps at the Water Hole

Last night’s (June 23, 2009) NOVA, “Ape Genius,” opened with a group of chimps having a “pool party” (their words) in the wilds of Africa. There they were, dropping from branches into the water, splashing everywhere, behaving like, well, kids at the beach having a grand old time.

You won’t, of course, find much about that in print, because it hasn’t reached the general knowledge base yet, but it’s there in living color. Chimps like to play in the water. At least some of them do.

NOVA went on to show chimps making and sharpening spears to hunt bush babies, a small arboreal primate, as well as understanding and following fairly complicated English instructions. The gist of the show was that we share more traits than we’d like to admit with the other apes.

It began by explaining the human bias towards thinking that many characteristics, such as tool making, are uniquely human as coming from a feeling that people have been blessed by the hand of God, so to speak, making them a separate beast from the other apes, which, the narrator promulgated, was not the truth. Easy for him to say, yet the show continued to speak of the differences between humans and apes rather than between humans and the other apes: a small but significant distinction. For even the most humble of observers it’s difficult to say “other apes,” because couched within that phrase is the admission that we, too, are one of them. (Something, frankly, which bothers me when I see us all driving cars down the road: should apes be doing this, I wonder?)

The show questioned why people ended up the way we did versus how the other apes ended up, but it didn’t look at the broader implications of these discoveries towards evolution in general; it only looked at the relationships between humans and chimps and bonobos. Certainly, chimps and bonobos inhabit different strata of the forest and eat probably slightly different, if similar, diets; and it’s well known that the two species have quite different social structures and behaviors. “Ape Genius” tried to make a case for humans developing differently than other apes because of certain social traits which we have, such as being able to squelch our emotions, rather than seeing those social traits as part of a larger complex of behaviors. Significantly, they didn’t discuss the implications of chimp behavior at the water hole, which is frivolous and interactive, nor did they make any connection between bipedalism and human behavior. It could well be that the forces which inclined us to be bipedal were the same forces that inclined us to be socially cooperative. And it’s plausible that both those traits were picked up at the water hole. (If I were you, I’d keep watching those chimps at the water hole. If they start liking frogs and tubers, who knows how far it could go. Couple million years, they might stand up and sing.)

NOVA also failed to stress the ubiquity of tool-making beyond humans, chimps, and bonobos; but if we have three extant primates making tools, the implications for primates of the past is enlightening. For one thing, it means that tool-making is not a uniquely human characteristic; so that any tool-making fossil from the past is not necessarily in our line any more than is your local bonobo.

But watching those spear-chucking chimps made me rethink early hunting strategies and what sort of weapons would be effective in the open country; and I’m starting to think that a pack of Austrolopithicines armed with sharpened spears might be a formidable foe. They might not be fast of foot out there in the open, but they were clever and, probably, cooperative. This hunting with the dogs thing is starting to make a lot of sense. One thing us “higher” apes are good at is learning from observation. I can see our ancestors watching how pack carnivores work and imitating their cooperative methods. I can see how, when they found themselves hunting the same prey, that the primates would start mimicking and running along with the dogs, as it were. I can picture the dogs looking to each other and asking, “Who let the people out”? And I can see after a successful hunt, right from the get-go the tiny little A-piths tossing the dogs their share: A) it avoids a nasty fight; and B) it insures cooperation next time they find themselves hunting together.

We might have been tough guys out there on the veld. Provided, of course, we had enough to drink.

Saturday, June 13, 2009

Mr. Wrangham’s Excellent Conjecture

Richard Wrangham is about to publish a book, Catching Fire: How Cooking Made Us Human, where (according to New York Times author Claudia Dreifus) he suggests that cooking was an essential ingredient in our rise above our fellow primates, largely by reducing the amount of time spend chewing our food up. He compared our primal diet to that of chimps which, he says in an interview with Ms Dreifus, they’d have to “masticate for a full hour.” By cooking one could cram the same food down ones gullet in a few minutes, leaving one, presumably, more time to work on ones sonnets and Pythagorean theorems.

Done.

I haven’t read Wrangham’s manuscript, but I have read the NY Times interview plus another article on his proposal and am prepared to make a few observations.

The first of which being it is an excellent observation and undoubtedly one with a lot of merit. Unquestionably, cooking took hold for some reason. Whether or not cooked food is “more nutritious” and “healthier,” than uncooked food is certainly open to debate and shouldn’t be accepted prima facie (although cooking definitely makes some foods palatable, which could otherwise be deadly), but that cooking makes food quicker to eat is undoubtedly true, although that may not have played as great a role in its adoption as flavor. If Wrangham displays any major fault with his theory, it’s that he’s too much in love with it, a common problem with theories; I suffer it myself.

Another fault is comparing us to chimps too keenly. Unquestionably, we’re closely related to them and their cousins the bonobos, but we’re likewise a lot different. No one would confuse the two of us walking down the street. And while we may share a distant ancestor, we long ago took different paths; occupied different parts of the forest; and, evidently, ate a different diet; something Wrangham has yet to fully grasp, even after trying to mimic the chimp diet when he was living in Tanzania in 1972. It’s true, we’re both omnivores, but that doesn’t mean we eat the same things; and trying to figure out what a chimp eats won’t necessarily lead you to what proto-people ate two or three million years ago. Especially if you don’t know where those proto-people lived, an issue about which Wrangham is confused. (We won’t get into Wrangham’s being primarily a vegetarian, which casts serious doubt on his understanding of both nutrition and evolution. I should mention here that I had a wife once who tried to emulate the diet of her goats, under reasoning not too different from Wrangham’s, but was dissuaded after the first mouthful of barbed grass heads stuck in her throat for hours.)

Wrangham bases much of his argument on assumptions about human fire use from 1.8 million years ago, a somewhat earlier date for fire use than is generally agreed upon, but not so early as to be improbable. What becomes questionable is to what uses fire was put at that early time and how often was it available. The only sources I’ve found estimating when people were first capable of “making” fire pegs that date between 9,000 and 15,000 years ago, which seems impossibly recent to me. It’s hard for me to believe lightning could be a reliable source of fire, particularly for people traversing the northern reaches at the edge of glaciers, but it’s certainly likely that people were able to control fire long before they could create it. Still, it’s hard to imagine that accidental fire would ever be common or ubiquitous. One can only imagine that for eons people clustered around fires when they had one, but that for most people most of the time there was no artificial warmth; and when someone decided to try cooking something other than a hunk of meat is not determined. In any event, people were upright creatures millions of years before they captured fire, and they were, apparently, already on a superior road to the chimps by the time they stood up for good, long before they started cooking anything. Cooking was a great technological leap and a great time-saver, but it probably didn’t affect our basic nutritional intake, at least at first. Certainly, as some products lend themselves more conveniently to cooking, they tend to become emphasized, but I can only presume that cooking didn’t initially affect food choice. The bottom line, though, is that we have and had our own dietary preferences which are distinguished from what other apes eat and most likely always have been. It’s likely that those early dietary differentials are part of what propelled us on the path we found ourselves; not to mention that those diety preferences probably caused us to becaome upright, as well.

Wrangham drops other bon mots into his conversation which are clearly either inaccurate or unknown. When he states that “the austrolopithicines, the predecessors of our prehuman ancestors, lived in savannahs with dry uplands,” he’s making both errors and assumptions. When the austrolopithicines first descended from the branches, there were no savannas where they were. True, they showed up by the time fire was captured, but they certainly weren’t where any prehumans lived. The austrolopithicines weren’t able to change their habitat, just because their world was drying out around them. Fortunately for them, they lived in a micro-environment along river banks and around marshes and swamps that may have been reduced in size, but never disappeared. When the savannas appeared they could forage, scavenge, and hunt them without actually living in them; provided, of course, that they’d developed weapons and a way to carry water.

Wrangham, likewise, makes several assumptions about how having fires changed our socialization, such as fires providing a stabilizing hearth around which people would then cluster. “This was clearly a very different system from wandering around chimpanzee-style, sleeping wherever you wanted, always able to leave a group if there was any kind of social conflict,” he claims; ignoring that most animals, I’ll bet chimps included, have regular bedding spots (not to mention birthing spots) that aren’t chosen so devil-may-carefully as Wrangham would suggest. Offhand, despite a myth of wandering animals and people, everything has a home territory, every bird has its branch. Even albatrosses.

And while it may be a minor point, there’s no guarantee that austrolopithicines were our ancestors any more than the species h. habilis was, which he also claims as a “distant ancestor.” If an ancestor is a person in ones direct line, then there is no assurance that either austrolopithicines or habilis were our ancestors. We may all have shared another as yet undiscovered ancestor. Someday we might have tests that can determine the relationship of those old fossils to ourselves, but in the meantime we’re going to have to go on the assumption that those old species were relatives, but not necessarily ancestors. We can safely assume that the majority of fossil primate species, bipedal ones included, have yet to be discovered and may never be. Just because a species that shares many of our traits was common at an earlier time doesn’t mean that it is in our direct line.

One also has to wonder about Wrangham’s general life experiences when he makes a statement like, “They [h. habilis] certainly made hammers from stones, which they may have used to tenderize [meat]. We know that sparks fly when you hammer stone. It’s reasonable to imagine that our ancestors ate food warmed by the fires they ignited when they prepared their meat.”

No it’s not. It’s not reasonable at all. It’s a prime example of being in love with ones own theory; once you’re in love, anything is possible. Even if it’s not. Nobody whacking away at a wet piece of meat with a rock is going to send off sparks that are going to catch that meat on fire. Or anything else laying nearby, either. Ain’t gonna happen. If it happened once in the entire history of humanity, I’d be amazed, but to depend on it as a way to get ones cooking fire going? The danger of stretching ones argument like that is that it casts doubt on the rest of ones propositions.

Such as trying to emulate chimps’ eating patterns. He wanted to eat just like a chimp but “in the end… never did the full experiment.” He did allow, though, that “there were times when I went off without eating in the mornings and tried living off whatever I found. It left me extremely hungry.” He relays this, as if it was a valid contribution to the discussion; that it was a valid experiment from which he got a significant result: he was hungry. Aside from the aforementioned problem that we don’t share a diet with chimps, here was an untrained, naive city boy trying to live off what he could find to eat, despite not knowing beans about what’s edible or not in the landscape. And then he has the balls to imply that his hunger was akin to what all “prehumans” would have experienced.

Not likely. And remind me not to go hunting with this guy.

A couple other suppositions surrounding fire test the credulity of the most humble among us:

One was that early cooks would place carcasses in front of advancing wildfires in order to have them cooked as the fire passed over; which surely explains the high hazardous duty pay that habilis cooks earned. Not to mention an attrition rate higher than kamikaze pilots. This supposition, even though burned bones can be analyzed to see whether it was a cooking or a wildfire that charred them, as cooking fires reach much higher temperatures than wildfires. In other words, not only would such a method of cooking be absurdly dangerous, it wouldn’t provide a superior, or necessarily even adequate, result. Trust me, our ancestors survived to become us because they were clever, not foolhardy.

Another amazing suggestion is that people lost their fur/hair in front of the fire, the hair being no longer necessary to keep one warm. It immediately makes me picture the cowboys coming in from the range after riding herd, gathering round the camp fire, taking off their clothes… Sure, we’ve all seen Brokeback Mountain. We know all about campfires and long, lonely nights. Heck, every campfire I’ve ever been at, everyone has taken off their clothes. Your fires, too, I imagine. All that notwithstanding, wouldn’t we expect people who’d lost their hair from sitting in front of a fire to have gotten bald chests and hairy backs? Why hair on the tops of ones heads, for God’s sake? How about to keep from getting sunburned?

That’s what’s meant about the dangers of falling in love with ones own theory. The seduction to explain everything is too great.

It’s too bad, because the origins of cooking are obscure, yet obviously crucial to human development. The relationship between cooking and farming has yet to be explored in any detail. Did we cook anything other than meat prior to the adoption of farming? Was cooking a catalyst for farming? It’s an intriguing question which Wrangham is right to approach. Hopefully, the next person to look at the subject will be more grounded.

Sunday, June 7, 2009

Follow the Bottle Gourd

The Web is aptly named; the strands are infinite and one can get lost in there. Things appear and disappear; move or are gone forever. I recently ran across a Dec. 14, 2005 article from Science Daily, “Ancient Humans Brought Bottle Gourds To The Americas From Asia.” Thank God, because it’s the only work I’ve seen on the subject since Charles B. Heiser’s 1979 seminal tome, “The Gourd Book.” Heiser’s book doesn’t touch on the subject of human evolution (at least I don’t remember it doing so), focusing instead on the use of gourds in human history. Undoubtedly, the most fun part of the book is the stuff on gourds as penis display extensions, but the statement that caught my eye was that gourds were most likely the first domesticated plant, predating any foodstuffs.

Really?

I was already convinced of the aquatic side of human evolution, but a finding like this, if true, was a powerful piece of evidence in its favor. Now, a quarter of a century later I find a confirmation of Heiser’s statement. The Science Daily article purports that “This lightweight ‘container crop’ would have been particularly useful to human societies before the advent of pottery and settled village life, and was apparently domesticated thousands of years before any plant was domesticated for food purposes.” (Emphasis mine.)

Bingo!

But let me back up a second. This is, after all, post No. 1 for Ape Shit. An introduction is in order.

Simply put, I advocate a demographic approach to the question of human evolution. I believe that a study of the demographics of known human populations can allow us to reconstruct the primal habitat in which we evolved. To my knowledge, this approach is unique and hasn’t been posited elsewhere. There are too many ramifications and elements of the theory to cover it thoroughly in one post, so it will all have to come out in the wash, so to speak.

The guiding principle of the demographic theory is Occam’s Razor. The demographic theory is quantum leaps simpler than anything else proposed, which allows it the luxury of not having to prove aberrations. The various complexities proposed by the savanna school and its offspring, such as standing up to be in the cooling breezes or to carry food to ones mate, verge on the absurd and would generate ridicule were they not proposed by stalwarts of the academic community. That they continue to be put forth makes one shake ones head in amazement.

The only way the academic community has avoided capitulation to a variant of the AAT is by refusing to engage in a discussion with AAT supporters. If you don’t debate them, you can’t lose the debate. Duh!

Because there’s no way anyone can stand in front of an audience and claim that our ancestors’ response to the savanna was go out and behave in ways that no other animal in the history of the world ever behaved and in ways that would surely have gotten them killed, had they done so. You can only pull stuff like that for so long before some fresh-faced kid yells, “Hey, that penguin’s got no clothes.”

But I promised…

Back to the gourds.

Why gourds? Why domesticate gourds thousands of years before foodstuffs? In fact, some people question why start farming, at all? Apparently, if the record of gourds is correct, we knew how to farm thousands of years before we chose to farm something to eat. Why raise gourds and not food?

The demographic theory says, everyone lives by the water hole. We do now; we always have. We know that we all live by the water hole now; it is part of the human condition to have access to water wherever we are and at all times (more or less). To not have that access is to risk death. This is not necessarily true of other animals, and certainly not true of any other primate. We are unique in that regard. The water hole, bar, coffee klutch, is still, along with the hearth, the center of human socialization. Chimps don’t go hang out down at the water hole, but we always do. We consume far more water than any similar animal or other primate and likewise evacuate a unique and very dilute urine stream. The demographic theory assumes that the human condition as such is primordial. Nothing has changed since the beginning. We have always lived at the water hole.

Which means that for millions of years — well, ever since time began, really — we were stuck at the water hole, as much as a kingfisher is. We couldn’t leave; we had to have too much water too much of the time to get away from a freshwater source for any length of time.

Which was, truth to tell, not as bad as it sounds. For one thing, all the best food is down at the water’s edge, if not in the water itself. It seems, for instance, that we’ve always been fond of turtles and frogs and clams, as well as roots and nuts and berries; and they’re a lot easier to catch than squirrels or rabbits, not to mention monkeys.

Furthermore, being slow of foot, not good climbers, and plodding swimmers, we were safest down where the brush was thickest and the trees easiest to climb; or, if need be, we could jump in the water to avoid some predators. We weren’t great swimmers, but we were better than lions.

Good food, safe environment. Where do you think the smartest monkey on the block would set up camp? Don’t forget, for millions of years we were bipedal, small, slow, and completely unarmed, but clever. Not to mention that for those millions of early years there weren’t many savannas, anyway, but if we were caught out in the open, we were known as “lunch.” Given that food and safety were down on the bank and that we were a prime entrée out in the open, little wonder we stayed down by the water.

Or so the theory goes.

Living down by the water meant that food was rarely our great concern. We’ve always lived in the lush part of the environment. Which is why we could know about farming for thousands of years without being pressured into adopting it: we didn’t have to. We always lived among (relative) abundance.

But for good or ill, we were stuck there, as well, until…

Until we figured out how to carry water with us.

Bingo! Gourds!

Once we had gourds, we could escape the banks. We could travel. We could stay overnight somewhere. We could survive. Food we could find anywhere, but water was, and still is, precious. It makes perfect sense that the bottle gourd was the first domesticated plant, if everyone was stuck at the fountain. If we were living on the savanna, of course, raising bottle gourds wouldn’t make much sense, at all. Why drag around something you don’t need? And surely, if you’re a savanna resident, you don’t need a water bottle. You don’t see lions or gazelles with water bottles, do you? But if you didn’t really live on the savanna but rather lived down by the water hole and only hunted the savanna once you’d acquired weapons, then a water carrying device would have been a huge technological advance. You could see where people right away would start growing bottle gourds, the hell with rice and wheat and corn. You want to get up and get out of there, not sit down and grind grain for the rest of your life.

You can, by the way, think of bottle gourds at the gateway plant, much as marijuana was the gateway plant to the modern organic farming movement. It’s not well known outside the organic farming community, but many of them began their farming careers back in the 1960s and 70s as pot farmers; and only after they’d spent the effort learning how to grow quality marijuana did they turn their attention to the larder. Likewise, evidently, people grew bottle gourds for thousands of years before finally deciding that, if they were going to all that effort, they might as well try growing something else, at the same time. Indeed, it’s been known for some time that people were knowledgeable about food farming long before they took it up; which has generated the nagging question of why? What took people so long? If they’d known how to grow crops for thousands of years yet didn’t, what made them change their mind?

There is another side to the Science Daily article, though, that expands considerably on Heiser’s hypothesis. The gourd was brought to America from Asia, the researchers contend, “some 10,000 years ago.” Bruce Smith, co-author of the research paper says the these early immigrants “did not arrive here empty-handed; they brought a domesticated plant and dogs with them.” He doesn’t say “plants,” plural. Just “a domesticated plant.” The review article doesn’t cover cultivation requirements for the bottle gourd, but a University of Florida Web site does and adds the information that the bottle gourd “is the only crop known to have been cultivated in pre-Columbian times in both the Old and New World.” (A conclusion that is open to debate.) What I was interested in, though, was what kind of climate was required for growing bottle gourds, and the same site says they’re grown “from warm parts of the temperate zone throughout the dry and wet tropics.” The importance of that information is that implies that the gourd didn’t slowly travel up the Siberian and down the Alaskan coasts as people pushed into the Americas, but rather that it was most likely transported from one temperate zone to another an ocean away in one fell swoop. Quite what all that implies, boggles the mind. Essentially, what they’re saying is that 10,000 years ago someone deliberately schlepped some bottle gourd seeds from China to California (or thereabouts) for the purpose of planting them. Hmm? At the very least, it means that by 10,000 BPE the immigrants to the Americas were not just hunters and gatherers, but were already farmers. That’s a pretty big “at the very least.”

Furthermore, you can bet the farm that the 10,000 year old date is by no means a record of the earliest bottle gourd cultivation. That’s only the earliest date we currently have for its cultivation in the Americas. God only knows how long before that it was first cultivated in Africa before spreading to Asia and only then on to the Americas.

Well anyway, that’s how the demographic theory sees it. You have any better guesses? The savannistas won’t touch the issue with a ten-foot pipe. Why do you think the bottle gourd was so important?