Wednesday, December 26, 2012

Savanna Confessions

Red Colobus Monkey
So maybe I was wrong. Sort of.

It’s no secret that I’m not a fan of the savannista hypothesis, despite its considerable popularity within the profession, primarily because it doesn’t account for demographics. It flies in the face of our current demographics. I begrudgingly accepted that early hominins harvested the grasslands, but I was skeptical of the disappearing forest as catalyst for their abandoning the protection of the trees. It appeared crystal clear to me that something had to draw the people down from the trees; they couldn’t have been forced out; evolution doesn’t work that way. It was also fairly evident that that something was meat, and that the protection was weaponry. Quite why meat suddenly became important for an isolated group of simians, I hadn’t worked out.

Nonetheless, most people in the profession still insisted that the disappearance of the forest and the simultaneous emergence of hominins couldn’t have been coincidental. Furthermore, hominins were beginning to harvest the grasslands; they were certainly eating small ungulates and vertebrates, as well as turtles. Given our demographics, it is unquestionable that at some point humans moved to the waterside, but that’s another question; they first have to get out of the trees.

The problems with leaving the trees, of course, is that it’s, exceedingly difficult to change ones eco-niche. As a rule, when an eco-system disappears, all components of the system disappear, as well, plants and animals. That humans didn’t disappear when the forest disappeared says that, self-evidently, their food source didn’t disappear along with the forest; which in turn implies that their food source wasn’t entirely dependent on the forest.

Stepping back a minute and looking at the pre-split simian, back when chimpanzees and hominins were a single species, and considering that today’s chimps are regular hunters who fashion their own spears and considering that humans are currently hunters, it’s most likely that the pre-split simian was also a hunter who fashioned his/her own weapons/tools. And that simian was already spending considerable time on the ground. That being the case, when the simian species split into two groups—a necessary condition for species development—possibly by the growth of intervening grasslands, it could well be that one of the two groups was caught in an island forest that was slowly disappearing under it. For many, if not most, of the inhabitants of that eco-system, it meant their disappearance, as well.

These chumans (chimps becoming humans), though, had a diet which included meat. Their main meat, if current chimps are any indication, was probably monkeys of one sort or another; although chimps are known to eat as many as thirty-five different vertebrate species, which includes small antelope. One can imagine that, as the forest disappeared, not only did the plant matter on which the chumans depended disappear, so did the monkeys. Fortunately, though, small ungulates didn’t disappear. In fact, with the spread of the grasslands, so would many types of antelope spread. It might have been fairly easy for the chumans to switch to an antelope versus a monkey dominated diet. And, if the plant material which the chumans were accustomed to eating began to get scarce, it could well be that those chumans who we better at and relied more on meat for their diet, out-paced those still dependent on forest products. In such circumstances, a species could change the ratio of its diet fairly quickly without changing its basic dietary needs. And by such a process, chumans, by now hominins, could have adapted fairly quickly to a disappearing forest. Concomitantly, it also meant that the newly minted hominins would have to spend all their time on the ground if they were going to be successful antelope hunters.
Okay, I’ll give you that. That’s a likely scenario which covers the savannista basics. It gets people out on the prairies. It does, per force, get them out on the prairies as hunters. I’m sure the defensive value of spears wasn’t lost on these early ancestors of ours, as well. What this undoubtedly starts is the human habit of tracking down animals over long distances, killing them, and transporting the meat back to the main group/tribe/family. Given that one’s hands are occupied with either weapons or kill, it behooved one to stand up as straight and as efficiently as possible; and those that did so easily were rewarded with successful hunts. Natural selection will select for upright bipedalism, as it did.

In defense of my original argument, that evolution is governed by opportunity, not necessity, still holds true. The hominins weren’t forced out of the dwindling forest, they were enticed out by by a new abundance of game, a shift from monkey to antelope. If there is a significant problem with this scenario, it’s that our shift towards obligate bipedalism occurred some three million years before the thinning of the forest, but who’s counting? And we shouldn’t forget that the hominins had no idea the forest was thinning; they were simply out hunting like they’d always done.

But this hunting scenario has other requirements. I would argue that, by adopting hunting as a regular part of our alimentary scheme, we were forced to adopt other regular practices of predatory animals, among them the establishing of temporary dens, lairs, or nests where the young are raised until old enough to participate in the hunt or other harvest. I would argue that, from the time we left the trees, we made those lairs down by the water (if for no other reason than we were reluctant to give up the trees until the last minute and that the last trees in the grassland cleaved close to the waterways), and we haven’t moved from there since. Sure, we covered immense territories in our hunting forays, but when we came back home, it was always down by the stream, on the edge of the swamp. That would account for our current demographics.

So, how does that hypothesis fit? Have I left anybody out?

Friday, December 21, 2012

Scavenging the Web

I ran across the following while hunting for hominin scavenging hypotheses on the Web. I have no idea who Blumenschine and Cavallo are or what book this quote is taken from, nor do I want to represent it a view endorsed by most scavenger hypothesis proponents; but I thought it typical of the fantasy world inhabited by advocates of the scavenger hypothesis:

“The earliest hominids probably scavenged and took small prey with their hands, as chimpanzees and baboons do. Only their next step was unique: they began to use tools to butcher large carcasses that nonhuman primates cannot exploit. The difficulty of this leap (to the use of tools to butcher) belies the charge that scavenging offers no challenge that might select for human qualities. . . Scavenging is not at all easy for a slow, small, dull-toothed primate. To locate scavengeable carcasses before others did, we had to learn how to interpret the diverse cues to the presence of a carcass in riparian woodlands. They include the labored, low-level, early-morning, beeline flight of a single vulture toward a kill; vultures perched in mid-canopy rather than at the crown of a tree, where they nest; appendages of a concealed leopard or of its kill dangling from a branch; and tufts of ungulate hair or fresh claw marks at the base of a leopards favorite feeding tree. At night, the loud 'laughing' of hyenas at a fresh kill, the panicked braying of a zebra being attacked, the grunting of a frightened wildebeest---all serve notice of where to find an abandoned carcass when morning comes." (p. 94-95, Blumenschine and Cavallo, 1992)

Starting off with a bang: “The earliest hominids probably scavenged and took small prey with their hands, as chimpanzees and baboons do.” This is the foundation their argument, equating early hominin behavior with that of chimpanzees and baboons. Fair enough. Except that in Stanford’s paper which I cited last post, “The Predatory Behavior and Ecology of Wild Chimpanzees,” the author states, “wild chimpanzees (particularly the males who do most of the hunting) show little interest in dead animals.” Okay, so much for the scavenging chimps. Unfortunately, of course, since B&C start by presenting this as fact, all the rest of their conclusions follow from this erroneous statement. Did they just not know what chimps eat, or did they ignore it and make up chimp behavior to suit their theory; because that’s, effectively, what happened?

They continued with the following assertion: “The difficulty of this leap (to the use of tools to butcher) belies the charge that scavenging offers no challenge that might select for human qualities.” Since they haven’t proved that hominins were scavenging, the statement makes no sense. It’s a non sequitur. Simply because people used tools to butcher is no proof that they scavenged the animals they butchered, much less that it was selected for. They don’t appear to realize that, if scavenging were selected for in human evolution at such a late date, we’d still be scavengers.

The authors did have a flash of reason when they observed, “Scavenging is not at all easy for a slow, small, dull-toothed primate.” Forget about unarmed. So, why would we do it? It’s doubtful we could scavenge enough food to compensate for the energy expended to get it. Scavenging is easy for hyenas and vultures; they’d clean everything up long before slow us got to the carcass, even if we wanted to eat the meat.

The authors finished with a fanciful flurry of heightened imagination; for a minute they thought they were writers:

“To locate scavengeable carcasses before others did, we had to learn how to interpret the diverse cues to the presence of a carcass in riparian woodlands. They include the labored, low-level, early-morning, beeline flight of a single vulture toward a kill; vultures perched in mid-canopy rather than at the crown of a tree, where they nest; appendages of a concealed leopard or of its kill dangling from a branch; and tufts of ungulate hair or fresh claw marks at the base of a leopards favorite feeding tree. At night, the loud 'laughing' of hyenas at a fresh kill, the panicked braying of a zebra being attacked, the grunting of a frightened wildebeest—all serve notice of where to find an abandoned carcass when morning comes.”

Pay attention to the phrase “riparian woodland”: “woodland down by the river,” another cogent, if unintentional observation by the authors. Other than that, they didn’t appear to think much beyond their thesaurus. Or they haven’t spent much time looking at vultures. I’d challenge them to spend a day on foot out in, say, the John Day country of eastern Oregon and try and find one carcass with enough meat on it for them to get a reasonable bite or two. Hell, just find any old carcass, meat on the bones or not. Roadkill doesn’t count; stay to the backcountry. Go ahead, follow all the vultures you want. And in the early morning light, try and find where the coyotes were last night. Did you hear the roar of a mountain lion? Track it down and see if it killed anything.

The authors are, alas, delusional and woefully inexperienced. We are exceptionally poorly equipped to be scavengers. We are not so poorly equipped to be hunters; we’re good at throwing things. The odds of us finding game to kill are much greater than the odds of finding a leftover carcass. If you’ve ever been deer hunting, try and remember the number of dead deer you’ve seen (that weren’t shot) compared to the number of lives ones you’ve seen. In my experience, the ratio is probably bigger than 1:100. Our blunted sense of smell alone would rule us out for scavenging.

Me thinks a major stumbling block to understanding stone tools, is that paleo-anthropolgists tend to think of the first stone tools as butchering tools: hand-axes and scrapers; that’s why they place so much weight on early flaked tools which were, largely just that. They ignore the virtual certainty that the first stone tools were nothing more than the stones themselves, unprocessed. And not just for cracking open marrow bones or turtle shells, but for tossing at prey. There’s a reason why we developed a throwing shoulder early after descending; and there’s a reason why ball games are so universally important. Kill a seagull by pitching a baseball at him? You bet. A trick like that can stand between you and starvation, or at least a good meal. I imagine we came down from the trees as primarily spear-chuckers, but I’ll bet we added rocks to our arsenal early on.

No matter how common it is, it’s always a bit unsettling to find academic authors making such basic mistakes. Hopefully, some of their colleagues raised the same flags I did.

There’s always hope, right?

Thursday, December 20, 2012

Dining With the Chimps

Somehow I got caught up in a debate about scavengers. It began with critiquing a recent article suggesting that early hominins ate grasses (see last post). In response to a Facebook post on the matter, a friend replied that she thought early people were scavengers. She led me to a paper by Craig Stanford, an anthropologist from USC, “The Predatory Behavior and Ecology of Wild Chimpanzees” (no date), that had some provocative information in it. If you thought chimps were meat eaters before, this paper makes it shockingly clear how much meat they eat:

“At Gombe, we now know that chimpanzees may kill and eat more than 150 small and medium sized animals such as monkeys, wild pigs and small antelopes each year.…

“The amount of meat eaten, even though it composed a small percentage of the chimpanzee diet, is substantial. I estimate that in some years, the 45 chimpanzees of the main study community at Gombe kill and consume more than 1500 pounds of prey animals of all species.”

I was under the impression that they primarily ate bush babies (Galagos) but Stanford asserts,

“Although chimpanzees have been recorded to eat more than 35 types of vertebrate animals (Uehara 1997), the most important vertebrate prey species in their diet is the red colobus monkey.”

Thirty-five types of vertebrates? Once again, I am astounded. These guys are bigger hunters than I thought. I do know that they count leopards among their prey. Yup, leopards. Their prey.

Killing by chimps is anything but incidental. Stanford observes:

“Jane Goodall has noted that the Gombe chimpanzees tend to go on ‘hunting crazes,’ during which they would hunt almost daily and kill large numbers of monkeys and other prey (Goodall 1986).”
He went on to assert,
“Chimpanzees may be among the most important predators on certain prey species in the African ecosystems where they live.”

Nonetheless, he still maintains a distance from meat-eating humans. He opines, “Since neither humans or chimpanzees are truly carnivorous —most traditional human societies eat a diet made up mostly of plant foods —we are considered omnivores”; a claim which is disputed somewhat. There are people who classify humans among the carnivores, but it’s indisputable that we’ve always had a large herbivorious component to our diet. The notion that “most traditional human societies eat a diet made up mostly of plant foods” is somewhat disingenuous. That depends largely on what one means by “traditional society,” a term he does not define. There are, in the main, two kinds of societies one might call “traditional.” There are those traditional elements of all modern societies, all of which are post-agrarian revolution and, hence, much more dependent on plant matter than a traditional hunter-gatherer society. Those pre-agricultural societies that survived into the modern age did so because they lived on marginal lands and don’t necessarily represent the norm for pre-agrarian societies. Nonetheless, it’s hard to imagine claiming the Inuit, for example, “eat a diet made up mostly of plant foods.” So, while it’s true that we’re omnivores; it’s by no means established that pre-agrarian societies ate, primarily, plant food. It’s those kinds of unwarranted assumptions that cloud one’s interpretation of data. The assumption makes it more likely that we had a vegetarian past, something the PC (politically correct) world would dearly love to be true. The brutal truth is, we certainly were as significant predators as chimps, and we certainly became more so once we were upright.

The author asked the question:

“Many researchers now believe that the carcasses of large mammals were an important source of meat for early hominids once they had stone tools to use for removing the flesh from the carcass (Bunn and Kroll 1986). But the evidence for stone tool use dates to only 2.5 million years ago. For the 3 or so million years of human evolution prior to that time, did our ancestors eat meat?”

Classic blunder. It’s not “evidence for stone tool use,” it’s evidence for flaked-stone tool use. All the difference in the world. By not mentioning the inevitable millions of years of un-shaped tool use that we must have had, he leaves the impression that we didn’t start using stone tools until we miraculously learned how to shape them. If we hadn’t already been using stone tools before we learned to shape them, how did we ever come up with the idea? It makes the question, “For the 3 or so million years of human evolution prior to that time, did our ancestors eat meat?” sounding like he didn’t read what he’d just written. Does he think we would have given up eating meat for those intervening years? Surely, he doesn’t. It’s an unfortunate slip of logic.

And it obviates the idea that we could have ever been scavengers. I know of no instance where a species turned from predator to scavenger, not to mention back again. That would be stunning, unprecedented. In the end, scavenging turned out to be one funny idea.

Tuesday, December 18, 2012

News Flash: Scientists Survive on Poppycock

Dec. 14, 2012

“Scientists 'Surprised' to Discover Very Early Ancestors Survived On Tropical Plants, New Study Suggests”

“Researchers involved in a new study led by Oxford University have found that between three million and 3.5 million years ago, the diet of our very early ancestors in central Africa is likely to have consisted mainly of tropical grasses and sedges. The findings are published in the early online edition of Proceedings of the National Academy of Sciences.”

Frankly, I was skeptical. I just couldn’t imagine early humans grazing on grasses. Nor could I imagine that our ancestors would have given up meat so quickly after learning how to be upright and hunt with weapons. I could see them sharing their diet with vegetable matter, but grasses seemed unlikely.

The study authors, I guess, also thought that eating grasses directly was unlikely, for the article continued:

“The authors argue that it is unlikely that the hominins would have eaten the leaves of the tropical grasses as they would have been too abrasive and tough to break down and digest. Instead, they suggest that these early hominins may have relied on the roots, corms and bulbs at the base of the plant.”

That mollified me some but I was still skeptical. Which corms and roots and bulbs are we talking about? From grasses and sedges? Really?

Then comes the clinker:

“Professor Lee-Thorp said: ‘Based on our carbon isotope data, we can't exclude the possibility that the hominins' diets may have included animals that in turn ate the tropical grasses. But as neither humans nor other primates have diets rich in animal food, and of course the hominins are not equipped as carnivores are with sharp teeth, we can assume that they ate the tropical grasses and the sedges directly.’”

Did they really say that? One has to presume they did. Humans don’t have a diet rich in animal food? What supermarket do the authors shop in?

Needless-to-say, the authors are basing their argument on the commonly held, if probably erroneous, assumption that because “hominins are not equipped as carnivores are with sharp teeth,” they couldn’t bring down game. What the hell were they doing with the rocks and spears they were carrying around? The more reasonable assumption is that, indeed, our early ancestors were eating game, not grasses.

Think for a minute, Professor Lee-Thorpe, what’s the likelihood of people surviving out on the treeless plains eating only roots and corms with no means of protection and no ability to run away and nothing to climb? Zilch. The only reason humans could survive out on the plains was that they were top predators. They were killers.

You’re misinterpreting your data, sir.

Monday, December 17, 2012

Hobbit Forming

The pilfered picture above shows the attempt at recreation of the features of a Hobbit from the isle of Flores. One look and anyone can tell it's not real. That doesn't look anything like Bilbo Baggins.

Monday, December 10, 2012

It’s Faith that Counts

It’s a matter of faith that people are categorically different from other animals. Not that we have spectacularly extended the skills given to us, but that we have acquired entirely new skills unavailable to the rest of the kingdom. This is not exclusively a religious faith; the social science have been very strict in admonishing people to not be anthropomorphic, to not extend human characteristics to other things, be they living or dead, which may be tautological with inanimate objects, but is much more debatable among living thing.

Free-will is a case in point. Only humans are clearly marked with the burden of free-will, or so we’re told. It is something thought to have developed during the last evolutionary spurt that humans have had (if we’re not going through one, now). Unfortunately, upon closer inspection, it turns out to be a chimera.

The very concept of “free-will” implies—demands, really—at least two kinds of will, free and otherwise. The problem lies in disentangling the two. The closer one looks, the more the two kinds of will appear to be the same thing: the ability to choose. If one has an option to choose, all will is free. If one doesn’t have options, case closed. What the argument for free-will seems to be saying is that only humans are actually given choices; the rest of the kingdom has to follow innate instructions. We, on the other hand, are free, implying that we have no innate instructions, no internal behavior algorithms. In our evolution we have broken the bonds of rigid rules that one would be otherwise doomed to blindly follow because of one’s programming. We, for some reason have developed the possibility of endless choice; we’re free to do whatever we want.

But are we? And what does it mean, free-will? Isn’t the very act of wanting something a determining force and not one of free-will? One can, perhaps, ignore one’s desires, but one can hardly be said to create one’s own desires. And can one ignore a desire unless there’s a stronger desire present?

This is where we have to step back a little bit and look at what people and the other animals really are. We have to take a look at consciousness. A leg of the free-will table suggests that, while all living things might have an awareness of their surroundings, only humans are self-aware, are self-conscious. Since no one has yet devised a way to test this hypothesis, it remains a hypothesis, but it’s firmly planted in the body public. Quite how this self-consciousness would be different from the garden variety of consciousness is left to one’s imagination. A concomitant of that is the belief that only humans are conscious of their own mortality, another untested hypothesis. And, unfortunately, there’s no real reason to suspect that either of those assumptions is true other than pride: we want to think we’re different, that we’re special, that—well—we’re made in the image of god.

Fat chance.

From a scientific standpoint, it’s more accurate to say we’ve been created in the image of bacteria, of which we’re primarily composed and from which our origins, apparently, come. The technical difficulties with that observation is that it’s hard to tease out our uniqueness. Maybe we don’t have any.

There are a million obstacles to understanding everything about evolution. The minimum requirements are eating and reproducing. Obviously, sex was a huge step forward in creating complex creatures, but most of the rules of life were figured out before then. At the very least, one has to have enough consciousness to pick out what’s edible.

Not surprisingly, given the small amount of material to work with and the random, scattered nature of mutations, it took billions of years (literally) for the single-celled original creatures to figure out how to clump together to make multi-celled organisms. They couldn’t do it until they’d evolved methods of communication; they had to make agreements on how to cooperate. As far as we can tell, those methods involve chemical behavior and electronic currents. Communication between cells—and by extension, ever-expanding clumps of cells—is chemical and electrical. And as far as we know, no other methods have been employed. And that’s true no matter how big or complex those clumps of cells become, even if they become human. In other words, all the internal communication in our bodies is done on chemical-electric pathways. All of it, including brain activity, especially brain activity. There is no meta-communications pathway; we all use the same simple methods employed by the original cells. All thought, all memory, is built up of those two elements.

Now, when cells get together to form a larger creature—us, say—do they have little garden parties to set out the rules of communication and responsibility? No, the rules are built into their DNA which packages the basic instructions. A copy of the DNA goes with each cell and directs how it is to perform. It’s all quite rigid and magical at the same time. So, when this clumped-being stumbles upon something edible, the cells don’t have a conference to decide whether or not to eat it. How to deal with something edible is built into its DNA, so when given a choice to eat or not eat something edible, it’s not really a choice at all. It has to eat it. What good would debate do the creature? None, so debate isn’t selected for in evolution.

Which brings us to “natural selection”; it helps to understand how it works. For one thing, it didn’t begin until after the invention of sex. It’s commonly thought that the selecting being talked about is the selecting that two members of a species do when they’re looking for mates. The theory being that the healthier, more successful members of a species will mate and thereby spread their genes to the rest of the population. Seems logical, but, unfortunately, is not the case. The selecting being done is not that done by the individuals but rather that done by the entire species; and the selecting is very simple: if people carrying gene X outlive and out-produce their neighbors, gene X will eventually spread to the entire population. It’s immaterial who those individuals are or their overall competence; what’s important is the success of the gene packages they leave behind.

Which leads us to an understanding of who’s running the show. It’s not us. It’s the successful gene packages. It’s the DNA. Effectively, the individual members of any species are only the hosts by which the DNA reproduces itself. The DNA is immortal, we are transient. Evolution is done at the DNA level, we have nothing to do with it. And it’s random. Those things are true of all species.

Meanwhile, the goals and operational methods remain the same: eat/divide; chemistry/electricity. And they never change; they’re a constant through all living things.

But as the organisms become more complicated, the controlling algorithms become more complicated; new features are periodically added. Empathy, for example. Bacteria probably don’t have much use for empathy, but by the time you get to big animals, empathy starts to become useful to the survival of a species; hence, it gets selected for by the DNA. Ditto love, which is very similar to empathy. Love is empathy with attachment. Many species, evidently, use love to protect and foster themselves. Humans, of course, are reluctant to attribute love to other animals, but it’s hard to see that other animals’ bonding mechanisms could be significantly different than ours. It’s more likely that we’ve employed an existing model rather than having invented the wheel all over again.

It’s the same with any other human characteristic that we don’t think of as DNA controlled: inquisitiveness, say. Obviously, inquisitive people were, in the long run, more successful than those willing to live with whatever showed up. Artists? Somehow as a society we valued them enough that we made sure they were well fed and prospered and hence DNA selected for art. (Where would theater and fiction be without empathy? Do theater and fiction foster empathy and that’s why they get selected?) Those skills, the skills of artist, tinkerer, explorer, weren’t ones people invented out of whole cloth; like everything else, they are an evolutionary adaptation to life. They are, you could say, accidental. Useful, but accidental. They got selected for because of their utility, but that they arose was the result of random mutation.

What’s important here is that no one chose the skills, these attributes. No one chose to be empathetic or in love. Likewise, no one chose to be an artist or a tinkerer, they were selected for by DNA, not the individual. No one chose to be born. No one chose when to take their first breath. No one chose when to utter their first word. And in the end, no one chose to become President of the United States. All those things happened because of evolution and the iron-clad laws of the Universe.

What confuses people, in my estimation, is a poor understanding of consciousness and self, we tend to conflate the two. We fail to see that consciousness is an organizational tool that evolved to allow cell clumps to operate in real time. In other words, the clumps had to find ways to operate as a unit rather than as individual cells, and one of these adaptations was to create a consciousness that unites the entire cluster. The consciousness, of course, wasn’t the cluster but was the tool by which the cluster functioned. Consciousness is a large component of food gathering. It enabled the organism to make rapid decisions when faced with options, which one constantly is. In order for consciousness to function in real time, it has to think, unequivocally, that it is in control, that it is making the decisions. If it truly had to think about the decisions it was making and how it was making those decisions, it would be paralyzed. It would starve. Hence it’s designed to function in real time as a single entity, though it’s nothing of the sort; it’s an agreement between billions of cells and DNA. The conscious self is, in truth, an illusion that enables the individual to function.

But that’s all it is, a tool, it doesn’t actually do anything. It’s merely a data input devise. It controls nothing. Neither your movements nor your thinking. One may think that their consciousness is telling them to run when they see the lion coming towards them; but, in truth, it’s the internal algorithms which are telling the person to run; but they doing it in real time so it looks like the consciousness is actually doing the thinking; but no, all thinking is done at the subconscious level. (Got that? All those “buts”?)

So free-will? Where in this mess of evolution would free-will come from and how would it be different from the existing choice model? What does free-will really mean? Does it mean than no algorithm was applied to make a decision? Or does it mean that a new algorithm was applied? And how was that algorithm created? Does free-will imply that people can create new algorithms out of nothing? And why would they make a new one versus using another if they didn’t have a preference which would override free-will?

This would be all well and good if the algorithms were accessible to the consciousness, but they’re not. There is no free-will in setting up the algorithms of love, empathy, lust, hunger, fear, pride, etc. Each of us is only what our combination of mutable instructions produces. None of us creates or directs those instructions. None of us creates our own algorithms, because the algorithms don’t belong to us, they belong to the DNA. Consciously, we do nothing. Consciousness is only a monitoring devise, it doesn’t have directive capacity, that’s done internally and subliminally; we don’t have access to it.

Therefore, to state that we have free-will ends up being meaningless. Our consciousness, it turns out, has no will of its own and all will is directed by internal algorithms. Who would have imagined? Not the average person, because they cling stubbornly to their self-identity. No one wants to be just R2-D2. No one wants to be Hal.

Yet there it is: how is free-will different from any other kind of will? And, even if you thought decisions were really a conscious act, would it be possible to perform any act that you hadn’t thought about in context of yourself, that you used no criteria to making a decision? Because, if you applied criteria, then the criteria ruled the decision; and if you overruled the decision to express your free-will, then another set of criteria will have been brought into play, no? In the end, either decisions are governed by algorithms (criteria, circumstances, etc.) or they’re random. Then the question becomes, is it really random or does it just look that way? But, seeing as thinking is subliminal, the debate is moot.

Mostly, though, it’s moot because of logical necessity. There is no will outside physical reality. Like the soul, there’s no mysterious force out there separate from the physical universe. Thinking, in the end, is the old communication by chemistry and electricity; there’s no room for any other force. There’s nothing like “will” floating around the Universe.

The trail goes:

Apparently, all intracellular communication is done by chemical-electrical means. This is true, apparently, for all living things, animal or vegetable. At the molecular level, all elements in a living body follow strictly the laws of chemistry and physics. As far as we know, all thought is generated out of the same chemical-electric matrix; meaning, inevitably, that anything that could be considered “will” comes out of it, as well. Any choice made has to be generated out of that matrix, following, strictly the laws of chemistry and physics. There is no provision in physics for a fifth force, that of free-will to be generated.

Free-will is always posited as something, if not unique to humans, then restricted to a very few animals. Presumably, the other animals have predetermined wills. But forming a new type of desire, implies a new force, a deus ex machina. Even though it’s hard to see how there can be two independent types of wills.

One wonders, for example if all choices are either free-will or innate? Could a person have within them both free and non-free will? The innate reaction to duck, for example, would seem mostly innate; although one can, though, stand one’s ground and get clobbered. Exactly when did free-will arise in the evolution of animals? Why was it selected for? How does free-will improve one’s survival rate? All those problems pile up until it becomes clear that there are not two types of will, two strategies for making choices in the world. It becomes clear that human neurophysiology functions the same way all other animals’ neurophysiology operates.

I recently had a long debate about this via Facebook with my son. He is of the free-will school and argued for it vigorously. He’s very bright and articulate, but he could never understand the mystical nature of free-will, its magical appearance on the stage with people. He cleaved to the belief that somehow people had grown, either a second will, or an entirely new one; although the mechanics of and reasoning for and functioning of he was never able to explain. In the end, he exclaimed that I was arguing against free-will, as a way to “excuse humans for their sins”; to which I replied, “But of course.” Sins are an invention of religion as much as free-will; they aren’t real either. I don’t think my son thinks of himself as at all religious even though he holds to those religious beliefs of sin and free-will. It’s hard to escape the reach of Christianity.

Free-will fails because it has no definition; no one knows what it is.

The farming addendum:

A classic example of misunderstanding of the nature of natural selection occurs with the spread of farming, something which has bothered researchers for a long time. As Jared Diamond ably demonstrated, farming was decidedly bad for the individual; so the question has long been posed as to why it was adopted if it was so maladaptive?

Because, of course, it wasn’t maladaptive. It was only maladaptive for the individual, and not across the board, at that. The rich did quite well, thank you. More importantly, the species prospered even if the individuals did not. Farming most likely didn’t spread because people saw its advantages and adopted it; farming spread because it grew so many people and organized them so tightly that they could whomp the shit out of the hunter-gatherers. Think of the Europeans coming to America. Think of the Americans moving west. Both were done, essentially, behind armies. The difference between the Pilgrims and the Huns is who wrote the history.

Why the first group adopted farming will probably be forever debated; but once it got going, it was a juggernaut, and soon the people who were predisposed to a settled agricultural life swamped those who were not. The hunter-gatherers got selected away by the species.

The lesson to take away is that evolution happens to the species, not the individual.

Sunday, December 2, 2012


“Analysis of skeletal remains found in an island cave in Favignana, Italy, has revealed that modern humans first settled in Sicily around the time of the last Ice Age and despite living on islands, ate little seafood.”

This from 2/12/12; “Study Reveals Origins and Food Habits of First Sicilians”

An interesting, anomalous finding. The dates we’re talking about are “19,000-26,500 years ago when sea levels were low enough to expose a land bridge between the island and the Italian peninsula,” according to Dr. Marcello Mannino of the Max Planck Institute for Evolutionary Anthropology, lead author.

It’s hard to know how to place this study in the context of such early findings of human habitation on Crete, dating back more than 100,000 years. Why such a radical difference for two Mediterranean islands, especially when Crete is so much farther from the mainland? All people getting to Crete had to do so by boat/raft of some sort. If they could get to Crete, why not Favignana?

The abstract says: “This dietary change was similar in scale to that at sites on mainland Sicily and in the rest of the Mediterranean.” I’m not sure what they mean by “scale.” This was said in the context of a “slight increase in marine food consumption from the late Pleistocene to the early Holocene.” I don’t know if the authors are saying that the rest of the Mediterranean peoples ate a meat-heavy diet, as well, or that the shift in diets was slight in the rest of the Mediterranean, too.

Is this true of the folks on Crete, too? What did they eat? And where did they sail from? There are so many parts of the story yet to be filled in. Has anyone done a dietary comparison for all fossil remains? What do we think those piths and erecti were eating?

Who were these Sicilians? Where did they come from?

Friday, November 30, 2012

Spear-Fishing Organgutans

This is old hat, 2008, but I hadn’t run across it before: the spear-fishing orangutan. He’d observed people fishing and decided to give it a whack. The report said, “Although the method required too much skill for him to master, he was later able to improvise by using the pole to catch fish already trapped in the locals’ fishing lines.”

What I find remarkable, as much as the orangutan’s ability to figure out a complex problem, is the fact that it was a fish-eater. Orangutans eat fish? Who knew? Did they learn it from people? Gee, what could a couple million years of fish-eating do to the orangutans? How about if we threw in a few bush babies?

Process that along with the image of the orangutan mother filmed taking her baby into the canoe for a paddle on the pond. She undoubtedly learned that from people, too. How much can an orangutan learn? I’m impressed.

Why I Skipped Grad School

The fellow was giving an informative and amusing TED talk about animals and was discussing a kind of flightless New Zealand parrot when he made the common observation that New Zealand had numerous flightless birds due to lack of predators. It appears the islands have never been connected to the other land masses so that whatever lives there either flew or hitched a ride; and, apparently, no predators did, only birds.

That’s when my convoluted thinking stepped in. Hmm, I thought, strange that no bird there had decided to become predatory. Step two: stranger still that no hawk or eagle had taken to flying over with the rest of the birds; didn’t they have them in Australia?

Well, yes, they do have them in Australia, it turns out. Only then did it occur to me to check out New Zealand where I should have begun the search, and, yes, New Zealand does have hawks and owls. Always has had.

One can only presume that it was a more effective means of escape to hide than to fly for many New Zealand birds, but predators they have and had.

Tuesday, November 27, 2012

Child of Grace

Original grace, not original sin.

What a twisted concept to believe that children are born into sin and that someone 2000 years ago died on their behalf. That’s a hell of a burden to start someone out with, don’t you think? Where did that bizarre notion originate? I find it much more likely that children are born pure and are shaped by their environment. If they turn out not so good, who ya gonna call? And what are ya gonna do about it? Seems fruitless to blame the kid. Either he/she started out with less than a full basket of eggs or some got cracked along the way.


A species divided against itself is a contradiction of nature; it cannot last. The idea that we are any less than one species is, not only false, but surely temporary.

The human species is remarkable, not for its warfare or destructive capabilities, but for its capacity for cooperation. Aside from ants, termites, and bees, not many animals are capable of organizing thousands of its members for a single task. The world as we see it today is a marvel of human cooperation, voluntary or otherwise.

Which is only to be expected. Any species is, essentially, a DNA stream broken into millions of components. Components, yes, but still integral with the whole and not separate from it. Individual species units are more like detached toes than self-sustaining entities. It is the ultimate goal of any species to further its entire self, not just parts of it. It expects losses, which is part of the reason it divides into so many units, but it doesn’t encourage them. In the main, species devise ways of cooperation and social interaction between members; to do otherwise would be inviting disease into the organism. Species do seem to foster competition between members, but not usually to the death, and usually, apparently, more to keep the species vigorous rather than complacent in the quest for food and staying alive. Competition, apparently, is a way to test the viability of mutations. In the end, the only thing that wins is successful breeding over generations. It’s not who wins the lion’s share that’s important; it’s who remains to have any shares after the dust has settled. The survivors are the winner, no matter how they got there.

The recognition of “oneness” appears to be spreading across the globe. My kids and my kids’ kids are okay with the world. They aren’t worried about being taken over by anyone other than too many people; and I try and convince them that that’s not a problem. (Okay, that’s not entirely true; they are concerned about mega-corporations.) It’s not that they’re oblivious to color, but rather that they can’t see reason to rank sunsets versus sunrises. We’re a family where you can be anything you want except a Republican.

And we’re not unusual, much less unique. Granted, that may not be true if you’re living in Dallas or Indianapolis, but it’s certainly true in this part of the world; and if the polls are correct, it—this wanton permissiveness—seems to be creeping into the young everywhere.

I’m thinking, maybe that overpopulation had something to do with it. Certainly, moving into cities did. Rural divisions break down in the city. Once you hit Portland, we don’t care if you’re from Montana or Iowa. we really don’t. And soon, no one does. It’s a world-wide phenomenon.


It would seem that overpopulation rarely annihilates a species. More often, it decimates it instead. (Okay, worse than decimate, because, technically, that’s only ten percent.) The effects of overpopulation, especially in face of a food-source collapse, can be catastrophic. Usually, though, enough members survive to form a recovery population that can grow with the rebounding food-source. Evolution is a kaleidoscope of change punctuated by periods where the scope is emptied of stones and new ones entered.

That being said, the human race will never knowingly erase itself. By accident, no problem, but willingly, no. The trick is avoiding the accidents.

The good news is that, all things considered, we’ve done a pretty good job of avoiding them so far. Seven billion of us, right? Couldn’t have gotten that big by killing everyone in sight, now could we? However you cut that cake, we’ve been a magnificently successful species. Us and tomatoes and Norway rats.

To a large extent, being a highly social species has been our success. If we weren’t inclined towards getting along with our neighbors, we wouldn’t have figured out how to get so populous. We not only had to figure out how to produce more food, we had to figure out how to distribute it.


Empathy and love.

I was set to thinking when I heard the biologist suppose that the love felt by geese for one another was the same as the love people felt for each other; why invent the same emotion twice? Wasn’t love just an evolutionarily evolved bonding mechanism that increased the success rate for one’s offspring? I think we tend to think that we’ve invented emotions with our minds rather than them being primitive control mechanisms that function as survival aids. I’d vote for the latter.

And what is empathy other than love for our fellow humans? Empathy is more than just feeling another’s pain, it’s caring about that feeling. The torturer is well aware of the damage he or she is inflicting; they don’t care. Empathy goes beyond knowledge, it goes to accepting some of the suffering unto one’s self. It is recognizing the essential oneness of all humanity, that the overriding importance is our unity as a species. It’s not hard to imagine that the more empathy a tribe recognizes, the better off its members are.

It is hard to see empathy as anything that a more generalized form of love, that it, too, is a primitive emotion operating below the radar of awareness.

Love, indeed, makes the world go round. We’d never have made it this far without it and we’ll need a lot more of it to convince everyone to stop killing each other for whatever reason. Once again, it’s time to thank God for evolution. Without evolution, we’d still be boiling in the primordial soup.


Natural selection is what’s causing the world population paradigm shift. It makes no difference what sets up the conditions—moving to the cities, cell phones, whatever—DNA doesn’t care. DNA only cares about what succeeds. Those families more successful in bringing their offspring to fruition are more likely to perpetuate themselves. For millions of years, survival was dependent on producing as many offspring as possible. Large families tended to surpass small families, especially in farming communities. Hence, the human population exploded.

Now, though, it is most often more expensive to have a large versus a small family. Offspring in smaller families where the resources don’t have to be spread so thinly tend to do better than offspring from larger families. Large families are now a burden, not an aid to the parents; hence we’ve automatically stopped having them, irrespective of what government thinks or wishes. Nothing like a ghetto to inspire birth control. Those same economic realities also now tend to play out in rural areas as well as urban so that rural family size is also falling.

A word of caution: capitalism is a large ponzi scheme, it depends on an ever-growing base. Surely you’ve noted that all economic indicators measure market growth, not accomplishment. The market report never tells you what the company did that day, only how well their stocks performed.

Have you heard of the man who makes Wensleydale cheese? He readily sells all the cheese he makes; he doesn’t begin to cover the demand. He’s constantly pressured to increase his capacity, but he steadfastly refuses. His argument is that he makes enough money and has all the work he can handle, so why should he do more? If you want more cheese, go make your own.

Now, what sort of capitalist is that? None at all. He’s happy to be a great cheese-maker making all the money he can comfortably spend. For some reason the god of reason hasn’t touched him and he’s content with his life. He doesn’t want more. It’s unfortunate, but true.

As a consequence, you won’t find his stock on the exchange. His stock is his skill.

What I’m suggesting is that it might be possible to have a society built around creating great things and doing great things but not necessarily on the principles of getting bigger and huger. A society where people can expect their just rewards but not the rewards of their neighbors. I’m also suggesting that, if we reach a population maximum for this planet and our population begins to decline, we might need a different paradigm than capitalism, no matter how successful it is at amassing resources at this time. I’m just saying…

Nor am I saying that I have a replacement. My only advice would be to talk about it before rushing into anything. There doesn’t seem to be a good meta-answer on the horizon; we’ll have to be content with working on the little things, like a universal minimum wage.

Oops, there I go again.

Wednesday, November 21, 2012

Reaganomics and God

The train started rolling when I was thinking about Reagan. I was wondering about this, how one of the worse presidents of all time could be so revered. Still. Here he was, a senile blob to begin with and he’s still treated as some sort of god.

Ah ha! The bells went off: “Some sort of god.” Exactly. It’s the believer syndrome: ignore reality, go with faith. Trying to talk sense to a Reaganite makes as much sense as trying to talk a believer out of god. As they’ll carefully point out to you, it’s not a question of who Reagan really was, it’s how he’s perceived. Just as it’s not a question of whether or not god exists, it’s faith that counts. (“Keep the faith, baby; I’ve got no use for it.”)

That pushed me into thinking (again) of the tenacious hold believers have on, especially, the American body politic. What keeps fueling that?

More bells!

Money. If you have a product to sell, you don’t want the person who needs your product, you want a customer who believes in your product. You need a customer to buy your product regardless of what it really does. You want to sell the perception of your product, not the function. I’m an Apple person; you’ll never squeeze me into a PC. You want your customer to have faith in your product.

The last thing you want is a skeptic, a doubting Thomas. What you strive for, therefore, is as large a body of true believers as you can find. It makes no difference what they believe in; what you want is, not what they believe in, but their capacity to believe. If they believe in one thing, they’re much more likely to accept something else on faith, too. Especially, if the first belief system endorses the second subject.

Functionally, this leads to anyone with a product (capitalism, anyone?) to sell to co-opting whoever leads the local belief system, which in America’s case means sucking up to the priests of the children of Abraham, Christians, in particular. Fortunately, those who have a product to sell, have the same goals as the leaders of the faith: money, power, young girls or boys. The usual stuff. It solves the problem of getting into bed together if you’re all in the same bed to start with. Also fortunately, both the leaders of the faith and those with a product to sell know that the belief is in a sham, which makes it doubly easy for them to work together: no one has to disguise their true intent.

If you’d like, you can think of belief systems—religions—as marketing schemes. Hey, they work.


Which brings me to shopping. People like to buy things. People like to go shopping. What for is less important than the act of shopping. People are perfectly happy to go shopping with nothing in mind that they’re looking for; they’re simply going shopping. Often as not, they’ll end out buying something they don’t need, just to justify the act of shopping. In fact, most of what they/we buy is unnecessary.

Why do we continually need the new object? Is the new that much better than the old?

I would posit that the act of buying is the reaffirmation of one’s faith; it’s an expression of kinship with one’s neighbors. It’s an expression of the herd instinct. One wants to be like everyone else; and, if everyone else is shopping, then shopping you’ll go. If everyone is going to church, then you’ll go to church. If everyone at your church is buying product X, then you’ll buy product X, too. If I’m selling product X, I want to get endorsed by that church and as many churches as I can. God says buy me. I want to tap into that herd instinct. I want my product to be above reproach. I want my product endorsed by God.

Welcome to the American political scheme.

I wouldn’t offhand say that American politics and religion walk hand-in-hand; I’d say it’s more a combination of hand-in-pocketbook and hand-in-pants. This is where the “different stokes” come in. Did the earth move for you, too?

Monday, November 19, 2012

Fuel On the Fire

Science on

“Sorry, vegans: Eating meat and cooking food made us human: High caloric intake enabled brains of our prehuman ancestors to grow dramatically”

By Christopher Wanjek

“The new studies demonstrate, respectively, that it would have been biologically implausible for humans to evolve such a large brain on a raw, vegan diet and that meat-eating was a crucial element of human evolution at least 1 million years before the dawn of humankind.”

The last part of this statement reflects the paleo-anemia reported upon earlier. The first part, the implausibility, is from a study at the University of Rio de Janeiro. Wanjek combined the two results for an appraisal of the findings.

I find it interesting that these reports consistently come out, yet there is so much resistance to the hypothesis that people stood up to carry weapons (and, subsequently, food).

The article also reports:  “Humans have exceptionally large, neuron-rich brains for our body size, while gorillas — three times more massive than humans — have smaller brains and three times fewer neurons. Why?

“The answer, it seems, is the gorillas' raw, vegan diet (devoid of animal protein), which requires hours upon hours of eating only plants to provide enough calories to support their mass.”

I know, I know; everyone wants to be a gentle vegetarian like, say, Hitler, but the truth is, it’s ruthless meat eaters like the Dali Lama who have led the species out of the jungle.

The good news is, we’re pretty sure, now, that the first restaurant was a barbecue hut.

Where are the cars?

This is Khovd, Mongolia, in the Altai Mountains, courtesy of Google Earth. Those little mushrooms down there? They’re yurts.

Saturday, November 17, 2012

Sail on

"Ancient Mariners: Did Neanderthals Sail to Mediterranean?"
Yahoo News 11/17/12

“For instance, stone artifacts on the southern Ionian Islands hint at human sites there as early as 110,000 years ago. Investigators have also recovered quartz hand-axs, three-sided picks and stone cleavers from Crete that may date back about 170,000 years ago. The distance of Crete about 100 miles (160 kilometers) from the mainland would have made such a sea voyage no small feat.

“The exceedingly old age of these artifacts suggests the seafarers who made them might not even been modern humans, who originated between 100,000 and 200,000 years ago. Instead, they might have been Neanderthals or perhaps even Homo erectus.”

I’m throwing this in as a reminder of my hypothesis—of which you need no reminder—that we descended from the trees to the water’s edge some millions of years ago, where we still live. I regularly toss in the most recent archeological finds, which—surprise, surprise—never contradict that theory; au contraire, they always line up in support. As if they had a choice.

Needless-to-say, living by the water doesn’t mean making boats, but it sure makes making them easier. I would imagine that rafts came shortly after the knot. I would suspect that they came into being long before modern humans. Evidence from Flores and Australia support the findings from Crete, that people were mariners prior to modern humans.

I’m just saying…

Thursday, November 15, 2012

Push 'Em Back, Way Back

Popular Archeology
Vol. 8 September 2012
“Stone-Tipped Spears Used Much Earlier Than Thought, Say Researchers”

“A University of Toronto-led team of anthropologists has found evidence that human ancestors used stone-tipped weapons for hunting 500,000 years ago - 200,000 years earlier than previously thought.”

For those of you counting, that’s a 40% increase in time span. The points they are showing are sophisticated, thin, well-made ones. This is not likely a new technology. Bearing in mind that, given the reality of the bell curve, fossil finds are most likely to come from near the top of the curve, not an end. That holds true no matter how far back you find the paleo-remains. It means, for example, that one can hypothesize an incipient date for such hafted-weapons (which is what they’re talking about) no less that 200,000 years previous to this new date of 500,000 BCE, and most likely much earlier. If the incipient date were pushed back to 1,000,000 years BCE, it would not be surprising.

Friday, November 9, 2012

Without a Paddle

Obsessions. I’ve been watching too many documentaries. A lot of religious stuff; my obsession. It’s what you get for being a preacher. How does one get to be a preacher, you might ask? It’s a calling. You don’t get to be anything; you’re what you’re called to be.

Who does the calling? That’s open to conjecture.

Let me define religion: a myth-based system through which one interprets one’s existence/experiences. Religions do many different things, but that’s the core of what they are; everything hinges around that, the myth system. This is true of all religions; it’s what defines them. It’s not to be confused with the religion invoked when asked, “What do you believe?” That’s another use of the word, related but not identical. If one practices a religion, i.e. believes in that religion, it means they use the precepts of that religion to make sense of their experiences.

My observation is that the world divides over the definition of “religion.” The fight is over whether or not they are myth systems. If one is a believer, one has to admit to the possibility that at least one religion could not be a myth system: theirs.

Yet, all religions, by their nature, are myth systems; they cannot not be. What is not understood here is that the origins of a myth system have no bearing on the functioning of the system. There can be no true or accurate beginning of a myth system, even if it’s history is well documented. The origins have nothing to do with the effect of a myth or how it functions as an organizational apparatus. Hence, the search for the “historical Jesus” is  a Sisyphussian battle, one that can never be won. It’s an entire industry built around searching for an chimera. (And they take it and themselves so seriously. On the other hand, it’s always fun to see the Emperor naked.)

What I find fascinating is that the trapped have no idea they are trapped. It’s the Stockholm syndrome writ large, one comes to agree with one’s captors. Think of the slaves. No, not the Christian slaves, the black ones, the guys from Africa. They were more than happy to adopt the faith of their oppressors. Glory hallelujah! Have to this day. Once you’re inside a faith, you can’t see out. Which is only reasonable because, if you could see out, you might lose your faith, and then where would you be? Yes, and without a paddle.

What if one’s faith is in a vague, holy spirit that permeates the Universe with, with, with… whatever? What if? So…, what if? How about, “Don’t you believe in something bigger than you? Bigger than us?”

Are those beliefs or are they a reason to wear flowers in one’s and burn incense? Maybe take a sweat with some Native Americans; boy, do they have a myth system. There is another abiding quest: to make the vision of god so sophisticated that it can’t be disputed. The ages have been filled with ever more sophisticated reasoning as to what god could be like; only those doing the more and more clever reasoning, evidently, don’t understand that the increasing sophistication of a myth doesn’t make it less of a myth. Myths are like pregnancies: they either are or they aren’t myths. There’s no such a thing as a three-month myth or a third-term myth; a myth is a myth, they’re sort of clean that way.

What have I learned from those documentaries? That we’re never going to dent a believer. They either have to have their own catharsis or die of old age or from falling off the edge of the Earth. It could happen, you know.

Thursday, November 8, 2012

Guilty As Charged

The attacker of U.S. Representative Gabrielle Giffords and killer of six, Jared Loughner, was sentenced to life imprisonment today. The judge commented that this was only the finalization of the legal proceedings, the emotions would, perhaps, never heal. Juxtapose this alongside the Anders Breivik verdict in Norway where Breivik killed seventy-seven people. He was found guilty, as well. For reasons I find inexplicable, it was important for the Norwegians to find him guilty rather than insane.

Their cases are extreme, but they highlight the problems of dealing with crime; beginning, of course, with the very definition of crime. Suffice it to say that crime is not a fixed commodity unassailable and invariable from society to society, but rather a consensual process of a people or a polity. Crime in a totalitarian state is very different from crime in a tribal society. What one does about crime differs widely depending on where one places the origins of crime and what one wants to accomplish when dealing with crime and criminals.

For sake of argument, let’s define “crime” as an infraction of the Basic Rule, aka the Golden Rule: someone does something to someone that that person didn’t want done to them. Simple enough, if difficult to translate into practicality. A crime is an injury to another person (which, by extension, covers the environment). Despite the complexities of the law, we all have a basic understand of right and wrong within the context of our culture. We are born with an innate sense of right and wrong, which is shaped by our environment.

I’m not here to argue, today, about what constitutes a crime, though; I’m more concerned with how we react to crime as a culture, as a people. The first thing to be decided is what one (or we) wants to accomplish when confronting crime and its perpetrators. The simplest desire and the route we have chosen for the most part is revenge, punishment. Crime is seen as an individual decision and that one should suffer commensurate with one’s offense. Or more or less, depending on one’s station in life. It’s most clearly expressed in the eye-for-an-eye, tooth-for-a-tooth philosophy. It’s easy to understand and satisfies that longing for revenge, for getting even. And since one can’t get even with one’s boss or the policeman or the bureaucrat, one can at least send some bastard to the hole for life. Take that, you scoundrel!

If you’ve been wronged, wanting pay-back is easy to fathom.

But one might have other goals. If the crime didn’t happen to you, getting even might not be so important. Revenge might not be as important as protection. “I’m sorry that it happened to you, but I don’t want it to happen to me. In fact, I’m more concerned with it not happening to me than I am with your getting revenge.”

That’s a whole different ball game, focusing on protecting one’s self against crime, rather than on punishing the criminal. If one looks at crime as a problem to be solved, not a behavior to be punished, one will approach it entirely differently.

To be sure, the reigning paradigm for all history has been that people should be punished, not only for their transgressions, but to scare the shit out of them so they won’t do it again. Setting aside whether or not people should be punished for their sins, the question is begged, is the scaring the shit out of people effective? One could ask that in the larger scale; since crime has historically gone down through the centuries, does that mean people are more frightened of punishment now, or did crime go down for other reasons? It doesn’t seem that the punishments of today are particularly worse than the punishments of, say a thousand years ago; and, if what I read of how they did things in those days is true, then things, water-boarding excepted, etc., are better today than the rack or burning by fire, etc. Besides that, I think they’ve done more than enough studies to understand that crime and punishment aren’t as interdependent as one would think. Other than punishment creates crime; this they know. If one wants to create a criminal class, America has written the textbook on how to do it. Punishment is not such a good deterrent but it’s an effective generator of crime.

This is where the dilemma of dealing with criminals comes in: how much crime is one willing to suffer to be able to extract revenge? How important to one is the ability to extract revenge? This is not a trivial question, because every degree of punishment that society metes out is returned ten-fold. Vengeance is very expensive. Punishment has an enormous trickle-down effect.

Ultimately, how one deals with crime revolves around the arcane and oft misunderstood subject of freewill: whether one subscribes to it or not. In simple form, it says that, if there is freewill, punishment is morally acceptable, even if not effective: it satisfies moral indignation. On the other hand, if there’s not freewill, punishment will never be successful in the long run nor effective in the short. Furthermore, it finds punishment immoral. (How can one be punished for something over which they have no control?)

The question of freewill hinges on how one understands the thinking process to take place. One school claims that all thinking is done on a subconscious level not currently available to human understanding. The other school, well, the other school doesn’t know how thinking actually works, but they’re quite sure they do it.

(Sam Harris has an interesting exercise he likes to do with audiences to explain the subconscious nature of thought. He asks everyone to imagine some famous person in their mind. Got one? Good. Then he asks, why that person? Why did you think of that person, rather than someone else? How did you decide which image to draw up from your memory bank?)

The trap people fall into is conflating words with thinking. Because much of thinking is translated into words, we tend to think that the arrival of words in our consciousness is the result of conscious choice, as if we looked at all the synonyms and chose the one closest to our needs, rather than simply grasping the first word to come out of the air. If we had to actually choose the words that make up our sentences by selecting them one-by-one from known vocabulary lists, we’d never be able to talk and thinking would take forever. Thinking, like talking, is done without conscious intervention. When one sits around scratching one’s head waiting for an answer, that’s exactly what one is doing, waiting for an answer; and the answer, when it comes, simply pops into one’s head without warning. Where was the thinking? Deep, deep down where only elves can see it. Not us.

And should you wonder, I’d argue that all creatures think, words or not. I’d also argue that none of us knows how we do it.

Breivik and Loughner, are they guilty? They certainly did what they were accused of. Were they insane? That’s as sticky a wicket as any other, what’s “insane”? Their actions certainly weren’t helpful and they definitely contradicted the Basic Rule and they probably need careful monitoring for the rest of their lives, but I wouldn’t spend a lot of time arguing about definitions. I’d get right on to the practical questions of how to house, feed, and make use of them for the rest of their born days. But vengeance? I’d give that a pass. Too expensive and doesn’t work. And, you know, it doesn’t even feel that good.

Tuesday, November 6, 2012

God of the Gaps

Professor Winston; I didn’t get the rest of his name. The show was a BBC production, The Story of God: The God of the Gaps. Pr. Winston was looking at the question of god from the viewpoint that he’d (God) been relegated to the gaps in scientific knowledge; which was a curious argument for Winston to take as he is an avowed believer in God. Although perhaps not so strange as he also presented himself as a firm Darwinist and follower of orthodox science; he was no starry-eyed dreamer. Winston has a tooth-brush mustache à la Groucho Marx and shares his Judaism. Other than that he seemed less of a wit and more kind.

Winston’s prime argument, a common one, is that people have two sides—spiritual and rational—and that they serve different human needs. All the while, mind you, demonstrating how the concept of god has withered through the centuries to be left only with this vague spiritualism of which he speaks. All fine and good, but he never got around to defining what this spiritual need is. He finished by pointing out that scientists don’t know what came before the Big Bang and religious folk don’t know what God is like, so they are both founded on uncertainty.

Well, yeah, sort of. No.

For one thing, in the end he compared the two views anyway; he didn’t leave them to their separate fields. He tacitly acknowledged that religion does concern itself with the knowable, not just the imaginary; and that where the two collide, religion disappears. Quite why he wants to hang onto the thread of religion is not examined.

Along the way he visited a statistician, who, in theory, could give a statistical basis as to whether or not God exists; but it turn out, could only give a statistical basis for what one thought about God, not about the likelihood of God itself. This fellow had an algorithm into which he inserted variables of what one thought about likely proofs for a god being correct. Winston, for example, thought that the existence of love counted in favor of the likelihood of their being a god. Winston had a couple assumptions like that; so, according to his beliefs, there was a 96% chance of there being a god. How that was anything other than confirming what he already told us—that he believed in God—was not explained; but it seemed to have been inserted into the film as positive evidence for the likelihood of there being a god. In the category of “Well, we don’t have any stronger evidence, so, we’ll go with the belief algorithm.” Winston believes strongly that there is a god and he’s a reasonable and pleasant enough fellow, so there must be something to it, right? But is that scientific?

If nothing else, it’s a superficial understanding of love. Thinking that love is anything other than an evolutionarily evolved emotion belies either wishful or limited thinking. Winston didn’t explain how attachments formed by other animals are categorically different than human attachments; he simply assumed so. Isn’t that a form of reverse-anthropomorphism, thinking that humans are categorically different from other different animals? Is it logical to think that our increased brain capacity has led to new types of emotions? Is it our intellectual capacity which governs what moves us emotionally; and that animals without our capacity don’t have our emotions, that our emotions evolved out of our intellect? Is it reasonable to base one’s belief in a god on a single emotion?

Winston never examined why he chose love as evidence for God anymore than, say, anger or hunger or frustration. Nor did he explain where love came into the chain of existence and why. Is love here to prove that there’s a god, or is there a practical side to love?

I don’t know; to me it just seems silly. All that time and effort put into making a documentary where in the end a guy shrugs his shoulders and says, “I think there’s a god because I want there to be a god.” I just don’t find that a convincing argument, no matter how nice a guy is that makes it.

Sunday, November 4, 2012

I'll Bet Pascal's Wrong

Pascal’s Wager

Wikipedia: “It posits that there's more to be gained from wagering on the existence of God than from atheism, and that a rational person should live as though God exists, even though the truth of the matter cannot actually be known.”

I’ve heard this put forth many times, most recently on a BBC documentary on the “god of the gaps.” What I’ve never heard anyone put forth is the logical fallacy; instead, it’s held up as example of the logic of believing in a god.

What Wikipedia fails to mention is that there is an unmentioned assumption here. The assumption is that not believing in a god would have negative consequences. That’s why Pascal thought it best to believe in a god, because he thought that any god—at least his god—would be really pissed off if you didn’t believe in him (it?) and make you suffer accordingly; but there’s no reason to assume that, if there’s a god, it would care at all what people thought of it. Pascal’s Wager only works if the god is a Christian god, but that’s never brought up.

At the very least, if anyone runs the Pascal Wager by you, you can cross them off your list for thinking things through. Pascal Schmascal.

Friday, November 2, 2012

Once More With Feeling

Hear me out; I may be onto something.

Maybe it’s all a matter of semantics, of word choice.

The problem is in the phrase “natural selection”; it implies choice. In people’s minds it implies that the selecting being done is being done by the individual members of the species, and that it’s that selecting which affects the direction of evolution; i.e. successful females get together with successful males (of whatever species), and they breed successful offspring which informs the direction of their evolution.

No. There are two way to make selections: one is by choice; and the other is by attrition. Natural selection is of the second variety: those who are left standing at the end are those who are chosen. We think of natural selection as of the first kind, but in truth, it’s of the second.

Natural selection happens at the species, not the individual, level. Natural selection doesn’t care how successful any individual is; it only cares whether or not the individual’s genes give rise to successful generations. It only cares what the individual contributes to the gene pool, not its individual success. All those pecking orders and strict animal hierarchies have nothing to do with mating the best specimens; they’re simply ways of A) insuring the shuffling of genes, and B) maintaining social order within the species.

It is not in a species’ interest to keep only its best specimens alive; it’s best interests are to keep as many of its members alive as possible.

Ergo, the only natural selection being done is a matter of success: either a mutation has it or it doesn’t. And really, that’s no selection at all; it’s dumb luck.

Think of it this way. Igor the cowherd was on the bottom rung of the ladder. He wasn’t the brightest of candles, was prone to disease, and spent a lot of his life alone with his cows in the mountains. This was a very, very long time ago. Before they’d brought those cows to Europe; they were still hanging around the Caucasus. Igor also had a penchant for barley beer and didn’t look much to the future.

Igor hooked up with Priscilla who was, perhaps, a little slow on the uptake and had trouble walking in a straight line, but she was happy enough with the lone cowherd. Their life would not be remarkable; they brought three children into the world before Igor was kicked in the head by a cow and killed. Alas. The only thing unusual was a small mutation in Igor’s constitution. Everyone at the time was lactose intolerant. That’s the usual human condition; it takes living with cows and drinking their milk for millennia before chance provides a lactose-tolerance mutation. Wouldn’t you know it, that mutation happened to Igor.

It didn’t, of course, help Igor—he was kicked to death at a young age—but two of his three children inherited the gene. It didn’t make a big difference in their life, either, though they did tolerate dairy products better than their kin. It wasn’t a huge change, but, considering their livelihood, it would eventually have a big impact on their culture. Those children who inherited the new gene did a little better than their neighbors. It was a useful gene and it slowly got passed around to most everyone in the tribe. Those who had it had a better survival rate than those who didn’t.

The king’s family? They never got the gene and were wiped out some years later in a coup. Alas.

And that’s how natural selection works. No one ever selected Igor’s family to be the hope of the future, other than chance. No one even noticed that it was Igor’s descendants who changed their people forever. Who would have imagined? No one, and no one did.

Friday, October 26, 2012

Is That You, Grandma?

Popular Archaeology, Tue, Oct 23, 2012
“Was Grandmothering a Key to Human Evolution?”
“Kristen Hawkes, a distinguished professor of anthropology at the University of Utah and senior author of the new study published Oct. 24 by the British journal Proceedings of the Royal Society B.”

[Start here. Yo, you.]

It’s a sweet, if slightly myopic, study.

Let me get my disclaimers out of the way. I’m a grandparent. I live with a grandmother. I’m all in favor of grandmothers. I think grandmothers rule the world.

That being said, let’s give a big shout-out for grandfathers.

Kristen Hawkes, it appears, would like to give the credit to grandmothers; and I’m more than willing to giver them all the credit they deserve, but I suspect both sexes had a role in Ms Hawke’s findings.

PA said: “Hawkes, University of Utah anthropologist James O'Connell and UCLA anthropologist Nicholas Blurton Jones formally proposed the grandmother hypothesis in 1997, and it has been debated ever since. Once major criticism was that it lacked a mathematical underpinning – something the new study sought to provide.” They summarized Hawkes proposal as “a famous theory that humans evolved longer adult lifespans than apes because grandmothers helped feed their grandchildren.”

Which begs the question: What were the grandfathers doing? Whittling spears?

The authors came to their conclusions “when they lived with Tanzania's Hazda hunter-gatherer people and watched older women spend their days collecting tubers and other foods for their grandchildren. Except for humans, all other primates and mammals collect their own food after weaning.”
Photo: Martin Shoeller
They never do say what the grandfathers were doing. (Let’s see, the Hazda are hunter-gatherers. The authors observed the women gathering. Do you suppose the men could have been out hunting?)

Hawkes is on to something, but she doesn’t know quite what. In this study they haven’t done research on observable realities but were instead working on creating computer simulations that would verify their assumptions. To no one’s surprise, they succeeded. They mathematically described a possibility and then tweaked it until it worked, demonstrating that possibility could have happened. I have no doubt they’re right.

But what they’re attributing to grandmothers, I’d attribute to structural changes in human society. It wasn’t, simply, that grandmother’s were feeding weaned children and extending their dependence on others for sustenance. The answer lies in what Hawkes herself said: “The [apes] that began to exploit resources little kids couldn't handle, opened this window for grandmothering and eventually evolved into humans."

The operatives words here are “little kids couldn't handle.” What was it about the food these humans were eating that, unlike all other primates, the kids needed adults to gather it for them? Hawkes, et al, don’t broach that question. Primates aren’t unusual in being able to self-feed immediately after weaning, most mammals are similarly equipped. As a rule, once you can reach the fruit, grass, leaves, nuts, by yourself, you’re on your own. Why we should be different, apparently, was of no concern to the authors.

The exceptions are predators like lions and tigers, and bears, oh my. And us. But if you’re reading Ms Hawkes, you’d never know that. As far as Ms Hawkes is concerned, food stops at tubers and roots; although why weaned children would need adults to provide them with said tubers and roots when they are perfectly capable of getting them themselves is never explained. Because the food in question, the food that the “little kids couldn't handle,” was meat. Everything else they could manage just fine, thank you. Meat, though, they had to be ready to go out after. Just like those Hazda grandfathers who are ignored and are nowhere to be found, they’re out hunting what the kids can’t handle. “Leave the little ones home with the women; we’ll go on a walk-about.”
Yes, yes, grandmothers are important in human society and have been forever; but no more so than the rest of the fabric. Grandmothers only became important when we had settled down into predators’ camps, when we had the luxury of community. It wasn't just the grandmothers who raised the little children, it was the whole village. Grandmothers became important when we had the luxury of having them at all. Grandmothers became important when we started bringing food home and sharing it rather than every person for themselves. If grandma had to climb that tree for dinner, tough titties. Grandmothers (and fathers) are a luxury of a meat-eating society. If it wasn’t for that, we’d never have come together around the campfire; we’d still be out grubbing for tubers and roots. It was no vegetarian what captured fire.

I repeat: The driving force behind humans becoming humans was meat. Pure and simple, meat. Nothing but meat. No tubers, no roots, no sweet berries, no saturated nuts. Nope. Meat. Betcha hamburger.

Sunday, October 14, 2012

Jessica's Revenge

"Our focus has changed from the search for Jessica to a mission of justice for Jessica," Westminster Police Chief Lee Birk said Friday.

The body of ten-year old Jessica Ridgeway was found yesterday (10/13/12) after a week-long search in the Denver area. The Chief concluded, “We recognize there is a predator at large in our community.”

Colorado seems to have had more than its fair share of horrendous events lately, of which this is just the latest example. I have no doubt that it’s nothing more than a statistical aberration; nonetheless, I’m sure it gives Denverites pause for thought.

Aside from sympathy for the poor parents to whom the unthinkable has happened and for all the people affected by similar tragedies, I worry about the phrase “mission of justice.” I shudder at what that means. Justice for whom? Jessica is dead, there will be no justice for her; justice is for the living. For the parents. For you and me.

Indubitably, we need protection from predators, but can we ever extract justice from them? How would we tell justice from revenge? What form would justice take if not punishment? And what, precisely, would we be punishing them for? Their crime?

I would be more of a fan of free will if I understood it, if someone could explain to me exactly how it works. Do you understand how thinking works? I sure don’t. I cram all the facts and the equations into my head, ask a question, and stare at the wall until something pops into my head. I have no idea where that something comes from or how it manifests itself in my mind’s eye, but it does. Pop! It’s there. I wish I could watch all the little gears whirling around until they come to an answer, but I can’t. I can’t even hear them over my tinnitus. Synapses firing? Poof! I have no idea. They’re soundless, invisible to me. Often as not I’m astounded at what they come up with—that’s not me!—but there they are, cockroaches of the mind creeping under the door. Too late; I blurted it out already. The thought was passed my lips before I’d even thought it. How fair is that?

I don’t understand how the pre-cellular consciousness of a strand of DNA or RNA or whatever could be any more or less conscious than you and I are, give or take a neuron or a feedback-loop or two. The principles, I would imagine, would have to be the same: input, process, output. Data is input and processed and operational directions are sent to the terminal. Consciousness is the input and output devises; processing is done internally. The actual processing can’t be observed by the terminal; there’s no need for its being able to do so. The processing is so fast and transmitted to the terminal at such speed that the terminal has the illusion of doing the processing. If the terminal had to think about how it was processing data, it would crash to a halt. Thinking has to be subconscious to be fast enough. If you truly had to think about what you were going to say, you couldn’t hold a conversation. But you’d never know it. You have to go on thinking your consciousness thought those words up all by itself. Pshaw! Smoke and mirrors. It’s all smoke and mirrors.

What you think is thinking is your processor processing the data it has at hand, which includes memory and current conditions. The processor will tell you whatever it thinks you need to hear. It decides that; you don’t. But for efficiency’s sake, it works best if the terminal thinks it’s doing the thinking. That’s you and me; we think that. So do howler monkeys.

What I’m trying to get at here is what is guilt and what is sin and what is responsibility and what is practical. That’s a lot of baloney to stuff into one casing.

What I’m thinking is that, if all thinking is done at an unconscious level, who’s responsible for it? Does it make any sense to have blame? Who are we going to blame for a lousy processor? Either the processor was lousy to begin with, or it had lousy input, right? Remember, this processor, even if it’s inside us, is basically just a machine mulling over the input, filing things away, making comparisons, evaluating, right? It doesn’t really think, either; it merely processes data at an incredible speed, fast enough to make our neurons zing and our mouths talk.

It’s certainly true that punishment is a whole new stream of data, and it might be enough to change some of the processor’s algorithms, but it’s equally liable to have unintended and undesirable consequences. The algorithms might not be changed to society’s benefit.  It may well be that the cost of justice/revenge is higher than the cost of rehabilitation. That’s when we have to make what is essentially an aesthetic decision: which do we want more, justice or safety?

Necessarily, this is complicated by the fact that most of the solutions to bad processors or bad data require deep changes at the societal level, changes which are not about to appear anytime soon. Poverty, for example, is bad data. Totalitarianism is bad data. Unconscious myth is bad data. Pollution is bad data. Capitalism is bad data. War is bad data. Violence is bad data. Hate is bad data. Racism is bad data. Bigotry is bad data. Misogyny is bad data.  Shall I go on? Until everyone is freed from bad data, how can we not expect processors to run amok?

But doesn’t it make more sense to try and fix the processors rather than smash them or feed them even worse data so that when we release them, they are more of a menace than when they went in for repairs? Of course, we can always enslave them or kill them, if we think it’s hopeless, yes?

Well, then, if we can’t do that, surely fixing them is the better option. That means getting rid of the blame, getting rid of the sin. Then getting rid of the problem.

See what I mean? That’s never going to happen. Not in my life time, and not in your, either. Not so long as we’re trying to find justice instead of cures. I would be happier, not that there was justice for Jessica, but that she didn’t die in vain.

Saturday, October 13, 2012

Uncle Ken

Uncle Ken
Seattle Times Oct. 12, 2012

Kennewick Man bones not from Columbia Valley, scientist tells tribes

The beauty of it is that it made no difference to the tribes. They continued blithely on as if KM was still their baby. In their view, belief trumps reality; which, truth be known, is a hallmark of belief. What follows are a few excerpts from the article and my intemperate responses.

“In a historic first meeting of two very different worlds, Columbia Plateau tribal leaders met privately Tuesday with scientist Doug Owsley, who led the court battle to study Kennewick Man.”

“Minthorn [Armand Minthorn of the Umatilla Board of Trustees] said reburial still needs to happen, and that the law should be changed to give tribes better control of sacred remains.”

Oh sure, We should just let you decide what’s sacred and what isn’t and then hand over whatever you say is sacred to you. Capital idea. I’m sure you’ll be impartial; you’ve shown yourself as such already. Not to mention having a good grasp of paleoanthropology.
Doug Owsley
“Ruth Jim, a member of the Yakama Tribal Council, where she is head of the tribe's cultural committee, said it is frustrating that Kennewick Man is still out of the ground. ‘I don't disagree that the scientists want to do their job, but there should be a time limit. The only concern we have as tribal leaders is he needs to return to Mother Earth,’ she said.”

“…the skeleton, which has largely been inaccessible but for two instances, in which a team of about 15 scientists could study it for a total of about two weeks.”

Absolutely, Ruth, two weeks is more than enough time for a thorough study. More than enough. I’m glad you brought that up. And we’re glad that as tribal members you’re concerned about the burial of this person who wasn’t a member of your tribe. Quite thoughtful of you. Unless, of course, this person came from a tribe that liked to donate their bodies to science; you never know, do you? Tell me once more, why does he need to return to Mother Earth?

“Vivian Harrison, NAGPRA coordinator for the Yakama, said it was disturbing to look at the slides Owsley showed, with the bones presented on a platform to be scrutinized from every angle. ‘Really, to me, it's sad. This is a human being and his journey has been interrupted by leaving the ground.’”

Look, Vivian, if you’re squeamish, leave the room. Don’t buy your meat from the slaughter-house. But there’s nothing inherently disturbing about bones. Old bones, new bones, human bones, dog bones. If you’re disturbed, okay, but don’t make us guilty because you’re of a delicate constitution; please get out of the way.

Sure, he’s human; but he’s delighted to be out here telling his story. He says it was stuffy underground and he’s glad for a second chance. He says he’s got a story worth telling. He says he doesn’t care if he ever goes in the ground again. Trust me; the White Buffalo told me this.

“Jaqueline Cook, repatriation specialist for the Confederated Tribes of the Colville Reservation, said scientists' finding that the skeleton had been purposefully buried was significant.

“‘It says a lot that somebody took care of him,’ Cook said. ‘To me that says community. And that he is part of the land. And our land.’”

Yo, with you all the way. Someone took care of him. I’d say that says community. And no doubt he’s been a part of our land for 8,500 years. For that we honor him. We owe him a lot. I’m also sure that when you say “our land” you mean all of us, not just the Indians. Surely, we’re all of this Earth together. But I don’t think one has to be buried to be part of the land, do you? I feel I’m part of this land, but I’m not quite dead yet. Close, maybe, but still wiggling my fingers. He gets to be the person that came out of the ground to tell his story. Listen.

“‘The day's presentation was ‘subtly traumatic,’ said Johnny Buck, one of Rex Buck's sons and a member of the steering committee of the Native Youth Leadership Alliance. ‘We have medicine people that took care of bodies. But we never did look so long at them.’”

Was that an argument or simply a statement? What do your customs have to do with these bones? We don’t hang around and stare at dead bodies for months on end, either, as a rule; but there’s nothing saying that sometimes it’s not a good idea. You’re talking as if the bones were yours. You’re trying to make an argument for bones that aren’t yours by describing what you would do with your bones. No one’s saying what you should do with your bones; they’re saying these aren’t your bones. They don’t become your bones by you saying so. Get over it. Why are you still arguing?

“…the Ancient One.”

Aka Kennewick Man. The problem with “Ancient One” is that there are a lot of those guys, one has to distinguish among them somehow. He’s not yours; you don’t get to name him.

We got that straight?


Try and get this straight, too: we’re all one. Go back 8,500 years and everyone is everyone’s ancestor. Kennewick Man is not more ancestral to the Umatillas or the Yakamas than he is to the Hottentots or the Swedes. You guys don’t have special standing. None. I personally know that most of you weren’t born here until after I was. There’s a song about this place: “This land is your land; this land is my land…”
Armand Minthorn
The message is: stop being whinny-butts. Stop thinking you’re special Americans. Get with the program. There’s a big jumble of people out here and you’re just one more jumble. You’re about as special as the Armenians. They could use a reservation, too.

Me? I’ve got reservations about the whole thing.