Wednesday, December 26, 2012

Savanna Confessions

Red Colobus Monkey
So maybe I was wrong. Sort of.

It’s no secret that I’m not a fan of the savannista hypothesis, despite its considerable popularity within the profession, primarily because it doesn’t account for demographics. It flies in the face of our current demographics. I begrudgingly accepted that early hominins harvested the grasslands, but I was skeptical of the disappearing forest as catalyst for their abandoning the protection of the trees. It appeared crystal clear to me that something had to draw the people down from the trees; they couldn’t have been forced out; evolution doesn’t work that way. It was also fairly evident that that something was meat, and that the protection was weaponry. Quite why meat suddenly became important for an isolated group of simians, I hadn’t worked out.

Nonetheless, most people in the profession still insisted that the disappearance of the forest and the simultaneous emergence of hominins couldn’t have been coincidental. Furthermore, hominins were beginning to harvest the grasslands; they were certainly eating small ungulates and vertebrates, as well as turtles. Given our demographics, it is unquestionable that at some point humans moved to the waterside, but that’s another question; they first have to get out of the trees.

The problems with leaving the trees, of course, is that it’s, exceedingly difficult to change ones eco-niche. As a rule, when an eco-system disappears, all components of the system disappear, as well, plants and animals. That humans didn’t disappear when the forest disappeared says that, self-evidently, their food source didn’t disappear along with the forest; which in turn implies that their food source wasn’t entirely dependent on the forest.

Stepping back a minute and looking at the pre-split simian, back when chimpanzees and hominins were a single species, and considering that today’s chimps are regular hunters who fashion their own spears and considering that humans are currently hunters, it’s most likely that the pre-split simian was also a hunter who fashioned his/her own weapons/tools. And that simian was already spending considerable time on the ground. That being the case, when the simian species split into two groups—a necessary condition for species development—possibly by the growth of intervening grasslands, it could well be that one of the two groups was caught in an island forest that was slowly disappearing under it. For many, if not most, of the inhabitants of that eco-system, it meant their disappearance, as well.

These chumans (chimps becoming humans), though, had a diet which included meat. Their main meat, if current chimps are any indication, was probably monkeys of one sort or another; although chimps are known to eat as many as thirty-five different vertebrate species, which includes small antelope. One can imagine that, as the forest disappeared, not only did the plant matter on which the chumans depended disappear, so did the monkeys. Fortunately, though, small ungulates didn’t disappear. In fact, with the spread of the grasslands, so would many types of antelope spread. It might have been fairly easy for the chumans to switch to an antelope versus a monkey dominated diet. And, if the plant material which the chumans were accustomed to eating began to get scarce, it could well be that those chumans who we better at and relied more on meat for their diet, out-paced those still dependent on forest products. In such circumstances, a species could change the ratio of its diet fairly quickly without changing its basic dietary needs. And by such a process, chumans, by now hominins, could have adapted fairly quickly to a disappearing forest. Concomitantly, it also meant that the newly minted hominins would have to spend all their time on the ground if they were going to be successful antelope hunters.
Okay, I’ll give you that. That’s a likely scenario which covers the savannista basics. It gets people out on the prairies. It does, per force, get them out on the prairies as hunters. I’m sure the defensive value of spears wasn’t lost on these early ancestors of ours, as well. What this undoubtedly starts is the human habit of tracking down animals over long distances, killing them, and transporting the meat back to the main group/tribe/family. Given that one’s hands are occupied with either weapons or kill, it behooved one to stand up as straight and as efficiently as possible; and those that did so easily were rewarded with successful hunts. Natural selection will select for upright bipedalism, as it did.

In defense of my original argument, that evolution is governed by opportunity, not necessity, still holds true. The hominins weren’t forced out of the dwindling forest, they were enticed out by by a new abundance of game, a shift from monkey to antelope. If there is a significant problem with this scenario, it’s that our shift towards obligate bipedalism occurred some three million years before the thinning of the forest, but who’s counting? And we shouldn’t forget that the hominins had no idea the forest was thinning; they were simply out hunting like they’d always done.

But this hunting scenario has other requirements. I would argue that, by adopting hunting as a regular part of our alimentary scheme, we were forced to adopt other regular practices of predatory animals, among them the establishing of temporary dens, lairs, or nests where the young are raised until old enough to participate in the hunt or other harvest. I would argue that, from the time we left the trees, we made those lairs down by the water (if for no other reason than we were reluctant to give up the trees until the last minute and that the last trees in the grassland cleaved close to the waterways), and we haven’t moved from there since. Sure, we covered immense territories in our hunting forays, but when we came back home, it was always down by the stream, on the edge of the swamp. That would account for our current demographics.

So, how does that hypothesis fit? Have I left anybody out?

Friday, December 21, 2012

Scavenging the Web

I ran across the following while hunting for hominin scavenging hypotheses on the Web. I have no idea who Blumenschine and Cavallo are or what book this quote is taken from, nor do I want to represent it a view endorsed by most scavenger hypothesis proponents; but I thought it typical of the fantasy world inhabited by advocates of the scavenger hypothesis:

“The earliest hominids probably scavenged and took small prey with their hands, as chimpanzees and baboons do. Only their next step was unique: they began to use tools to butcher large carcasses that nonhuman primates cannot exploit. The difficulty of this leap (to the use of tools to butcher) belies the charge that scavenging offers no challenge that might select for human qualities. . . Scavenging is not at all easy for a slow, small, dull-toothed primate. To locate scavengeable carcasses before others did, we had to learn how to interpret the diverse cues to the presence of a carcass in riparian woodlands. They include the labored, low-level, early-morning, beeline flight of a single vulture toward a kill; vultures perched in mid-canopy rather than at the crown of a tree, where they nest; appendages of a concealed leopard or of its kill dangling from a branch; and tufts of ungulate hair or fresh claw marks at the base of a leopards favorite feeding tree. At night, the loud 'laughing' of hyenas at a fresh kill, the panicked braying of a zebra being attacked, the grunting of a frightened wildebeest---all serve notice of where to find an abandoned carcass when morning comes." (p. 94-95, Blumenschine and Cavallo, 1992)

Starting off with a bang: “The earliest hominids probably scavenged and took small prey with their hands, as chimpanzees and baboons do.” This is the foundation their argument, equating early hominin behavior with that of chimpanzees and baboons. Fair enough. Except that in Stanford’s paper which I cited last post, “The Predatory Behavior and Ecology of Wild Chimpanzees,” the author states, “wild chimpanzees (particularly the males who do most of the hunting) show little interest in dead animals.” Okay, so much for the scavenging chimps. Unfortunately, of course, since B&C start by presenting this as fact, all the rest of their conclusions follow from this erroneous statement. Did they just not know what chimps eat, or did they ignore it and make up chimp behavior to suit their theory; because that’s, effectively, what happened?

They continued with the following assertion: “The difficulty of this leap (to the use of tools to butcher) belies the charge that scavenging offers no challenge that might select for human qualities.” Since they haven’t proved that hominins were scavenging, the statement makes no sense. It’s a non sequitur. Simply because people used tools to butcher is no proof that they scavenged the animals they butchered, much less that it was selected for. They don’t appear to realize that, if scavenging were selected for in human evolution at such a late date, we’d still be scavengers.

The authors did have a flash of reason when they observed, “Scavenging is not at all easy for a slow, small, dull-toothed primate.” Forget about unarmed. So, why would we do it? It’s doubtful we could scavenge enough food to compensate for the energy expended to get it. Scavenging is easy for hyenas and vultures; they’d clean everything up long before slow us got to the carcass, even if we wanted to eat the meat.

The authors finished with a fanciful flurry of heightened imagination; for a minute they thought they were writers:

“To locate scavengeable carcasses before others did, we had to learn how to interpret the diverse cues to the presence of a carcass in riparian woodlands. They include the labored, low-level, early-morning, beeline flight of a single vulture toward a kill; vultures perched in mid-canopy rather than at the crown of a tree, where they nest; appendages of a concealed leopard or of its kill dangling from a branch; and tufts of ungulate hair or fresh claw marks at the base of a leopards favorite feeding tree. At night, the loud 'laughing' of hyenas at a fresh kill, the panicked braying of a zebra being attacked, the grunting of a frightened wildebeest—all serve notice of where to find an abandoned carcass when morning comes.”

Pay attention to the phrase “riparian woodland”: “woodland down by the river,” another cogent, if unintentional observation by the authors. Other than that, they didn’t appear to think much beyond their thesaurus. Or they haven’t spent much time looking at vultures. I’d challenge them to spend a day on foot out in, say, the John Day country of eastern Oregon and try and find one carcass with enough meat on it for them to get a reasonable bite or two. Hell, just find any old carcass, meat on the bones or not. Roadkill doesn’t count; stay to the backcountry. Go ahead, follow all the vultures you want. And in the early morning light, try and find where the coyotes were last night. Did you hear the roar of a mountain lion? Track it down and see if it killed anything.

The authors are, alas, delusional and woefully inexperienced. We are exceptionally poorly equipped to be scavengers. We are not so poorly equipped to be hunters; we’re good at throwing things. The odds of us finding game to kill are much greater than the odds of finding a leftover carcass. If you’ve ever been deer hunting, try and remember the number of dead deer you’ve seen (that weren’t shot) compared to the number of lives ones you’ve seen. In my experience, the ratio is probably bigger than 1:100. Our blunted sense of smell alone would rule us out for scavenging.

Me thinks a major stumbling block to understanding stone tools, is that paleo-anthropolgists tend to think of the first stone tools as butchering tools: hand-axes and scrapers; that’s why they place so much weight on early flaked tools which were, largely just that. They ignore the virtual certainty that the first stone tools were nothing more than the stones themselves, unprocessed. And not just for cracking open marrow bones or turtle shells, but for tossing at prey. There’s a reason why we developed a throwing shoulder early after descending; and there’s a reason why ball games are so universally important. Kill a seagull by pitching a baseball at him? You bet. A trick like that can stand between you and starvation, or at least a good meal. I imagine we came down from the trees as primarily spear-chuckers, but I’ll bet we added rocks to our arsenal early on.

No matter how common it is, it’s always a bit unsettling to find academic authors making such basic mistakes. Hopefully, some of their colleagues raised the same flags I did.

There’s always hope, right?

Thursday, December 20, 2012

Dining With the Chimps

Somehow I got caught up in a debate about scavengers. It began with critiquing a recent article suggesting that early hominins ate grasses (see last post). In response to a Facebook post on the matter, a friend replied that she thought early people were scavengers. She led me to a paper by Craig Stanford, an anthropologist from USC, “The Predatory Behavior and Ecology of Wild Chimpanzees” (no date), that had some provocative information in it. If you thought chimps were meat eaters before, this paper makes it shockingly clear how much meat they eat:

“At Gombe, we now know that chimpanzees may kill and eat more than 150 small and medium sized animals such as monkeys, wild pigs and small antelopes each year.…

“The amount of meat eaten, even though it composed a small percentage of the chimpanzee diet, is substantial. I estimate that in some years, the 45 chimpanzees of the main study community at Gombe kill and consume more than 1500 pounds of prey animals of all species.”

I was under the impression that they primarily ate bush babies (Galagos) but Stanford asserts,

“Although chimpanzees have been recorded to eat more than 35 types of vertebrate animals (Uehara 1997), the most important vertebrate prey species in their diet is the red colobus monkey.”

Thirty-five types of vertebrates? Once again, I am astounded. These guys are bigger hunters than I thought. I do know that they count leopards among their prey. Yup, leopards. Their prey.

Killing by chimps is anything but incidental. Stanford observes:

“Jane Goodall has noted that the Gombe chimpanzees tend to go on ‘hunting crazes,’ during which they would hunt almost daily and kill large numbers of monkeys and other prey (Goodall 1986).”
He went on to assert,
“Chimpanzees may be among the most important predators on certain prey species in the African ecosystems where they live.”

Nonetheless, he still maintains a distance from meat-eating humans. He opines, “Since neither humans or chimpanzees are truly carnivorous —most traditional human societies eat a diet made up mostly of plant foods —we are considered omnivores”; a claim which is disputed somewhat. There are people who classify humans among the carnivores, but it’s indisputable that we’ve always had a large herbivorious component to our diet. The notion that “most traditional human societies eat a diet made up mostly of plant foods” is somewhat disingenuous. That depends largely on what one means by “traditional society,” a term he does not define. There are, in the main, two kinds of societies one might call “traditional.” There are those traditional elements of all modern societies, all of which are post-agrarian revolution and, hence, much more dependent on plant matter than a traditional hunter-gatherer society. Those pre-agricultural societies that survived into the modern age did so because they lived on marginal lands and don’t necessarily represent the norm for pre-agrarian societies. Nonetheless, it’s hard to imagine claiming the Inuit, for example, “eat a diet made up mostly of plant foods.” So, while it’s true that we’re omnivores; it’s by no means established that pre-agrarian societies ate, primarily, plant food. It’s those kinds of unwarranted assumptions that cloud one’s interpretation of data. The assumption makes it more likely that we had a vegetarian past, something the PC (politically correct) world would dearly love to be true. The brutal truth is, we certainly were as significant predators as chimps, and we certainly became more so once we were upright.

The author asked the question:

“Many researchers now believe that the carcasses of large mammals were an important source of meat for early hominids once they had stone tools to use for removing the flesh from the carcass (Bunn and Kroll 1986). But the evidence for stone tool use dates to only 2.5 million years ago. For the 3 or so million years of human evolution prior to that time, did our ancestors eat meat?”

Classic blunder. It’s not “evidence for stone tool use,” it’s evidence for flaked-stone tool use. All the difference in the world. By not mentioning the inevitable millions of years of un-shaped tool use that we must have had, he leaves the impression that we didn’t start using stone tools until we miraculously learned how to shape them. If we hadn’t already been using stone tools before we learned to shape them, how did we ever come up with the idea? It makes the question, “For the 3 or so million years of human evolution prior to that time, did our ancestors eat meat?” sounding like he didn’t read what he’d just written. Does he think we would have given up eating meat for those intervening years? Surely, he doesn’t. It’s an unfortunate slip of logic.

And it obviates the idea that we could have ever been scavengers. I know of no instance where a species turned from predator to scavenger, not to mention back again. That would be stunning, unprecedented. In the end, scavenging turned out to be one funny idea.

Tuesday, December 18, 2012

News Flash: Scientists Survive on Poppycock

Dec. 14, 2012

“Scientists 'Surprised' to Discover Very Early Ancestors Survived On Tropical Plants, New Study Suggests”

“Researchers involved in a new study led by Oxford University have found that between three million and 3.5 million years ago, the diet of our very early ancestors in central Africa is likely to have consisted mainly of tropical grasses and sedges. The findings are published in the early online edition of Proceedings of the National Academy of Sciences.”

Frankly, I was skeptical. I just couldn’t imagine early humans grazing on grasses. Nor could I imagine that our ancestors would have given up meat so quickly after learning how to be upright and hunt with weapons. I could see them sharing their diet with vegetable matter, but grasses seemed unlikely.

The study authors, I guess, also thought that eating grasses directly was unlikely, for the article continued:

“The authors argue that it is unlikely that the hominins would have eaten the leaves of the tropical grasses as they would have been too abrasive and tough to break down and digest. Instead, they suggest that these early hominins may have relied on the roots, corms and bulbs at the base of the plant.”

That mollified me some but I was still skeptical. Which corms and roots and bulbs are we talking about? From grasses and sedges? Really?

Then comes the clinker:

“Professor Lee-Thorp said: ‘Based on our carbon isotope data, we can't exclude the possibility that the hominins' diets may have included animals that in turn ate the tropical grasses. But as neither humans nor other primates have diets rich in animal food, and of course the hominins are not equipped as carnivores are with sharp teeth, we can assume that they ate the tropical grasses and the sedges directly.’”

Did they really say that? One has to presume they did. Humans don’t have a diet rich in animal food? What supermarket do the authors shop in?

Needless-to-say, the authors are basing their argument on the commonly held, if probably erroneous, assumption that because “hominins are not equipped as carnivores are with sharp teeth,” they couldn’t bring down game. What the hell were they doing with the rocks and spears they were carrying around? The more reasonable assumption is that, indeed, our early ancestors were eating game, not grasses.

Think for a minute, Professor Lee-Thorpe, what’s the likelihood of people surviving out on the treeless plains eating only roots and corms with no means of protection and no ability to run away and nothing to climb? Zilch. The only reason humans could survive out on the plains was that they were top predators. They were killers.

You’re misinterpreting your data, sir.

Monday, December 17, 2012

Hobbit Forming

The pilfered picture above shows the attempt at recreation of the features of a Hobbit from the isle of Flores. One look and anyone can tell it's not real. That doesn't look anything like Bilbo Baggins.

Monday, December 10, 2012

It’s Faith that Counts

It’s a matter of faith that people are categorically different from other animals. Not that we have spectacularly extended the skills given to us, but that we have acquired entirely new skills unavailable to the rest of the kingdom. This is not exclusively a religious faith; the social science have been very strict in admonishing people to not be anthropomorphic, to not extend human characteristics to other things, be they living or dead, which may be tautological with inanimate objects, but is much more debatable among living thing.

Free-will is a case in point. Only humans are clearly marked with the burden of free-will, or so we’re told. It is something thought to have developed during the last evolutionary spurt that humans have had (if we’re not going through one, now). Unfortunately, upon closer inspection, it turns out to be a chimera.

The very concept of “free-will” implies—demands, really—at least two kinds of will, free and otherwise. The problem lies in disentangling the two. The closer one looks, the more the two kinds of will appear to be the same thing: the ability to choose. If one has an option to choose, all will is free. If one doesn’t have options, case closed. What the argument for free-will seems to be saying is that only humans are actually given choices; the rest of the kingdom has to follow innate instructions. We, on the other hand, are free, implying that we have no innate instructions, no internal behavior algorithms. In our evolution we have broken the bonds of rigid rules that one would be otherwise doomed to blindly follow because of one’s programming. We, for some reason have developed the possibility of endless choice; we’re free to do whatever we want.

But are we? And what does it mean, free-will? Isn’t the very act of wanting something a determining force and not one of free-will? One can, perhaps, ignore one’s desires, but one can hardly be said to create one’s own desires. And can one ignore a desire unless there’s a stronger desire present?

This is where we have to step back a little bit and look at what people and the other animals really are. We have to take a look at consciousness. A leg of the free-will table suggests that, while all living things might have an awareness of their surroundings, only humans are self-aware, are self-conscious. Since no one has yet devised a way to test this hypothesis, it remains a hypothesis, but it’s firmly planted in the body public. Quite how this self-consciousness would be different from the garden variety of consciousness is left to one’s imagination. A concomitant of that is the belief that only humans are conscious of their own mortality, another untested hypothesis. And, unfortunately, there’s no real reason to suspect that either of those assumptions is true other than pride: we want to think we’re different, that we’re special, that—well—we’re made in the image of god.

Fat chance.

From a scientific standpoint, it’s more accurate to say we’ve been created in the image of bacteria, of which we’re primarily composed and from which our origins, apparently, come. The technical difficulties with that observation is that it’s hard to tease out our uniqueness. Maybe we don’t have any.

There are a million obstacles to understanding everything about evolution. The minimum requirements are eating and reproducing. Obviously, sex was a huge step forward in creating complex creatures, but most of the rules of life were figured out before then. At the very least, one has to have enough consciousness to pick out what’s edible.

Not surprisingly, given the small amount of material to work with and the random, scattered nature of mutations, it took billions of years (literally) for the single-celled original creatures to figure out how to clump together to make multi-celled organisms. They couldn’t do it until they’d evolved methods of communication; they had to make agreements on how to cooperate. As far as we can tell, those methods involve chemical behavior and electronic currents. Communication between cells—and by extension, ever-expanding clumps of cells—is chemical and electrical. And as far as we know, no other methods have been employed. And that’s true no matter how big or complex those clumps of cells become, even if they become human. In other words, all the internal communication in our bodies is done on chemical-electric pathways. All of it, including brain activity, especially brain activity. There is no meta-communications pathway; we all use the same simple methods employed by the original cells. All thought, all memory, is built up of those two elements.

Now, when cells get together to form a larger creature—us, say—do they have little garden parties to set out the rules of communication and responsibility? No, the rules are built into their DNA which packages the basic instructions. A copy of the DNA goes with each cell and directs how it is to perform. It’s all quite rigid and magical at the same time. So, when this clumped-being stumbles upon something edible, the cells don’t have a conference to decide whether or not to eat it. How to deal with something edible is built into its DNA, so when given a choice to eat or not eat something edible, it’s not really a choice at all. It has to eat it. What good would debate do the creature? None, so debate isn’t selected for in evolution.

Which brings us to “natural selection”; it helps to understand how it works. For one thing, it didn’t begin until after the invention of sex. It’s commonly thought that the selecting being talked about is the selecting that two members of a species do when they’re looking for mates. The theory being that the healthier, more successful members of a species will mate and thereby spread their genes to the rest of the population. Seems logical, but, unfortunately, is not the case. The selecting being done is not that done by the individuals but rather that done by the entire species; and the selecting is very simple: if people carrying gene X outlive and out-produce their neighbors, gene X will eventually spread to the entire population. It’s immaterial who those individuals are or their overall competence; what’s important is the success of the gene packages they leave behind.

Which leads us to an understanding of who’s running the show. It’s not us. It’s the successful gene packages. It’s the DNA. Effectively, the individual members of any species are only the hosts by which the DNA reproduces itself. The DNA is immortal, we are transient. Evolution is done at the DNA level, we have nothing to do with it. And it’s random. Those things are true of all species.

Meanwhile, the goals and operational methods remain the same: eat/divide; chemistry/electricity. And they never change; they’re a constant through all living things.

But as the organisms become more complicated, the controlling algorithms become more complicated; new features are periodically added. Empathy, for example. Bacteria probably don’t have much use for empathy, but by the time you get to big animals, empathy starts to become useful to the survival of a species; hence, it gets selected for by the DNA. Ditto love, which is very similar to empathy. Love is empathy with attachment. Many species, evidently, use love to protect and foster themselves. Humans, of course, are reluctant to attribute love to other animals, but it’s hard to see that other animals’ bonding mechanisms could be significantly different than ours. It’s more likely that we’ve employed an existing model rather than having invented the wheel all over again.

It’s the same with any other human characteristic that we don’t think of as DNA controlled: inquisitiveness, say. Obviously, inquisitive people were, in the long run, more successful than those willing to live with whatever showed up. Artists? Somehow as a society we valued them enough that we made sure they were well fed and prospered and hence DNA selected for art. (Where would theater and fiction be without empathy? Do theater and fiction foster empathy and that’s why they get selected?) Those skills, the skills of artist, tinkerer, explorer, weren’t ones people invented out of whole cloth; like everything else, they are an evolutionary adaptation to life. They are, you could say, accidental. Useful, but accidental. They got selected for because of their utility, but that they arose was the result of random mutation.

What’s important here is that no one chose the skills, these attributes. No one chose to be empathetic or in love. Likewise, no one chose to be an artist or a tinkerer, they were selected for by DNA, not the individual. No one chose to be born. No one chose when to take their first breath. No one chose when to utter their first word. And in the end, no one chose to become President of the United States. All those things happened because of evolution and the iron-clad laws of the Universe.

What confuses people, in my estimation, is a poor understanding of consciousness and self, we tend to conflate the two. We fail to see that consciousness is an organizational tool that evolved to allow cell clumps to operate in real time. In other words, the clumps had to find ways to operate as a unit rather than as individual cells, and one of these adaptations was to create a consciousness that unites the entire cluster. The consciousness, of course, wasn’t the cluster but was the tool by which the cluster functioned. Consciousness is a large component of food gathering. It enabled the organism to make rapid decisions when faced with options, which one constantly is. In order for consciousness to function in real time, it has to think, unequivocally, that it is in control, that it is making the decisions. If it truly had to think about the decisions it was making and how it was making those decisions, it would be paralyzed. It would starve. Hence it’s designed to function in real time as a single entity, though it’s nothing of the sort; it’s an agreement between billions of cells and DNA. The conscious self is, in truth, an illusion that enables the individual to function.

But that’s all it is, a tool, it doesn’t actually do anything. It’s merely a data input devise. It controls nothing. Neither your movements nor your thinking. One may think that their consciousness is telling them to run when they see the lion coming towards them; but, in truth, it’s the internal algorithms which are telling the person to run; but they doing it in real time so it looks like the consciousness is actually doing the thinking; but no, all thinking is done at the subconscious level. (Got that? All those “buts”?)

So free-will? Where in this mess of evolution would free-will come from and how would it be different from the existing choice model? What does free-will really mean? Does it mean than no algorithm was applied to make a decision? Or does it mean that a new algorithm was applied? And how was that algorithm created? Does free-will imply that people can create new algorithms out of nothing? And why would they make a new one versus using another if they didn’t have a preference which would override free-will?

This would be all well and good if the algorithms were accessible to the consciousness, but they’re not. There is no free-will in setting up the algorithms of love, empathy, lust, hunger, fear, pride, etc. Each of us is only what our combination of mutable instructions produces. None of us creates or directs those instructions. None of us creates our own algorithms, because the algorithms don’t belong to us, they belong to the DNA. Consciously, we do nothing. Consciousness is only a monitoring devise, it doesn’t have directive capacity, that’s done internally and subliminally; we don’t have access to it.

Therefore, to state that we have free-will ends up being meaningless. Our consciousness, it turns out, has no will of its own and all will is directed by internal algorithms. Who would have imagined? Not the average person, because they cling stubbornly to their self-identity. No one wants to be just R2-D2. No one wants to be Hal.

Yet there it is: how is free-will different from any other kind of will? And, even if you thought decisions were really a conscious act, would it be possible to perform any act that you hadn’t thought about in context of yourself, that you used no criteria to making a decision? Because, if you applied criteria, then the criteria ruled the decision; and if you overruled the decision to express your free-will, then another set of criteria will have been brought into play, no? In the end, either decisions are governed by algorithms (criteria, circumstances, etc.) or they’re random. Then the question becomes, is it really random or does it just look that way? But, seeing as thinking is subliminal, the debate is moot.

Mostly, though, it’s moot because of logical necessity. There is no will outside physical reality. Like the soul, there’s no mysterious force out there separate from the physical universe. Thinking, in the end, is the old communication by chemistry and electricity; there’s no room for any other force. There’s nothing like “will” floating around the Universe.

The trail goes:

Apparently, all intracellular communication is done by chemical-electrical means. This is true, apparently, for all living things, animal or vegetable. At the molecular level, all elements in a living body follow strictly the laws of chemistry and physics. As far as we know, all thought is generated out of the same chemical-electric matrix; meaning, inevitably, that anything that could be considered “will” comes out of it, as well. Any choice made has to be generated out of that matrix, following, strictly the laws of chemistry and physics. There is no provision in physics for a fifth force, that of free-will to be generated.

Free-will is always posited as something, if not unique to humans, then restricted to a very few animals. Presumably, the other animals have predetermined wills. But forming a new type of desire, implies a new force, a deus ex machina. Even though it’s hard to see how there can be two independent types of wills.

One wonders, for example if all choices are either free-will or innate? Could a person have within them both free and non-free will? The innate reaction to duck, for example, would seem mostly innate; although one can, though, stand one’s ground and get clobbered. Exactly when did free-will arise in the evolution of animals? Why was it selected for? How does free-will improve one’s survival rate? All those problems pile up until it becomes clear that there are not two types of will, two strategies for making choices in the world. It becomes clear that human neurophysiology functions the same way all other animals’ neurophysiology operates.

I recently had a long debate about this via Facebook with my son. He is of the free-will school and argued for it vigorously. He’s very bright and articulate, but he could never understand the mystical nature of free-will, its magical appearance on the stage with people. He cleaved to the belief that somehow people had grown, either a second will, or an entirely new one; although the mechanics of and reasoning for and functioning of he was never able to explain. In the end, he exclaimed that I was arguing against free-will, as a way to “excuse humans for their sins”; to which I replied, “But of course.” Sins are an invention of religion as much as free-will; they aren’t real either. I don’t think my son thinks of himself as at all religious even though he holds to those religious beliefs of sin and free-will. It’s hard to escape the reach of Christianity.

Free-will fails because it has no definition; no one knows what it is.

The farming addendum:

A classic example of misunderstanding of the nature of natural selection occurs with the spread of farming, something which has bothered researchers for a long time. As Jared Diamond ably demonstrated, farming was decidedly bad for the individual; so the question has long been posed as to why it was adopted if it was so maladaptive?

Because, of course, it wasn’t maladaptive. It was only maladaptive for the individual, and not across the board, at that. The rich did quite well, thank you. More importantly, the species prospered even if the individuals did not. Farming most likely didn’t spread because people saw its advantages and adopted it; farming spread because it grew so many people and organized them so tightly that they could whomp the shit out of the hunter-gatherers. Think of the Europeans coming to America. Think of the Americans moving west. Both were done, essentially, behind armies. The difference between the Pilgrims and the Huns is who wrote the history.

Why the first group adopted farming will probably be forever debated; but once it got going, it was a juggernaut, and soon the people who were predisposed to a settled agricultural life swamped those who were not. The hunter-gatherers got selected away by the species.

The lesson to take away is that evolution happens to the species, not the individual.

Sunday, December 2, 2012


“Analysis of skeletal remains found in an island cave in Favignana, Italy, has revealed that modern humans first settled in Sicily around the time of the last Ice Age and despite living on islands, ate little seafood.”

This from 2/12/12; “Study Reveals Origins and Food Habits of First Sicilians”

An interesting, anomalous finding. The dates we’re talking about are “19,000-26,500 years ago when sea levels were low enough to expose a land bridge between the island and the Italian peninsula,” according to Dr. Marcello Mannino of the Max Planck Institute for Evolutionary Anthropology, lead author.

It’s hard to know how to place this study in the context of such early findings of human habitation on Crete, dating back more than 100,000 years. Why such a radical difference for two Mediterranean islands, especially when Crete is so much farther from the mainland? All people getting to Crete had to do so by boat/raft of some sort. If they could get to Crete, why not Favignana?

The abstract says: “This dietary change was similar in scale to that at sites on mainland Sicily and in the rest of the Mediterranean.” I’m not sure what they mean by “scale.” This was said in the context of a “slight increase in marine food consumption from the late Pleistocene to the early Holocene.” I don’t know if the authors are saying that the rest of the Mediterranean peoples ate a meat-heavy diet, as well, or that the shift in diets was slight in the rest of the Mediterranean, too.

Is this true of the folks on Crete, too? What did they eat? And where did they sail from? There are so many parts of the story yet to be filled in. Has anyone done a dietary comparison for all fossil remains? What do we think those piths and erecti were eating?

Who were these Sicilians? Where did they come from?