On the Existence of Vampires


Obviously, vampires don't exist. And by vampires, I don't mean goth kids with white makeup and fitted fangs who drink blood for fun -- they do exist. I won't deny that. I mean a creature that survives on blood, is immortal, and can turn others into vampires by biting them. These vampires are just a myth, popularized by literature and film over the past couple hundred years.

Despite the fact that we know they don't exist, our friends at the Committee for Skeptical Inquiry have posted an article, in honor of Halloween, taking seriously the folklore on ghosts, vampires, and zombies, and logically debunking the existence of each of them.

One particularly interesting argument the article makes is that vampires mathematically can't exist, since each feeding would create a new vampire and reduce the human population by one, and that this process would increase the number of vampires to the point that there would be no more humans:
Let us assume that a vampire need feed only once a month. . . Now, two things happen when a vampire feeds. The human population decreases by one and the vampire population increases by one. Let us suppose that the first vampire appeared in 1600 c.e. . . . For January 1, 1600, we will accept that the global population was 536,870,911.2. . . .

On February 1, 1600, one human will have died and a new vampire will have been born. This gives two vampires and 536,870,911–1 humans. The next month, there are two vampires feeding, thus two humans die and two new vampires are born. This gives four vampires and 536,870,911–3 humans. Now on April 1, 1600, there are four vampires feeding and thus we have four human deaths and four new vampires being born. This gives us eight vampires and 536,870,911–7 humans. . .

This sort of progression is known in mathematics as a geometric progression—more specifically, it is a geometric progression with ratio two, since we multiply by two at each step. A geometric progression increases at a tremendous rate, a fact that will become clear shortly. Now, all but one of these vampires were once human, so that the human population is its original population minus the number of vampires excluding the original one. So after n months have passed, there are 536,870,911–2n+1 humans. The vampire population increases geometrically and the human population decreases geometrically. . . . We conclude that if the first vampire appeared on January 1, 1600, humanity would have been wiped out by June of 1602, two and a half years later. . .

We conclude that vampires cannot exist, since their existence would contradict the existence of human beings. Incidentally, the logical proof that we just presented is of a type known as reductio ad absurdum, that is, “reduction to the absurd.” Another philosophical principle related to our argument is the truism given the elaborate title, the anthropic principle. This states that if something is necessary for human existence then it must be true since we do exist. In the present case, the nonexistence of vampires is necessary for human existence. Apparently, whoever devised the vampire legend had failed his college algebra and philosophy courses.
This is certainly mathematically correct, but the logic is faulty. I posit that vampires could exist (mathematically at least) because it would be relatively easy for them to slow or terminate the replacement of humans with vampires without starvation: human blood farming.

Call me crazy, but if I were a vampire, I would rather have a person around who could give me blood continually without becoming a vampire himself, thus losing the ability to feed me. I mean, if a cow became human every time you drank milk from the teat, you would probably start pumping the milk without making lip-to-teat contact. Right??

Of course, this is all ridiculous mental masturbation, and if you've made it this far, I commend you, and I apologize. Congratulations and Happy Halloween!