So: All humans descend from a common ancestor. Or so many people believe. Literalist believers in the Old Testament would say Adam and Eve are common ancestors of us all. Believers in evolution go further and believe all life on Earth descends from a common ancestor. I’m in the latter group.

When did the most recent common ancestor (MRCA) of all humans live? Simulations suggest it was more recently than you might expect: 2000 to 5000 years ago. (It’s been argued that all persons of European descent share a common ancestor only about 1000 years ago, and genetic analysis [paper, FAQ] seems to support that.) Furthermore, going only a few thousand more years back, you arrive at a time when literally every person then alive fell into one of two categories: Those who are ancestors of *no one* alive today, and those who are ancestors of *everyone* alive today. I don’t know if there’s a term for that state, so I’ll make one up: Universal common ancestry (UCA). Understand these are probabilistic claims; it would certainly be *possible* that 20,000 years ago there were people whose living descendants are some but not all of us. But the probability of such a thing is so small it is almost indistinguishable from zero: Something that improbable really cannot be expected to happen even once in the history of the universe. And even if it did, another few thousand years back a non-UCA state would be even more unlikely than that. The probability of UCA back then, and any time previous to that, is absolute certainty for all practical purposes.

Of course this doesn’t just apply to humans: We can talk about the MRCA of all chimpanzees alive today, or all hominins (humans and chimpanzees), or all ruby throated hummingbirds, or whatever.

Now, here’s the weird thing: Long ago, 5 million years or more, amongst members of the species Nakalipithecus or something like it, there lived an individual (call them Nico) who had (at least) two offspring, Chris and Harley. They were Nakalipithecus (or whatever) too, of course, and they mated with other Nakalipithecus and had Nakalipithecus children, who had more Nakalipithecus children, who had … well, you get it. But here’s the thing: *All of Chris’s living descendants are chimpanzees. All of Harley’s are human.*

That might surprise you. And think how surprised Chris and Harley would be.

But it’s true, and it’s provable (in, again, a certain-for-all-practical-purposes sense).

Nico, you see, was the MRCA of all living hominins. Nico has all living humans and all living chimps as their descendants, and being they’re the most recent common ancestor, no one in subsequent generations can make the same claim.

Now of course much more recent individuals are common ancestors of all living humans (and no chimps), or of all living chimps (and no humans). How about ancestors of all living humans and *some* living chimps? Or vice versa? But no. We have UCA for humans anytime before several thousand years ago, and I’m sure chimp UCA must be even more recent than that. More than a few thousand years ago, let alone a few million, anyone who was an ancestor of *some* living chimps was an ancestor of *all* living chimps. And likewise humans, so any individual with *any* living humans and *any *living chimps in their descendants would have to be a common ancestor to *all* hominins. And Nico was the MRCA, so anyone born after Nico would have to have no living descendants, or humans but no chimps, or chimps but no humans.

We know Nico must have had two or more offspring. If they had only one, then it would have been a common ancestor of all hominins and Nico would not be the MRCA. One of them was Chris, and the other was Harley… and they were born after Nico (duh). The argument above applies to *a**nyone* born after Nico, and that includes Chris and Harley. One had to be an ancestor of all living chimps (but no living humans) — that’s Chris — and the other the ancestor of all living humans (but no living chimps) — that’s Harley.

A scenario in which that makes sense is one like this. Seven million years ago, a group of Nakalipithecus woke up one morning, checked Facebook, went to work, or went out to play, or whatever, not realizing that fifty miles away there was a great huge lake held back by a natural dam which had been slowly eroding. That day the dam broke. There was an enormous flood, many of the Nakalipithecus died, and the ones that survived found themselves on two sides of a new, permanent body of water. And the topography was such that none of them were able to cross the water or go around it, so the survivors on one side never again encountered the survivors on the other.

All right, this is not a particularly likely scenario, but it’s a thought experiment, okay?

Nico, alas, died in the flood, but not before saving the life of Chris. Harley, meanwhile, was swept off and ended up on the other side of the water. Within a few thousand years all the Nakalipithecus on one side were descendants of Chris (but not Harley) and all on the other side were descendants of Harley (but not Chris). The two groups evolved apart and eventually gave rise to chimps on one side, humans on the other.

Granted, speciation doesn’t usually work like that. Much of the time geographical separation is a factor in speciation, but not necessarily all the time, and in any case the geographic separation usually isn’t instantaneous. Mountains rise slowly, rivers change course over centuries, and so on.

But it doesn’t matter whether the two groups are sundered in hours or eons. Or even if they are geographically sundered at all. The point is, for whatever reason, two groups evolve away from one another. At the start they are one species and they interbreed frequently, but the frequency of that decreases more and more. And it may decrease over a very long time period, *but*: There’s a single, momentary last time. A last time a member of one of the groups breeds with a member of the other group. And not long before that, a last time one individual breeds with a member of one group while their sibling breeds with a member of the other, and they have fertile offspring leading to descendants alive today. Their parent is our common ancestor, and the common ancestor of all chimps too. Their children, though, may be your ancestor, may have been Washoe’s, but are not both.

]]>

See that? As increases, approaches integer values. Odd, huh? Why does it do that?

Despite what should have been a dead giveaway hint, I didn’t figure out the approach revealed in this post. Embarrassing. But having failed on the insight front, what can I do on the obvious generalization front?

Let’s think about quantities of the form

where , , , , and are nonzero integers; is in lowest terms and , , and . For now let’s also restrict to primes.

To investigate that we’ll consider

The complex quantities lie on the unit circle in the complex plane and are the vertices of an -gon. Using the binomial expansion, the sum is

or

Now, for the terms where is a multiple of , is equal to 1 and the sum over equals .

Otherwise, we’re summing over the points on the unit circle:

which is the sum of a geometric series so

For instance, when , the sum is . When , it’s .

All right then. This means we keep only the terms where is a multiple of :

which is an integer. Call it . Then

or

So for large , approaches an integer if and only if the magnitudes of all the quantities have magnitude .

For instance: Let , . Then

Now in the case ,

The magnitude of for . In fact it’s zero for , but then is an integer anyway; point is, or works: and both approach integers (the former much more quickly than the latter).

How about ? Then

By symmetry the magnitudes of the two complex numbers are the same, so what we need is

or

So there are no integer values of that give convergence to an integer for . It seems evident the same is true for all .

]]>

]]>

This shows as horizontal bars the range of “fifths” (or generators) for which *n* generators give an adequate major third. “Adequate” here means within 25 cents, and *n* is the number on the vertical axis; negative *n* means fifths downwards, positive *n* is fifths upwards. The black line is at 702 cents, a just perfect fifth. The bar colors just help distinguish different values of *n* (and make it a little less boring than if everything were blue).

So you can see that for just fifths, four upward fifths or eight downward fifths will work. The range where four upward fifths will work is larger, but still only 12.5 cents wide. The just fifths line is very close to the upper edge of the band, and once you go beyond it in the positive direction, there’s a small gap before you get to the range where nine upward fifths give a major third. In the other direction, ten downward fifths work, but only after a somewhat larger gap where nothing smaller than eleven fifths will do it.

Around 800 cents you see lots of bars. I wrote before about how near 720 cents the number of fifths needed for a major third is very large, because 720 cents generates a 5-equal scale with no adequate thirds. On the other hand 800 cents generates a 3-equal scale with an adequate third (400 cents), so *n* = -10, -7, -4, -1, 2, 5, 8 (and maybe beyond) are all solutions there, and nearby. (Similarly, near 700 cents, you have *n* = -8 and 4.)

]]>

The diatonic scale most western music’s been based on for the past couple millennia comes from ancient Greece; they developed it by tuning their tetrachords using seven consecutive notes on a circle of fifths — F, C, G, D, A, E, B, for instance. Using a just perfect fifth, 701.96 cents, (Pythagorean tuning) the A (four fifths up from the F) is 2807.82 cents above F, or, dropping it down two octaves, 407.82 cents. This is not a particularly great approximation to a just major third, 386.31 cents, but it’s fairly close. Close enough that when music written in octaves or fifths or fourths gave way to use of thirds, musicians developed other ways of tuning diatonic scales for better thirds rather than dumping the entire system. What we’ve ended up with is equal temperament, with 700.00 cent fifths, four of which make a 400.00 cent major third.

So for many centuries we’ve been using a scale that was based on octaves and fifths to make music that uses thirds, because it happens to contain thirds that are close enough to just. Now, it’s a little odd to be using probabilistic terms to talk about simple arithmetic — two and two isn’t likely to be four, it *is* four — but I think you know what I mean when I ask, how likely is that? How lucky did we get?

It’s a hard question to quantify. But let’s generously say a major third is adequate if it’s within 25 cents of just. That’s a 50 cent range you want four fifths to fall into, so a single fifth has to be within a range a quarter of that size, 12.5 cents. If you didn’t know beforehand the size of a just perfect fifth, but knew it was somewhere between 650 and 750 cents, you might guess the odds of four fifths making a major third would at most be 12.5/100, or one in eight. Though worse, maybe, because the 12.5 cent range where it works might not be entirely contained within 650 to 750. In fact it might not overlap that range at all. (Though in actuality the range is from 690.33 to 702.83 cents.)

On the other hand, maybe the 25 cent range where two “fifths” make a major third does overlap, or the 16.7 cent range where three “fifths” will work does. So the odds of four *or fewer* fifths making an adequate major third might be a little better. Still seems small though.

Oh… but you also want to consider four or fewer downward fifths, or equivalently, four or fewer upward fourths. That improves the odds.

So let’s do a little simulation. Pick a number in the range, say, from 650.0 to 750.0 and see how many fifths, up or down, it takes to make an adequate third. Repeat, and get the distribution. Then ask about things like the average number of fifths needed.

There’s a difficulty here, though: Sometimes the answers get very large. Think about 720.00 cents. The notes that “fifth” generates are 720.0, 240.0, 960.0, 480.0, 0.0, 720.0… and it just repeats those five notes over and over; 720.00 generates a 5-equal scale. None of those notes is an adequate third, so you can run forever looking for it.

Of course you have pretty much zero chance of picking 720.00 at random, but if you pick, say, 719.932771, you’ll have to add a lot of fifths before hitting an adequate third. (1099 of them, looks like.) You’ll get occasional large numbers, then, and they’ll have a big impact on the mean value. The answer you get will fluctuate a lot depending on which large numbers you end up with.

This is why medians were invented.

So I wrote a little Python script to do this. If you take the range of possible “fifths” as 650 to 750 cents, then there’s about a 22% chance four or fewer fifths, up or down, will produce an adequate third. The median number of fifths required to make an adequate third: 11.

I think it’s safe to say if you needed 11 perfect fifths to make an adequate major third, the system upon which western music developed would have been entirely different. Different how, I have no idea, but different. Needing only four fifths was a “lucky” break… not win-the-lottery lucky, but definitely beating-the-odds lucky.

]]>

If you run them in B37c/S23, the pre-loaf stabilizes quickly, but the pi takes a while — and some space. It needs 110 generations to settle down.

If you run them in B37e/S23, though, the pre-loaf just becomes a loaf immediately and the pi stabilizes much more quickly, in only 23 generations, and without spreading out so much.

So based on that, maybe it’s plausible B37c/S23 could be explosive while B37e/S23 isn’t. And then that might help account for why B37/S23 is explosive while B36/S23 and B38/S23 aren’t.

But this is still rather hand-wavy. Plausible is one thing, proved is something else entirely. First, it’s not just what happens to these objects that matters, but also how likely they are to arise in the first place. No matter how long and over what area a pi evolves, it won’t cause anything if it never crops up in the first place.

Second, in addition to pre-loafs and pis, you also need to consider larger sub-patterns with pre-loaves and pis embedded in them. How likely are they and what do they do under the two rules?

Anyway, proof or no proof, B37e/S23 really isn’t explosive. That means — unlike B37/S23 — it can be explored with a soup search. But apgmera, the C++ version 3 of apgsearch, can’t handle non-totalistic rules. The Python version 1.0 can, or rather a hacked version of it can. Slowly! I’ve been running it and it’s done 2 or 3 soups per second, about 3 orders of magnitude slower than apgmera can run B3/S23. ~~And for some reason it seemed not to be sending results to Catagolue.~~*

But anyway, 11c/52 diagonal puffers cropped up several times, laying trails of blocks and loaves:

This puffer evolves from this 7 cell object:And if there’s a suitably placed ship nearby, it becomes an 11c/52 diagonal spaceship. This has cropped up in several soups.

**Edit:* My B37e/S23 hauls *are* on Catagolue, but under the enharmonic name B37-c/S23.

]]>

]]>

Fairly typical. I’ve seen some soups take several thousand generations to stabilize in B38/S23, and I’ve seen a few — *very* few — stabilize in B37/S23. But most soups stabilize in 1000 generations or so in B36/S23 and B38/S23… and almost all soups explode in B37/S23.

Does that make any sense to you? Explain it to me, then.

I guess you can wave your hands and say “well, if you have few births you don’t get explosive behavior, and if you have many births in some small region you get momentary overpopulation which then crashes in the next generation, but somewhere in the middle there’s a point where you have not too few births but not enough crashes and it explodes”. But is that the best we can do at understanding this?

*Edit:* In fact, that lame explanation seems even more lame when you consider this: The non totalistic rule B37c/S23 (meaning birth occurs if there are 3 live neighbors, or if there are 7 live neighbors with the dead neighbor in the corner of the neighborhood) is explosive, but B37e/S23 (birth occurs if there are 3 live neighbors, or if there are 7 live neighbors with the dead neighbor on the edge of the neighborhood) isn’t.

]]>

But beyond a few observations like that, it’s hard to predict. At least for me.

Consider the rule B34/S456, for a semi random example. Start with a 32 by 32 soup at 50% density: Then let it run for 1000 generations. It expands to a blob 208 by 208 in size, population 21,132:But change the B34/S456 rule to B3/S456 or B4/S456 — removing one number or the other from the birth rule — and either way, the same initial configuration dies.

How about removing one number from the survival rule? B34/S56 collapses to a little period 2 oscillator:

And so does B34/S45. But B34/S46 dies.

Now try adding numbers to the birth rule. B345/S456 for instance:

At first it expands, but only until it makes this near-square shape. Its boundaries no longer move, but oscillate with period 2. The interior is mostly static but there are a couple of regions that oscillate with period 3, so the whole thing is a p6 oscillator.

So we’re back to infinitely growing irregular blob behavior. Population 53,457. How about B347/S456?Likewise, but slower growing; population only 15,686. Want to guess at B348/S456?Blob again, but population only 9672. Not too surprising. Except that both B347/S456 and B348/S456 grow *slower* than B34/S456, even though both differ only in that there are *more* ways to produce a birth!

Finally let’s add numbers to the survival rule. B34/S4567 first:

Population 196,655. If you see faint squares on that blob it’s an artifact of the image reduction. Click on the image to see it at full size. It’s interesting: there are chaotic regions interspersed with domains where the live cells are arranged regularly. Most of the interior cells are stable or oscillate with period 2, and the only real evolution occurs at the boundary of the blob. That’s true of the B34/S4568 blob too:but here the growth rate is much smaller — the population’s only 23,299 — and the interior of the blob’s pretty random with only very small regular domains.

So twelve rules: B34/S456, B4/S456, B3/S456, B34/S56, B34/S46, B34/S45, B345/S456, B346/S456, B347/S456, B348/S456, B34/S4567, B34/S4568. All look similar in that the last eleven differ from the first only by the addition or removal of a single number. Three die, two collapse to a small oscillator, one shows bounded growth, and the other six show unbounded growth, but not only at different rates but with different behavior: unstable interior versus stable or oscillating interior; random patterns versus random mixed with regular. There’s the seeming paradox that adding a digit to the rule can produce something that grows more slowly, or can even stop it from growing.

Add or subtract a second digit and try guessing what’ll happen. I don’t guess correctly very often.

]]>

]]>