Fifths get lucky (part 2)

I’ve been learning a little about pyplot, and I’ve drawn a diagram:

figure_1This shows as horizontal bars the range of “fifths” (or generators) for which n generators give an adequate major third. “Adequate” here means within 25 cents, and n is the number on the vertical axis; negative n means fifths downwards, positive n is fifths upwards. The black line is at 702 cents, a just perfect fifth. The bar colors just help distinguish different values of n (and make it a little less boring than if everything were blue).

So you can see that for just fifths, four upward fifths or eight downward fifths will work. The range where four upward fifths will work is larger, but still only 12.5 cents wide. The just fifths line is very close to the upper edge of the band, and once you go beyond it in the positive direction, there’s a small gap before you get to the range where nine upward fifths give a major third. In the other direction, ten downward fifths work, but only after a somewhat larger gap where nothing smaller than eleven fifths will do it.

Around 800 cents you see lots of bars. I wrote before about how near 720 cents the number of fifths needed for a major third is very large, because 720 cents generates a 5-equal scale with no adequate thirds. On the other hand 800 cents generates a 3-equal scale with an adequate third (400 cents), so n = -10, -7, -4, -1, 2, 5, 8 (and maybe beyond) are all solutions there, and nearby. (Similarly, near 700 cents, you have n = -8 and 4.)

 

perfect_fifth_on_c

Fifths get lucky

I’ve been thinking more about musical tuning than CAs lately. For instance, four perfect fifths make (more or less) a major third. How crazy is that?

The diatonic scale most western music’s been based on for the past couple millennia comes from ancient Greece; they developed it by tuning their tetrachords using seven consecutive notes on a circle of fifths — F, C, G, D, A, E, B, for instance. Using a just perfect fifth, 701.96 cents, (Pythagorean tuning) the A (four fifths up from the F) is 2807.82 cents above F, or, dropping it down two octaves, 407.82 cents. This is not a particularly great approximation to a just major third, 386.31 cents, but it’s fairly close. Close enough that when music written in octaves or fifths or fourths gave way to use of thirds, musicians developed other ways of tuning diatonic scales for better thirds rather than dumping the entire system. What we’ve ended up with is equal temperament, with 700.00 cent fifths, four of which make a 400.00 cent major third.

So for many centuries we’ve been using a scale that was based on octaves and fifths to make music that uses thirds, because it happens to contain thirds that are close enough to just. Now, it’s a little odd to be using probabilistic terms to talk about simple arithmetic — two and two isn’t likely to be four, it is four — but I think you know what I mean when I ask, how likely is that? How lucky did we get?

It’s a hard question to quantify. But let’s generously say a major third is adequate if it’s within 25 cents of just. That’s a 50 cent range you want four fifths to fall into, so a single fifth has to be within a range a quarter of that size, 12.5 cents. If you didn’t know beforehand the size of a just perfect fifth, but knew it was somewhere between 650 and 750 cents, you might guess the odds of four fifths making a major third would at most be 12.5/100, or one in eight. Though worse, maybe, because the 12.5 cent range where it works might not be entirely contained within 650 to 750. In fact it might not overlap that range at all. (Though in actuality the range is from 690.33 to 702.83 cents.)

On the other hand, maybe the 25 cent range where two “fifths” make a major third does overlap, or the 16.7 cent range where three “fifths” will work does. So the odds of four or fewer fifths making an adequate major third might be a little better. Still seems small though.

Oh… but you also want to consider four or fewer downward fifths, or equivalently, four or fewer upward fourths. That improves the odds.

So let’s do a little simulation. Pick a number in the range, say, from 650.0 to 750.0 and see how many fifths, up or down, it takes to make an adequate third. Repeat, and get the distribution. Then ask about things like the average number of fifths needed.

There’s a difficulty here, though: Sometimes the answers get very large. Think about 720.00 cents. The notes that “fifth” generates are 720.0, 240.0, 960.0, 480.0, 0.0, 720.0… and it just repeats those five notes over and over; 720.00 generates a 5-equal scale. None of those notes is an adequate third, so you can run forever looking for it.

Of course you have pretty much zero chance of picking 720.00 at random, but if you pick, say, 719.932771, you’ll have to add a lot of fifths before hitting an adequate third. (1099 of them, looks like.) You’ll get occasional large numbers, then, and they’ll have a big impact on the mean value. The answer you get will fluctuate a lot depending on which large numbers you end up with.

This is why medians were invented.

So I wrote a little Python script to do this. If you take the range of possible “fifths” as 650 to 750 cents, then there’s about a 22% chance four or fewer fifths, up or down, will produce an adequate third. The median number of fifths required to make an adequate third: 11.

I think it’s safe to say if you needed 11 perfect fifths to make an adequate major third, the system upon which western music developed would have been entirely different. Different how, I have no idea, but different. Needing only four fifths was a “lucky” break… not win-the-lottery lucky, but definitely beating-the-odds lucky.

screen-shot-2016-09-07-at-11-44-59-pm

Changing the rules (part 3)

Take a look at a pre-loaf and a pi:screen-shot-2016-09-07-at-11-37-22-pm

If you run them in B37c/S23, the pre-loaf stabilizes quickly, but the pi takes a while — and some space. It needs 110 generations to settle down.

If you run them in B37e/S23, though, the pre-loaf just becomes a loaf immediately and the pi stabilizes much more quickly, in only 23 generations, and without spreading out so much.

So based on that, maybe it’s plausible B37c/S23 could be explosive while B37e/S23 isn’t. And then that might help account for why B37/S23 is explosive while B36/S23 and B38/S23 aren’t.

But this is still rather hand-wavy. Plausible is one thing, proved is something else entirely. First, it’s not just what happens to these objects that matters, but also how likely they are to arise in the first place. No matter how long and over what area a pi evolves, it won’t cause anything if it never crops up in the first place.

Second, in addition to pre-loafs and pis, you also need to consider larger sub-patterns with pre-loaves and pis embedded in them. How likely are they and what do they do under the two rules?

Anyway, proof or no proof, B37e/S23 really isn’t explosive. That means — unlike B37/S23 — it can be explored with a soup search. But apgmera, the C++ version 3 of apgsearch, can’t handle non-totalistic rules. The Python version 1.0 can, or rather a hacked version of it can. Slowly! I’ve been running it and it’s done 2 or 3 soups per second, about 3 orders of magnitude slower than apgmera can run B3/S23. And for some reason it seemed not to be sending results to Catagolue.*

But anyway, 11c/52 diagonal puffers cropped up several times, laying trails of blocks and loaves:

screen-shot-2016-09-07-at-11-49-25-pmThis puffer evolves from this 7 cell object:screen-shot-2016-09-07-at-11-54-50-pmAnd if there’s a suitably placed ship nearby, it becomes an 11c/52 diagonal spaceship. This has cropped up in several soups.screen-shot-2016-09-07-at-11-44-59-pm

*Edit: My B37e/S23 hauls are on Catagolue, but under the enharmonic name B37-c/S23.

Changing the rules (part 2a)

That lame explanation seems even more lame when you consider this: The non totalistic rule B37c/S23 (meaning birth occurs if there are 3 live neighbors, or if there are 7 live neighbors with the dead neighbor in the corner of the neighborhood) is explosive, but B37e/S23 (birth occurs if there are 3 live neighbors, or if there are 7 live neighbors with the dead neighbor on the edge of the neighborhood) isn’t.

Screen Shot 2016-09-04 at 10.11.08 AM

Changing the rules (part 2)

Here’s an even more perplexing (to me, at least) instance of different CA behavior under similar-but-different rules. Consider this 32 x 32 soup:Screen Shot 2016-09-04 at 10.14.02 AM B36/S23 is a Life-like rule sometimes called HighLife. Many objects behave the same way as in Life; in particular, blocks, loaves, boats, and beehives are still lifes; blinkers are p2 oscillators; gliders are c/4 diagonal spaceships. So after 378 generations in B36/S23 when that soup looks like this, it’s stabilized:Screen Shot 2016-09-04 at 10.10.26 AM B38/S23 has no nickname I know of. Under that rule, the same soup stabilizes in 483 generations:Screen Shot 2016-09-04 at 10.10.52 AM And in B37/S23… here’s what it evolves to after 10,000 generations:Screen Shot 2016-09-04 at 10.11.08 AMPopulation 17,298 and growing, presumably forever.

Fairly typical. I’ve seen some soups take several thousand generations to stabilize in B38/S23, and I’ve seen a few — very few — stabilize in B37/S23. But most soups stabilize in 1000 generations or so in B36/S23 and B38/S23… and almost all soups explode in B37/S23.

Does that make any sense to you? Explain it to me, then.

I guess you can wave your hands and say “well, if you have few births you don’t get explosive behavior, and if you have many births in some small region you get momentary overpopulation which then crashes in the next generation, but somewhere in the middle there’s a point where you have not too few births but not enough crashes and it explodes”. But is that the best we can do at understanding this?

Edit: In fact, that lame explanation seems even more lame when you consider this: The non totalistic rule B37c/S23 (meaning birth occurs if there are 3 live neighbors, or if there are 7 live neighbors with the dead neighbor in the corner of the neighborhood) is explosive, but B37e/S23 (birth occurs if there are 3 live neighbors, or if there are 7 live neighbors with the dead neighbor on the edge of the neighborhood) isn’t.

b345s456

Changing the rules

It surprises me how hard it can be to guess what kind of behavior a given CA rule will produce. There are some things that are fairly obvious. For instance, under a rule that doesn’t include births with fewer than 4 live neighbors, no pattern will never expand past its bounding box. (Any empty cell outside the bounding box will have no more than 3 live neighbors, so no births will occur there.)

But beyond a few observations like that, it’s hard to predict. At least for me.

Consider the rule B34/S456, for a semi random example. Start with a 32 by 32 soup at 50% density:gen0 Then let it run for 1000 generations. It expands to a blob 208 by 208 in size, population 21,132:b34s456But change the B34/S456 rule to B3/S456 or B4/S456 — removing one number or the other from the birth rule — and either way, the same initial configuration dies.

How about removing one number from the survival rule? B34/S56 collapses to a little period 2 oscillator:b34s56

And so does B34/S45. But B34/S46 dies.

Now try adding numbers to the birth rule. B345/S456 for instance:b345s456

At first it expands, but only until it makes this near-square shape. Its boundaries no longer move, but oscillate with period 2. The interior is mostly static but there are a couple of regions that oscillate with period 3, so the whole thing is a p6 oscillator.

Here’s B346/S456:b346s456

So we’re back to infinitely growing irregular blob behavior. Population 53,457. How about B347/S456?b347s456Likewise, but slower growing; population only 15,686. Want to guess at B348/S456?b348s456Blob again, but population only 9672. Not too surprising. Except that both B347/S456 and B348/S456 grow slower than B34/S456, even though both differ only in that there are more ways to produce a birth!

Finally let’s add numbers to the survival rule. B34/S4567 first:b34s4567
Population 196,655. If you see faint squares on that blob it’s an artifact of the image reduction. Click on the image to see it at full size. It’s interesting: there are chaotic regions interspersed with domains where the live cells are arranged regularly. Most of the interior cells are stable or oscillate with period 2, and the only real evolution occurs at the boundary of the blob. That’s true of the B34/S4568 blob too:b34s4568but here the growth rate is much smaller — the population’s only 23,299 — and the interior of the blob’s pretty random with only very small regular domains.

So twelve rules: B34/S456, B4/S456, B3/S456, B34/S56, B34/S46, B34/S45, B345/S456, B346/S456, B347/S456, B348/S456, B34/S4567, B34/S4568. All look similar in that the last eleven differ from the first only by the addition or removal of a single number. Three die, two collapse to a small oscillator, one shows bounded growth, and the other six show unbounded growth, but not only at different rates but with different behavior: unstable interior versus stable or oscillating interior; random patterns versus random mixed with regular. There’s the seeming paradox that adding a digit to the rule can produce something that grows more slowly, or can even stop it from growing.

Add or subtract a second digit and try guessing what’ll happen. I don’t guess correctly very often.