http://en.wikipedia.org/wiki/Non-

<< In 1965, Solovay constructed Solovay's model, which shows that it is consistent with standard set theory, excluding uncountable choice, that all subsets of the reals are measurable. >>

Now every subset of the reals is measurable, and has the Baire property. Thus: a lot fewer concepts to worry about! And no more Banach-Tarski paradox.

Do we lose anything, really?

The idea that the only objects that exist are the constructible ones is, I think, a central idea in the worldview of computer scientists, who are, afterall, the current bearers of the intuitionistic flame; besides being central figures in ideas like cognition is computation, and so is the universe. To me, it's a direct consequence of thinking of the world (including ourselves) as machines; and the alternative borders on mysticism (a.k.a. metaphysics).

Coming back to mathematics, why don't I see any computer scientists working on Choice-free mathematics? And why don't I see mathematical librarians cataloging which bits of mathematics are still kosher, according to constructivists? I'd love to see a choice-free graduate mathematics curriculum, but I'd be happy to just see a book on Probability.

### group theory

May. 30th, 2010 05:40 pm*de jure*supervisor at UBC, recently proved the Strengthened Hanna Neumann conjecture. (He is completely separated from my actual research here, which is roughly in Biostatistics)

This stuff sounds pretty neat, so I just spent about an hour reading up on group theory on Wikipedia, but as one might imagine, it got pretty tangential.

**( Read more... )**

Proofs"

I'm proposing an easy way of formalizing mathematics, by translating mathematical statements into FOL formulas. Please comment or contribute on the wiki.

Proof:

Constructively, let f(c) = reverse(c).

For example,

E = (0,2,2,1,0,0) : ---|---|---|--- ---|---|---|--- ---o---|---|--- G# 3 ---|---o---|--- E 1 ---|---o---|--- B 5 ---|---|---|--- E 1 becomes E = (1,2,2,0): ===|===|===|=== E 1 ===|===o===|=== B 5 ===|===o===|=== E 1 ===o===|===|=== G# 3

Let's start with the low E-string:

c[1] is the finger placement on the low-E string

then f(c)[4] plays the same note (under octave equivalence), since the mandolin's 4th string is an E.

Likewise, for all i, c[i] plays the same note on the guitar (under octave equivalence) as f(c)[5-i] plays in the mandolin. This is because while the guitar is starting from the low-E and going up in fourths, the mandolin is starting from the high-E and coming down in fifths. Since fourths and fifths are octave-complements of each other, that means that the difference between the guitar-note and mandolin-note will remain the same after each iteration (under octave equivalence). Since the difference was 0 to start with, then it remains 0. Do note that the absolute difference becomes one octave smaller at each step.

(It kinda bothers me that properly formalizing the math is so much work. I think mathematical language needs to be redesigned by good software engineers.)

I think one important reason to call this an "isomorphism" instead of "bijection" is that there are relations between chords that are preserved. For example, if you know that to go from an E to an E7, you let go of one finger...

E7 = (0,2,0,1,0,0) : ---|---|---|--- ---|---|---|--- ---o---|---|--- G# 3 ---|---|---|--- D 7 ---|---o---|--- B 5 ---|---|---|--- E 1

you can do the same on the mandolin.

E7 = (1,0,2,0): ===|===|===|=== E 1 ===|===o===|=== B 5 ===|===|===|=== D 7 ===o===|===|=== G# 3

In fact, you have lots of relations between chords like this one, that are all preserved under this simple isomorphism. Another obvious one is transposition (i.e. move the bar in bar chords, NB: all chords are bar chords).

Trying this for all basic guitar chords (i.e. non-bar): C, D, E, G, A...

**a problem appears:**If you try to translate A into the mandolin, it won't sound right. This is because there is no 3rd degree present in the lower 4 strings. One possible fix is to play an F# on the D string, which corresponds to a non-standard A chord on the guitar.

Similarly, when translating a D from the guitar, you should play the low-E string: i.e. you should play D/F#.

Credits to Thorne, who playing mandolin for the first time, said that the mandolin was like a guitar backwards. I thought that he was playing a left-handed mandolin, but then I knew what he meant.

### category theory & cybernetics

Oct. 31st, 2005 07:18 pmMy housemate is going to teach a series titled "Baby Category Theory" to the logic students. I intend to go, but I'm a bit afraid that the math will be too fascinating, causing me to become a mathematician and never spend another day of my life as a productive human being.

Cybernetics has an image problem, unfortunately. Its name is frequently abused by the likes of spamferences and crackpots. I hope that respectable scientists don't dismiss its ideas, many of which are common sense.

When teaching us about the common ion effect (about the solubility of pairs of salts), my high school chemistry teacher used to say "equilibria retaliate" (I used to think that he was speaking Latin, but this was just his way of remembering Le Chatelier's Principle). But this is reminescent of the principle of diminishing returns from economics (how far can we make an analogy?). Are we applying chemistry to economics or vice-versa? Neither! That's why we need a more general framework... both of these results are special cases of more or less "universal" structures. This may not be saying much, but it provides me something with which to think: when I see a situation that is analogous, I will predict that adding twice as much of the stuff will give less than twice the return.

What about homeostasis? You see it in economics as well as biology. (keyword for later reference: qualitative reasoning)

Did anyone see Art De Vany - Our Body is Not Communist, arguing that the human body is kept living through an invisible hand? I think he would say that cancer is a market failure, caused by irrational agents.

Why I am no longer a mathematician:

· Tired of working hard just to be clever. Life is short. The real world is more interesting.

· Phenomenology, introspection drove me towards cogsci.

· it's more productive to do meta work: computers will eventually do math much more cheaply than me. (see Zeilberger)

----

Here's something of an academic autobiography, of my time at Bucknell. It says nothing about my ideas, or what I read. I tell the story of how undergraduate curricula shaped my choice of majors:

**Mathematics**

The last time I did serious mathematical research was my junior year of college... and even that was very much empirically-aided: it was about counting the number of roots of polynomials over finite fields... my discoveries were made with the aid of a C++ compiler.

Since then, I have proven things about cute games (Nim, thanks to

**agnosticessence**), toy theorems (prove that number_of_divisors_of(n) is always even except for when n is a perfect square), and created neat correspondences (e.g. if you represent natural numbers as multisets, GCD is intersection, LCM is union), but nothing that could count as serious mathematics.

Already my senior year, in topology class, I no longer saw the point of doing pure math. The only way I could interpret infinite products of topological spaces was as a game of symbols: it had no real meaning to me.

Not only was I starting to get a formalistic view of mathematics, but I was increasingly bothered by the normal approach to mathematics, the standard mathematical language and the paper medium. This was made much worse by the fact that I had grown intolerant of confusing notation/language and informal proofs. Thankfully, I didn't stay in mathematics. Advanced mathematics requires a lot of effort and things are not always beautiful. The real world has many more interesting things to understand. During this time, I considered going for a PhD in Applied Math, but became disappointed with that idea too. It was still too much like other math.

By my senior year, mathematics was no longer fun. Still not "hard", but I no had motivation left. I had become enthusiastic about statistical modelling... even if I got labelled a Bayesian by our frequentistics department (I think it was meant as a compliment). And it was my interest in AI, by far, that dominated my intellect.

**Physics**

The reason I had liked mathematics before that was that it had been, for me, easy and fun. And its formal structures were much more satisfactory and easier for me to understand than the things people did in physics, my original major. My physics teachers never seemed to explain things clearly, and never gave me good logical reasons for why they were doing what they were doing. It was often unclear which model and assumptions were being used. And even after pressing them, I still had foundational questions that went unanswered. Quantum Mechanics class was extremely frustrating: while "nobody understands quantum mechanics", the theory still has a reason to be, but they didn't give us a chance to try to make sense of the experimental results that motivated the theory, or convince me that the theory was the best we could do.

Although I started out with bad grades in physics, they were steadily improving. Still, my professors saw promise in me, and wanted me to stay. Despite liking and doing well on my last class on Thermodynamics & Statistical Mechanics, I decided that I was going to focus on math: I was just too different from the physicists, and talking to them took too much effort. Now I want Patrick Suppes to be my next physics teacher. Among the physicists, I was definitely a philosopher.

**Computer Science**

I had to overcome my initial prejudice against CS. I only started it because of my father's argument that it would be a good idea if I wanted to make money. As a freshman, I had thought that it was just going to be about programming techniques, and similar boring-sounding things. The sort of person who did CS at my school was not far from the "typical management major": financially ambitious, if not particularly mathematically-talented. When I joined the group, I learned that there were exceptions... so now, I realized that there were also "computer geeks", as well as the former type. I was never a "computer geek". Programming geek, yes, for a long time... but one who couldn't get Linux installed, and who would call a technician to troubleshoot my network. Among them, I was solidly seen as a math geek. It bothered me that their AI class assumed neither knowledge of basic probability or basic logic, and that the computer graphics class couldn't do a simple linear projection.

But I really liked ProgLan. Also, designing algorithms was fun. Algorithmic reductions even more. And I learned some useful programming techniques.

**Philosophy**

I've always been a philosopher. But I did not like the prospect of reading shelffuls of philosophy books, learning the ins and outs of useless arguments (for instance, about metaphysics), and rereading & struggling to understand what exactly writers mean. Philosophy is great for breaking people out of their epistemological vices: questioning their prejudices, intuitions, etc., but some things are just overanalyzed. I think this is because they talk past each other. Case in point: the Monty Hall problem. Why are they still writing papers about it?? I think that philosophers should benefit the most from computational aids to reasoning, argumentation maps and such. At least, they already know logic.

**Psychology**

It was fascinating. But it wasn't rigorous enough for me. If they had offered cognitive science, I probably would have taken lots of it.

**Economics & Linguistics**

I also flirted with economics, although never for credit. It was interesting, but they were too slow on the math. Like CS, only worse. I also took a class in linguistics (the only one offered!), but as I wasn't about to start doing NLP, it remained a mere curiosity.

If it were possible, I would like to do it right now with Lisp. I have implemented a refuter for general Pi_1 statements, but even though implementing a verifier would never be successful, I still firmly believe in some proofs of Pi_1 statements. That means that I must implicitly accept axioms about infinity, much as I'd like not to.

Here's a nice theorem:

;;;All numbers with an odd number of divisors are perfect squares:

(forall #'(lambda (n) (implies (not (perfect-square-p n)) (evenp (n-divisors n)))))

And here's a nice proof:

(Lemma 1) Forall n in N, n is a perfect square IFF its prime factorization has an even power at all the primes.

(Lemma 2) Forall n in N, n-divisors(n) = PRODUCT (power(p_i,n) + 1) for each prime p_i in the factorization of n

Suppose n is not a perfect square (1)

Then there is a prime p in the prime factorization of n such that its power is odd. (by 1, L1)(2)

Since power(p,n) is odd, then power(p,n) + 1 is even. (3)

Then n-divisors(n) is even. (by 3, L2) (4)

But which axioms am I using in this proof? Are they too hard to dig out of the lemmas? My justification for the lemmas is my visual intuitions. Somewhere in proving them in any kind of system, you would need an axiom about infinity, right?

in case you want the rest of the Lisp code:

**( Read more... )**