gusl: (Default)
[personal profile] gusl
One of the few places in which statistics gets applied to math (rather than the other way around) is in the experimental mathematics community.

This year, the Workshop on Discovery and Experimentation in Number Theory is split between two venues simultaneously, dividing the attendees into two physical communities.

The workshop takes place at two sites: The Fields Institute in Toronto and The IRMACS Centre in Vancouver. While participants are welcome at either site, we anticipate that most participants will choose the closer or easier site. We will share (by technology of a sophisticated nature) the core of the lectures though we will have some individual sessions at each site, primarily because of the three hours time difference. The lectures which are not shared will be video streamed and archived.

This is obviously somewhat of an experimental format, though the primary focus is, of course, the content. Technical assistance will be available at each site, so as to minimize glitches.


Doron Zeilberger apparently won't be there. But Lenore Blum will. She has a book about computation over the reals called "Complexity and Real Computation" (classical computation is defined in binary terms, i.e. over Z2). There's a blurb about "Transfer Principles for Complexity Theory":
We can ask: Suppose we can show P = NP over the complex numbers (using all the mathematics that is natural here). Then can we conclude that P = NP over another field such as the algebraic numbers or even over Z2? (Answer: Yes and essentially yes.)


This reminds me of work on infinite-precision numerical computation, by Françoise Chaitin-Chatelin. (coincidentally or not, they are both married to famous mathematicians)

However, the two are completely different metaphors: in infinite-precision numerical computation the real number is (I think) the limit of the output of a Turing Machine over time. In Blum's work, the real numbers are the symbols that you read&write to the tape.

(no subject)

Date: 2009-09-18 04:14 am (UTC)
From: [identity profile] bhudson.livejournal.com
I should read that book someday. It's getting dusty.

(no subject)

Date: 2009-09-18 04:17 am (UTC)
From: [identity profile] bhudson.livejournal.com
By asking the secretary to order it for me.

(no subject)

Date: 2009-09-18 04:20 am (UTC)
From: [identity profile] gustavolacerda.livejournal.com
do expect it to be relevant to your research? how?

(no subject)

Date: 2009-09-18 04:25 am (UTC)
From: [identity profile] bhudson.livejournal.com
If I want to claim optimality, I have to know the lower bound. Much of my work is for the real RAM, though I tend to be embarrassed about using completely fictional machine models.

(no subject)

Date: 2009-09-18 04:38 am (UTC)
From: [identity profile] gustavolacerda.livejournal.com
RAM = Random Access Machine. (Googling for "real RAM" produces RealPlayer, if you feel lucky)

This is probably an obvious question, but: how do real RAMs connect to questions in computational geometry?

(no subject)

Date: 2009-09-18 05:00 am (UTC)
From: [identity profile] bhudson.livejournal.com
Indeed.

Geometry is basically the art of calculating functions of coordinates to get combinatorial structures. It's the calculating functions bit that bites. For many problems, if we can calculate just the *sign* of the function, the rest all follows. But even then that's nontrivial.

If you assume the coordinates are reals, and that you can take two numbers and add, multiply, or divide them in constant time, many things are relatively easy. If you assume the coordinates are integer or floating-point, then errors build up. For example, you can solve an LP in polynomial time on a real RAM. We don't know how to do this in polynomial time on a word RAM: if you do exact arithmetic, the size of the numbers grows exponentially with the number of multiplications you perform -- each multiplication doubles the number of bits you need to represent.

(no subject)

Date: 2009-09-18 06:18 am (UTC)
From: [identity profile] gustavolacerda.livejournal.com
I should say: I know nothing about RAMs. I once took a class from Peter van Emde Boas, but it didn't have RAMs. Lots of tilings, though.

Recursion theorists like to talk about "computability oracles": "if we had a machine that could decide the halting problem, then..."

I guess this stuff is about "complexity oracles": "if we had a machine that could multiply in constant time, then..."

Somehow the latter type of question strikes me as more interesting (I imagine the complexity zoo to be more diverse than the computability zoo), because it lets you focus on higher-level aspects of the problems/algorithms (it's a kind of idealization).

(no subject)

Date: 2009-09-18 06:45 am (UTC)
From: [identity profile] bhudson.livejournal.com
The RAM is the first model of computation you learn in undergrad, although they tend to gloss over the fact that it's a model of computation. The Turing machine is the first model where you realize you're just completely bullshitting, but what they teach you in intro to algorithms is also a crazy machine model. It just happens to be sufficiently useful to bother engineers with.

RAM: You assume infinite memory, and some type of value that can be stored. The machine has unit-size pointers. You can read and write to a location in unit time. You can do any one binary arithmetic operation on the values in unit time.

Word RAM: the value type is w-bit integers (words), and usually says that pointers are w bits also (ergo, you'd better have w at least log n) -- you can convert between values and pointers.

Real RAM: the value type is reals; you often also get a word RAM to play with on the side, or maybe an integer RAM (with unbounded-length integers).

The word RAM model is pretty close to a physical machine, although typically not all binary operations exist (a common one is "count the number of bits in this word"). The main deviation from reality is that it ignores memory hierarchy effects.

If you squint and think of the floating point unit in your computer as operating in the reals rather than in floating-point, then the real RAM model kind of vaguely looks like a physical machine. It's close enough that we geometers manage to sleep at night despite the lies that we tell.


You have to be careful about converting between words and reals if you care about lower bounds: you can encode crazy-complicated things as arithmetic expressions on reals. I periodically have to look up the precise details so that I can stay correct when I make brash claims of optimality, but said details ooze out of my brain pretty quickly. On the other hand, some things are easier in word RAM: for example, it takes O(n log n) to sort n reals, but only linear time to sort n w-bit words assuming w is O(log n). Which is to say: the real RAM is more powerful, but its adversary is also more powerful.

(no subject)

Date: 2009-09-18 07:03 am (UTC)
From: [identity profile] gustavolacerda.livejournal.com
<< you can encode crazy-complicated things as arithmetic expressions on reals >>

this sounds like the sorta thing where I'd try to use symbolic algebra programs. I guess somehow real RAMs help you construct lower-bound proofs?


<< it takes O(n log n) to sort n reals, but only linear time to sort n w-bit words assuming w is O(log n). >>

keyword: radix sort.

(no subject)

Date: 2009-09-18 03:18 pm (UTC)
From: [identity profile] bhudson.livejournal.com
Actually, real RAMs hurt me -- they're more powerful than I expect, so my expectation of a lower bound fails.

(no subject)

Date: 2009-09-18 11:08 pm (UTC)
From: [identity profile] gustavolacerda.livejournal.com
<< Geometry is basically the art of calculating functions of coordinates to get combinatorial structures. >>

I never heard that before. Can you give an example?
I'm thinking of packing problems, kissing problems, .


Btw, what's the technique for solving LPs with multiplications? (name?)

(no subject)

Date: 2009-09-19 11:58 am (UTC)
From: [identity profile] bhudson.livejournal.com
Convex hull, nearest neighbours; and more prosaic things like determining the maximum or minimum number of crossings in an arrangement of line segments. The input is geometric, the output is a combinatorial structure (a graph, a pointer to one of the points, or a number).

Every LP solver I'm aware of uses arithmetic operations; it's hard to imagine not doing so given that what you're asked to optimize is the dot product of two vectors. The interior point method is poly time if you can do real (in fact, rational) arithmetic in constant time. It's not if you only get to do arithmetic on w-bit words, because the rationals you operate on end up using a lot of bits.

(no subject)

Date: 2009-09-18 06:20 am (UTC)
From: [identity profile] psifenix.livejournal.com
There's also a thematic semester in Montréal in the Spring, which I will probably attend at least part of.

(no subject)

Date: 2009-09-18 06:39 am (UTC)
From: [identity profile] gustavolacerda.livejournal.com
how often do you write/run code? can you spend months of research without doing it?

(no subject)

Date: 2009-09-18 06:45 am (UTC)
From: [identity profile] psifenix.livejournal.com
I haven't started doing serious research yet, but I expect to be writing at least some code (I already have fairly ambitious plans to code software for weight one modular forms, etc). Since most arithmetical phenomena (number of points on a curve in Z/pZ for instance) seem to be "random for large primes" but with possibly counterintuitive distributions, it is somewhat difficult to make conjectures about their behavior without doing some fairly long-winded calculations.

(no subject)

Date: 2009-09-18 07:07 am (UTC)
From: [identity profile] gustavolacerda.livejournal.com
do you consider this typical for number theorists, or the mathematicians that you know?

(no subject)

Date: 2009-09-18 07:21 am (UTC)
From: [identity profile] psifenix.livejournal.com
I would say for NTists there are a lot of us who code and a lot of us who don't. It's more common in NT than in say, logic or topology, or even other areas of algebra (in pure ring theory I don't even think they really /can/ compute much yet... the algebraic geometers and commutative algebraists are just starting to, really), though it is less common in NT than in analysis/PDE/probability, where they've been computing for years and years.

February 2020

S M T W T F S
      1
2345678
9101112131415
16171819202122
23242526272829

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags