gusl: (Default)
[personal profile] gusl
I think so. It's almost required of believers in strong AI.

This afternoon, I had a colleague tell me that humans are superior to computers because we are not Turing machines, so we aren't subject to the halting problem. I told his arguments sounded a lot like Penrose's fallacious anti-strong-AI arguments.
We argued for about 10 minutes and then I gave up convincing him.

Crispin Cowan on the metaphysical implications of humans being Turing machines.

Boyer and Moore have a machine prove the Halting Problem

hmm

Date: 2003-06-25 11:17 pm (UTC)
From: [identity profile] patrissimo.livejournal.com
I don't think we're turing machines, because TM's have infinite memory. I think we're just finite automata.

Now, if you count pieces of paper and RAM chips as being acceptable parts of humans (we did create them, after all), that boosts our memory a lot. Then I'd put humans in the category of "Deterministic Automata with so much available memory that they look like Turing Machines except on really large, complex problems".

Re: hmm

Date: 2003-06-26 08:44 am (UTC)
From: [identity profile] gustavolacerda.livejournal.com
Hm... I think there was a linguistic confusion here: by "deterministic automata", I understood you meant either DFA or NFA, but you obviously didn't mean those since they have no memory. If we are thinking of the same thing, I'm guessing you probably meant "Turing machine with finite tape", tweaking the definition of TM if necessary.


There is a hierarchy:

Deterministic/Non-deterministic Finite Automata: can decide if a string fits a regular expression.
Pushdown Automata: decide context-free languages
Turing Machines: decide computable/"decidable" problems

So DFAs/NFAs don't have memory by definition, just states. Can't decide whether a string fits 0^m 1^m.

PAs have a partially-accessible memory in the form of an unlimited stack. Can decide whether a string fits 0^m 1^m, but can't decide whether 0^m 1^m 0^m (?).

TMs have an infinite tape.

Re: hmm

Date: 2003-06-26 05:33 pm (UTC)
From: [identity profile] patrissimo.livejournal.com
There was no miscommunication about a "deterministic automata". A DFA does indeed have memory, because it knows which of its states it is in. After all, what is memory if not a large set of possible "states" to be in? A DFA has log_2(N) bits of memory, where N is the number of states.

You could transform a computer program into a DFA as long as it only used a finite amount of memory known in advance, by letting each state in the DFA represent a complete description of the CPU registers, memory space of the program, current location of the PC, etc. The entire transition table (mammoth though it would be) could be precomputed and the DFA generated.

A DFA can't decide if a string fits 0^m 1^m because that requires log_2(m) bits of memory. If m can be arbitrarily large, then we just make m bigger than the number of states in the DFA, and the DFA can't keep track of the count of incoming 0's. Being forced to cycle in the state graph can be thought of as "running out of memory".

A Turing Machine with finite tape (or a PA with a finite stack) is just a DFA. It only has finite memory, and just like the computer program, we can "simulate" it with a DFA, each of whose states represents the complete state of the tape and the position of the head.

Since humans have only a finite amount of memory, how can we be Turing Machines? A TM can solve problems that take more memory than there are particles in the universe. We can't. We're just large DFA's...

Re: hmm

Date: 2003-06-26 11:05 pm (UTC)
From: [identity profile] gustavolacerda.livejournal.com
I agree with you now. Thanks for refreshing my knowledge, and adding to my understanding.
It was counterintuitive for me that we can't decide 0^m 1^m.

A Turing Machine with finite tape (or a PA with a finite stack) is just a DFA. It only has finite memory, and just like the computer program, we can "simulate" it with a DFA, each of whose states represents the complete state of the tape and the position of the head.

That's an insight I should have had back when I studied Theory of Computation, but for some reason I didn't (I don't think the prof showed it either). I guess I was just sick of the whole thing.

February 2020

S M T W T F S
      1
2345678
9101112131415
16171819202122
23242526272829

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags