gusl: (Default)
[personal profile] gusl
One important dimension in which researchers vary is whether they are a head-in-the-sky or a feet-on-the-ground type.

I have always leaned towards being a head-in-the-sky type1: this is evident from my attraction to Philosophy of Science and logic and semantics, and all the time I've spent dreaming about things like the semantic web, and similarly far-fetched ideas. Head-in-the-sky work tends to be hard to evaluate.

Feet-on-the-ground types are concerned with real performance, and tend to not be satisfied unless they have a working system. Crackpots don't find a home among them.

Somewhere in between, there are research programs that involve rigorous but unrealistic modeling (e.g. a few logicians I met in Amsterdam), whose work is (rightly) ignored by empirical scientists (psychologists, economists, etc.2).

My experience has been that I work best on projects with feet-on-the-ground goals, with feet-on-the-ground advisors, because they prevent me from wasting time with unrealizable ideas.

In Amsterdam, my Master's thesis advisor was very much a head-in-the-sky type. My thesis had the grandiose goal of implementing a model of what Thomas Kuhn called "normal science", but it unfortunately (and, in hindsight, predictably) amounted to nothing, because it was based on an unworkable idea. The only lesson learned was that the analogy between parse-trees and logical-derivation-trees seems to be fruitless.

All my research since then has been with concrete machine learning systems. I still have a lot to learn, but since making this switch, my progress (in terms of making a real contribution to science) has been satisfactory.




1 Ironically, many of my head-in-the-sky ideas have been motivated by my skepticism (a feet-in-the-ground trait).

2 I hear that many theoretical economists are guilty of the same sin.

(no subject)

Date: 2007-10-01 11:21 pm (UTC)
From: [identity profile] en-ki.livejournal.com
Of course, some are both. If you wish a balanced educational experience, it is best to select someone as tall as possible.

(no subject)

Date: 2007-10-01 11:44 pm (UTC)
From: [identity profile] fancybred.livejournal.com
The only lesson learned was that the analogy between parse-trees and logical-derivation-trees seems to be fruitless.


I'm curious what you mean here. Were you looking at something along the lines of
type logical grammar
, or something different? I forget whether we've talked about this before.

(no subject)

Date: 2007-10-02 04:09 am (UTC)
From: [identity profile] gustavolacerda.livejournal.com
My advisor's idea is that natural-language parse-trees are analogous to everything. He used the same methodology for music-parsing, and likes to advertise the fact that it's used in computer vision. In my thesis, we went a step too far, in trying to apply this hammer to talk about derivations in physics and engineering.

Type-logical grammar sounds interesting. How does it compare with Montague grammar?

(no subject)

Date: 2007-10-02 04:20 am (UTC)
From: [identity profile] gustavolacerda.livejournal.com
You can read the thesis (with abstract and everything) here. Nowadays, I consider it garbage.

(no subject)

Date: 2007-10-02 05:24 am (UTC)
From: [identity profile] fancybred.livejournal.com
Type-logical grammar sounds interesting. How does it compare with Montague grammar?

As I understand it, TLG is the modern-day descendant of categorial grammar (about natural language syntax) and Montague grammar (about natural language semantics). Actually it was van Benthem who introduced them to each other (remember the "Curry-Howard-Van Benthem correspondence"?). The idea behind categorial grammar is to model the syntax of a language as purely logical derivations, i.e. parse trees are just ordinary proofs in ("substructural") logic. Van Benthem noticed you could apply a simple homomorphism to these parse-trees-as-proofs to extract their semantic content as lambda-calculus terms. This allowed a much tighter connection between syntax and semantics than was traditionally the case, for example opening up the way to much better treatments of quantification than Montague's "Proper Treatment". [Disclaimer: I am not a linguist and am speaking out of no authority. I just think TLG is cool, and it sort of jives with the way I think about programming languages. I'm not sure whether this relates to your thesis, except that yes, natural-language parse-trees are analogous to everything!]

February 2020

S M T W T F S
      1
2345678
9101112131415
16171819202122
23242526272829

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags