self-control and meta-rationality
Jan. 6th, 2007 01:43 pmIdeal theories of rationality assume perfect self-control. Thus, long-term plans obtained by optimizing utility over all possible plans are only good to the extent that I have control over my future self. Can you trust your future self to do the right thing? Or are you like a werewolf, prone to bouts of temporary insanity?
People with compulsions, tics (and to a lesser extent drug addicts) know that there exist shades of grey between the voluntary and the involuntary. Some call these shades "semi-voluntary": you *can* resist your compulsion, but only temporarily, and only by enduring a great deal of psychological pain, i.e. your mind coerces you to do what you don't want to do.
People who want to quit smoking sometimes establish a system by which they are forced to pay a large fine for each cigarette that they smoke. Is this rational? Obviously, if they just wanted to quit cold turkey, and could trust their future self, then they wouldn't bother making the system.
If I know that I may not be rational in the future (or my goals may become different than my present goals), then the realistically optimal thing to do (i.e. "realistically" = given how much self-control I have), may be a plan that would be suboptimal for an ideal agent.
Formalizing this in terms of game theory, your "future self" is a different player, whose goals are sometimes similar to yours, sometimes not; who may or may not be more short-sighted than you. Not trusting your future self implies that the rational strategy is to play it safe, i.e. a maximin strategy.
People with compulsions, tics (and to a lesser extent drug addicts) know that there exist shades of grey between the voluntary and the involuntary. Some call these shades "semi-voluntary": you *can* resist your compulsion, but only temporarily, and only by enduring a great deal of psychological pain, i.e. your mind coerces you to do what you don't want to do.
People who want to quit smoking sometimes establish a system by which they are forced to pay a large fine for each cigarette that they smoke. Is this rational? Obviously, if they just wanted to quit cold turkey, and could trust their future self, then they wouldn't bother making the system.
If I know that I may not be rational in the future (or my goals may become different than my present goals), then the realistically optimal thing to do (i.e. "realistically" = given how much self-control I have), may be a plan that would be suboptimal for an ideal agent.
Formalizing this in terms of game theory, your "future self" is a different player, whose goals are sometimes similar to yours, sometimes not; who may or may not be more short-sighted than you. Not trusting your future self implies that the rational strategy is to play it safe, i.e. a maximin strategy.