gusl: (Default)
[personal profile] gusl
Highly recommended: Michael Huemer - Why People Are Irrational about Politics (even if I disagree with his moral objectivism at first sight)
My highlights:

serious problem
The problem of political irrationality is the greatest social problem humanity faces. It is a greater problem than crime, drug addiction, or even world poverty, because it is a problem that prevents us from solving other problems. Before we can solve the problem of poverty, we must first have correct beliefs about poverty, about what causes it, what reduces it, and what the side effects of alternative policies are. If our beliefs about those things are being guided by the social group we want to fit into, the self-image we want to maintain, the desire to avoid admitting to having been wrong in the past, and so on, then it would be pure accident if enough of us were to actually form correct beliefs to solve the problem.
...
rational irrationality
The best explanation lies in the theory of Rational Irrationality: individuals derive psychological rewards from holding certain political beliefs, and since each individual suffers almost none of the harm caused by his own false political beliefs, it often makes sense (it gives him what he wants) to adopt those beliefs regardless of whether they are true or well-supported.
...
the self-deception mechanism
The theory defended in the last two sections assumes that people have control over their beliefs; it explains people’s beliefs in the same manner in which we often explain people’s actions (by appeal to their desires). But many philosophers think that we can’t control our beliefs — at least not directly. To show this, they often give examples of obviously false propositions, and then ask if you can believe them — for instance, can you, if you want to, believe that you are presently on the planet Venus?

Perhaps we cannot believe obviously false propositions at will. Still, we can exercise substantial control over our political beliefs. A “mechanism of belief fixation” is a way that we can get ourselves to believe the things we want to believe. Let’s look at some of these mechanisms.
...


Maybe we should create prizes to incent people to produce unbiased knowledge about politically-charged questions. Of course, we need an objective methodology. But plain "informal" research can already do a lot of good, if publicized: it seems that >95% of the funding that goes to research political issues is inherently biased.

It would be interesting to look at different people's argumentative styles: I imagine that more rational people will want to go back to epistemic foundations, whereas others will keep offering subjective beliefs as evidence, or try to appeal to your emotions.


---

I have an idea for a system for collaborative knowledge construction by skeptics, meant to avoid bias.

Think of a Wikipedia-like resource, with claims writtten in a formal language. Each claim will have its "judges" (which includes "claimers" and "disclaimers", or "believers" and "skeptics"), and each judge will "sign" his judgement of the claim with (1) his justification, (2) a degree of probability about the claim (3) how strongly he backs up his claim / how much reputation he is willing to wager on this question.

Features include:
* justification tags, of several types:
     * trust (claimer trusts the source of this judgment, perhaps due to authority position or proximity in the trust network)
     * personal-observation
     * hard-data-or-statistics (claimer trusts source of data)
     * inference from other believed claims (this is recursive)
* belief aggregation / reasoning with expert opinions.
* given a reader's profile (commitments, epistemic preferences, trust towards each author on each topic) and information available to him, we could try predict his opinion a given issue. We could also do experiments presenting facts in a biased way, selecting facts that support only one side of the issue, in order to infer a reader's profile. We could also identify objective thinkers this way.

* Judges would create reputations for different kinds of claims. This reputation is implemented as a "network of trust".

Forming (or changing) a belief about a claim should propagate beliefs about (1) logically related claims (belief revision) (2) the trust / reputation of judges who declared an opinion on it. (we could represent this with PRMs. While this could get easily intractable, Koller shows that good approximations can be tractable)

the role of epistemic foundations
People need to trust each other. It's a simple question of division of labour (the solution to rational irrationality). But this means that wrong or biased information can spread. Also, beliefs sometimes rely on old assumptions that have since been proven wrong (but not enough belief revision has been done). This is why beliefs need to be tied to epistemic foundations, again and again. The optimal rate of foundational updates may be a matter of personal preference.


---

Here's a scarier link about techniques of persuasion, manipulation, hypnosis, etc. He characterizes the US Marines and revivalist churches as "brainwashing cults". Persuasion and Brainwashing Techniques Being Used On The Public Today

---

finally, via Google Ads:
The Theseus Learning System. Maybe I can make some money this way: selling software for critical-thinking education / idea refinement / writing. But my real interest is to create systems to enlighten real debates.

Let me be almost original and invent the phrase "epistemic hygiene".
(will be screened)
(will be screened if not validated)
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting

February 2020

S M T W T F S
      1
2345678
9101112131415
16171819202122
23242526272829

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags