confusion about information theory
Sep. 10th, 2006 09:58 pmMuch of my motivation to try to understand "semantic information theory" comes from shortcomings in distinguishing signal from noise.
How Could Information Equal Entropy?
Also, Information is not uncertainty shows related errors in published papers.
If I get the time, I should read Dretske's "On Semantic Information" and Floridi, Luciano (2005) - Is Information Meaningful Data?.
How Could Information Equal Entropy?
If someone says that information = uncertainty = entropy, then they are confused, or something was not stated that should have been. Those equalities lead to a contradiction, since entropy of a system increases as the system becomes more disordered. So information corresponds to disorder according to this confusion.
Also, Information is not uncertainty shows related errors in published papers.
If I get the time, I should read Dretske's "On Semantic Information" and Floridi, Luciano (2005) - Is Information Meaningful Data?.
Different uses of the term 'information'
Date: 2006-09-11 09:48 am (UTC)http://terrystewart.ca/papers/2003-Information.pdf
:)
Terry (who still hasn't gotten an account yet, but he will, soon, honest!)