gusl: (Default)
[personal profile] gusl
Much of my motivation to try to understand "semantic information theory" comes from shortcomings in distinguishing signal from noise.

How Could Information Equal Entropy?
If someone says that information = uncertainty = entropy, then they are confused, or something was not stated that should have been. Those equalities lead to a contradiction, since entropy of a system increases as the system becomes more disordered. So information corresponds to disorder according to this confusion.


Also, Information is not uncertainty shows related errors in published papers.

If I get the time, I should read Dretske's "On Semantic Information" and Floridi, Luciano (2005) - Is Information Meaningful Data?.

Different uses of the term 'information'

Date: 2006-09-11 09:48 am (UTC)
From: (Anonymous)
If you want a quick summary of these issues, here's a short paper I put together highlighting and integrating the different uses of the term 'information'.

http://terrystewart.ca/papers/2003-Information.pdf

:)
Terry (who still hasn't gotten an account yet, but he will, soon, honest!)

February 2020

S M T W T F S
      1
2345678
9101112131415
16171819202122
23242526272829

Most Popular Tags

Page Summary

Style Credit

Expand Cut Tags

No cut tags