gusl: (Default)
I think it is well-known that the news media copies stories from each other (presumably with enough changes to not seem too blatant*, but nevertheless the typical lack of citations would suffice to be hung for plagiarism in any academic field). This process can be described as a DAG in which the nodes are stories and the arrows represent copying, and only some nodes are observed (printed). If you abstract over the rewordings, a diagram displaying merges and splits might not look that different from a version control graph.

Query: What aspects of these networks are public knowledge, and to what extent can we trace the history of an individual news item?

Have people developed tools to help this science of "news forensics"? My hope is that by figuring out who the original sources are and what they actually said, we can cancel out the effect of the replicator dynamics (what you see is what sells), and thus get more objective information.



* - If this were a single (rather than a developing) piece of news, and if everyone were honest, one would expect that the non-blatant copying means that the news gets phrased more and more awkwardly. In reality, I suspect they rephrase things without regard for the facts (and often towards sensationalism).

February 2020

S M T W T F S
      1
2345678
9101112131415
16171819202122
23242526272829

Syndicate

RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags