A new reading group is gaining momentum – if your research is in any aspect of science and / or statistics, you’ll find like-minding folk here. Information from Lucas Wilkins, doctoral researcher in Biology and Environmental Science.
The next Statistics and Science Reading Group is this
Friday (9th March) in the JMS tea room at 11:00.
All are welcome!
This week we will be reading Claude Shannon’s Mathematical Theory of Communication. This paper is the basis for much of modern day information theory. You can find a copy here:
,a link is also available through our Mendeley group (to get a “find at” link you need to use the website version at
The paper is rather long, so I suggest that we focus on a few parts of it, rather than try and read the whole thing (though there is nothing stopping anyone from doing so). Here’s a list of the parts of the paper that I have judged to be the most important (parenthesis for moderately important parts):
- Part I: I, II, (III), VI, VII
- (Part II)
- Part III: XVIII, XIX, XX
- Part IV: XXIV, XXV, XXVI
- (Part V)
Hopefully this is a good selection – it covers the three main concepts: Channel capacity, Entropy and the Shannon-Hartley theorem, but skips some of the formal background and examples.
To give you some idea of the groups focus, the following is a list of topics we deemed relevant:
- ‘Standard’ Probability: Hypothesis testing, Measure theory.
- Bayesian Probability: Subjective probabilities, Credence and credibility, Bayesian Inference, Rationality and Agency.
- Information Theory: Communication, Information measures, Information geometry, Risk.
- Causality: Causal networks, Causal inference, Experimental design.
- Statistical Mechanics past and present: Thermodynamics, Population genetics, Neural ensembles, Phase transitions, Renormalisation groups.
- History and philosophy of statistics: Controversies, Seminal papers, Logic, Principles of statistical theory.
This group is not, in general, about t-tests