The “sampling from an infinite population” metaphor beloved by statisticians of all types is a disaster for reproducible science. To explain why I’ll show what sampling from a finite population has going for it that’s not there in the “infinite population” case. (more…)

April 19, 2014

#### •

Comments (0)Here I derive a simple formula for probability distributions general enough for Statistical Mechanics and Classical Statistics in which the roles, meanings, and interpretations between the Information Entropy and Boltzmann’s Entropy are as clear as possible.

What follows is for the one dimensional case but trivially generalizes to several. (more…)

April 16, 2014

#### •

Comments (9)The exponential family of distributions is important in Statistics because all the common distributions are of this type, and by the Pitman-Koopman theorem, they are exactly the family of distributions which has useful sufficient statistics. By an amazing coincidence applying the usual IID+MLE procedures to them is equivalent to working a completely different problem, with very different assumptions, and derived from a different philosophy.

The following mathematics is not new or controversial. It’s at least 70-80 years if not older, but it’s not part of the standard statistics curriculum so it’s worth pointing out to a wider audience. First a review of the classical approach for the one parameter case. (more…)

April 15, 2014

#### •

Comments (0)The previous post claimed it’s reasonable to expect frequencies in binary experiments to be near .5 simply because that’s what most possible outcomes lead to.

Reasonable or not, there’s no guarantee it’ll happen however. If 1% of the possibilities yield outcomes far from .5 that’s still an enormous number of exceptions in absolute terms, so there’s plenty of room to be wrong.

So if we could be wrong, then why do the calculation in the first place? (more…)

April 12, 2014

#### •

Comments (2)Many view the propensity theory of probabilities as something incompatible with Bayesian probabilities. Nothing could be further from the truth; it represents an elementary special case of that definition.

To see this I’ll apply those Bayesian principles to predicting the outcome of trials of a binary experiment. I’ll use the notation from two posts ago where Bayesian probabilities were defined. (more…)

April 9, 2014

#### •

Comments (67)The last post together with Christian Hennig’s comment, naturally reminded me of Jaynes and his view of frequencies. After a discussion similar to my previous post, but approached in a different way and in more depth, Jaynes states (PTLOS p. 292): (more…)

April 6, 2014

#### •

Comments (33)This post defines the goal of all statistical efforts, shows how that goal is met, and defines probabilities. It is intuitive and simple, but for that reason will be difficult for statisticians to follow. To ease the pain, the reader should ditch everything they know about probabilities and begin anew. (more…)

April 3, 2014

#### •

Comments (4)