#epistemology #rationality
# [[Epistemic status]]
#shower-thought
# Related
- [[Multinomial logistic regression]]
- [[Multinomial distribution]]
- [[Readwise/Articles/towardsdatascience.com - The Intuition Behind Shannon’s Entropy]]
- [[Philosophy/Epistemology/Kolmogorov randomness]]
- [[Probability theory]]
- [[Physic/Complexity]]
- [[Readwise/Articles/en.wikipedia.org - Uncertainty Principle - Wikipedia]]
# Maximum entropy
> What then is that precious something contained in our food which keeps us from death? That is easily answered. Every process, event, happening – call it what you will; in a word, everything that is going on in Nature means an increase of the entropy of the part of the world where it is going on. **Thus a living organism continually increases its entropy – or, as you may say, produces positive entropy – and thus tends to approach the dangerous state of maximum entropy, which is death**. It can only keep aloof from it, i.e. alive, by continually drawing from its environment negative entropy – which is something very positive as we shall immediately see.
> ~[[Schrodinger]]
**Entropy** is a measure of information content of an outcome of $X$. A less probable outcome conveys more information than more probable ones. Thus, entropy can be stated as a _measure of uncertainty_. When the goal is to find a distribution that is as ignorant as possible, then, consequently, entropy should be maximal.
The maximum entropy principle is a means of deriving probability distributions given certain constraints and the assumption of maximizing entropy.
>The **principle of maximum entropy** is often used to obtain [[prior probability distributions]] for [[Bayesian inference]]. Jaynes was a strong advocate of this approach, claiming the **maximum entropy** distribution represented the **least informative distribution**
i.e. [[Occam razor]] / [[Simplicity]] / [[Kolmogorov complexity]]?
# External links
https://bayes.wustl.edu/etj/articles/theory.1.pdf
https://bayes.wustl.edu/etj/articles/theory.2.pdf