In probability we often see statements like
where
A probability space consists of
- A set of outcomes AKA "things that can happen" AKA the "sample space", usually denoted
$\Omega$ . - A set of events AKA "things whose probability we can measure" AKA the "event space", usually denoted
$\mathcal{F}$ . This is a$\sigma$ -algebra on$\Omega$ , although this condition is only really important if the sample space is uncountable. If$\Omega$ is countable, we can just use$2^\Omega$ (the power set) as our event space. - A function that assigns probabilities to events AKA a probability measure, denoted
$P$ . That is$P : \mathcal{F} \to [0, 1]$ with some special conditions (measure properties):-
$P(\emptyset) = 0$ . - For any event
$A \in \mathcal{F}$ ,$P(A^c) = 1 - P(A)$ . - For a countable family
$\mathcal{A}$ of pairwise-disjoint events,$$P\left(\bigcup_{A \in \mathcal{A}} A\right) = \sum_{A \in \mathcal{A}} P(A)$$
-
Now the elements of the sample space
In many cases, what we really want to know are things like: "What's the probability of winning $100 in a game in which
I win $100 for rolling snake eyes (double 1s) and lose $10 for rolling anything else?".
Or, more realistically, "If I play this game a lot, will I win in the long run?".
These statements involve outcomes ("rolling snake eyes"), but the outcomes have some additional meaning attached ("I win
To see how this works, lets look at the dice game. The sample space ("things that can happen") is the set of all possible pairs of dice throws, which we might represent as
For reasons we'll go into elsewhere, we can basically forget about
Now we're ready to "attach" meaning to the outcomes via a random variable, which we'll call
In our case, we can define \cases
):
W (1, 1) = 100
W (_, _) = -10
It might help to think of
outcome | meaning
------- + -------
(1, 1) | 100
(1, 2) | -10
(1, 3) | -10
... | ...
Now to discover the "meaning" of an outcome we just apply
We're now in a position to answer some of our original questions.
The expressions
Because these
So the probability of me winning 100 dollars is just
Footnotes
-
This turns out to be a minor detail, but note that $E[W]$ implies a probability measure $P$. That is $W$ doesn't contain information about probabilities: remember, it's just a table of "tags" for outcomes. It might make sense to use the same $W$ with different probability measure (maybe a fair die and a weighted die), but in this case $E[W]$ will depend on which measure we use. This will always be clear from the context, but I think it's worth noting since the expression $E[W]$ might give the impression that $W$ "knows" something about probabilities. ↩