I love the subject of probability. Almost every book I’ve read on Probability, the explanation of expectation has been unintuitive and de-motivating. Except a few – one that is written by my professor, Prof. Norm Matloff, which is available here.
Consider that I am buying 10 oranges, which sums up to a weight of 1lb. Now the weight of one orange on an average is lb. This concept is so simple that there is nothing to explain here. We all know what an average value means.
Expectation is nothing but an average value. Assume the case of throwing a die times. Let us assume we have a variable that marks the value we get for every throw. Let denote the value we get in the throw. So after throws I get a sum of values:
Let us denote the sum by a variable ; then: .
Now what is the average value that we get for each throw. That would be .
As a side note, these variables, the are called random variables. Why are they called random variables? Just because they are initialized to some value randomly. I can not say will always be 1. All I can say is will be an integer between 1 and 6.
Let us try rewriting our equation for the sum in a fancy way:
If we take this sum over our throws, we get the exact sum .
With this new equation if we rewrite our average:
But we already know, intuitively:
And so on.
Then our average becomes:
We can write this concisely as:
where we wrote a generic single random variable instead of differentiating each of them with ‘s.
We can call this average by a fancy name too, expectation. The mathematical notation of which would be .
This gives , in case of rolling a die. Here the sum is taken over all possible values that can take.
Let us denote that as a set, named , which in our case will be .
Using this we can write our generic equation for expectation (as found in many textbooks):
And finally, the word expectation, as my professor explains, says only one thing about that value; “you should never expect the expected value”.