Examples of expectations of discrete random variables#
Let’s revisit some of the distributions we encountered in Lecture 9 and calculate their
expectations.
We will do it both analytically, and using scipy.stats.
Example: Expectation of a Bernoulli random variable#
Take a Bernoulli random variable:
Then:
And here is how we can do it using scipy.stats:
theta = 0.7
X = st.bernoulli(theta)
Now that we have made the random variable we can get its expectation by X.expect():
print(f'E[X] = {X.expect():1.2f}')
E[X] = 0.70
Let’s visualize the PMF and the expectation on the same plot:
Example: Expectation of a Categorical random variable#
Take a Categorical random variable:
The expectation is:
Here is how we can find it with Python:
import numpy as np
# The values X can take
xs = np.arange(4)
print('X values: ', xs)
# The probability for each value
ps = np.array([0.1, 0.3, 0.4, 0.2])
print('X probabilities: ', ps)
# And the expectation in a single line
E_X = np.sum(xs * ps)
print(f'E[X] = {E_X:1.2f}')
X values: [0 1 2 3]
X probabilities: [0.1 0.3 0.4 0.2]
E[X] = 1.70
Alternatively, we could use scipy.stats:
X = st.rv_discrete(name='X', values=(xs, ps))
print(f'E[X] = {X.expect():1.2f}')
E[X] = 1.70
And a visualization:
Example: Expectation of a Binomial random variable#
Take a Binomial random variable:
The expectation is:
This makes sense. Remember that \(X\) is the number of successes in a binary experiment that is repeated \(n\) times. Each binary experiment has probability of success equal to \(\theta\).
Here is how we can get it with scipy.stats:
n = 5
theta = 0.6
X = st.binom(n, theta)
print(f'E[X] = {X.expect():1.2f}')
print(f'Compare to n * theta = {n * theta:1.2f}')
E[X] = 3.00
Compare to n * theta = 3.00
Just like before, let’s visualize the PMF and the expectation:
fig, ax = make_full_width_fig()
xs = np.arange(n+1)
ax.vlines(xs, 0, X.pmf(xs), label='PMF of $X$')
ax.plot(X.expect(), 0, 'ro', label=r'$\mathbf{E}[X]$')
ax.set_xlabel('$x$')
ax.set_ylabel('$p(x)$')
ax.set_title(f'Binomial$(n={n:d}, \\theta={theta:1.2f})$')
plt.legend(loc='upper left')
save_for_book(fig, 'ch11.fig3')
Questions#
Rerun the case of the Binomial with \(n=50\). Does the shape of the PMF you get look familiar?
Example: Expectation of a Poisson random variable#
Take Poisson random variable:
The expectation is:
Let’s also do it in scipy.stats:
lam = 2.0
X = st.poisson(lam)
print(f'E[X] = {X.expect():1.2f}')
E[X] = 2.00
And let’s visualize the PMF and the expectation together:
Question#
Rerun the case for the Poisson with a rate parameter \(\lambda = 50\). Does the shape look familiar?