Expectation with Examples in Statistics

 

Certainly! In statistics, the concept of expectation (also known as expected value) is a fundamental measure used to quantify the average outcome of a random variable over a large number of trials. It provides a way to summarize the long-term behavior or central tendency of a random process or distribution. Let's delve deeper into this concept with examples and numerical illustrations.

Definition and Notation:

The expectation of a random variable X X , denoted as E ( X ) E(X) or E [ X ] \mathbb{E}[X] , is defined as:

E ( X ) = ∑ x x â‹… P ( X = x ) for discrete random variables E(X) = \sum_{x} x \cdot P(X = x) \quad \text{for discrete random variables} E ( X ) = ∫ − ∞ ∞ x â‹… f ( x )   d x for continuous random variables E(X) = \int_{-\infty}^{\infty} x \cdot f(x) \, dx \quad \text{for continuous random variables}

where:

  • X X is the random variable.
  • P ( X = x ) P(X = x) is the probability mass function (PMF) for discrete X X .
  • f ( x ) f(x) is the probability density function (PDF) for continuous X X .

Examples:

Example 1: Fair Six-Sided Die

Consider a fair six-sided die. Let X X be the outcome of a single roll.

  • Random Variable X X : Possible outcomes are { 1 , 2 , 3 , 4 , 5 , 6 } \{1, 2, 3, 4, 5, 6\} .
  • Probability Mass Function (PMF): P ( X = x ) = 1 6 P(X = x) = \frac{1}{6} for x = 1 , 2 , 3 , 4 , 5 , 6 x = 1, 2, 3, 4, 5, 6 .

To find E ( X ) E(X) :

E ( X ) = ∑ x = 1 6 x ⋅ P ( X = x ) = ∑ x = 1 6 x ⋅ 1 6 E(X) = \sum_{x=1}^{6} x \cdot P(X = x) = \sum_{x=1}^{6} x \cdot \frac{1}{6} E ( X ) = 1 6 ⋅ ( 1 + 2 + 3 + 4 + 5 + 6 ) = 21 6 = 3.5 E(X) = \frac{1}{6} \cdot (1 + 2 + 3 + 4 + 5 + 6) = \frac{21}{6} = 3.5

So, the expected value of X X , the outcome of a single roll of a fair six-sided die, is 3.5 3.5 .

Example 2: Continuous Uniform Distribution

Let X X follow a continuous uniform distribution on the interval [ a , b ] [a, b] .

  • Random Variable X X : X ∼ Uniform ( a , b ) X \sim \text{Uniform}(a, b) .
  • Probability Density Function (PDF): f ( x ) = 1 b − a f(x) = \frac{1}{b-a} for x ∈ [ a , b ] x \in [a, b] .

To find E ( X ) E(X) :

E ( X ) = ∫ a b x ⋅ 1 b − a   d x E(X) = \int_{a}^{b} x \cdot \frac{1}{b-a} \, dx E ( X ) = 1 b − a ⋅ [ x 2 2 ] a b E(X) = \frac{1}{b-a} \cdot \left[ \frac{x^2}{2} \right]_{a}^{b} E ( X ) = 1 b − a ⋅ b 2 − a 2 2 E(X) = \frac{1}{b-a} \cdot \frac{b^2 - a^2}{2} E ( X ) = a + b 2 E(X) = \frac{a + b}{2}

Thus, the expected value of X X for a continuous uniform distribution Uniform ( a , b ) \text{Uniform}(a, b) is a + b 2 \frac{a + b}{2} .

Importance of Expectation:

  • Meaning: The expectation represents the average value of X X over a large number of trials or observations.
  • Utility: It serves as a measure of central tendency and helps in decision-making and risk assessment.
  • Applications: Used in finance (expected returns), physics (expected energy levels), and various fields of engineering and sciences.

Conclusion:

Expectation is a powerful concept in statistics, providing a succinct summary of the behavior of random variables. Whether in discrete or continuous contexts, its calculation offers insights into the average outcome of random phenomena, making it a cornerstone of statistical analysis and probability theory.

Previous Post Next Post