Watch
Watching this resources will notify you when proposed changes or new versions are created so you can keep track of improvements that have been made.
Favorite
Favoriting this resource allows you to save it in the “My Resources” tab of your account. There, you can easily access this resource later when you’re ready to customize it or assign it to your students.
Expected Value
The expected value is a weighted average of all possible values in a data set.
Learning Objectives

Identify the applications of the expected value and its relationship to the law of large numbers.

Define and compute the expected value.
Key Points
 The expected value refers, intuitively, to the value of a random variable one would "expect" to find if one could repeat the random variable process an infinite number of times and take the average of the values obtained.
 The intuitive explanation of the expected value above is a consequence of the law of large numbers: the expected value, when it exists, is almost surely the limit of the sample mean as the sample size grows to infinity.
 From a rigorous theoretical standpoint, the expected value of a continuous variable is the integral of the random variable with respect to its probability measure.
Terms

weighted average
an arithmetic mean of values biased according to agreed weightings

random variable
a quantity whose value is random and to which a probability distribution is assigned, such as the possible outcome of a roll of a die

integral
the limit of the sums computed in a process in which the domain of a function is divided into small subsets and a possibly nominal value of the function on each subset is multiplied by the measure of that subset, all these products then being summed
Full Text
In probability theory, the expected value refers, intuitively, to the value of a random variable one would "expect" to find if one could repeat the random variable process an infinite number of times and take the average of the values obtained. More formally, the expected value is a weighted average of all possible values. In other words, each possible value the random variable can assume is multiplied by its assigned weight, and the resulting products are then added together to find the expected value.
The weights used in computing this average are the probabilities in the case of a discrete random variable (that is, a random variable that can only take on a finite number of values, such as a roll of a pair of dice), or the values of a probability density function in the case of a continuous random variable (that is, a random variable that can assume a theoretically infinite number of values, such as the height of a person).
From a rigorous theoretical standpoint, the expected value of a continuous variable is the integral of the random variable with respect to its probability measure. Since probability can never be negative (although it can be zero), one can intuitively understand this as the area under the curve of the graph of the values of a random variable multiplied by the probability of that value. Thus, for a continuous random variable the expected value is the limit of the weighted sum, i.e. the integral.
Simple Example
Suppose we have a random variable X, which represents the number of girls in a family of three children. Without too much effort, you can compute the following probabilities:
Probability of Number of Girls
The probabilities of the number of girls in a family of three children.
The expected value of X, E[X], is computed as:
Expected Value
The computation of the expected value in our example.
This calculation can be easily generalized to more complicated situations. Suppose that a rich uncle plans to give you $2,000 for each child in your family, with a bonus of $500 for each girl. The formula for the bonus is:
Y = 1,000 + 500X
What is your expected bonus?
Expected Value of Girl Bonus
The computation of the expected value of the girl bonus in our example.
We could have calculated the same value by taking the expected number of children and plugging it into the equation:
E[1,000 + 500X] = 1,000 + 500E[X]
Expected Value and the Law of Large Numbers
The intuitive explanation of the expected value above is a consequence of the law of large numbers: the expected value, when it exists, is almost surely the limit of the sample mean as the sample size grows to infinity. More informally, it can be interpreted as the longrun average of the results of many independent repetitions of an experiment (e.g. a dice roll). The value may not be expected in the ordinary sense—the "expected value" itself may be unlikely or even impossible (such as having 2.5 children), as is also the case with the sample mean.
Uses and Applications
To empirically estimate the expected value of a random variable, one repeatedly measures observations of the variable and computes the arithmetic mean of the results. If the expected value exists, this procedure estimates the true expected value in an unbiased manner and has the property of minimizing the sum of the squares of the residuals (the sum of the squared differences between the observations and the estimate). The law of large numbers demonstrates (under fairly mild conditions) that, as the size of the sample gets larger, the variance of this estimate gets smaller.
This property is often exploited in a wide variety of applications, including general problems of statistical estimation and machine learning, to estimate (probabilistic) quantities of interest via Monte Carlo methods.
The expected value plays important roles in a variety of contexts. In regression analysis, one desires a formula in terms of observed data that will give a "good" estimate of the parameter giving the effect of some explanatory variable upon a dependent variable. The formula will give different estimates using different samples of data, so the estimate it gives is itself a random variable. A formula is typically considered good in this context if it is an unbiased estimator—that is, if the expected value of the estimate (the average value it would give over an arbitrarily large number of separate samples) can be shown to equal the true value of the desired parameter.
In decision theory, and in particular in choice under uncertainty, an agent is described as making an optimal choice in the context of incomplete information. For risk neutral agents, the choice involves using the expected values of uncertain quantities, while for risk averse agents it involves maximizing the expected value of some objective function such as a von NeumannMorgenstern utility function.
Assign just this concept or entire chapters to your class for free.
Key Term Reference
 Objective
 Appears in these related concepts: Defining Credibility, Ways of Thinking About Language, and PrincipleAgent Problem
 arithmetic mean
 Appears in these related concepts: Applications of Statistics, Which Average: Mean, Mode, or Median?, and Shapes of Sampling Distributions
 average
 Appears in these related concepts: Outliers, Mean: The Average, and Average Value of a Function
 continuous random variable
 Appears in these related concepts: Continuous Probability Distributions and Recognizing and Using a Histogram
 continuous variable
 Appears in these related concepts: Using the Model for Estimation and Prediction, Models with Both Quantitative and Qualitative Variables, and Types of Variables
 datum
 Appears in these related concepts: Inferential Statistics, Graphs of Qualitative Data, and Change of Scale
 density
 Appears in these related concepts: Density Calculations, The Density Scale, and Quorum Sensing
 dependent variable
 Appears in these related concepts: Evaluating Model Utility, Formulating the Hypothesis, and Converting between Exponential and Logarithmic Equations
 discrete random variable
 Appears in these related concepts: Expected Value and Standard Error, Two Types of Random Variables, and Probability Histograms
 expected value
 Appears in these related concepts: What Does the Law of Averages Say?, Expected Values of Discrete Random Variables, and Expected Return
 experiment
 Appears in these related concepts: Experimental Design, Primary Market Research, and Descriptive and Correlational Statistics
 finite
 Appears in these related concepts: The Sample Average, Introduction to Sequences, and Summing Terms in an Arithmetic Sequence
 graph
 Appears in these related concepts: Graphing on Computers and Calculators, Reading Points on a Graph, and Graphing Functions
 independent
 Appears in these related concepts: Fundamentals of Probability, Conditional Probability, and Party Identification
 mean
 Appears in these related concepts: Mean, Variance, and Standard Deviation of the Binomial Distribution, The Mean Value Theorem, Rolle's Theorem, and Monotonicity, and Understanding Statistics
 probability
 Appears in these related concepts: Particle in a Box, The Addition Rule, and Rules of Probability for Mendelian Inheritance
 probability density function
 Appears in these related concepts: Probability, Continuous Sampling Distributions, and Philosophical Implications
 probability theory
 Appears in these related concepts: Chance Processes, Random Samples, and Independence
 regression
 Appears in these related concepts: Making a Box Model, Standard Error, and Coefficient of Determination
 residual
 Appears in these related concepts: The Correction Factor, Plotting the Residuals, and Degrees of Freedom
 residuals
 Appears in these related concepts: Graphs for Quantitative Data, Two Regression Lines, and Inferences of Correlation and Regression
 sample
 Appears in these related concepts: Sampling, Defining the Sample and Collecting Data, and Basic Inferential Statistics
 sample mean
 Appears in these related concepts: Random Assignment of Subjects, Sampling Distributions and the Central Limit Theorem, and Which Standard Deviation (SE)?
 unbiased
 Appears in these related concepts: The Purpose of Statistics, Samples, and Using Impartial Language
 variable
 Appears in these related concepts: Related Rates, Math Review, and Psychology and the Scientific Method: From Theory to Conclusion
 variance
 Appears in these related concepts: Testing a Single Variance, Variance Estimates, and Variance
Sources
Boundless vets and curates highquality, openly licensed content from around the Internet. This particular resource used the following sources:
Cite This Source
Source: Boundless. “Expected Value.” Boundless Statistics. Boundless, 21 Jul. 2015. Retrieved 30 Aug. 2015 from https://www.boundless.com/statistics/textbooks/boundlessstatisticstextbook/continuousrandomvariables10/expectedvalueandstandarderror42/expectedvalue2012651/