Examples of discrete random variable in the following topics:

 Provide examples of discrete random variables
Probability distributions for discrete random variables can be displayed as a formula, in a table, or in a graph.
 A discrete random variable X has a countable number of possible values.
 This histogram displays the probabilities of each of the three discrete random variables.
 The probability mass function has the same purpose as the probability histogram, and displays specific probabilities for each discrete random variable.
 This table shows the values of the discrete random variable can take on and their corresponding probabilities.
 probability distribution (noun) A function of a discrete random variable yielding the probability that the variable will have a given value.
 probability mass function (noun) a function that gives the relative probability that a discrete random variable is exactly equal to some value
 discrete random variable (noun) obtained by counting values for which there are no inbetween values, such as the integers 0, 1, 2, ….

 Contrast discrete and continuous variables
A random variable X, and its distribution, can be discrete or continuous.
 A discrete random variable has a countable number of possible values.
 The probability of each value of a discrete random variable is between 0 and 1, and the sum of all the probabilities is equal to 1.
 Discrete random variables can take on either a finite or at most a countably infinite set of discrete values (for example, the integers).
 Examples of discrete random variables include the values obtained from rolling a die and the grades received on a test out of 100.
 discrete random variable (noun) obtained by counting values for which there are no inbetween values, such as the integers 0, 1, 2, ….
 continuous random variable (noun) obtained from data that can take infinitely many values
 random variable (noun) a quantity whose value is random and to which a probability distribution is assigned, such as the possible outcome of a roll of a die

 Continuous random variables have many applications.
 The ﬁeld of reliability depends on a variety of continuous random variables.
 NOTE: The values of discrete and continuous random variables can be ambiguous.
 For example, if X is equal to the number of miles (to the nearest mile) you drive to work, then X is a discrete random variable.
 How the random variable is deﬁned is very important.

 Determine the expected value of a discrete random variable
The expected value of a random variable is the weighted average of all possible values that this random variable can take on.
 A discrete random variable X has a countable number of possible values.
 The probability distribution of a discrete random variable X lists the values and their probabilities, such that x_{i} has a probability of p_{i}.
 In probability theory, the expected value (or expectation, mathematical expectation, EV, mean, or first moment) of a random variable is the weighted average of all possible values that this random variable can take on.
 The weights used in computing this average are probabilities in the case of a discrete random variable.
 discrete random variable (noun) obtained by counting values for which there are no inbetween values, such as the integers 0, 1, 2, ….
 expected value (noun) of a discrete random variable, the sum of the probability of each possible outcome of the experiment multiplied by the value itself

 These two examples illustrate two different types of probability problems involving discrete random variables.
 Recall that discrete data are data that you can count.
 A random variable describes the outcomes of a statistical experiment in words.
 The values of a random variable can vary with each repetition of an experiment.
 In this chapter, you will study probability problems involving discrete random distributions.

 Upper case letters like X or Y denote a random variable.
 Lower case letters like x or y denote the value of a random variable.
 If X is a random variable, then X is written in words.
 Because you can count the possible values that X can take on and the outcomes are random (the x values 0, 1, 2, 3), X is a discrete random variable.

 Practice finding the expected value of a random variable
Solve for the standard error of a sum
Expected value and standard error can provide useful information about the data recorded in an experiment.
 The expected value (or expectation, mathematical expectation, EV, mean, or first moment) of a random variable is the weighted average of all possible values that this random variable can take on.
 In probability theory, the expected value (or expectation, mathematical expectation, EV, mean, or first moment) of a random variable is the weighted average of all possible values that this random variable can take on.
 The weights used in computing this average are probabilities in the case of a discrete random variable, or values of a probability density function in the case of a continuous random variable.
 The expected value of a random variable can be calculated by summing together all the possible values with their weights (probabilities): $E\left [ X \right ]= x_{1}p_{1}+x_{2}p_{2}+...
 continuous random variable (noun) obtained from data that can take infinitely many values
 discrete random variable (noun) obtained by counting values for which there are no inbetween values, such as the integers 0, 1, 2, ….

 Contrast hypergeometric distribution and binomial distribution
Derive hypoergeometric probability distribution from a hypergeometric experiment
A hypergeometric random variable is a discrete random variable characterized by a fixed number of trials with differing probabilities of success.
 As random selections are made from the population, each subsequent draw decreases the population causing the probability of success to change with each draw.
 The hypergeometric distribution is a discrete probability distribution that describes the probability of k successes in n draws without replacement from a finite population of size N containing a maximum of K successes.
 As random selections are made from the population, each subsequent draw decreases the population causing the probability of success to change with each draw.
 A random variable follows the hypergeometric distribution if its probability mass function is given by:
,
A random variable follows the hypergeometric distribution if its probability mass function is given by the formula shown, where: N is the population size, K is the number of success states in the population, n is the number of draws, k is the number of successes.
where:
N is the population size,
K is the number of success states in the population,
n is the number of draws,
k is the number of successes, and
$\left( \frac { a }{ b } \right)$ is a binomial coefficient.
 binomial distribution (noun) the discrete probability distribution of the number of successes in a sequence of n independent yes/no experiments, each of which yields success with probability p
 hypergeometric distribution (noun) a discrete probability distribution that describes the number of successes in a sequence of n draws from a finite population without replacement
 Bernoulli Trial (noun) an experiment whose outcome is random and can be either of two possible outcomes, "success" or "failure"

 The expected value refers, intuitively, to the value of a random variable one would "expect" to find if one could repeat the random variable process an infinite number of times and take the average of the values obtained.
 From a rigorous theoretical standpoint, the expected value of a continuous variable is the integral of the random variable with respect to its probability measure.
 In probability theory, the expected value refers, intuitively, to the value of a random variable one would "expect" to find if one could repeat the random variable process an infinite number of times and take the average of the values obtained.
 The weights used in computing this average are the probabilities in the case of a discrete random variable (that is, a random variable that can only take on a finite number of values, such as a roll of a pair of dice), or the values of a probability density function in the case of a continuous random variable (that is, a random variable that can assume a theoretically infinite number of values, such as the height of a person).
 From a rigorous theoretical standpoint, the expected value of a continuous variable is the integral of the random variable with respect to its probability measure.
 random variable (noun) a quantity whose value is random and to which a probability distribution is assigned, such as the possible outcome of a roll of a die

 The expected value refers, intuitively, to the value of a random variable one would "expect" to find if one could repeat the random variable process an infinite number of times and take the average of the values obtained.
 From a rigorous theoretical standpoint, the expected value of a continuous variable is the integral of the random variable with respect to its probability measure.
 In probability theory, the expected value refers, intuitively, to the value of a random variable one would "expect" to find if one could repeat the random variable process an infinite number of times and take the average of the values obtained.
 The weights used in computing this average are the probabilities in the case of a discrete random variable (that is, a random variable that can only take on a finite number of values, such as a roll of a pair of dice), or the values of a probability density function in the case of a continuous random variable (that is, a random variable that can assume a theoretically infinite number of values, such as the height of a person).
 From a rigorous theoretical standpoint, the expected value of a continuous variable is the integral of the random variable with respect to its probability measure.
 random variable (noun) a quantity whose value is random and to which a probability distribution is assigned, such as the possible outcome of a roll of a die