discrete random variable
(noun)Definition of discrete random variable
obtained by counting values for which there are no inbetween values, such as the integers 0, 1, 2, ….
Source: Boundless Learning  CC BYSA 3.0
Examples of discrete random variable in the following topics:

The Poisson Random Variable
 The Poisson Distribution and Its History The Poisson distribution is a discrete probability distribution.
 The work focused on certain random variables N that count, among other things, the number of discrete occurrences (sometimes called "events" or “arrivals”) that take place during a time interval of given length.
 The Poisson random variable, then, is the number of successes that result from a Poisson experiment, and the probability distribution of a Poisson random variable is called a Poisson distribution.
 The Poisson random variable satisfies the following conditions: The number of successes in two disjoint time intervals is independent.
 Apart from disjoint time intervals, the Poisson random variable also applies to disjoint regions of space.
 The Poisson random variable is a discrete random variable that counts the number of times a certain event will occur in a specific interval.

The Hypergeometric Random Variable
 The hypergeometric distribution is a discrete probability distribution that describes the probability of k successes in n draws without replacement from a finite population of size N containing a maximum of K successes.
 As random selections are made from the population, each subsequent draw decreases the population causing the probability of success to change with each draw.
 A random variable follows the hypergeometric distribution if its probability mass function is given by: , where: N is the population size, K is the number of success states in the population, n is the number of draws, k is the number of successes, and $\left( \frac { a }{ b } \right)$ is a binomial coefficient.
 A hypergeometric random variable is a discrete random variable characterized by a fixed number of trials with differing probabilities of success.

Expected Values of Discrete Random Variables
 Discrete Random Variable A discrete random variable X has a countable number of possible values.
 The probability distribution of a discrete random variable X lists the values and their probabilities, such that xi has a probability of pi.
 Expected Value Definition In probability theory, the expected value (or expectation, mathematical expectation, EV, mean, or first moment) of a random variable is the weighted average of all possible values that this random variable can take on.
 The weights used in computing this average are probabilities in the case of a discrete random variable.
 Then the expectation of this random variable X is defined as: E [X] = x1p1 + x2p2 + ... + xipi, which can also be written as: E [X] = Σ xipi.
 The expected value of a random variable is the weighted average of all possible values that this random variable can take on.

Probability Distributions for Discrete Random Variables
 A discrete random variable X has a countable number of possible values.
 The probability distribution of a discrete random variable X lists the values and their probabilities, where value x1 has probability p1, value x2 has probability x2, and so on.
 Examples of discrete random variables include: The number of eggs that a hen lays in a given day (it can't be 2.3) The number of people going to a given soccer match The number of students that come to class on a given day The number of people in line at McDonald's on a given day and time A discrete probability distribution can be described by a table, by a formula, or by a graph.
 For example, suppose that X is a random variable that represents the number of people waiting at the line at a fastfood restaurant and it happens to only take the values 2, 3, or 5 with probabilities 2/10, 3/10, and 5/10 respectively.
 The probability mass function has the same purpose as the probability histogram, and displays specific probabilities for each discrete random variable.
 Probability distributions for discrete random variables can be displayed as a formula, in a table, or in a graph.

Expected Value and Standard Error
 Expected ValueIn probability theory, the expected value (or expectation, mathematical expectation, EV, mean, or first moment) of a random variable is the weighted average of all possible values that this random variable can take on.
 The weights used in computing this average are probabilities in the case of a discrete random variable, or values of a probability density function in the case of a continuous random variable.The expected value may be intuitively understood by the law of large numbers: the expected value, when it exists, is almost surely the limit of the sample mean as sample size grows to infinity.
 The value may not be expected in the ordinary sense—the "expected value" itself may be unlikely or even impossible (such as having 2.5 children), as is also the case with the sample mean.The expected value of a random variable can be calculated by summing together all the possible values with their weights (probabilities): $E\left [ X \right ]= x_{1}p_{1}+x_{2}p_{2}+...

Probability Histograms
 Suppose we want to create a probability histogram for the discrete random variable X that represents the number of heads in four tosses of a coin.
 We know the random variable X can take on the values of 0, 1, 2, 3, or 4.
 The probability of each of the random variables X is as follows: P(X=0)=1/16=0.0625; P(X=1)=4/16=0.25; P(X=2)=6/16=0.375; P(X=3)=4/16=0.25; P(X=4)=1/16=0.0625.
 The Xaxis would be labeled with the possible values of the random variable X  in this case, 0, 1, 2, 3, and 4.

Expected Value
 In probability theory, the expected value refers, intuitively, to the value of a random variable one would "expect" to find if one could repeat the random variable process an infinite number of times and take the average of the values obtained.
 The weights used in computing this average are the probabilities in the case of a discrete random variable (that is, a random variable that can only take on a finite number of values, such as a roll of a pair of dice), or the values of a probability density function in the case of a continuous random variable (that is, a random variable that can assume a theoretically infinite number of values, such as the height of a person).
 From a rigorous theoretical standpoint, the expected value of a continuous variable is the integral of the random variable with respect to its probability measure.
 Thus, for a continuous random variable the expected value is the limit of the weighted sum, i.e. the integral.
 Simple Example Suppose we have a random variable X, which represents the number of girls in a family of three children.

The Correction Factor
 In probability theory, the expected value refers, intuitively, to the value of a random variable one would "expect" to find if one could repeat the random variable process an infinite number of times and take the average of the values obtained.
 The weights used in computing this average are the probabilities in the case of a discrete random variable (that is, a random variable that can only take on a finite number of values, such as a roll of a pair of dice), or the values of a probability density function in the case of a continuous random variable (that is, a random variable that can assume a theoretically infinite number of values, such as the height of a person).
 From a rigorous theoretical standpoint, the expected value of a continuous variable is the integral of the random variable with respect to its probability measure.
 Thus, for a continuous random variable the expected value is the limit of the weighted sum, i.e. the integral.
 Simple Example Suppose we have a random variable X, which represents the number of girls in a family of three children.