Examples of discrete random variable in the following topics:

 Probability distributions for discrete random variables can be displayed as a formula, in a table, or in a graph.
 A discrete random variable X has a countable number of possible values.
 The probability mass function has the same purpose as the probability histogram, and displays specific probabilities for each discrete random variable.
 This histogram displays the probabilities of each of the three discrete random variables.
 This table shows the values of the discrete random variable can take on and their corresponding probabilities.

 The expected value of a random variable is the weighted average of all possible values that this random variable can take on.
 A discrete random variable X has a countable number of possible values.
 The probability distribution of a discrete random variable X lists the values and their probabilities, such that xi has a probability of pi.
 In probability theory, the expected value (or expectation, mathematical expectation, EV, mean, or first moment) of a random variable is the weighted average of all possible values that this random variable can take on.
 The weights used in computing this average are probabilities in the case of a discrete random variable.

 The ﬁeld of reliability depends on a variety of continuous random variables.
 This chapter gives an introduction to continuous random variables and the many continuous distributions.
 NOTE: The values of discrete and continuous random variables can be ambiguous.
 For example, if X is equal to the number of miles (to the nearest mile) you drive to work, then X is a discrete random variable.
 If X is the distance you drive to work, then you measure values of X and X is a continuous random variable.

 A random variable X, and its distribution, can be discrete or continuous.
 Random variables can be classified as either discrete (that is, taking any of a specified list of exact values) or as continuous (taking any numerical value in an interval or collection of intervals).
 Discrete random variables can take on either a finite or at most a countably infinite set of discrete values (for example, the integers).
 Examples of discrete random variables include the values obtained from rolling a die and the grades received on a test out of 100.
 Selecting random numbers between 0 and 1 are examples of continuous random variables because there are an infinite number of possibilities.

 These two examples illustrate two different types of probability problems involving discrete random variables.
 A random variable describes the outcomes of a statistical experiment in words.
 The values of a random variable can vary with each repetition of an experiment.
 In this chapter, you will study probability problems involving discrete random distributions.

 Upper case letters like X or Y denote a random variable.
 Lower case letters like x or y denote the value of a random variable.
 If X is a random variable, then X is written in words.
 Because you can count the possible values that X can take on and the outcomes are random (the x values 0, 1, 2, 3), X is a discrete random variable.

 A hypergeometric random variable is a discrete random variable characterized by a fixed number of trials with differing probabilities of success.
 The hypergeometric distribution is a discrete probability distribution that describes the probability of k successes in n draws without replacement from a finite population of size N containing a maximum of K successes.
 As random selections are made from the population, each subsequent draw decreases the population causing the probability of success to change with each draw.
 A random variable follows the hypergeometric distribution if its probability mass function is given by:
 A random variable follows the hypergeometric distribution if its probability mass function is given by the formula shown, where: N is the population size, K is the number of success states in the population, n is the number of draws, k is the number of successes.

 The Poisson random variable is a discrete random variable that counts the number of times a certain event will occur in a specific interval.
 and assuming that the process that produces the event flow is essentially random, the Poisson distribution specifies how likely it is that the count will be 3, 5, 10, or any other number during one period of observation.
 The work focused on certain random variables N that count, among other things, the number of discrete occurrences (sometimes called "events" or "arrivals") that take place during a time interval of given length.
 The Poisson random variable, then, is the number of successes that result from a Poisson experiment, and the probability distribution of a Poisson random variable is called a Poisson distribution.
 Apart from disjoint time intervals, the Poisson random variable also applies to disjoint regions of space.

 In probability theory, the expected value (or expectation, mathematical expectation, EV, mean, or first moment) of a random variable is the weighted average of all possible values that this random variable can take on.
 The weights used in computing this average are probabilities in the case of a discrete random variable, or values of a probability density function in the case of a continuous random variable.
 The expected value of a random variable can be calculated by summing together all the possible values with their weights (probabilities):

 In probability theory, the expected value refers, intuitively, to the value of a random variable one would "expect" to find if one could repeat the random variable process an infinite number of times and take the average of the values obtained.
 The weights used in computing this average are the probabilities in the case of a discrete random variable (that is, a random variable that can only take on a finite number of values, such as a roll of a pair of dice), or the values of a probability density function in the case of a continuous random variable (that is, a random variable that can assume a theoretically infinite number of values, such as the height of a person).
 From a rigorous theoretical standpoint, the expected value of a continuous variable is the integral of the random variable with respect to its probability measure.
 Thus, for a continuous random variable the expected value is the limit of the weighted sum, i.e. the integral.
 Suppose we have a random variable X, which represents the number of girls in a family of three children.