Examples of discrete random variable in the following topics:

 Probability distributions for discrete random variables can be displayed as a formula, in a table, or in a graph.
 A discrete random variable X has a countable number of possible values.
 The probability mass function has the same purpose as the probability histogram, and displays specific probabilities for each discrete random variable.
 This histogram displays the probabilities of each of the three discrete random variables.
 This table shows the values of the discrete random variable can take on and their corresponding probabilities.

 The expected value of a random variable is the weighted average of all possible values that this random variable can take on.
 A discrete random variable X has a countable number of possible values.
 The probability distribution of a discrete random variable X lists the values and their probabilities, such that xi has a probability of pi.
 In probability theory, the expected value (or expectation, mathematical expectation, EV, mean, or first moment) of a random variable is the weighted average of all possible values that this random variable can take on.
 The weights used in computing this average are probabilities in the case of a discrete random variable.

 A random variable X, and its distribution, can be discrete or continuous.
 Random variables can be classified as either discrete (that is, taking any of a specified list of exact values) or as continuous (taking any numerical value in an interval or collection of intervals).
 Discrete random variables can take on either a finite or at most a countably infinite set of discrete values (for example, the integers).
 Examples of discrete random variables include the values obtained from rolling a die and the grades received on a test out of 100.
 Selecting random numbers between 0 and 1 are examples of continuous random variables because there are an infinite number of possibilities.






 To define probability distributions for the simplest cases, one needs to distinguish between discrete and continuous random variables.
 In the discrete case, one can easily assign a probability to each possible value.
 In contrast, when a random variable takes values from a continuum, probabilities are nonzero only if they refer to finite intervals.
 Intuitively, a continuous random variable is the one which can take a continuous range of values — as opposed to a discrete distribution, where the set of possible values for the random variable is, at most, countable.
 If the distribution of X is continuous, then X is called a continuous random variable and, therefore, has a continuous probability distribution.

 A stochastic process is a collection of random variables that is often used to represent the evolution of some random value over time.
 In probability theory, a stochastic processsometimes called a random process is a collection of random variables that is often used to represent the evolution of some random value, or system, over time.
 In the simple case of discrete time, a stochastic process amounts to a sequence of random variables known as a time seriesfor example, a Markov chain.
 Random variables are nondeterministic (single) quantities which have certain probability distributions.
 Random variables corresponding to various times (or points, in the case of random fields) may be completely different.