Random variables

• A random variable is a function from the outcomes of an experiment to the real numbers.
• A probability distribution function specifies the the probabilities associated with the values of the random variable.
Example:
If the outcomes of an experiment with associated probabilities are:

```    outcome:           Apple  Banana   Cake    Doughnut    Eclair
P(outcome)           .3      .1      .4        .1         .1
```

and a random variable X the function:

```
outcome           Apple   Banana   Cake   Doughnut    Eclair
X(outcome)           1       1       10        5         12
```

the resultant probability distribution function of X is

```               X        1     5      10      12
P(X)      .4    .1     .4      .1
```

Often we shall be working with probability distributions without knowing the probability space or random variable which produced them.
Analogous to when we had a frequency distribution, for a probability distribution we can calculate the mean of a random variable X which is called the expected value and denoted by
E[X] = *sum*x(i)P(x(i))= *sum* x(i)p(i)
We can also calculate the variance which is denoted by
V[X] = *Sum* ((x(i)-E[X])^2) p(i)
In the above example, the mean is E[X] = 1 × .4 + 5 × .1 + 10 × .4 + 12 × .1 = 6.1
and the variance is: V[X] = *Sum* ((x(i)-E[X])^2) p(i) =
(1-6.1)^2 × .4 + (5-6.1)^2 × .1 + (10-6.1)^2 × .4 + (12-6.1)^2 × .1 = 20.0860
As before, the standard deviation (denoted by a lower case sigma) is the square root of the variance, in this example 4.4817.

Two rules for means and variances of random variables which shall be useful are:

• If X and Y ar two random variables, E[X+Y} = E[X] + E[Y] (The expected value of the sum is the sum of the expected values).
• If X and Y are two indpendent random variables, V[X+Y] = V[X] + V[Y] (For independent random variables, the variance of the sum is the sum of the variances).
For example, you are invited to verify that the expected value for the number of pips when a single die is thrown is 3.5; the expected total number of pips when two dice are thrown is 7.
Also the variance of the number of pips when a single die is cast is 2.9167, and the variance when two dice are cast is 5.8333. Note that the standard deviations for the number of pips of one (1.7078) and two (2.4152) dice differ by a factor of the square root of two.

Competencies: Calculate the expected value, variance, and standard deviation for the following probability distribution function:

```                X        1      3      9      12
P(X)      .3    .2      .4     .1

Reflection: How can a given experiment be associated with different probability distribution functions?  How can different experiments be associated with the same probability distribution function?