We're now going to review the concepts of conditional expectation and conditional

variances. We'll see that conditional expectations

identity as well as the conditional variance identity, and we'll see an

example where we put them to work. This material is useful because we will

use it later in the course when we discuss credit derivitives.

When x and y be two random variables, then the conditional expectation identity

states that the expected value of x can be calculated as follows.

We first compute the expected value of x conditional on y and then we actually

compute the expected value of that quantity.

Likewise, the conditional variance identity states that we can compute the

variance of x as the summation of two quantities.

First of all, we can compute the expected value of x given y, and then compute the

variance of this quantity. And then, also compute the variance of the

expected value, sorry, then compute the variance of x given y and compute the

expected value of this quantity. Okay.

One thing I want to emphasize here is that the expected value of x given y, and the

variance of x given y, are both functions of y, and are there for random variables

themselves. So for example, I actually could write

this as g of y, say, so this is also a g of y, and maybe the variance of x given y,

I could write as h of y. So g of y, and h of y, are random

variables. So in fact, I can write the expected value

of x, as being the expected value of g of y, okay.

And, I can write the variance, of x, as then being equal to the variance, of the

random variable g of y. Plus the expected value of the random

variable h of y. Okay so there the conditional expectation

and conditional variance identities and they can be very useful in many

applications. So we'll see one application here.

So, we want to compute a random sum of random variables.

In particular we're going to that w equal to x1 plus x2 and so on up to xn where the

xi's are IID with mean ux and variance sigma x squared.

But where n is also a random variable and this random variable is assumed to be

independent of the variable xi's. So the question that arises is the

following. What is the expected value w?

Well I can compute the expected value of w using the conditional expectation

identity. In particular the expected value of w.

Well, over here I can write that as being equal to the expected value of the

expected value of w given n. And this quantity here inside here is w.

Okay. So the expected value of w, given n, if

you think about it, the expected value of w, given n, is equal to the expected value

of this summation, i equals 1 to n of the xi's.

And because n is a constant, given n, I should have an n here, okay.

This is equal to the sum, n, i equals 1, of the expected value of the xi's, and

this is equal to, well this is equal to mu x.

And there's n terms, so that's n mu x. And that's where this comes from here.

Okay. So now the mu x is a constant.

We can take it outside the outer expectation over here, and we're left with

the expected value of N. So that's how we compute the expected

value of w. How about the variance of w.

Well, we can compute the variance of w by using the conditional variance identity.

So this is the variance identity here. We've already calculated the expected

value of w given n. It's equal to n times mu x, so that's what

this quantity is down here. The variance of w given n, well that's the

variance of this quantity given n. These are n IID rounding variables and the

variance of n IID rounding variables is simply n times the variance of one of

them, which is sigma x squared. And so we get n times sigma x squared

here. So now, the variance of mu x times n, well

mu x is a constant, so it comes out the variance, outside the variance is a

square. And we're left with mu x squared times the

variance of n. And over here, sigma x squared is a

constant, and it comes outside the expectation, and we're left the expected

value of n. So that's how we compute the variance of

w. So here's an example with chickens and

eggs. A hen lays n eggs where n is plus on with

parameter lambda. Each egg hatches and yields a chicken with

probability p independently of the other eggs and then.

Let k be the number of chickens. So the first question I want to ask is,

what is the expected value of k given n? And of course one of the reasons I want to

ask this question is because I want to introduce indicator functions, which are

often very useful in probability, and in fact we'll use them later in the course.

We'll be using indicator functions later in the course to describe the event of

companies defaulting on their bonds. So we'll use it to compute the expected

number of defaults in the basket of bonds for example.

So it's a good place right now, we could add here right now to introduce these

indicator functions. So what we're going to do is we're going

to write the total number of chickens, k, as being the sum from I equals 1 to

capital N times one subscript hi where hi is the event that the ith egg hatches.

Okay. So in particular, one subscript to hi.

This is the indicator function, and it takes on 2 possible values.

It takes on the value 1 if the ith egg hatches.

And it takes on the value 0 otherwise. So in fact that's an indicator function in

general it takes on two values one and zero.

One if the event in question occurs zero otherwise.

In this particular example the event in question is hi which is the event that the

ith egg hatches. Okay so we've written k as the sum from i

equals 1 to n of these indicator functions.

It's also clear that the expected value of one of these indicator functions is easily

computed. In particular, it takes on the value 1

with probability p. It takes on the value 0 with probability 1

minus p, and so this is equal to p. So the expected value of the indicator

function 1 hi is equal to p. So now the expected value of k given n is

the expected value of this quantity which is k given n and n is a constant at this

point because of conditions on its value. So, we can just take the expectation

inside the summation and get this, but we know the expected value of 1hi is equal to

p. There's n of these terms, so we get np.

So therefore the expected value of k given n equals n times p.

Which of course is what you'd expect if you've got n eggs and each of them occurs

with probability p. You would expect the total number of

chickens to be n times p. Okay.

So now, we can use the conditional expectation form to compute the expected

value of k. The expected value of k is equal to the

expected value of the expected value of k, given n.

But that, we have calculated there. It's np.

So now, the expected value of k is the expected value of np.

P is a constant. So it can come outside over here.

And the expected value of n? Well n is Poisson we're told and if we

recall, the expected value of a Poisson random variable is equal to lamda and so

that's why we get the lamda down here. And so we see that the expected number of

chickens is equal to lambda times p.