To proceed, we have to discuss variance of sum of two random variables. Previously, we found that variance of sum of two random variables is not necessarily equal to sum of variances. However, it appears that if two random variables are independent, it is true that variance of sum is equal to sum of our answers. To prove it, first, we have to prove an additional Lemma, and this proof also introduce a notion of covariance of two random variables. Let us find a variance the sum of two random variables. We will use definition of variance to do it. This is expected value of difference between x plus y, and expected value of x plus y squared. Now, we can use property that expected value of sum is equal to sum of expected values and rearrange downs in this sum. Now, we can expand this square and get the following. Now, we can use the rule that expected value of sum of these three random variables is equal to sum of their corresponding expected values. Now, we can see that this expected value is equal to variance of x, and this expected value is equal to variance of y. But we also have this term, and this term is called covariance, this part of the term except of this coefficient two. This term is called covariance of random variables x and y. Covariance measures some kind of association between x and y. It is strongly related to dependence between x and y, as we will see in the next Lemma. If two random variables are independent, then their covariance is equal to 0. Let us prove it. Let us first consider a simple case. Let us assume that expected value of x is equal to 0, and expected value of y is equal to 0. In this case, covariance of x and y is equal to expected value of x minus 0, times y minus 0. It means that it is expected value of x times y. Due to independence between x and y, expected value of product is equal to product of expected values. But both expected values are equal to 0, so their products is also 0. The crucial idea here is that we have this equality due to independence of x and y. In general case, formulas will be a little bit more complicated but the idea is the same. Now, consider the general case. Let us expand this product. We have a sum of four random variables, so we have split it into a sum of four expected values. This is expected value of products x and y. Due to independence, this thing is equal to product of expected values. Here, y is random variable but expected value of x is just a number. So we can move this number out of the sign of expected value. In this case, we again have product of expected value of x and expected value y. The same thing here, we can move expected value of y out of this expected value, and again get the same product with the sign minus. Finally here, product of these expected values is just a number. It is not a random variable, it's just a constant. So expected value of this constant is equal to this constant itself. So we have plus expected value x, times expected value y. We see four terms that consists of product of expected value of x and y with different signs. So we see that these two terms cancel each other, and these two terms cancel each other as well. So the overall, this is equal to 0 as expected. This has finished the proof. Direct corollary of this Lemma shows us that if x and y are independent, then, this term is equal to 0. In this case, variance of x plus y, is equal to variance of x plus variance of y. Now, let us discuss some properties of covariance.