Covariance merits some property of association between two random variables. Let us discuss some properties of covariance. First, as we discussed just right now, covariance can be used to determine the variance of sum of two random variables. Second, if two random variables are independent the covariance is equal to zero. It doesn't follow that if covariance equals to zero, then x and y are independent. Is it true that inverse halts. So is it true that if covariance is equal to zero, then x and y are independent. Let us consider the following example. Let us consider two random variables x and y. Ways of the following joint distribution. X takes values, negative one, zero and one. Y takes values, one and negative two. The corresponding probabilities are the following. Here we have one-third, here we have zero, here we have one-third. Here we have zero, here we have one-third, and here we have zero. Then we can find marginal distributions. For x, every value that it can take, it takes with probability of one-third. Because here we have zeros. So we have one third here, one-third here and one here. For y, we have two-thirds here and one third here. It is clear that x and y are not independent to each other. For example, because this zero is not a product of this one-third and these two-thirds. But what can you say about covariance between x and y? First of all, we have to find expected value of x and expected value of y. What is expected value of x? We see that x takes value or negative one with probability one-third. Zero with probability one-third and one with probability one-third. This is asymmetric distribution, and we see that the expected value of x is equal to zero. Directly calculation shows us that expected value of y is also called, the is also equals to zero. Now, to find the covariance between x and y, we just need to find expected value of their products. Expected value of these products can be calculated in the following way. It is non-negative, that this product is non-negative if x is non-negative. The only values for which the corresponding probability is non-zero is this cell and this cell. So this expected value is equal to the following thing. It is negative one times one times one-third plus negative two times zero times one-third, and plus one times one times one-third. It is clear that the result is zero. So we have two variables, which covariance is equal to zero, but they are not independent. So we see that the inverse is not true. Now, let us discuss some more properties of covariance. Some obvious ones. For example, covariance that of x and y is equal to covariance of y and x. This follows immediately from the definition because in the definition of covariance, we have expected value of products. Of course, if we solve this terms, the product does not change. Then if we add some constant to x or to y, covariance will not change. Indeed, if we add some constant to x the same constant will be added to its expected value, and they cancel on each other. So the covariance doesn't change when you add a constant to one of these variables. The same as for the second argument of covariance as well. Now, what about multiplication by a constant? Covariance of cX, Y is equal to c times covariance of x, y. Again, this is a direct consequence of definitions. Let us prove it. This covariance is equal to expected value of cX negative expected value cX times y minus expected value y. As previously, we can move this c out of this expected value, and then we can move them both out of this expected value. So we have c expected value, x minus expected value of x, y minus expected value of y. Now, let us consider a special case. Let us consider a case when y and x are related by some linear relation, it means that y is equal to k x plus b. K and b are constants. In this case, covariance between x and y is equal to the following. This is covariance between x and k x plus b. Now, we can use the rules that we discussed before. Say that this thing is equal to covariance of x, k x using here we use this rule number 4. Now, we can move k out of the covariance, using rule number 5, and get k times covariance of x, x. By the way, what is covariance of x, x? We see from definition that if here y is equal to x, then this term is equal to this term. We have square of these terms. So covariance of x, x is equal to variance of x. So we have k times variance x. In a sense, covariance measure some linear relationship between two variables. If we know that the knowledge that one variable is large gives us some information like the other variable is also large, and vice versa. Then they have non-zero covariance. But a little bit better tool to measure this kind of linear relationship, is correlation. We will discuss correlation later.