If you ask about the correspondence between the existence of partial derivatives and the continuity of a function, recall from a single variable calculus that there is an implication. Whenever a function or one variable is differentiable, if is differentiable, that means that F prime exists, then the function F of X is continuous or simply put it, belongs to C. That in the opposite direction, we have no implication because we know the absolute value of X is continuous, but it's not differentiable at zero. So, what we can tell in the multi-variable case, is there any implication? Actually, the answer is none. That means that if a function is continuous, then not always we can find partial derivatives and we can use the same absolute value of X function. But what's more interesting is that, if partial derivatives exist, not necessarily the function equation is continuous. It seems the kind of unusual but we can find easily an example which shows this, shows that such implication doesn't exist. For example, let's consider a function F of X, Y which takes only two values. It takes value one. So, this is a constant function. In the case when the product of two variables X times Y equals zero and it takes the value zero when the product is not zero. So, we can imagine the graph of this function in 3D. So, the value of the function except for the points which belong to the X is equals zero, but at any point which belongs to either the X-axis or Y-axis, the value of the function is one. Well, clearly this function is not continuous at the origin. So, we're discussing just one point, the origin. What about the partial derivatives? Whether they exist at the origin or they do not exist. In order to find partial derivatives, we apply the definition. So, how to find FX at zero zero, how to find. So, we substitute for Y, zero value, that means we look at the statement, above statement here. So, the function equals one and we find the limit when X tends to zero. So, the function's equals one, we subtract one and we divide by X. So, the value of the derivative equals zero. The same true for another partial derivative FY at zero zero. So, this time we take zero value for X, so the function equals one. Once again, we subtract from one, also one and we get zero. So, this is also zero. So, this example tells us that although the function is not continuous at a point, both derivatives exist. Now, let's consider higher order partial derivatives and we proceed with the second-order. Second-order derivatives. As we understand, the first-order derivatives are themselves functions. DF over DX is a function of X, Y and DF over DY also a function of X, Y. So, we can apply the definition of partial derivatives to them. That means for example, if we choose as the first candidate for the further differentiation, Df over DX this is notation, that's how we get a second order derivative with respect to X alone, that's notation. So, we use squares here and there. So, by definition, this is the first-order derivative or the first-order derivative. In the same function, we can find another second- order derivative if we choose DF or DX, but this time we'll differentiate with respect to y. So, we have the order of differentiation, differentiation is as follows. This time I'll write down the combination of DX, DY here and denominator later on. So, we're differentiating with respect to Y, the first-order derivative. I will keep the same order, so that becomes DY, DX. Now, if I work with the first order derivative DF over DY, I also have derivative where we differentiate only with respect to Y, this how we get D squared F, DY squared which is by definition, is the double differentiation and we also get another derivative similar to this one, but with a different order of differentiation. So, we get D squared F over DX, DY. Both derivatives, these and that are called cross derivatives or mixed cross derivatives. Let's consider an example, for instance we can try the same Cobb Douglas function we use earlier as an example. L cubed times K squared and let us find mixed or cross derivatives. So, if we find D squared Q over DK, DL, that means that firstly, we differentiate with respect to L. After that, we differentiate with respect to K. In both cases, first we get six and here we have L squared K. If we reverse the order of differentiation and firstly differentiate with respect to K and later on with respect to L, it looks as if that would get the same result, absolutely the same result. Which gives us an idea that probably this is always true. This is not always true, there are some counter examples, which we'll not discuss at the moment. But what's important, is the theoretical fact. When the equality of cross derivatives takes place, how can we guarantee that regardless of the order of differentiation, we get the same second-order derivative?