Welcome back to our lecture, in this lecture you will learn how to choose the model. Previously, we considered a linear model, in this linear model beta 2 measures they change of y when x changes by one unit. But actually economists are interested in more than linear regressions in more than linear models. And usually economists use logarithmic specifications. Today we consider three different logarithmic specifications. The first one is when we take the natural logarithms for the dependent variable y. The second one is the linear-log model, and in this case we take the natural logarithms for the independent variable x. And the third specification is the log-log specification, or the logarithm specification, where we take natural logarithms for both independent and dependent variable. Usually these specifications are used when the effect of x on y is not constant. For example, the effect of x on y decreases as the independent variable x increases. Let's consider a linear model, in this linear model the coefficient beta 2 measures the change of y with respect to the one unit change of x. And in order to show this interpretation, we calculate the derivative, the derivative is easy to calculate here and we can see explicitly that beta 2 is this ratio. So beta 2 is a constant marginal effect of x on y and therefore the interpretation is the following. Changing x by one unit leads to a change of y by beta 2 units on average, other things being equal. Let's consider a second example. Let's consider a log-linear model. In this case we take the log-linear model on the left hand side and we take the natural logarithms for variable y. In order to make the interpretation for our model, we need to calculate the derivative again. Here the calculation of the derivative is not so straightforward. So let's try to do that. We take the derivative of y with respect to x and instead of y we substitute the exponent to the power of natural logarithms, then we plug in the model for the natural logarithms of y. And in this case, our derivative is the multiplication of y beta 2. So beta 2 shows us the relative change of y, with respect to a unit change in x. And therefore the interpretation is the following a change of x by one unit leads approximately to a beta multiplied by 100% change in y on average are the things being equal. In another example, when we consider a linear log model when we take the natural logarithms on the right hand side of our regression model, we also calculate the derivative for the interpretation of our coefficient beta 2. The derivative is actually easy to calculate and we can straightforward see that beta 2 is the absolute change in y, with respect to the relative change in x. So the interpretation for a linear log model is the following. A change of x by 1% leads approximately to a beta 2 divided by 100 change in y on average, everything else being equal. And the last specification is the logarithmic model. When we take the natural logarithms for both the left hand side and the right hand side. Here calculation of the derivative is similar to what we've done before, and you can see that beta 2 shows us a relative change y with respect to the relative change of x. In this case, the interpretation is actually very interesting for economists. This is an elasticity of y on x. So the interpretation is the following. The 1% change in x leads approximately to a beta 2% change in y are the things being equal? Usually, economists use this logarithmic specification because they want to estimate the elasticity. So previously were considered linear specification, and this linear specification means that we deal with the additive effect. This additive effect is quite often in nature, but usually economists deal with multiplication effects. These multiplication effects are used when we deal with population models for example, macroeconomic models where we have time, serious data and so on. In these multiplication effects, we have the non linearity, and actually we cannot deal with these models explicitly. And in this case we use logarithmic specifications, because linearizing the model is easy when we take the logarithm. One of the examples of these models is the Cobb Douglas production function here. The output y depends on the labor and the capital. We can linearize this model with the help of the natural logarithms, and then we get the linear model, which we can estimate with the help of the ordinary least squares. Here beta 1 shows us the return to labor, and beta 2 shows us the return to capital and they are the elasticities. Here it is often asked whether we have the constant returns to scale or not, and this is the most often hypothesis tested here. When we have such a great range of specifications, now we have to be able to choose the best model between them. Previously, when we discussed the quality of the model, we use the R squared or the R squared adjusted. But it is important to remember that R squared can be used only in the case when we have the same dependent variable. In case when we deal with the models where the dependent variable changes, we have to be accurate in our analysis. We cannot use R squared adjusted anymore, because the total sum of squares calculated here is based on the different scale. And in order to choose between logarithmic and linear specifications, we use the Box-Cox tests. The Box-Cox test is based on the idea that we can actually test the parameter lambda. Here the parameter lambda helps us to choose between the linear and the logarithmic specification. In order to do the Box-Cox test, first we need to transform our dependent or independent variable or both of them. After the transformation, we evaluate the model and we evaluate the residual sum of squares. So after that, we can test the hypothesis by testing the hypothesis for lambda 1 and lambda 2, and this is called the Box-Cox test. Let's consider, for example, a special case when we have to choose between the linear model and the log-linear model. In this case, we need to transform the y variable. If the transformation is made for the linear model, then the lambda is equal to 1. And if the transformation is made for the logarithmic model, the lambda will go to 0, and then lambda 1 shows us that we need to use the linear model. And lambda which is equal to 0, shows us that we have to use the logarithmic model. In this video, we learned how to use logarithmic specifications in order to build a better model. In the next video, we're going to use dummy variables, which also gives us a great range of model transformation.