An old saying goes, if you have a hammer, everything looks like a nail. And that's exactly what you will get by the end of this week. You will get that machine learning hammer. The hammer is neural networks combined with TensorFlow, a powerful library for Machine Learning open sourced by Google in November 2015. Chances are that you might have heard about or used these tools before. But if not, then I think that the best time and place to learn about these tools will be now and here. So here's what you will learn in this week. Let's start with our first lesson of this week. And first, you will get familiar in this lesson with our main tool, TensorFlow. We will start with a review of how TensorFlow implements what is called a computational graph. Then we will open our first TensorFlow Jupyter Notebook and see how these things work by looking at very simple operations. Next, you will see how simple linear regression can be implemented TensorFlow and compare it with how it's done with other Python packages. And after that, you will learn how neural nets are organized and how linear regression is nothing but a very special and simple neural network. You will also see why neural networks can be seen as a kind of machine learning hammer. After you learn about neural networks, you will learn about methods used for their training. And here, you will meet some of main power courses of modern machine learning, namely, the Gradient Descent method and its relatives, such as Stochastic Gradient Descent. We will see they're working both in theory and in practice, that is, in slides and in Jupyter Notebooks together. Here, we'll see how it works first in a linear regression and then neural network regression, implemented to TensorFlow. All these will be done using some simple non-financial data, so that we will not get lost litigating in several new topics simultaneously. All these topics will be presented in the first lesson. Then after we get such a powerful hammer, we will start to look for financial nails. And the first nail you will see is a very classical financial problem, namely the problem of predicting earnings per share, or EPS for short. Once we have our tools such as TensorFlow and other packages, such as scikit-learn and statsmodels, we will be able to try both linear and nonlinear neural network regression on this problem. And after that, we will turn to classification methods. We will start with looking at how machine learning deals with probabilistic models. And then we will talk about maximum likelihood estimation and related methods such as Maximum A-Posteriori method. We will also talk about the notion of relative entropy, also known as KL divergence, which is very important for understanding for machine learning. Then you'll see how all this machinery applies to classification problems. You will learn about one of the most powerful approach to classification called logistic regression. You will also learn how it can be implemented in scikit-learn and TensorFlow. You will see how logistic regression is just another special case of our general hammer called neural nets. And finally, our hammer will find its second financial nail that will be a problem of predicting bank failures. As you will see, our hammer will be pretty good for this type of nails. So this is what you will master this week.