This course offers a brief introduction to the multivariate calculus required to build many common machine learning techniques. We start at the very beginning with a refresher on the “rise over run” formulation of a slope, before converting this to the formal definition of the gradient of a function. We then start to build up a set of tools for making calculus easier and faster. Next, we learn how to calculate vectors that point up hill on multidimensional surfaces and even put this into action using an interactive game. We take a look at how we can use calculus to build approximations to functions, as well as helping us to quantify how accurate we should expect those approximations to be. We also spend some time talking about where calculus comes up in the training of neural networks, before finally showing you how it is applied in linear regression models. This course is intended to offer an intuitive understanding of calculus, as well as the language necessary to look concepts up yourselves when you get stuck. Hopefully, without going into too much detail, you’ll still come away with the confidence to dive into some more focused machine learning courses in future.

이 강좌는 머신 러닝 수학 전문 분야의 일부입니다.

# Mathematics for Machine Learning: Multivariate Calculus

제공자:

## About this Course

### 학습자 경력 결과

## 35%

## 26%

### 귀하가 습득할 기술

### 학습자 경력 결과

## 35%

## 26%

#### 공유 가능한 수료증

#### 100% 온라인

#### 다음 전문 분야의 3개 강좌 중 2번째 강좌:

#### 유동적 마감일

#### 초급 단계

#### 완료하는 데 약 22시간 필요

#### 영어

## 강의 계획 - 이 강좌에서 배울 내용

**완료하는 데 4시간 필요**

## What is calculus?

Understanding calculus is central to understanding machine learning! You can think of calculus as simply a set of tools for analysing the relationship between functions and their inputs. Often, in machine learning, we are trying to find the inputs which enable a function to best match the data. We start this module from the basics, by recalling what a function is and where we might encounter one. Following this, we talk about the how, when sketching a function on a graph, the slope describes the rate of change of the output with respect to an input. Using this visual intuition we next derive a robust mathematical definition of a derivative, which we then use to differentiate some interesting functions. Finally, by studying a few examples, we develop four handy time saving rules that enable us to speed up differentiation for many common scenarios.

**완료하는 데 4시간 필요**

**10개의 동영상**

**4개의 읽기 자료**

**6개 연습문제**

**완료하는 데 3시간 필요**

## Multivariate calculus

Building on the foundations of the previous module, we now generalise our calculus tools to handle multivariable systems. This means we can take a function with multiple inputs and determine the influence of each of them separately. It would not be unusual for a machine learning method to require the analysis of a function with thousands of inputs, so we will also introduce the linear algebra structures necessary for storing the results of our multivariate calculus analysis in an orderly fashion.

**완료하는 데 3시간 필요**

**9개의 동영상**

**5개 연습문제**

**완료하는 데 3시간 필요**

## Multivariate chain rule and its applications

Having seen that multivariate calculus is really no more complicated than the univariate case, we now focus on applications of the chain rule. Neural networks are one of the most popular and successful conceptual structures in machine learning. They are build up from a connected web of neurons and inspired by the structure of biological brains. The behaviour of each neuron is influenced by a set of control parameters, each of which needs to be optimised to best fit the data. The multivariate chain rule can be used to calculate the influence of each parameter of the networks, allow them to be updated during training.

**완료하는 데 3시간 필요**

**6개의 동영상**

**3개 연습문제**

**완료하는 데 3시간 필요**

## Taylor series and linearisation

The Taylor series is a method for re-expressing functions as polynomial series. This approach is the rational behind the use of simple linear approximations to complicated functions. In this module, we will derive the formal expression for the univariate Taylor series and discuss some important consequences of this result relevant to machine learning. Finally, we will discuss the multivariate case and see how the Jacobian and the Hessian come in to play.

**완료하는 데 3시간 필요**

**9개의 동영상**

**5개 연습문제**

### 검토

#### 4.7

##### MATHEMATICS FOR MACHINE LEARNING: MULTIVARIATE CALCULUS의 최상위 리뷰

Very Well Explained. Good content and great explanation of content. Complex topics are also covered in very easy way. Very Helpful for learning much more complex topics for Machine Learning in future.

Great course to develop some understanding and intuition about the basic concepts used in optimization. Last 2 weeks were a bit on a lower level of quality then the rest in my opinion but still great.

Excellent course. I completed this course with no prior knowledge of multivariate calculus and was successful nonetheless. It was challenging and extremely interesting, informative, and well designed.

Very clear and concise course material. The inputs given during the videos and the subsequent practice quiz almost force the student to carry out extra/research studies which is ideal when learning.

i think some of concepts touched the surface and it was difficult to get a deep understanding .Probably the course could have provided some external links for those topics where people could read .

Excellent course!\n\nI studied multivariate calculus during engineering. I hardly understood the concepts at that time, this course helped me understand and visualize what is going behind formulas.

Superb quality. The way instructors teach is really innovative. The course is good in terms of the area it covers but lacks depth, but is a good starting point if you want to dwell more in detail.

I highly recommend this course.\n\nEvery Machine Learning student have to do it. Some concepts is so clearly explained that you will be able to perform better in following ML studies.

Just a great course for getting you ready to understand machine learning algorithms. The chapter on backpropagation is simply outstanding and the programming assignments are awesome!

As good as the first class in the Math for ML series. Instruction was interesting. Questions were not too confusing. Clearly a lot of time was spent producing this class. Thank you.

I wish, Linear Regression was taught with a little more clarity. Seemed like too many things were happening. Otherwise, a very good course. Really enjoyed the back-propagation week.

A wonderful course. I learnt a lot after struggling to finish it. Some foundations of calculus might be needed since the lecturer goes through differntiation in a tremendous speed.

### 제공자:

#### 임페리얼 칼리지 런던

Imperial College London is a world top ten university with an international reputation for excellence in science, engineering, medicine and business. located in the heart of London. Imperial is a multidisciplinary space for education, research, translation and commercialisation, harnessing science and innovation to tackle global challenges.

## 머신 러닝 수학 전문 분야 정보

## 자주 묻는 질문

강의 및 과제를 언제 이용할 수 있게 되나요?

강좌에 등록하면 바로 모든 비디오, 테스트 및 프로그래밍 과제(해당하는 경우)에 접근할 수 있습니다. 상호 첨삭 과제는 이 세션이 시작된 경우에만 제출하고 검토할 수 있습니다. 강좌를 구매하지 않고 살펴보기만 하면 특정 과제에 접근하지 못할 수 있습니다.

이 전문 분야를 구독하면 무엇을 이용할 수 있나요?

강좌를 등록하면 전문 분야의 모든 강좌에 접근할 수 있고 강좌를 완료하면 수료증을 취득할 수 있습니다. 전자 수료증이 성취도 페이지에 추가되며 해당 페이지에서 수료증을 인쇄하거나 LinkedIn 프로필에 수료증을 추가할 수 있습니다. 강좌 내용만 읽고 살펴보려면 해당 강좌를 무료로 청강할 수 있습니다.

환불 규정은 어떻게 되나요?

재정 지원을 받을 수 있나요?

궁금한 점이 더 있으신가요? 학습자 도움말 센터를 방문해 보세요.