Mathematics for Machine Learning: Multivariate Calculus(으)로 돌아가기

# 임페리얼 칼리지 런던의 Mathematics for Machine Learning: Multivariate Calculus 학습자 리뷰 및 피드백

4.7
별점
3,088개의 평가
518개의 리뷰

## 강좌 소개

This course offers a brief introduction to the multivariate calculus required to build many common machine learning techniques. We start at the very beginning with a refresher on the “rise over run” formulation of a slope, before converting this to the formal definition of the gradient of a function. We then start to build up a set of tools for making calculus easier and faster. Next, we learn how to calculate vectors that point up hill on multidimensional surfaces and even put this into action using an interactive game. We take a look at how we can use calculus to build approximations to functions, as well as helping us to quantify how accurate we should expect those approximations to be. We also spend some time talking about where calculus comes up in the training of neural networks, before finally showing you how it is applied in linear regression models. This course is intended to offer an intuitive understanding of calculus, as well as the language necessary to look concepts up yourselves when you get stuck. Hopefully, without going into too much detail, you’ll still come away with the confidence to dive into some more focused machine learning courses in future....

## 최상위 리뷰

##### DP

Nov 26, 2018

Great course to develop some understanding and intuition about the basic concepts used in optimization. Last 2 weeks were a bit on a lower level of quality then the rest in my opinion but still great.

##### JT

Nov 13, 2018

Excellent course. I completed this course with no prior knowledge of multivariate calculus and was successful nonetheless. It was challenging and extremely interesting, informative, and well designed.

필터링 기준:

## Mathematics for Machine Learning: Multivariate Calculus의 519개 리뷰 중 376~400

교육 기관: Kuo P

Mar 15, 2018

excellent

교육 기관: Rodrigo F

Sep 18, 2019

Amazing!

교육 기관: Мусаллямов Д Н

May 31, 2019

Awesome!

교육 기관: James A

Jan 14, 2019

Amazing!

교육 기관: AMIT K A

Jul 27, 2018

V

E

R

Y

G

O

O

D

교육 기관: Wong Y W M

Feb 21, 2020

Thanks.

교육 기관: Bálint - H F

Mar 20, 2019

Great !

교육 기관: Shanxue J

May 23, 2018

Amazing

교육 기관: Liang Y

Jun 21, 2019

Great!

교육 기관: Shuvo D N

May 26, 2019

Great!

교육 기관: Nitish K S

Jul 18, 2018

nice !

교육 기관: Kailun C

Jan 25, 2020

niubi

교육 기관: Nathan L

Mar 06, 2020

goot

교육 기관: Zhao J

Sep 11, 2019

GOOD

교육 기관: HARSH K D

Jun 26, 2018

good

교육 기관: Omar D

May 05, 2020

gd

교육 기관: Rinat T

Aug 01, 2018

the part about neural networks needs improvement (some more examples of simple networks, the explanation of the emergence of the sigmoid function). exercises on partial derivatives need to be focused more on various aspects of partial differentiation rather than on taking partial derivatives of some complicated functions. I felt like there was too much of the latter which is not very efficient because the idea of partial differentiation is easy to master but not always its applications. just taking partial derivatives of some sophisticated functions (be it for the sake of Jacobian or Hessian calculation) turns into just doing lots of algebra the idea behind which has been long understood. so while some currently existing exercises on partial differentiation, Jacobian and Hessian should be retained, about 50 percent or so of them should be replaced with exercises which are not heavy on algebra but rather demonstrate different ways and/or applications in which partial differentiation is used. otherwise all good.

교육 기관: Yaroslav K

Apr 08, 2020

1) Totally British English with a bunch of very rare-used words and phrases globally. 2) The pace of the course is just not suitable for me. If you don't have strong math or engineer background you will need to search for the explanations somewhere else (khan academy - a great resource, etc.). Closer to the end of the course I stopped having a full understanding of what's going on and why. So I could calculate things, but I don't feel that I will able to that in 1-2 week because I didn't have a time and opportunity to strengthen gained skills. 3) Also I don't understand why instructors (especially David) don't visualize what they say like Sal or Grant are doing. They draw on the desk and on the plots and so on. Sometime it looks like you just listen to audio-book about the Math.

I will take Stanford ML course after this course and also review what I've learned here with Khan Academy resource.

교육 기관: Matteo L

Apr 20, 2020

This course is a great refresher for someone who has already studied these topics previously. The topics were very well illustrated and the objective of getting a good intuition of the math is achieved in my opinion.

I thought the examples like the neural network and the sandpits were great. That being said, I'd have liked to go a little bit deeper on the subject of optimization.

In general, I do feel that it would have been nice to have more practice on the topics (e.g. linear approximation and its use were not covered very thoroughly in my opinion). Also, the notebook assignments are far too easy and therefore don't add enough to the learning experience.

교육 기관: Ronny A

Jun 27, 2018

Course is pretty good. I like how well thought out the assignments are and the use of visualizations, even in the assignments, to enrich intuitive understanding. There were a couple of instances where the content wasn't clear and I referenced Khan Academy to clarify things for myself. The reason I give this course a 4-start rather than a 5-star is that it seems the teachers or else TAs were not responsive. Specifically, myself and another person had posted in the discussion forum how it seemed one of the slides had a typo in the Jacobian contour plot. There was no official response to this.

교육 기관: Fang Z

Jul 11, 2019

I really love Samuel's teaching style. He strived to make people understood by showing a lot of graph and I can easily follow him step by step. However, David's teaching I couldn't follow up his mind much maybe because less explanations given during the lecture.

In addition, I found some quiz have huge amount of calculated amount which I really spent a lot time to verify the answer.

Finally, I hope more detailed explanations could be given if I made mistakes in some quiz so I could boost what I've learned so far.

Thanks,

Fang

교육 기관: Hermes J D R P

Feb 28, 2020

The first 4 weeks of the course were amazing: great content, clear explanations and fair and interactive assessment activities. However, the last 2 weeks weren't as good as the previous ones. That's why I don't give this course 5 stars. By and large, the first two courses of this specialization are the best resources available on the internet to learn the foundations of mathematics for Machine Learning. I recommend that instead of doing the last course, you had better try to read the related book wrote by Deisenroth.

교육 기관: Wu X

Apr 21, 2020

This course teaches multivariate calculus and its applications. In particular, Jacobian and Hessian Matrix are introduced as Matrix versioned derivatives (first order and second order), along with gradient descent optimization based on them. The structure of the course is a little bit loose, so it's not a good choice for those who want to seek systemically arranged learning materials. But it still worth taking for a better perspective and ideas.

교육 기관: Saras A

Jan 29, 2020

Good course. I wish it had more sections as in a total of 12 sections or weeks and more steps to gain a more thorough graphical understanding (and perhaps even a more mathematical/algebraic understanding however overall that's much easier for me on that front...).

From a Data Science or Machine Learning perspective Week 6 (linear regression and non linear regression with chi-squared methods etc) were the most interesting.

교육 기관: Donna D C

Apr 25, 2020

Nice balance between rigor and developing intuition (again as in the previous linear algebra course in this series). I would’ve liked some “homework” reading about backpropagation for training the simple neural to prepare for the future courses. Also, more references for additional reading on least squares minimization techniques to tie more into the statistics underlying the techniques. I love the stuff, thank you!!