이 강좌는 Mathematics for Machine Learning 전문 분야의 일부입니다.

제공자:

Mathematics for Machine Learning 전문 분야

임페리얼 칼리지 런던

About this Course

4.0

731개의 평가

•

148개의 리뷰

This intermediate-level course introduces the mathematical foundations to derive Principal Component Analysis (PCA), a fundamental dimensionality reduction technique. We'll cover some basic statistics of data sets, such as mean values and variances, we'll compute distances and angles between vectors using inner products and derive orthogonal projections of data onto lower-dimensional subspaces. Using all these tools, we'll then derive PCA as a method that minimizes the average squared reconstruction error between data points and their reconstruction.
At the end of this course, you'll be familiar with important mathematical concepts and you can implement PCA all by yourself. If you’re struggling, you'll find a set of jupyter notebooks that will allow you to explore properties of the techniques and walk you through what you need to do to get on track. If you are already an expert, this course may refresh some of your knowledge.
The lectures, examples and exercises require:
1. Some ability of abstract thinking
2. Good background in linear algebra (e.g., matrix and vector algebra, linear independence, basis)
3. Basic background in multivariate calculus (e.g., partial derivatives, basic optimization)
4. Basic knowledge in python programming and numpy
Disclaimer: This course is substantially more abstract and requires more programming than the other two courses of the specialization. However, this type of abstract thinking, algebraic manipulation and programming is necessary if you want to understand and develop machine learning algorithms.

지금 바로 시작해 나만의 일정에 따라 학습을 진행하세요.

일정에 따라 마감일을 재설정합니다.

권장: 4 weeks of study, 4-5 hours/week...

자막: 영어

Python ProgrammingPrincipal Component Analysis (PCA)Projection MatrixMathematical Optimization

지금 바로 시작해 나만의 일정에 따라 학습을 진행하세요.

일정에 따라 마감일을 재설정합니다.

권장: 4 weeks of study, 4-5 hours/week...

자막: 영어

주

1Principal Component Analysis (PCA) is one of the most important dimensionality reduction algorithms in machine learning. In this course, we lay the mathematical foundations to derive and understand PCA from a geometric point of view. In this module, we learn how to summarize datasets (e.g., images) using basic statistics, such as the mean and the variance. We also look at properties of the mean and the variance when we shift or scale the original data set. We will provide mathematical intuition as well as the skills to derive the results. We will also implement our results in code (jupyter notebooks), which will allow us to practice our mathematical understand to compute averages of image data sets....

8 videos (Total 27 min), 6 readings, 4 quizzes

Welcome to module 141

Mean of a dataset4m

Variance of one-dimensional datasets4m

Variance of higher-dimensional datasets5m

Effect on the mean4m

Effect on the (co)variance3m

See you next module!27

About Imperial College & the team5m

How to be successful in this course5m

Grading policy5m

Additional readings & helpful references5m

Set up Jupyter notebook environment offline10m

Symmetric, positive definite matrices10m

Mean of datasets15m

Variance of 1D datasets15m

Covariance matrix of a two-dimensional dataset15m

주

2Data can be interpreted as vectors. Vectors allow us to talk about geometric concepts, such as lengths, distances and angles to characterise similarity between vectors. This will become important later in the course when we discuss PCA. In this module, we will introduce and practice the concept of an inner product. Inner products allow us to talk about geometric concepts in vector spaces. More specifically, we will start with the dot product (which we may still know from school) as a special case of an inner product, and then move toward a more general concept of an inner product, which play an integral part in some areas of machine learning, such as kernel machines (this includes support vector machines and Gaussian processes). We have a lot of exercises in this module to practice and understand the concept of inner products....

8 videos (Total 36 min), 1 reading, 5 quizzes

Dot product4m

Inner product: definition5m

Inner product: length of vectors7m

Inner product: distances between vectors3m

Inner product: angles and orthogonality5m

Inner products of functions and random variables (optional)7m

Heading for the next module!35

Basis vectors20m

Dot product10m

Properties of inner products20m

General inner products: lengths and distances20m

Angles between vectors using a non-standard inner product20m

주

3In this module, we will look at orthogonal projections of vectors, which live in a high-dimensional vector space, onto lower-dimensional subspaces. This will play an important role in the next module when we derive PCA. We will start off with a geometric motivation of what an orthogonal projection is and work our way through the corresponding derivation. We will end up with a single equation that allows us to project any vector onto a lower-dimensional subspace. However, we will also understand how this equation came about. As in the other modules, we will have both pen-and-paper practice and a small programming example with a jupyter notebook....

6 videos (Total 25 min), 1 reading, 3 quizzes

Projection onto 1D subspaces7m

Example: projection onto 1D subspaces3m

Projections onto higher-dimensional subspaces8m

Example: projection onto a 2D subspace3m

This was module 3!32

Full derivation of the projection20m

Projection onto a 1-dimensional subspace25m

Project 3D data onto a 2D subspace40m

주

4We can think of dimensionality reduction as a way of compressing data with some loss, similar to jpg or mp3. Principal Component Analysis (PCA) is one of the most fundamental dimensionality reduction techniques that are used in machine learning. In this module, we use the results from the first three modules of this course and derive PCA from a geometric point of view. Within this course, this module is the most challenging one, and we will go through an explicit derivation of PCA plus some coding exercises that will make us a proficient user of PCA. ...

10 videos (Total 52 min), 5 readings, 2 quizzes

Problem setting and PCA objective7m

Finding the coordinates of the projected data5m

Reformulation of the objective10m

Finding the basis vectors that span the principal subspace7m

Steps of PCA4m

PCA in high dimensions5m

Other interpretations of PCA (optional)7m

Summary of this module42

This was the course on PCA56

Vector spaces20m

Orthogonal complements10m

Multivariate chain rule10m

Lagrange multipliers10m

Did you like the course? Let us know!10m

Chain rule practice20m

4.0

148개의 리뷰이 강좌를 수료한 후 새로운 경력 시작하기

이 강좌를 통해 확실한 경력상 이점 얻기

급여 인상 또는 승진하기

대학: JS•Jul 17th 2018

This is one hell of an inspiring course that demystified the difficult concepts and math behind PCA. Excellent instructors in imparting the these knowledge with easy-to-understand illustrations.

대학: JV•May 1st 2018

This course was definitely a bit more complex, not so much in assignments but in the core concepts handled, than the others in the specialisation. Overall, it was fun to do this course!

Imperial College London is a world top ten university with an international reputation for excellence in science, engineering, medicine and business. located in the heart of London. Imperial is a multidisciplinary space for education, research, translation and commercialisation, harnessing science and innovation to tackle global challenges.
Imperial students benefit from a world-leading, inclusive educational experience, rooted in the College’s world-leading research. Our online courses are designed to promote interactivity, learning and the development of core skills, through the use of cutting-edge digital technology....

For a lot of higher level courses in Machine Learning and Data Science, you find you need to freshen up on the basics in mathematics - stuff you may have studied before in school or university, but which was taught in another context, or not very intuitively, such that you struggle to relate it to how it’s used in Computer Science. This specialization aims to bridge that gap, getting you up to speed in the underlying mathematics, building an intuitive understanding, and relating it to Machine Learning and Data Science.
In the first course on Linear Algebra we look at what linear algebra is and how it relates to data. Then we look through what vectors and matrices are and how to work with them.
The second course, Multivariate Calculus, builds on this to look at how to optimize fitting functions to get good fits to data. It starts from introductory calculus and then uses the matrices and vectors from the first course to look at data fitting.
The third course, Dimensionality Reduction with Principal Component Analysis, uses the mathematics from the first two courses to compress high-dimensional data. This course is of intermediate difficulty and will require basic Python and numpy knowledge.
At the end of this specialization you will have gained the prerequisite mathematical knowledge to continue your journey and take more advanced courses in machine learning....

강의 및 과제를 언제 이용할 수 있게 되나요?

강좌에 등록하면 바로 모든 비디오, 테스트 및 프로그래밍 과제(해당하는 경우)에 접근할 수 있습니다. 상호 첨삭 과제는 이 세션이 시작된 경우에만 제출하고 검토할 수 있습니다. 강좌를 구매하지 않고 살펴보기만 하면 특정 과제에 접근하지 못할 수 있습니다.

이 전문 분야를 구독하면 무엇을 이용할 수 있나요?

강좌를 등록하면 전문 분야의 모든 강좌에 접근할 수 있고 강좌를 완료하면 수료증을 취득할 수 있습니다. 전자 수료증이 성취도 페이지에 추가되며 해당 페이지에서 수료증을 인쇄하거나 LinkedIn 프로필에 수료증을 추가할 수 있습니다. 강좌 내용만 읽고 살펴보려면 해당 강좌를 무료로 청강할 수 있습니다.

환불 규정은 어떻게 되나요?

재정 지원을 받을 수 있나요?

궁금한 점이 더 있으신가요? 학습자 도움말 센터를 방문해 보세요.