이 강좌는 Mathematics for Machine Learning 전문 분야의 일부입니다.

제공자:

Mathematics for Machine Learning 전문 분야

임페리얼 칼리지 런던

About this Course

309,929

In this course on Linear Algebra we look at what linear algebra is and how it relates to vectors and matrices. Then we look through what vectors and matrices are and how to work with them, including the knotty problem of eigenvalues and eigenvectors, and how to use these to solve problems. Finally we look at how to use these to do fun things with datasets - like how to rotate images of faces and how to extract eigenvectors to look at how the Pagerank algorithm works.
Since we're aiming at data-driven applications, we'll be implementing some of these ideas in code, not just on pencil and paper. Towards the end of the course, you'll write code blocks and encounter Jupyter notebooks in Python, but don't worry, these will be quite short, focussed on the concepts, and will guide you through if you’ve not coded before.
At the end of this course you will have an intuitive understanding of vectors and matrices that will help you bridge the gap into linear algebra problems, and how to apply these concepts to machine learning.

지금 바로 시작해 나만의 일정에 따라 학습을 진행하세요.

일정에 따라 마감일을 재설정합니다.

권장: 5 weeks of study, 2-5 hours/week...

자막: 영어

Eigenvalues And EigenvectorsBasis (Linear Algebra)Transformation MatrixLinear Algebra

지금 바로 시작해 나만의 일정에 따라 학습을 진행하세요.

일정에 따라 마감일을 재설정합니다.

권장: 5 weeks of study, 2-5 hours/week...

자막: 영어

주

1In this first module we look at how linear algebra is relevant to machine learning and data science. Then we'll wind up the module with an initial introduction to vectors. Throughout, we're focussing on developing your mathematical intuition, not of crunching through algebra or doing long pen-and-paper examples. For many of these operations, there are callable functions in Python that can do the adding up - the point is to appreciate what they do and how they work so that, when things go wrong or there are special cases, you can understand why and what to do....

5 videos (Total 28 min), 4 readings, 3 quizzes

Motivations for linear algebra3m

Getting a handle on vectors9m

Operations with vectors11m

Summary1m

About Imperial College & the team5m

How to be successful in this course5m

Grading policy5m

Additional readings & helpful references10m

Exploring parameter space20m

Solving some simultaneous equations15m

Doing some vector operations14m

주

2In this module, we look at operations we can do with vectors - finding the modulus (size), angle between vectors (dot or inner product) and projections of one vector onto another. We can then examine how the entries describing a vector will depend on what vectors we use to define the axes - the basis. That will then let us determine whether a proposed set of basis vectors are what's called 'linearly independent.' This will complete our examination of vectors, allowing us to move on to matrices in module 3 and then start to solve linear algebra problems....

8 videos (Total 44 min), 4 quizzes

Modulus & inner product10m

Cosine & dot product5m

Projection6m

Changing basis11m

Basis, vector space, and linear independence4m

Applications of changing basis3m

Summary1m

Dot product of vectors15m

Changing basis15m

Linear dependency of a set of vectors15m

Vector operations assessment15m

주

3Now that we've looked at vectors, we can turn to matrices. First we look at how to use matrices as tools to solve linear algebra problems, and as objects that transform vectors. Then we look at how to solve systems of linear equations using matrices, which will then take us on to look at inverse matrices and determinants, and to think about what the determinant really is, intuitively speaking. Finally, we'll look at cases of special matrices that mean that the determinant is zero or where the matrix isn't invertible - cases where algorithms that need to invert a matrix will fail....

8 videos (Total 57 min), 3 quizzes

How matrices transform space5m

Types of matrix transformation8m

Composition or combination of matrix transformations8m

Solving the apples and bananas problem: Gaussian elimination8m

Going from Gaussian elimination to finding the inverse matrix8m

Determinants and inverses10m

Summary59

Using matrices to make transformations12m

Solving linear equations using the inverse matrix16m

주

4In Module 4, we continue our discussion of matrices; first we think about how to code up matrix multiplication and matrix operations using the Einstein Summation Convention, which is a widely used notation in more advanced linear algebra courses. Then, we look at how matrices can transform a description of a vector from one basis (set of axes) to another. This will allow us to, for example, figure out how to apply a reflection to an image and manipulate images. We'll also look at how to construct a convenient basis vector set in order to do such transformations. Then, we'll write some code to do these transformations and apply this work computationally....

6 videos (Total 53 min), 4 quizzes

Matrices changing basis11m

Doing a transformation in a changed basis4m

Orthogonal matrices6m

The Gram–Schmidt process6m

Example: Reflecting in a plane14m

Non-square matrix multiplication20m

Example: Using non-square matrices to do a projection12m

주

5Eigenvectors are particular vectors that are unrotated by a transformation matrix, and eigenvalues are the amount by which the eigenvectors are stretched. These special 'eigen-things' are very useful in linear algebra and will let us examine Google's famous PageRank algorithm for presenting web search results. Then we'll apply this in code, which will wrap up the course....

9 videos (Total 44 min), 1 reading, 5 quizzes

What are eigenvalues and eigenvectors?4m

Special eigen-cases3m

Calculating eigenvectors10m

Changing to the eigenbasis5m

Eigenbasis example7m

Introduction to PageRank8m

Summary1m

Wrap up of this linear algebra course1m

Did you like the course? Let us know!10m

Selecting eigenvectors by inspection20m

Characteristic polynomials, eigenvalues and eigenvectors30m

Diagonalisation and applications20m

Eigenvalues and eigenvectors25m

4.6

484개의 리뷰이 강좌를 수료한 후 새로운 경력 시작하기

이 강좌를 통해 확실한 경력상 이점 얻기

대학: CS•Apr 1st 2018

Amazing course, great instructors. The amount of working linear algebra knowledge you get from this single course is substantial. It has already helped solidify my learning in other ML and AI courses.

대학: PL•Aug 26th 2018

Great way to learn about applied Linear Algebra. Should be fairly easy if you have any background with linear algebra, but looks at concepts through the scope of geometric application, which is fresh.

Imperial College London is a world top ten university with an international reputation for excellence in science, engineering, medicine and business. located in the heart of London. Imperial is a multidisciplinary space for education, research, translation and commercialisation, harnessing science and innovation to tackle global challenges.
Imperial students benefit from a world-leading, inclusive educational experience, rooted in the College’s world-leading research. Our online courses are designed to promote interactivity, learning and the development of core skills, through the use of cutting-edge digital technology....

For a lot of higher level courses in Machine Learning and Data Science, you find you need to freshen up on the basics in mathematics - stuff you may have studied before in school or university, but which was taught in another context, or not very intuitively, such that you struggle to relate it to how it’s used in Computer Science. This specialization aims to bridge that gap, getting you up to speed in the underlying mathematics, building an intuitive understanding, and relating it to Machine Learning and Data Science.
In the first course on Linear Algebra we look at what linear algebra is and how it relates to data. Then we look through what vectors and matrices are and how to work with them.
The second course, Multivariate Calculus, builds on this to look at how to optimize fitting functions to get good fits to data. It starts from introductory calculus and then uses the matrices and vectors from the first course to look at data fitting.
The third course, Dimensionality Reduction with Principal Component Analysis, uses the mathematics from the first two courses to compress high-dimensional data. This course is of intermediate difficulty and will require basic Python and numpy knowledge.
At the end of this specialization you will have gained the prerequisite mathematical knowledge to continue your journey and take more advanced courses in machine learning....

강의 및 과제를 언제 이용할 수 있게 되나요?

강좌에 등록하면 바로 모든 비디오, 테스트 및 프로그래밍 과제(해당하는 경우)에 접근할 수 있습니다. 상호 첨삭 과제는 이 세션이 시작된 경우에만 제출하고 검토할 수 있습니다. 강좌를 구매하지 않고 살펴보기만 하면 특정 과제에 접근하지 못할 수 있습니다.

이 전문 분야를 구독하면 무엇을 이용할 수 있나요?

강좌를 등록하면 전문 분야의 모든 강좌에 접근할 수 있고 강좌를 완료하면 수료증을 취득할 수 있습니다. 전자 수료증이 성취도 페이지에 추가되며 해당 페이지에서 수료증을 인쇄하거나 LinkedIn 프로필에 수료증을 추가할 수 있습니다. 강좌 내용만 읽고 살펴보려면 해당 강좌를 무료로 청강할 수 있습니다.

환불 규정은 어떻게 되나요?

재정 지원을 받을 수 있나요?

궁금한 점이 더 있으신가요? 학습자 도움말 센터를 방문해 보세요.