Loading...

Attention mechanism

Course video 32 of 43

Nearly any task in NLP can be formulates as a sequence to sequence task: machine translation, summarization, question answering, and many more. In this module we will learn a general encoder-decoder-attention architecture that can be used to solve them. We will cover machine translation in more details and you will see how attention technique resembles word alignment task in traditional pipeline.

Coursera 소개

세계 최고의 대학교와 교육 기관의 최상위 강사가 가르쳐주는 강좌와 전문 강좌를 듣고 온라인 학위를 취득하세요.

Community
Join a community of 40 million learners from around the world
Certificate
Earn a skill-based course certificate to apply your knowledge
Career
Gain confidence in your skills and further your career