Trigger Word Detection

video-placeholder
Loading...
강의 계획서 보기

배우게 될 기술

Natural Language Processing, Long Short Term Memory (LSTM), Gated Recurrent Unit (GRU), Recurrent Neural Network, Attention Models

검토

4.8(26,729개의 평가)
  • 5 stars
    83.69%
  • 4 stars
    13.07%
  • 3 stars
    2.49%
  • 2 stars
    0.47%
  • 1 star
    0.26%
SS
2020년 1월 1일

Learnt a lot about new concepts in RNN and LSTM. Really wanted to learn about these models. This course helped a lot. Everything was new and so fascinating. Loved this course and our teach Andrew NG.

NM
2018년 2월 20일

Hope can elaborate the backpropagation of RNN much more. BP through time is a bit tricky though we do not need to think about it during implementation using most of existing deep learning frameworks.

수업에서
Sequence Models & Attention Mechanism
Augment your sequence models using an attention mechanism, an algorithm that helps your model decide where to focus its attention given a sequence of inputs. Then, explore speech recognition and how to deal with audio data.

강사:

  • Placeholder

    Andrew Ng

    Instructor
  • Placeholder

    Kian Katanforoosh

    Senior Curriculum Developer
  • Placeholder

    Younes Bensouda Mourri

    Curriculum developer

Coursera 카탈로그 살펴보기

무료로 참여해 맞춤화된 추천, 업데이트 및 제안을 받아보세요.