Beam Search

video-placeholder
Loading...
강의 계획서 보기

배우게 될 기술

Natural Language Processing, Long Short Term Memory (LSTM), Gated Recurrent Unit (GRU), Recurrent Neural Network, Attention Models

검토

4.8(26,738개의 평가)
  • 5 stars
    83.69%
  • 4 stars
    13.07%
  • 3 stars
    2.49%
  • 2 stars
    0.47%
  • 1 star
    0.26%
SD
2018년 9월 27일

Great hands on instruction on how RNNs work and how they are used to solve real problems. It was particularly useful to use Conv1D, Bidirectional and Attention layers into RNNs and see how they work.

JR
2019년 5월 25일

I am so grateful that Andrew and the team provided such good course, I learn so much from this course, I am so excited that see the wake word detection model actually work in the programming exercise

수업에서
Sequence Models & Attention Mechanism
Augment your sequence models using an attention mechanism, an algorithm that helps your model decide where to focus its attention given a sequence of inputs. Then, explore speech recognition and how to deal with audio data.

강사:

  • Placeholder

    Andrew Ng

    Instructor
  • Placeholder

    Kian Katanforoosh

    Senior Curriculum Developer
  • Placeholder

    Younes Bensouda Mourri

    Curriculum developer

Coursera 카탈로그 살펴보기

무료로 참여해 맞춤화된 추천, 업데이트 및 제안을 받아보세요.