Chevron Left
Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization(으)로 돌아가기

deeplearning.ai의 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization 학습자 리뷰 및 피드백

4.9
40,139개의 평가
4,265개의 리뷰

강좌 소개

This course will teach you the "magic" of getting deep learning to work well. Rather than the deep learning process being a black box, you will understand what drives performance, and be able to more systematically get good results. You will also learn TensorFlow. After 3 weeks, you will: - Understand industry best-practices for building deep learning applications. - Be able to effectively use the common neural network "tricks", including initialization, L2 and dropout regularization, Batch normalization, gradient checking, - Be able to implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence. - Understand new best-practices for the deep learning era of how to set up train/dev/test sets and analyze bias/variance - Be able to implement a neural network in TensorFlow. This is the second course of the Deep Learning Specialization....

최상위 리뷰

CV

Dec 24, 2017

Exceptional Course, the Hyper parameters explanations are excellent every tip and advice provided help me so much to build better models, I also really liked the introduction of Tensor Flow\n\nThanks.

AO

Apr 06, 2018

Fantastic course! For the first time, I now have a better intuition for optimizing and tuning hyperparameters used for deep neural networks.I got motivated to learn more after completing this course.

필터링 기준:

Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization의 4,203개 리뷰 중 226~250

교육 기관: 唐章源

Mar 27, 2019

perfect

교육 기관: HEF

Mar 27, 2019

This course taught me a lot of things that I cannot usually find in a school curriculum, yet the content are extremely useful in helping me to accelerate my algorithms. This course is super important in handling deep learning projects, I think.

교육 기관: Abhinay P

Mar 27, 2019

This is one the best courses. The use of Hyperparameters and tuning is best explained.

교육 기관: KEMAL S

Mar 27, 2019

These courses gives a new aspect of life. I gained lots of valuable information.

교육 기관: Ravikant C

Mar 28, 2019

I really enjoyed doing this assignment. A perfect combination of hands-on and concept discussion.

교육 기관: Lucifer Z

Mar 28, 2019

awesome!

교육 기관: RAJ S

Mar 28, 2019

I cant stop myself to watch the next course

교육 기관: Sergio L M

Mar 28, 2019

Great!

교육 기관: Celia C

Mar 27, 2019

Hope the tensorflow homework can be more clearly instructed. And hope there were more tensorflow part of homework

교육 기관: Pieter J V V V

Mar 28, 2019

Very clear explanations, well guided exercises.

교육 기관: Sherif M

Mar 29, 2019

Great as always especially the notebooks appear more mature now and are educational by themselves.

Great job by Andrew and team!

교육 기관: Gabriel

Mar 29, 2019

another fantastic entry

교육 기관: Matthew J B

Mar 29, 2019

Fantastic

교육 기관: Muhammad H B K

Mar 30, 2019

Was an amazing course. There are few mistakes in final exercise of week 3.

교육 기관: Federico A G C

Mar 30, 2019

Fantastic in every way!

교육 기관: ChenTianyi

Mar 31, 2019

amazing

교육 기관: Raksha Y

Mar 31, 2019

Excellent

교육 기관: Lydia N

Mar 30, 2019

Learning can be so much fun!!! I like to read the book Deep Learning by Ian Goodfellow and Yoshua Bengio for additional support :-)

교육 기관: yasser s

Mar 30, 2019

Thank you! Very helpful!

교육 기관: zhou

Mar 31, 2019

非常详细,讲的特别好,还有实验课程

교육 기관: Haoqiu W

Mar 31, 2019

nice teacher

교육 기관: Raza F

Mar 31, 2019

Best Course for learning hyperparameter tuning, Regularization and Optimization

교육 기관: Bassel G

Mar 31, 2019

Thank you Coursera I am thankful to you Andrew and all team members of this course.

교육 기관: Aishwarya R

Mar 19, 2019

deeplearning.ai courses are worth the time and effort

교육 기관: Sagar K

Mar 19, 2019

Liked the content of this course. I would have liked optional videos about the mathematics behind the optimization algorithms. Appreciate the focus on building the optimization algorithms from ground up before learning a framework.