Chevron Left
Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization(으)로 돌아가기

deeplearning.ai의 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization 학습자 리뷰 및 피드백

4.9
43,022개의 평가
4,618개의 리뷰

강좌 소개

This course will teach you the "magic" of getting deep learning to work well. Rather than the deep learning process being a black box, you will understand what drives performance, and be able to more systematically get good results. You will also learn TensorFlow. After 3 weeks, you will: - Understand industry best-practices for building deep learning applications. - Be able to effectively use the common neural network "tricks", including initialization, L2 and dropout regularization, Batch normalization, gradient checking, - Be able to implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence. - Understand new best-practices for the deep learning era of how to set up train/dev/test sets and analyze bias/variance - Be able to implement a neural network in TensorFlow. This is the second course of the Deep Learning Specialization....

최상위 리뷰

NA

Jan 14, 2020

After completion of this course I know which values to look at if my ML model is not performing up to the task. It is a detailed but not too complicated course to understand the parameters used by ML.

XG

Oct 31, 2017

Thank you Andrew!! I know start to use Tensorflow, however, this tool is not well for a research goal. Maybe, pytorch could be considered in the future!! And let us know how to use pytorch in Windows.

필터링 기준:

Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization의 4,549개 리뷰 중 4076~4100

교육 기관: Flaviu I V

Apr 07, 2018

I feel like the second course was better then the first one. But there are a couple of typos in some assignments and the assignments are still too easy.

교육 기관: Mark M

Oct 30, 2017

The intro of hyper parameters was from mathematical point of view as good as the basics of week 1, however practical relevance becomes not really clear.

교육 기관: Stephan W

Sep 02, 2017

As always - excellent lectures by Andrew Ng. However, I think that the programming assignments tend to be a it too easy and a bit too much "copy/paste".

교육 기관: Sergey

Oct 06, 2019

I wish prof. Ng provided more intuitions into underlying math particularly why gradient optimization techniques help. But like it anyways, very useful!

교육 기관: Anthony

Nov 08, 2017

Great material, few minor errors that need fixing throughout. Noted in forums. I expect this will improve as more take the course and feedback applied.

교육 기관: Isaac S

Nov 27, 2019

I missed in the course an explanation and possibly a programming assignment of different tuning algorithms, such as random search and Bayesian search.

교육 기관: James D B

Jun 23, 2019

Probably a little too follow your nose at this point in the specialisation. But none-the-less very good. Would give 4.5 stars if that were an option.

교육 기관: Christoph S

Mar 03, 2019

Still some flaws + inaccuracies + video sequences that should be cut out. I think the organizers should really do it as people are now paying for it!

교육 기관: Teodor C

Dec 28, 2018

Last Tensorflow assignment has some output typos and bugs when using operators like @ and +. Course was ok, but that assignment took me way too long.

교육 기관: HongZhang

Jun 14, 2018

Great course to deepen my knowledge after first course. However, I would like to access more programming exercise for practice. That will be perfect!

교육 기관: Daniel E B G

Aug 26, 2019

I think this course would benefit from a little more explaining. There are a lot of new concepts and some explanations were too quick in my opinion.

교육 기관: Stephen R

Oct 26, 2018

Enjoyed this course, especially the material that goes a bit deeper (different optimization methods, parameter tuning) and the intro to TensorFlow.

교육 기관: Huang C H

Nov 24, 2017

Less exciting than the first course, but this course is important to understanding the parameters that could affect a neural network's performance.

교육 기관: Youssouf B

Apr 22, 2019

what I did recognize in the deeplearning specialization that there are now further reading suggestions or reading syllabus like the other courses.

교육 기관: Harsh T

Feb 26, 2019

This course is one of the best course for good understanding of hyperparameter tunning.

And also let you know the effect of various hyperparameter.

교육 기관: Nicolás E C

Apr 09, 2019

Nice course, TensorFlow might need some more in-detail explanation because it's a different than programming with Python, but it was really nice.

교육 기관: Vinicius J S

Aug 08, 2018

Nice course and nice the Tensorflow introduction, but there are errors on the lecture and on the final test. Be aware to use the forum some times

교육 기관: Daniel F

Feb 10, 2020

Course was awesome, but there is an error with the grader for one of the programming assignments that took some time to search for a workaround.

교육 기관: Collin J O

Mar 15, 2019

Valuable lessons, but the tensorflow lesson + assignment at the end was a bit vague and hard to follow to the point of passing their test cases.

교육 기관: Giuseppe N

Jul 09, 2018

It's very good, but I would have spent more explaining the difference between adding layers and adding neurons, and how to decide the next move.

교육 기관: Jeremy Z

Dec 11, 2017

a few of the examples and expected output for the programming exercises seemed not to be correct. otherwise great course. highly recommended.

교육 기관: David A S

Sep 27, 2017

Good course. Kinda skips over hard bits which only leaves one with more questions. Hopefully these details are recovered in the later courses.

교육 기관: Dinh T T

Feb 09, 2019

It's a wonderful course because it provides me how to improve deep neural networks and delve to some techniques to gain good hyperparameters

교육 기관: John S T L

Feb 01, 2019

Would have given 5 stars if the Jupyter exercise did not give me too much of a hard time looking for errors in syntax. Overall, great lesson!

교육 기관: Parag P

Oct 19, 2018

Loved the easy to understand explanation given by Prof. Andrew Ng for some of the most complex concepts in Deep Learning like Regularisation.