Chevron Left
Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization(으)로 돌아가기

deeplearning.ai의 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization 학습자 리뷰 및 피드백

4.9
39,861개의 평가
4,245개의 리뷰

강좌 소개

This course will teach you the "magic" of getting deep learning to work well. Rather than the deep learning process being a black box, you will understand what drives performance, and be able to more systematically get good results. You will also learn TensorFlow. After 3 weeks, you will: - Understand industry best-practices for building deep learning applications. - Be able to effectively use the common neural network "tricks", including initialization, L2 and dropout regularization, Batch normalization, gradient checking, - Be able to implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence. - Understand new best-practices for the deep learning era of how to set up train/dev/test sets and analyze bias/variance - Be able to implement a neural network in TensorFlow. This is the second course of the Deep Learning Specialization....

최상위 리뷰

CV

Dec 24, 2017

Exceptional Course, the Hyper parameters explanations are excellent every tip and advice provided help me so much to build better models, I also really liked the introduction of Tensor Flow\n\nThanks.

AO

Apr 06, 2018

Fantastic course! For the first time, I now have a better intuition for optimizing and tuning hyperparameters used for deep neural networks.I got motivated to learn more after completing this course.

필터링 기준:

Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization의 4,179개 리뷰 중 4101~4125

교육 기관: Alexander V

Feb 25, 2018

Tests are very easy, and the programming exercises are very straight-forward - to the point where it is really obvious what to do. I could have learned more if both were more challenging

교육 기관: Tushar B

Jun 12, 2018

Assignments vs lecture, difference is huge

교육 기관: QUINTANA-AMATE, S

Mar 11, 2018

Again, nice videos but not

교육 기관: FREDERIC T

May 13, 2018

Good courses, the sound quality is very poor (high tone noise).

교육 기관: Dimitrios G

Nov 28, 2017

The course continues on the same path the previous Deep Learning course has set but I found the use of TensorFlow somewhat limiting. It is a great tool that simplifies the training and running of NNs but it does not allow for easy debugging or for easy looking within the built-in functions to spot problems. I felt that we were treating many tf.functions as black boxes and I am not so fond of this. Otherwise the course was fairly useful.

교육 기관: Maysa M G d M

Mar 05, 2018

Some exercises were wrong , like Z3 em tensorflow model, you said z3=w*z2+b3, but it was A2 ,not Z2.

Several exercises did not check the result for each function, so when I arrived at the huge model function, it was hard to discover where I was wrong.

I think this third week could be two. I missed exercise with normalization, there were all with tensorflow.

교육 기관: Dartois S

Aug 17, 2017

A bit less good than the previous course. It would have been good to have a chance to concretely implement Batch normalization. Then I think the tutorial on tensorflow needs more details and explanations of the what and why of the conventions. Anyway I was really happy to learn a bit about tensorflow, I hope I will use it more through the course.

교육 기관: Amod J

Mar 18, 2018

Want to download my own work but cannot.

교육 기관: Alex I E

Sep 05, 2017

The Tensorflow part should have started sooner in the course.

교육 기관: Laura L

Mar 22, 2018

It does not make you think of the problems, just fill in the gaps. First course was better.

교육 기관: Li X

May 12, 2018

Good: Contents on Tensor Flow

Bad: No real useful content compared the Course 1.

교육 기관: Maisam S W

Oct 04, 2017

I still find tensorflow hard.

교육 기관: Peter G

Dec 05, 2017

Nice course, but again, main emphasis on the practical side and 'never mind, you don't need to know the details' approach. Having optional parts where theory about batch-normalization implementation and softmax derivative derivation could be shown - that would be very desirable. Another not so great thing is that final TensorFlow-related practice exercises are too 'quick' in a sense that 99% of the code is written for you and hints are given in such a way that you literally don't even have to use a half of your brain. That is also frustrating, when everything is already done for you.

교육 기관: 笛 王

Jan 19, 2018

Harder to understand. Overall quality is not as good as the first class.

교육 기관: Chaobin Y

Nov 03, 2017

Maybe this course can merge with the 1st one. they both cover too little materials.

교육 기관: Joshua P J

Jun 08, 2018

I've loved Andrew Ng's other courses, but this course was boring and not well-organized. The lectures were unfocused and they rambled a lot; they're nearly the opposite style of Prof. Ng's other material, which I found extremely well-organized. Most topics could be shortened 33-50% with no of clarity.

The course structure itself could use improvement:

The first part of Week 3 (Hyperparameter Tuning) belongs in Week 2.

The third part of Week 3 (Multi-Class Classification) should be its own week and its own assignment and could really be its own course. This is *THE* problem that almost every "applied" machine learning paper I've read is attempting to solve, whether by deep learning or some other class of algorithms. (Context and full disclosure: I'm a Ph.D. Geophysicist and my research is in seismology and volcanology.)

The introduction to TensorFlow needs to explain how objects and data structures work in TF. It really needs to explain the structure and syntax of the feed dictionary.

In the programming assignment for Week 3, there are three issues: (a) The correct use of feed_dict in 1.3 is completely new and cannot be guessed from the instructions or the TF website, and it's not clear why we use float32 for Y instead of int64; (b) In 1.4, "tf.one_hot(labels, depth, axis)" should be "tf.one_hot(labels, depth, axis=axis_number)". (c) In 2.1, the expected output for Y should have shape (6,?), not (10,?).

교육 기관: Aliaksei A

Sep 07, 2017

Looks raw yet.

교육 기관: Adrian C

Nov 30, 2017

So far, I think this course is weak on theory, seems rushed and should provide more in depth lecture notes.

교육 기관: Kenneth Z

Mar 20, 2018

It is a bit abrupt to jump into tensorflow without explaining in depth.

교육 기관: Gadiel S

Sep 21, 2018

The course is good. It covers important ideas, and they are well explained in the videos. However, the formulation of the assignments is sloppy. There are mistakes and inconsistencies, in some cases necessary explanations are missing, and in some cases the instructions are misleading (I suspect the assignment has changed over time, but the instructions have not been consistently updated).

교육 기관: K K R

Sep 17, 2018

Some of the videos are very abstract and needs a bit of mathematical intuitions. These intuitions are best obtained by calculations rather than a lecture :)

교육 기관: Salim S I

Aug 12, 2018

Would have liked programming assignment in python to understand the various initializations and optimizations. Although tensorflow introduction was good, It felt like being left stranded without a python assignment to cement the things learnt in the class.

교육 기관: Ha S C

Oct 29, 2018

A much sloppier and poorer course than previously. Grading mishaps (on the fault of the grader), a few errors in the lectures (the variance in the normalization), and very basic and unhelpful feedback from staff made for a course that did not live up to the level of the previous one. If at any point you need further help, it is generally unavailable, or difficult to find at best.

교육 기관: Imad M

Nov 04, 2018

Week 1 and week 2 needs more examples of python programming in the videos. The videos for week 3 were a lot more interesting. Without the python implementation examples in the videos, the course can be very dry.

교육 기관: Peiyu H

Oct 12, 2018

Lots of error on the final exercise. It seems some errors exist from previous sessions already. Hope the teaching team will fix the errors and make learning less confusing for us.