Chevron Left
Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization(으)로 돌아가기

deeplearning.ai의 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization 학습자 리뷰 및 피드백

50,728개의 평가
5,725개의 리뷰

강좌 소개

This course will teach you the "magic" of getting deep learning to work well. Rather than the deep learning process being a black box, you will understand what drives performance, and be able to more systematically get good results. You will also learn TensorFlow. After 3 weeks, you will: - Understand industry best-practices for building deep learning applications. - Be able to effectively use the common neural network "tricks", including initialization, L2 and dropout regularization, Batch normalization, gradient checking, - Be able to implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence. - Understand new best-practices for the deep learning era of how to set up train/dev/test sets and analyze bias/variance - Be able to implement a neural network in TensorFlow. This is the second course of the Deep Learning Specialization....

최상위 리뷰


Jan 14, 2020

After completion of this course I know which values to look at if my ML model is not performing up to the task. It is a detailed but not too complicated course to understand the parameters used by ML.


Dec 24, 2017

Exceptional Course, the Hyper parameters explanations are excellent every tip and advice provided help me so much to build better models, I also really liked the introduction of Tensor Flow\n\nThanks.

필터링 기준:

Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization의 5,670개 리뷰 중 76~100

교육 기관: 江小鱼

Feb 12, 2019

This time , I finished Regularzation, I think this is a interesting experience, for you can implement your alg step by step, I get some magic(not black magic) alg, like RMS, momentum and Adam. At last, the most fascinating is to construct Tensorflow, just like a pipeline, step by step , and every step was made by only one line, from forward (without backward) to the model, Tensorflow is really black magic.

(I have to say Tensorflow is a bit difficult, forgive my poor English, thanks )

교육 기관: Nathan Y

Oct 16, 2017

Neural networks are not new. What we learned in this course is some of the critical implementation details/tricks from the past decades of making them work in practice. Going beyond gradient descent, types of regularization, hyperparameter searching we get to a set of robust tools that quickly find good solutions in extremely high dimensional spaces. As Professor Ng says, our understanding of optimization rules of thumb in low dimensional spaces doesn't carry over to deep learning.

교육 기관: José A

Oct 31, 2017

Seamlessly continues the previous course. If you know the basic structures of Neural Networks, how to initialize weights. Sigmoid, Tangenth, activations, and so forth, this will help you understand terms such as L2 regularization, gradient descent with momentum, RMSProp, Adam, Exponentially weighted averages, and many others.

Don't let the 3 weeks set you off. It has a lot of micro-content material that works on top of the previous work. Thanks to all the mentors for this great course.

교육 기관: Raimond L

Aug 20, 2017

Really nice course. A lot of good information about how to prepare and divide data for training, hyperparameters optimization strategy, regularization techniques, learning algorithms, mini-batches, batch-normalization and more... Very useful information with clear explanation !!! Highly recommended course.

Very positive course, except tensorflow practical assignment, which caused some stress, because for me that framework is a bit alienating, forcing to look into manual every minute.

교육 기관: P M K

Nov 26, 2017

Hi, The course content was definitely good and it helped to understand a lot of internals quite easily. I, however have one suggestion, the Introduction to Tensor Flow looked quite fast and could have been done in a better way by giving more slides about TensorFlow and then going on to the examples. Please ensure that you correct any errors pointed by the members taking this course, so that it benefits others and avoids wasting of time and reduces frustration at times.

Regards, PMK

교육 기관: Sachita N

Jun 18, 2018

Professor Ng explains the most complicated concepts in the most intuitive fashion I have ever seen. The explanations are simple, straightforward and they encompass so many perspectives and alternatives to doing things. The exercises are immensely educational - they strike a great balance between guiding the student and letting them figure stuff out on their own. This is a great specialisation and I would whole-heartedly recommend it for anyone wanting to start with Deep Learning

교육 기관: kindalin

Jul 31, 2019

This is the best course I have ever seen. The previous mooc class gave me some bad impressions, which is be created by some scholars for KPI. I believe that such a well-designed course will eventually replace the traditional curriculum. This is also a good hope for our students in non-brand schools.

The only downside is that the coursework instructions are too detailed as many people reflect. I can see a lot of good and hard designs in it, but I hope it can have a better form.

교육 기관: Joppe G

Aug 13, 2017

This course is simply brilliant. You start with implementing the low-level functions that make up a deep learning framework. It's only in the last assignment that you explore TensorFlow. At that point, you have a full understanding of what the API encapsulates.

This really gives you confidence in your capability to get started with your own projects, knowing that you can come back at any time to brush up on some of the lower-level details.

Thank you Andrew and the whole team!

교육 기관: Rajeev B A

Nov 18, 2017

The assignments are very good. All the parameter update methods are explained in a very good manner. I would recommend it very strongly for anyone who is looking for an in depth understanding of why we do what we do for tuning, regularization, optimization of NN. All the implementation in the assignments is also from scratch, so, that really helps a lot. I felt this is better than Stanford CS231n course material, after all this is a whole course on this specific purpose :).

교육 기관: Marcel M

Jun 01, 2018

This course a practical way of fine tuning your model in order to improve on its performance. Rather than Deep Learning being a "so-called" black box. It turns out that Machine Learning models are not black boxes but rather there are proven techniques of not only finding out what happens in them but also to fine tune them in a systematic manner in order to improve on their results. It is an excellent course for the practical Deep Learning Engineer. Good Job and Keep It Up!

교육 기관: Artem M

Apr 22, 2018

Found a lot of interesting details about NN that I did not know. This is a much better course than the first one. IncludesTensorflow exercises, which is useful. Nevertheless, proofs are still omitted for some results like initializations. It is not hard to google, but I bet lecturers could explain them much faster than diving into science literature. Otherwise, intuitional explanations of Adam using exponential smoothing, or physics analogy of momentum are just brilliant.

교육 기관: Daniel C

Jan 14, 2018

True to the claimed learning objectives, the course Improving Deep Neural Networks shows some of the magic behind deep learning algorithms. The programming assignments solidify abstract concepts discussed in lecture videos. In fact, some portions like seeing cost decreases in real-time for Adam Optimization are truly eye-opening experiences.

One possible improvement is better editing of instructions and code comments of TensorFlow Tutorial Programming Assignment in Week 3.

교육 기관: Pedro B M

Feb 04, 2019

As always Andrew Ng is very didactic explaining different and complex hyperparameter tuning techniques and optimizations algorithms, giving intuitive explanations and examples. I've been learning a lot in these courses! And more than that, the content is presented in such a way that motivates the student to go beyond and explore/try different implementations and problems to apply. I highly recommend the course for anyone who wants to become a serious ML practitioner!

교육 기관: Johnathan T

Sep 01, 2017

This class was awesome! Thank you to Andew Ng and his team for putting this Specialization together. It is amazing for someone with so much experience in this field to be willing to share their wisdom with everyone, practically for free. The course content is filled with information that would have taken me years of to acquire. I am fortunate to have the opportunity to build a strong foundation in this field at a time when A.I. is becoming society's new electricity!

교육 기관: Anton V

Jun 13, 2018

A very valuable course to improve your understanding and develop a better toolset in using NNs. The instructor gives great tips on how to approach problems and explains the latest techniques very well. Also features a nice introduction to TensorFlow. As an experienced programmer I found this course to be a breezy and fast hands-on tutorial to get you going quickly if you are doing these courses to apply for something you are interested in (e.g. personal project)

교육 기관: AVADH P

Jan 07, 2020

Excellent course!! Really glad to have taken this course as a part of the Deep Learning specialization. This course gives a breakthrough in designing neural networks and deep networks using a thorough understanding of all the major aspects to be considered. The course also helps in learning current industry-wide used opensource frameworks such as TensorFlow. The assignments are well designed to make the step by step understanding and exercise of the learning.

교육 기관: Matheus H B d A

Sep 22, 2017

Um dos cursos que mais gostei até o momento. Desde que comecei a estudar deep learning vejo se falar de muitas técnicas que pareciam impossíveis de compreender e implementar, mas esse curso não só ensina como implementar algumas delas, como também ajuda a entender o motivo dessas técnicas serem tão boas para os modelos de redes neurais, dando uma boa intuição de como cada método funciona. Além disso, apresenta e ajuda a desmistificar o framework tensorflow.

교육 기관: Joe M

Jul 14, 2019

This course was a great continuation of the first. The lecture pace is great (and ability to speed up or slow down the video speed helps a lot), the reiteration of past lessons helps with some of the denser materials, and the overall presentation is excellent. Also very nice that the problem sets aren't out to trick you! The material is new enough to many of us to begin with! The emphasis on practical application of the material is key (for me, at least).

교육 기관: Nidhi V S

Apr 27, 2020

This course is very well designed and the instructor does an amazing job at explaining the concepts making it easy to learn, even for a novice in the field. This course helped me to get a greater understanding of Neural Networks. I learned how to enhance the performance of Neural Network by selecting appropriate hyperparameters, using regularization, using normalization and various other techniques. It was interesting to learn about the Softmax function.

교육 기관: Ricardo S

Dec 17, 2017

The course covers an extremely important topic (I know I've been lost in hyperparameter maze before) , and allowed me to get a good feeling of what, when and how to use hyperparameters. I guess that to actually master the topic students will have to practice with their own models and data sets, therefore I think that getting actual practice on this topic would be out of the scope of the course, and thus I think the programming assignments were adequate.

교육 기관: Holger O

May 23, 2019

Prof. Andrew did it again! I took the "classical" Machine Learning course and I'm pleased to see that this continuation was as good or even better. A total equilibrium between the mathematical depth you need to understand the basis of the algorithms and the practical skills you need to put them in practice in the real world, in the exact amount for them to fit in a 18-hour course. As a starting point, this course is perfect! Eager to keep on learning...

교육 기관: David F

Sep 16, 2017

These courses are awesome. Andrew Ng is a very clear professor and the interviews with other ML practitioners are enlightening. My one criticism is that the assignments are put on a plate for you so they're pretty easy to complete but then difficult to replicate in real life (since so much of the scaffolding was taken care of for you while learning). But maybe that helps to preserve the flow of the class, rather than getting you bogged down in details.

교육 기관: Sergio B S

Aug 01, 2018

I began using Deep Learning Frameworks before this course, but...

I realise now, after this second course and the first one, that learning the maths behind Neural Networks helps exponentially to understand and internalize what is the real use of some of the most important hyperparameters and the what's and why's of good strategies to regularize models. As A.Ng repeat sometimes, this specialization help me "To get the intuition" to improve the models.

교육 기관: Amit K

Dec 04, 2018

This is good course for the student, who want to do real stuff with NN. Some of the tricks are well explained like L2,dropout, adam, momentum, minibatches etc. I think these are much needed tricks if i need to implement and tune my own NN on my own problems. I prefer to have a second level of such course which really talks about challenges in real life NN and how to solve those. Once again thanks alot for the entire Team for pulling this together.

교육 기관: Eleanna S

Mar 04, 2018

Very useful course. Gives great insight on the hyper parameter tuning, regularisation and optimisation. One request I have is to provide a docker image which we can use to run the exercises locally. Sometimes I found it hard to build the environment where I can run the coursework. Some of the installations are clashing and it is not clear what versions of libraries are used in the coursework environment. It sometimes requires unnecessary effort.