Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization(으)로 돌아가기

4.9

별

42,913개의 평가

•

4,599개의 리뷰

This course will teach you the "magic" of getting deep learning to work well. Rather than the deep learning process being a black box, you will understand what drives performance, and be able to more systematically get good results. You will also learn TensorFlow.
After 3 weeks, you will:
- Understand industry best-practices for building deep learning applications.
- Be able to effectively use the common neural network "tricks", including initialization, L2 and dropout regularization, Batch normalization, gradient checking,
- Be able to implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence.
- Understand new best-practices for the deep learning era of how to set up train/dev/test sets and analyze bias/variance
- Be able to implement a neural network in TensorFlow.
This is the second course of the Deep Learning Specialization....

Dec 06, 2019

I enjoyed it, it is really helpful, id like to have the oportunity to implement all these deeply in a real example.\n\nthe only thing i didn't have completely clear is the barch norm, it is so confuse

Dec 24, 2017

Exceptional Course, the Hyper parameters explanations are excellent every tip and advice provided help me so much to build better models, I also really liked the introduction of Tensor Flow\n\nThanks.

필터링 기준:

교육 기관: Jeroen M

•Jan 10, 2018

Great course, a few rough edges in the exercises and I also feel the exercise comments give away a bit too much (would be better if the student needed to figure out things by himself a little more). But these are minor details, I've learned a great deal in an amazingly short span of time, from one of the top minds in AI today!

교육 기관: Jeff R

•Oct 02, 2017

I appreciate the large amount of time that has gone into preparing this course. I note that there are a large number of corrections in the errata forum that have not been reviewed by staff. In particular there are some obvious errors in the programming assignments that could easily be corrected with a small investment in time.

교육 기관: Murat T

•Dec 24, 2018

Topics cut in to sections are well defined and so clear. Programming assignments definitely gives you hands on experience. Also, math is demystified that you track with high school math. If you used framework like Keras and you want to know why and when you need to use that function,parameter etc., you would love this course.

교육 기관: Gilles D

•Sep 06, 2017

Eventually a clear and definitive explanation about Network initialization, regularization and optimization. Good insight share on hyper-parameters prioritization.

We learn the how and why and suddenly, it all becomes a little bit less mysterious. It is all clearly explained in a very accessible way.

Great value for my needs

교육 기관: Xuefeng P

•Aug 29, 2017

This course really gives you a fundamental and practical ideas about the hyper-parameters of DNN, and the way of tuning them. The part I liked most is the last programming assignment ---- play with Tensorflow!!! The assignment walks you through Tensorflow structure and basics in a very organized fashion.

Highly recommended!

교육 기관: Mihai L

•Jan 21, 2018

This course is also interesting. The art of tuning hyper parameters and other optimization techniques are very interesting and nicely explained.

The introduction to Tensorflow and assignment is also interesting.Overall the difficulty is not high but the concepts are really powerful and important ,most scaffollding is done

교육 기관: Vlad M

•Sep 07, 2018

The course part is overall good.

The last assignment can be improved in two key ways:

The comment # Z3 = np.dot(W3,Z2) + b3 should be # Z3 = np.dot(W3,A2) + b3 - figured this out by myself without help from forums. :)

Also, the Adam optimization is not very apparent in the instructions - searched in the forums for issues.

교육 기관: Brad M

•Aug 22, 2019

In my deep learning classes in academia, hyperparameter tuning was always "hand-waved" away - my questions were always deflected, or put off. This class answered every one of my questions, and made me more confident I'd be able to implement a DL system in industry, and be satisfied with the results. Very good course!

교육 기관: Toby K

•Nov 01, 2019

I am working through the DL specialisation. Consistently good teaching style and the programming assignments are suitably pitched for getting the learner to pick up methods quickly e.g. Tensorflow syntax for self-application later. Good course and looking forward to the next in the series. Well done Andrew and team.

교육 기관: Ankur T

•Nov 21, 2018

word is not sufficient signup and experience it. For a deep learning beginner who already have math background can easily understand concept behind it but for implementation you need to refer extra materials on internet and book too. Andrew Ng explain only concept and recipe but for practice you will struggle hard.

교육 기관: afshin m

•Feb 05, 2018

This course is continuation and a requirement of the first course. Really like the learning style of how first course and the first 2 weeks of the second course taught neural networks by doing all the math and calculations manually and finally introduced Tensorflow with parallels of what was taught in the class.

교육 기관: arulvenugopal

•Dec 17, 2017

This is another excellent course in this specialization. I enjoyed the programming assignments. The instructions, tips made Tensor flow coding section to be easy . However, few blocks consumed more than few hours, due to placeholders. logic and the TF documentation is overwhelming. I am proceeding to next course.

교육 기관: Wei L

•Aug 26, 2017

This course is harder than the previous one. It teaches more details of tuning parameters and optimization in deep learning. In the end it also teaches tensorflow which is really helpful. It's like a programming course, nerally all the commands have been already provided, so it's not hard to get the code correct.

교육 기관: 姜云鹏

•Nov 21, 2017

It is really good and teach me the basic understanding of DeepLearning back propagation and gradients optimization like Momentum, RMPS, Adam finally I learn how to use Tensorflow to train my model.

But there are some mistakes in the assignments and also in the grade so that it costs me a lot of time but useless.

교육 기관: Vinodh R

•Nov 12, 2017

The course content was excellent. The only issue is that there were some glitches with the grading of the second week programming assignment, in that I could obtain the expected output, but with repeated submissions, there would be (different) sections which could not be graded due to unnamed technical issues.

교육 기관: Renato L

•Jul 03, 2019

Excellent content and very well explained. Thanks for this amazing course.

The course cover the building blocks of a Neural network. Andrew (and his team) did a great job by organizing the content in an evolving way in which you have the chance to build the knowledge from each piece of a (deep) Neural Network.

교육 기관: Bryan H

•May 28, 2018

Practical programming lessons, and well-paced enjoyable lectures.

Comments:

Move tutorials on TensorFlow to Course 3, which was the most obscure part of the course. TensorFlow isn't as intuitive as other numerical toolboxes, so spending more time on the foundations of TensorFlow might reduce the learning curve.

교육 기관: Mojtaba H

•Feb 11, 2020

It covers very good tips and tricks to build and enhance deep learning model.

Andrew is the best teacher for ML and Deep Learning, he covers all theory and practice simultaneously.

In this course you can understand all mathematical intuitions and implementation of neural network from scratch by your own codes.

교육 기관: Rob v P

•Oct 02, 2017

This second course in the specialization is really great. I have gained a lot of insight in hyperparameter tuning and the reason why they work (or don't ;-). It is much easier now to understand what models are doing and why we need certain techniques. This is again one of the best courses for deep learning.

교육 기관: Abdallah D

•Feb 03, 2020

Fantastic course providing a broad overview of hyperparameter tuning in deep neural networks. The introduction on TensorFlow is informative. Looking forward to the three remaining courses of this great specialization on machine leaning. Thanks Andrew and their assistants for putting those courses together!

교육 기관: Daniel R B

•Jun 06, 2018

I really liked the course. The forum is very helpful navigating programming errors during the assignments.

A thing to improve would be to get the feedback from the forums to the lectures. Specially in corrections that should be made to the programming assignments that don't match the expected result. Thanks

교육 기관: Steve S

•Dec 11, 2017

Provided a lot of deeper insights passed over in the previous course in the specialization. Between this course and the previous course, you feel like you have a very solid beginner's understanding of deep learning, but one that is also practical enough and comprehensive enough to start coding on your own.

교육 기관: Marcin G

•Oct 15, 2017

Andrew Ng is a great teacher and will get you excited about improving deep networks. In this course you will get to know how to increase performance of your network. Essential course for deep networks specialists and amateurs. Additionally you will get to know most influential people befind the technology.

교육 기관: Shashank S S

•Jul 08, 2019

All possible area of Improving Deep Learning models got covered in detail. I liked the lucid and intelligible way of explanation . Since the topics were vast to cover , I would recommend to get the course extended by 1 week with one more programming assignment on using tensor-flow with a capstone project.

교육 기관: Vincenzo M

•Sep 11, 2017

This course will becoma a foundamental course for people that aim to work in the machine learning / deep learning area because it presents clearly the recent innovations in the deep learning. For production environment people will probably use open source framework, but this course clarify what is behind.