Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization(으)로 돌아가기

4.9

별점

48,112개의 평가

•

5,363개의 리뷰

This course will teach you the "magic" of getting deep learning to work well. Rather than the deep learning process being a black box, you will understand what drives performance, and be able to more systematically get good results. You will also learn TensorFlow.
After 3 weeks, you will:
- Understand industry best-practices for building deep learning applications.
- Be able to effectively use the common neural network "tricks", including initialization, L2 and dropout regularization, Batch normalization, gradient checking,
- Be able to implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence.
- Understand new best-practices for the deep learning era of how to set up train/dev/test sets and analyze bias/variance
- Be able to implement a neural network in TensorFlow.
This is the second course of the Deep Learning Specialization....

Apr 19, 2020

Very good course to give you deep insight about how to enhance your algorithm and neural network and improve its accuracy. Also teaches you Tensorflow. Highly recommend especially after the 1st course

Oct 09, 2019

I really enjoyed this course. Many details are given here that are crucial to gain experience and tips on things that looks easy at first sight but are important for a faster ML project implementation

필터링 기준:

교육 기관: Itsido C A

•Dec 17, 2019

This is a must to really understand and master the art of machine learning. With this course I understood that building a model and training it is not even half of the story of being a machine learning engineer, without knowledge of how to tune the models parameters you might not be able to deliver product on schedule. Thanks for Dr Andrew and the team for an awesome content and learning experience.

교육 기관: sunshineren

•Aug 31, 2019

It is really a EXTREMELY GOOD course for a bad-basic student, according this course, not only I have know the theories, but also the pratical project.I do think now I know the BN, the Hyperparameter, and the Regularization and so on in Deep Learning field! It would be very helpful for me to step into the AI!

and both videos and lectures are very important for new comers in deep learning ! THANKS ALOT!

교육 기관: Nouroz R A

•Sep 13, 2017

This is one of the best MOOC I have ever come up to. Very informative, well explained and easily put. This course helped me to learn so many new things that I had missed in books and research papers. Thanks Andrew Ng, this was like a debt to me. As a wannabe deep learning researcher/Engineer, your contribution to help me catch the basic concepts will always be remembered. :-)

Yes, highly recommended.

교육 기관: Rohit

•Jul 06, 2018

This course has really helped me alot in gaining better insights about improving deep neural networks by tuning the required hyperparameters. It has also increased my understanding of the previous course and I would definitely recommend this course. I would like to express my gratitude from the bottom of my heart to the Coursera team and the specialization course team for such an amazing course.

교육 기관: XiaoLong L

•Aug 14, 2017

After reading the Deep Learning book wrote by Ian Goodfellow, it's much more easy for me to complete this course within two days. I've gotten a lot through this course and know more detail about the deep learning hyperparameter tuning, regularization and optimization methods now. Thanks so much for Prof. Andrew and TAs. I will keep learning the 3rd course in this specification of deep learning.

교육 기관: Ram N

•Jan 01, 2020

The course covers the theory and implementation details of advanced optimization algorithms. A good amount of intuition was provided in the explanation of these algorithms. A basic explanation of bias and variance and how hyper parameters affect them both is explained clearly. I liked the hands on part, as it allowed me to implement the algorithms discussed and gain more clarity in the process.

교육 기관: Harry ( D

•Jul 21, 2018

Very useful follow up to the first course in this specialization. Learned all the details of how to tune and optimize a deep neural network, as well as nice introduction to Tensorflow. Some typos in the comments of the final assignments but they were easy to spot. This time Jupiter notebooks worked better that during the time I was working on the previous course with less or no resets required.

교육 기관: Mukund C

•Oct 15, 2019

Excellent Course. Really structured way of learning the importance of hyper parameters and their effects on the learning/training and hammering concepts like "regularization" home.

Just an observations, but it seems like the mentors are not that engaged in these courses anymore, but there are enough help threads that one can figure out the questions - specifically on the programming exercises.

교육 기관: Ayush K

•Jun 16, 2018

What an amazing course it is. Perfect explanation how we can use optimize our cost more efficiently and effectively. Also this course includes techniques to overcome problems like over fitting i.e Regularization and Dropout techniques. Information about Batch Normalization is very splendid. Also got little intuition about tensor flow. Thank You Andrew Ng for providing such a wonderful course.

교육 기관: colinyu

•Jan 15, 2018

Prof Ng is a great teacher and is good at making the difficult material very easy to learn. I am very interested in the DL. Before I took this class, I found that since this field is very new so all the material you can find is a little piece and not systematical. This specialization is a wonderful and systematical, easy to learn and fun. Thanks for the great work those teacher have done .

교육 기관: Zhou S

•Mar 08, 2018

Awesome illustration on deep network's regularization techniques, weight initialization techniques and gradient checking, and more. This class provides you with hands-on experience with how to tune a deep network efficiently. You will not only learn the techniques but also understand many of the intuitions of how each technique works. A must take if you are dedicated into machine learning!

교육 기관: Rusty M

•Dec 07, 2018

I learned a lot about the area that is not much talked about in deep learning, which is hyperparameter tuning! The forum was very helpful in debugging the programming assignments! Thank you Prof. Ng for the wonderful course. I thank Coursera as well for believing in me and granting me Financial Aid. It wouldn't have been possible without your help, Coursera Team. THANK YOU VERY MUCH! :D

교육 기관: Neeraj B

•Oct 03, 2019

This was an excellent follow-up of the first course. Having used adam optimization for almost all the neural network models I have build it was great to understand the mathematical intuition behind adam optimizers. Also the programming assignment gave a wonderful refresher and practice of tensorflow. Overall I'm glad hyperparameter tuning and optimization was chosen as a seperate course

교육 기관: MANRAJ S C

•Oct 16, 2019

The course is great and will help you in understanding on how to optimize your deep learning algorithm and tune your hyper-parameters. The course provides insights into the exponentially weighted averages concept too which helps you understand how things work behind the scenes when trying to optimize your algorithm. Dropout and regularization have also been explained to a good extent.

교육 기관: Chan-Se-Yeun

•May 01, 2018

This course is very useful for practical purpose. I've learnt a systematic method to develop and iterate my algorithms, which saves me a lot of time. And it's been the first time that I get to know so many variants of gradient descent method, such as Adam and RMSprop. By the way, the programming assignments get a bit hard, but it help me better understand the algorithms. Thanks a lot!

교육 기관: Andreea A

•Feb 02, 2019

This was a useful course for newbies in neural networks. It gave useful hints regarding how to update the model one is using based on what problems one observes, as well as how to tune the hyperparameters (if there is enough computational power or one runs a small problem). Obviously, this is just a starting point and one should invest a lot of time and energy to become experienced.

교육 기관: Jay G

•Sep 24, 2018

All the quality of the first course, but even better. My 4-stars for course one were addressed in these Jupyter notebooks. They were still manageable but the prompts provided very good reinforcement to the various tuning algorithms. A top-notch offering...one I'll be sure to recommend broadly. I'm very much looking forward to the remaining courses in the Specialization. Thanks!

교육 기관: Sarthak k

•Aug 12, 2019

I had a very good time getting teaching sessions from ANDREW NG .., I am a second year student and have entered in this field of deep learning since some months then i encountered this specialization and with the deep concepts of Sir ANDREW NG ,i am now able to make much more complicated models ever before...I hope i could get an autograph from my Ideal in this field

Mr.Andrew Ng

교육 기관: SUJITH V

•Oct 26, 2018

This is a great course to learn about practical aspects of neural networks. Some parts are challenging to consume as most of the material relies on intuition rather than detailed mathematical explanation. This helps to involve more people in the course who are intimidated by mathematical equations. A great addition would be to have optional mathematical details in separate videos.

교육 기관: Shangjin T

•Mar 02, 2018

I've learnt much from course including preprocessing (mini-batch, regularization, normalization), gradient descent algorithm (batch gradient descent, stochastic gradient descent, mini-batch gradient descent) and the variants (momentum, RMSProp, Adam). Also there's TensorFlow tutorials which I love best.

Thanks for Andrew Ng for bringing us such an amazing fundamental course of DNN!

교육 기관: sourabh

•Oct 17, 2019

This course really helped me getting the deep insight into the hyper-parameters which need to be tuned to get the optimal learning of the algorithm with the different algorithms necessary for improving learning rate.Andrew Ng really simplified the tough things and arranged them in a proper series of videos that is easy to understand.This will really help me lot in future.Cheers!

교육 기관: Danilo Đ

•Dec 04, 2017

I suppose Hyperparameter tuning, Regularization and Optimization are some of the most important aspects of Deep Learning, since 90% of most of the DL projects come down to just that. Andrew masterfully dives into the intuitions behind some of the most widely used approaches, and the programing assignments are designed to show the impact good tuning could have on a DL algorithm.

교육 기관: Mohammed A

•Jan 07, 2018

Great explanation of optimizations that can help speed up deep learning algorithms. Loved the little tips and tricks that are covered in different sections. The easy with which Prof. Ng explains complex concepts and analogies is commendable. The programming assignments are very helpful to people without expert programming experience too, that makes the experience very smooth.

교육 기관: Anirudh S

•Nov 06, 2017

In my opinion it would be a good to have a short video describing how to drive the ml project in the company. As i am taking ml course and this specialization, I started with working on octave, then numpy then tensorflow, so it would be good to have some advice/tips on when to use octave or numpy or tensorflow for building a model when you get a project in ml in your job.

교육 기관: saad a

•Oct 04, 2017

Post the first course, this course would is the one that is going to make you feel like a deep learning practitioner. You get to understand why deep learning is sometimes called an art how much difference in terms of speed and accuracy can be made just by tuning the hyper parameters. Highly recommended if you know deep neural networks and willing to dive deeper into them.