Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization(으)로 돌아가기

4.9

별점

52,700개의 평가

•

5,966개의 리뷰

This course will teach you the "magic" of getting deep learning to work well. Rather than the deep learning process being a black box, you will understand what drives performance, and be able to more systematically get good results. You will also learn TensorFlow.
After 3 weeks, you will:
- Understand industry best-practices for building deep learning applications.
- Be able to effectively use the common neural network "tricks", including initialization, L2 and dropout regularization, Batch normalization, gradient checking,
- Be able to implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence.
- Understand new best-practices for the deep learning era of how to set up train/dev/test sets and analyze bias/variance
- Be able to implement a neural network in TensorFlow.
This is the second course of the Deep Learning Specialization....

Jan 14, 2020

After completion of this course I know which values to look at if my ML model is not performing up to the task. It is a detailed but not too complicated course to understand the parameters used by ML.

Dec 24, 2017

Exceptional Course, the Hyper parameters explanations are excellent every tip and advice provided help me so much to build better models, I also really liked the introduction of Tensor Flow\n\nThanks.

필터링 기준:

교육 기관: Shangjin T

•Mar 02, 2018

I've learnt much from course including preprocessing (mini-batch, regularization, normalization), gradient descent algorithm (batch gradient descent, stochastic gradient descent, mini-batch gradient descent) and the variants (momentum, RMSProp, Adam). Also there's TensorFlow tutorials which I love best.

Thanks for Andrew Ng for bringing us such an amazing fundamental course of DNN!

교육 기관: sourabh

•Oct 17, 2019

This course really helped me getting the deep insight into the hyper-parameters which need to be tuned to get the optimal learning of the algorithm with the different algorithms necessary for improving learning rate.Andrew Ng really simplified the tough things and arranged them in a proper series of videos that is easy to understand.This will really help me lot in future.Cheers!

교육 기관: Danilo Đ

•Dec 04, 2017

I suppose Hyperparameter tuning, Regularization and Optimization are some of the most important aspects of Deep Learning, since 90% of most of the DL projects come down to just that. Andrew masterfully dives into the intuitions behind some of the most widely used approaches, and the programing assignments are designed to show the impact good tuning could have on a DL algorithm.

교육 기관: Mohammed A

•Jan 07, 2018

Great explanation of optimizations that can help speed up deep learning algorithms. Loved the little tips and tricks that are covered in different sections. The easy with which Prof. Ng explains complex concepts and analogies is commendable. The programming assignments are very helpful to people without expert programming experience too, that makes the experience very smooth.

교육 기관: Anirudh S

•Nov 06, 2017

In my opinion it would be a good to have a short video describing how to drive the ml project in the company. As i am taking ml course and this specialization, I started with working on octave, then numpy then tensorflow, so it would be good to have some advice/tips on when to use octave or numpy or tensorflow for building a model when you get a project in ml in your job.

교육 기관: Saad A

•Oct 04, 2017

Post the first course, this course would is the one that is going to make you feel like a deep learning practitioner. You get to understand why deep learning is sometimes called an art how much difference in terms of speed and accuracy can be made just by tuning the hyper parameters. Highly recommended if you know deep neural networks and willing to dive deeper into them.

교육 기관: xun y

•Apr 07, 2019

Again a great course about deep learning. The course structure is very well defined, with step by step to build technical foundations in the beginning and later using open source deep learning framework to connect all the pieces together. Dr. Andrew Ng made all of them very easy to learn and sometimes I feel like I should jump out the comfortable zone he created for us.

교육 기관: Willismar M C

•May 22, 2018

Very nice course about important subjects of Vanilla Neural Networks, as optimizations algorithms , regularization methods, hyper-parameters used and how to implement them in practice. A very nice chapter on the sequence of the specialization that give me understanding on important aspects of it, how to use and how to implement them. I really enjoyed each detail of it.

교육 기관: Bharath S

•Jul 08, 2019

This course gives a very good idea of the overfitting problem in deep learning and different ways to overcome it. It also introduces commonly used optimization methods in deep learning. A nice introduction to tensorflow is provided in the last week's programming assignment. Overall it is a very satisfying course. Many thanks to the instructor and the entire team!

교육 기관: Hari K M

•Jan 04, 2018

Key course in the specialization and covers wide array of topics which are responsible for improving the DNNs. Complicated than the first course but very well explained by Andrew Ng. Things definitely get clear after doing the programming assignments. One should definitely complete this course if one has already completed the first course. I totally recommend it.

교육 기관: Bilal A

•Jan 12, 2020

Course was amazing, content was amazing, assignments was amazing.

Andrew Ng is the best teacher I have ever experienced in my life. I learned a lot from this course, these things are very difficult to learn from research papers it takes a lot of time but person with great passion of deep learning can learn all these things in just three weeks. Highly Recommended.

교육 기관: Hiep P

•Nov 29, 2017

In era of deep learning bloom, know how to control network model is an important thing. And this course has them all, from tuning learning rate to speed-up convergence or applying drop-out for avoiding overfit, etc... It shows you the under-the-hood theory and brings you the knowledge to grasp the basics yourself, and actually can apply back into your projects.

교육 기관: WALEED E

•Jan 08, 2019

The course is very useful for being acquainted with tuning hyper-parameters and modern optimization algorithms like momentum, RMSProp an Adam. It is also introducing how to prevent over-fitting efficiently from recent papers in addition to mini batching training data. Although it introduces TensorFlow in a brief way, the overall assessment needs some revision.

교육 기관: Ruthuparna K

•Jul 09, 2020

Gives you an in-depth understanding on how to finetune your neural network hyperparameters and introduces you to the various optimization methods. Finally, an introduction to TensorFlow gives a more practical solution to developing your code fast and easy. Yet again, Andrew Ng is nothing short of brilliant and his ML content is always the best in the world.

교육 기관: 石啸

•Feb 16, 2020

I strongly recommend this course since I pass an interview after finish the first and second specialization. Although it is not enough for some high-demanded company, it is a really good lecture and experience for the new beginner in neural networks. But I have to say that the project is too easy so far, I wish we will have more great exercises and projects!

교육 기관: Saimur R A

•Aug 02, 2020

This course trully go deeper into the deep learning and I learned a lot of things which improve my concept about NN network. Andrew gave an excellent lesson like the first course and simplify everything and the quote from Andrew "if you dont understand anythink don't worry too much about it" really make sense and over the time the concept will get clearer.

교육 기관: Jaime A

•Sep 08, 2017

Very clear, straight to the point, explanations with very well guided programming assignments in Python to hammer the concepts. A lot of knowledge and experience condensed in just a few hours and materials. I recommend previous exposure to Python and Machine Learning to make the most of this course (Ng's Coursera's course provides a very solid foundation)

교육 기관: Amaranath B

•Oct 13, 2019

This is an amazing course , the way they had designed the transition from numpy to tensorflow was amazing. The the concepts of gradient descent with momentum to adam optimizer was great coming from your previous course , I can't express how much this has grounded my understanding. I'm pushing myself to complete the specialization. Thanks a lot everyone !

교육 기관: Naveen K

•Sep 25, 2017

The course if very structured. Can't think of any improvement in course structure. Will like to thank Andrew Sir for this great effort.

As an improvement it would be great if people can be encouraged to solve problems on different dataset on internet such as kaggle. Such sources with other help can be provided as work to do after the completion of Kaggle.

교육 기관: Daniel V I

•Feb 09, 2020

A fine continuing of the previous course in this specialization.

Learning optimization algorithms to improve our parameters' update, how to normalize the inputs at each and every layer, how to prioritize certain hyperparameters over others when testing.

All culminating with Tensorflow, a platform that saves us a lot of time in programming Neural Networks.

교육 기관: GAURAB B

•Jun 19, 2019

Brilliant material altogether.. almost a compulsory course for researchers diving on the ocean of deep learning.. While I was reading papers on deep learning I came across all these terms but couldn't understand it.. Now the picture is pretty clear... Thanks Prof. Andrew Ng for this wonderful effort. I have already recommended this course to everyone.

교육 기관: zhijun l

•Dec 06, 2018

A great course talks about the detail in building Neural networks. With the first course as a foundation, student taking this definitely will get a better understanding on hyperparameter tuning and optimization, in addition on training neural networks. I recommend this course to those who would like to know neural networks more than just the concept!!

교육 기관: shaila a

•Jul 26, 2020

The details covered in the course are very important for pracical use. They are not commonly available on the Internet otherwise. Also, with the new libraries that make the task of coding easier, the knowledge of tuning parameters, of optimizing learning curves, is often overlooked. This course highlights the importance of that knowledge. Thank you!

교육 기관: Oliver M

•Aug 14, 2017

Having completed Udacity 730 on Tensorflow, I found Andrew Ng filled crucial gaps in my understanding. He is not afraid of presenting some maths to build intuition, but he always presents it in a straightforward way. Compare his explanation of Adam optimisation with the source paper on the subject. Andrew boils it down and serves it up beautifully.

교육 기관: Adail M R

•Sep 13, 2017

Once more, Prof. Ng show in his simple style how to tackle the tough subject of hyperparameter tuning, pointing to several techniques and helping us selecting the most appropriate ones for the task at hand. The Tensorflow introduction is also very effective and engaging! Looking forward to advance my knowledge and experience with the next courses!