Clean, clear and helpful. Thanks a lot!\n\nWould also be nice to see the approaches to tune BERT for the particular task (e.g. custom tokenization, pre-processing of data, etc.)
Thanks to Mr.Ari Anastassiou\n\nSentiment Analysis with Deep Learning using BERT! is been really a wonderful project .Enjoyed it
교육 기관: Mona H•
He does not explain anything about the logic behind what he is doing. It was as if for an hour and a half I just was typing the same as he did, with not a lot of knowledge about what I was doing.
교육 기관: Nattapat S•
Lack of explanation. He just shows how to code but didn't explain anything how each line of code words. The coding website is not easy to use. Noisy sounds like a motorcycle appears in clips.
교육 기관: MOGAN P K S•
More explanations on the functions and libraries used will make this project better
교육 기관: Shanshan W•
The instructor explains very well on how to using bert to train a sentiment classifier. Very cool project.
교육 기관: C K•
I didn't want to have syntax errors after I paid 7 pounds no matter how much they were minors - two variable names were corrected by myself. Instructor should be more careful about coding errors during the recording rather than repeat apologies - why not record again? I agree that his effort to create own algorithm deserves to get paid but not too sure why and how he neglects on presentation. As long as it is published in the public (with charges), please attention to details under your name.
교육 기관: Ganapathy ( K•
I really did not like using Rhyme. I preferred to. use the Jupyter notebook like in the other exercises. Also, the instructor made mistakes in the code which was distracting. He also did not explain the intuition and concepts very well. Aspire to teach like Andrew Ng.
교육 기관: John B•
I found the course useful. It provides a hands on, working example of a BERT implementation. The finished model can be downloaded and training it for your own purposes would not be too big a leap. A grasp of Python and neural networks is needed.
교육 기관: Yesica C•
The course is very informative, basic machine learning concepts are needed, full explication of each line of code. I really liked this project and I will definitely use it at work.
교육 기관: Fiodar R•
Clean, clear and helpful. Thanks a lot!
Would also be nice to see the approaches to tune BERT for the particular task (e.g. custom tokenization, pre-processing of data, etc.)
교육 기관: Phan C N•
Like other have said, this course lack a lot of explation on why should you do this and that and what is the thing that we are doing. If you have some background of NLP and sentiment analysis then you can understand why lecturer is doing this and that, but otherwise you might feel confuse. There's additional reading from pytorch manual. BERT and you can google the concept yourself, so it's not like it's impossible to understand, but you need to spend significant more time to do your reading.
The project and the way he doing this project is pretty good. I think the quality of the project is much better and if you managed to make sth similar, you can show it as a good personal project
교육 기관: Ravinder S•
Ari Anastassiou has done very well to keep is crisp and has taken great care in explaining the implementation. His style is lucid and sincere. I would recommend this short course to anyone who needs an introduction to this heavy concept in a simple and less intimidating manner. Nice work by Ari ! I would love to see a pithy tutorial from him( may be 30 mins) to explain the concepts of BERT as well. This could make it a PERFECT 10 for me. Thank you!
교육 기관: TADONFOUET T L•
The project and concepts exposed were good , But more explanations of libraries and comments for some lines of code will be welcome. Despite these, I learned a lot.
교육 기관: Unnmesh M•
There could have been more explanation about the libraries and the module 6,7,8 and 9 could have covered more deeply.
교육 기관: Vaibhav D•
I recommend this project to newcomers and freshers in the field of ML.
교육 기관: Siao S•
the instructor should be more prepared, instead of making so many small mistakes in typing/coding.
Also a little enthusiasm would be nice.
교육 기관: Ali A•
more theory needed. Also some benchmarks can be added to show in which ways Bert outperforms others.
교육 기관: Vaibhav J•
Could Have been better
교육 기관: Matheus S•
Great instructor! Comments about theory and shortcut commands just on point. After finish the project, you'll have a good foundation, both in code and theory knowledge, to finetune attention models to other specific tasks. Kudos to all the involved!
교육 기관: Oleksandra P•
Great course! This was my first time participating in guided projects. The topic is relevant to my job, therefore it was very useful to go through building models using BERT with instructor. I've had issues with rhyme though.
교육 기관: Federico C•
The instructor is excellent. Value for money very high. I would recommend in future to offer the code session on Google Colab. The virtual machine is a nice idea, but it is not so convenient as having the code in Colab.
교육 기관: Grace G N B•
Thanks to Mr.Ari Anastassiou
Sentiment Analysis with Deep Learning using BERT! is been really a wonderful project .Enjoyed it
교육 기관: Sanathraj N•
Thanks, Ari Anastassiou for the wonderful tutorial. Hoping you do a complete course on NLP using BERT soon.
교육 기관: Patil B•
Very effective course to understand the concept of sentiment analysis using Deep Learning.. Thank you team