4.32 (763 reviews)
☑ Understand the history about BERT and why it changed NLP more than any algorithm in the recent years
☑ Understand how BERT is different from other standard algorithm and is closer to how humans process languages
☑ Use the tokenizing tools provided with BERT to preprocess text data efficiently
☑ Use the BERT layer as a embedding to plug it to your own NLP model
☑ Use BERT as a pre-trained model and then fine tune it to get the most out of it
☑ Explore the Github project from the Google research team to get the tools we need
☑ Get models available on Tensorflow Hub, the platform where you can get already trained models
☑ Clean text data
☑ Create datasets for AI from those data
☑ Use Google Colab and Tensorflow 2.0 for your AI implementations
☑ Create customs layers and models in TF 2.0 for specific NLP tasks
Dive deep into the BERT intuition and applications:
Suitable for everyone: We will dive into the history of BERT from its origins, detailing any concept so that anyone can follow and finish the course mastering this state-of-the-art NLP algorithm even if you are new to the subject.
Powerful and disruptive: Learn the concepts behind a new BERT, getting rid of RNNs, CNNs and other heavy deep learning models to implement a more intuitive way to process language that will suit a wide range of NLP purposes, including yours!
User-friendly and efficient: We’ve designed the course using the latest technologies, using Tensorflow 2.0 and Google Colab, assuring that you won’t have any local machine/software version/compatibility issues and that you are using the most up-to-date tools.
Welcome to the course
Course curriculum, Colab toolkit and data links
BERT - Intuition
What is BERT?
Old fashioned seq2seq
Transformer general understanding
Application: using BERT's tokenizer
Application: using BERT as an embedder
Application: fine-tuning BERT to create a question answering system
Summary: how to use BERT
Great course. It covers all the basics with enough detail to actually use BERT at the end. Well worth the time.
The slides in the theoretical part are just a handful of images in a video of ~20 minutes, without any example words or vectors to indicate what goes where, which makes it harder to follow than necessary.
Very Informative Course. Covering the Basics of BERT until the good level of BERT applications. Thanks
Martin teaches this course in a wonderful way. Understood the basics of BERT along with some of its applications.
The explanations are quite thorough. Going through the code is great, too, but there is a little too much time spent on the "easy stuff" (like using pip, etc.)
Ovarall this course is very well put together and step by step increases the understanding of how to apply BERT to tackle different problems. Absolutely my preferred way to learn so thank you for that! One thing to note is that the third application is very related to SQUAD and it would be great to have a session on how to transfer this knowledge to constructing a custom Q/A application from scratch as one fear to learn too much specific to SQUAD. I would also have preferred a session on how to fine tune the training for the sentiment classification model as well just to have seen it. That being said I can highly recommend this course to anyone ready to start digging deeper into BERT
would like to get bit more details on BERT customization. At least documentation pointers from Google.
I like this course! I appreciate the level of detail that instructor shared to the coding section and his insight and experience sharing. Sometimes a couple sentence of explanation save me quite some time of web search. I like the project example with actual hold-hand coding. I admit initially it's harder to follow since I am new to this area, but it gets easier as the 3 projects are related to each other and can carry on to the next project. I have a solid learning feeling. The only thing is if the transcript could be real but not auto-generated would help me to understand some word or terminology easier. This is one of a few courses I've actually completed end to end.
There are topics that need to be explain better, because it's a lot a code but at the end, I'm not sure if I can apply this to a personal project. All courses from SDC have had an intuation part with a presentation deck that helps to understand better.
So far, just reviewing stuff I already know. BTW, I've seen a few spelling / typing errors in the slides. Update: the instructions in the course to get the data for the first practical section didn't work. I also didn't see any link to the notebook (I expected to have a template).
The course explains about the BERT overall on an average level. The application part of the BERT is quite good but the theory part of the BERT could be more mathematical and relevant research papers should have been referenced for further exploration.
I find it rather ambiguous when the instructor mention Shape and dimensions of the layers in the network. I know he did mentioned it in the first part but still, the learner has to go back and find the diagram and it really affects the flow. So I would suggest while constructing the network, he could add a diagram to illustrate the dimension of each layer.
I find it hard to follow, its the French accent but also a lot of theory which requires a lot of concentration and could benefit from shorter lectures and more samples to explain.
I was hoping to learn everything about using BERT, but I haven't. The most important things (how to prepare data and what is returned by BERT) took just a few minutes without ANY visuals. Don't show your typing, show a well-commented code and tell why did you write this and that. And don't use SQuAD, your students will have their own tasks!
The course is excellent as it details the state-of-the-art BERT algorithm on SQUAD 1.1. It would have been REALLY AWESOME if the course also included a further section on the newer SQUAD 2.0, and perhaps (please, PLEASE!) on the more general and more interesting open-domain question answering topic. Kudos to the author and the team!!!