Udemy

Platform

English

Language

Data Science

Category

Natural Language Processing: NLP With Transformers in Python

Learn next-generation NLP with transformers for sentiment analysis, Q&A, similarity search, NER, and more

4.60 (170 reviews)

Natural Language Processing: NLP With Transformers in Python

Students

11.5 hours

Content

Jun 2021

Last Update
Regular Price

SKILLSHARE
SkillShare
Unlimited access to 30 000 Premium SkillShare courses
30-DAY FREE TRIAL

What you will learn

Industry standard NLP using transformer models

Build full-stack question-answering transformer models

Perform sentiment analysis with transformers models in PyTorch and TensorFlow

Advanced search technologies like Elasticsearch and Facebook AI Similarity Search (FAISS)

Create fine-tuned transformers models for specialized use-cases

Measure performance of language models using advanced metrics like ROUGE

Vector building techniques like BM25 or dense passage retrievers (DPR)

An overview of recent developments in NLP

Understand attention and other key components of transformers

Learn about key transformers models such as BERT

Preprocess text data for NLP

Named entity recognition (NER) using spaCy and transformers

Fine-tune language classification models


Description

Transformer models are the de-facto standard in modern NLP. They have proven themselves as the most expressive, powerful models for language by a large margin, beating all major language-based benchmarks time and time again.

In this course, we learn all you need to know to get started with building cutting-edge performance NLP applications using transformer models like Google AI's BERT, or Facebook AI's DPR.

We cover several key NLP frameworks including:

  • HuggingFace's Transformers

  • TensorFlow 2

  • PyTorch

  • spaCy

  • NLTK

  • Flair

And learn how to apply transformers to some of the most popular NLP use-cases:

  • Language classification/sentiment analysis

  • Named entity recognition (NER)

  • Question and Answering

  • Similarity/comparative learning

Throughout each of these use-cases we work through a variety of examples to ensure that what, how, and why transformers are so important. Alongside these sections we also work through two full-size NLP projects, one for sentiment analysis of financial Reddit data, and another covering a fully-fledged open domain question-answering application.

All of this is supported by several other sections that encourage us to learn how to better design, implement, and measure the performance of our models, such as:

  • History of NLP and where transformers come from

  • Common preprocessing techniques for NLP

  • The theory behind transformers

  • How to fine-tune transformers

We cover all this and more, I look forward to seeing you in the course!


Content

Introduction

Introduction

Course Overview

Environment Setup

CUDA Setup

NLP and Transformers

The Three Eras of AI

Pros and Cons of Neural AI

Word Vectors

Recurrent Neural Networks

Long Short-Term Memory

Encoder-Decoder Attention

Self-Attention

Multi-head Attention

Positional Encoding

Transformer Heads

Preprocessing for NLP

Stopwords

Tokens Introduction

Model-Specific Special Tokens

Stemming

Lemmatization

Unicode Normalization - Canonical and Compatibility Equivalence

Unicode Normalization - Composition and Decomposition

Unicode Normalization - NFD and NFC

Unicode Normalization - NFKD and NFKC

Attention

Attention Introduction

Alignment With Dot-Product

Dot-Product Attention

Self Attention

Bidirectional Attention

Multi-head and Scaled Dot-Product Attention

Language Classification

Introduction to Sentiment Analysis

Prebuilt Flair Models

Introduction to Sentiment Models With Transformers

Tokenization And Special Tokens For BERT

Making Predictions

[Project] Sentiment Model With TensorFlow and Transformers

Project Overview

Getting the Data (Kaggle API)

Preprocessing

Building a Dataset

Dataset Shuffle, Batch, Split, and Save

Build and Save

Loading and Prediction

Long Text Classification With BERT

Classification of Long Text Using Windows

Window Method in PyTorch

Named Entity Recognition (NER)

Introduction to spaCy

Extracting Entities

Authenticating With The Reddit API

Pulling Data With The Reddit API

Extracting ORGs From Reddit Data

Getting Entity Frequency

Entity Blacklist

NER With Sentiment

NER With roBERTa

Question and Answering

Open Domain and Reading Comprehension

Retrievers, Readers, and Generators

Intro to SQuAD 2.0

Processing SQuAD Training Data

(Optional) Processing SQuAD Training Data with Match-Case

Our First Q&A Model

Metrics For Language

Q&A Performance With Exact Match (EM)

ROUGE in Python

Applying ROUGE to Q&A

Recall, Precision and F1

Longest Common Subsequence (LCS)

Q&A Performance With ROUGE

Reader-Retriever QA With Haystack

Intro to Retriever-Reader and Haystack

What is Elasticsearch?

Elasticsearch Setup (Windows)

Elasticsearch Setup (Linux)

Elasticsearch in Haystack

Sparse Retrievers

Cleaning the Index

Implementing a BM25 Retriever

What is FAISS?

FAISS in Haystack

What is DPR?

The DPR Architecture

Retriever-Reader Stack

[Project] Open-Domain QA

ODQA Stack Structure

Creating the Database

Building the Haystack Pipeline

Similarity

Introduction to Similarity

Extracting The Last Hidden State Tensor

Sentence Vectors With Mean Pooling

Using Cosine Similarity

Similarity With Sentence-Transformers

Fine-Tuning Transformer Models

Visual Guide to BERT Pretraining

Introduction to BERT For Pretraining Code

BERT Pretraining - Masked-Language Modeling (MLM)

BERT Pretraining - Next Sentence Prediction (NSP)

The Logic of MLM

Fine-tuning with MLM - Data Preparation

Fine-tuning with MLM - Training

Fine-tuning with MLM - Training with Trainer

The Logic of NSP

Fine-tuning with NSP - Data Preparation

Fine-tuning with NSP - DataLoader

The Logic of MLM and NSP

Fine-tuning with MLM and NSP - Data Preparation


Reviews

S
Serge14 July 2021

Strange exercises - you can send a task once that is placed in a notebook. It has to be reviewed by other students. This is very wrong. Reviewing should be automatic (unit tests), it should be possible to send a lot of assignments - otherwise, how do you learn new things? That said, the material is extremely actual, I learned new modern frameworks (e.g. Haystack). The author of the course has a good blog where I will still working over the articles, running and modifying the original code.

C
Christopher8 July 2021

Exceptional course, James is an expert at NLP. Out of the numerous NLP courses I've done, this is the best one.

N
Nicholas30 June 2021

This course covered a lot of really interesting topics. However, it felt like more of an overview. Personally, I would have preferred a deeper dive into individual topics rather such a broad approach.

T
Trading4 June 2021

I am a course creator as well, and that is one the best course that I took on Udemy. I strongly recommend.


Coupons

DateDiscountStatus
6/5/2021100% OFFExpired

3754106

Udemy ID

1/6/2021

Course created date

6/5/2021

Course Indexed date
Bot
Course Submitted by