Udemy

Platform

English

Language

Data Science

Category

NLP-Natural Language Processing in Python(Theory & Projects)

Natural Language Processing using Spacy, NLTK, PyTorch, Natural Language Processing with Hands-on Projects

4.80 (5 reviews)

Students

23.5 hours

Content

Sep 2021

Last Update
Regular Price


What you will learn

• The importance of Natural Language Processing (NLP) in Data Science.

• The reasons to move from classical sequence models to deep learning-based sequence models.

• The essential concepts from the absolute beginning with complete unraveling with examples in Python.

• Details of deep learning models for NLP with examples.

• A summary of the concepts of Deep Learning theory.

• Practical description and live coding with Python.

• Deep PyTorch (Deep learning framework by Facebook).

• The use and applications of state-of-the-art NLP models.

• Building your own applications for automatic text generation and language translators.

• And much more…


Description

Comprehensive Course Description:

Natural Language Processing (NLP), a subdivision of Artificial Intelligence (AI), is the ability of a computer to understand human language the way it’s spoken and written. Human language is typically referred to as natural language.

Humans also have different sensors. For instance, ears perform the function of hearing, and eyes perform the function of seeing. Similarly, computers have programs for reading and microphones for collecting audio. Just as the human brain processes an input, a computer program processes a specific input. And during processing, the program converts the input into code that the computer understands.

This course, Natural Language Processing (NLP), Theory and Practice in Python, introduces you to the concepts, tools, and techniques of machine learning for text data. You will learn the elementary concepts as well as emerging trends in the field of NLP. You will also learn about the implementation and evaluation of different NLP applications using deep learning methods.

Why Use Python for NLP?

Python is the most preferred language for NLP thanks to its expansive tools and libraries for text analysis and computer-usable data extraction. This course will take you through numerous techniques for text pre-processing, from basics such as regular expressions and text normalization to complex topics such as string matching, language models, and word embeddings.

You will be considering most of the examples from the English language for understanding the algorithms. But the algorithms can be adapted to any language. (Hence, there’s no language/grammar dependency.) You will get exposure to state-of-the-art packages (NLTK, Gensim, SpaCy) as well as frameworks (PyTorch) along with extensive implementation/coding-oriented content in Python. The main focus of the course is on preparing text data for machine learning models.

Although we have separate courses on Deep learning, we do cover useful concepts in this course briefly to make this course more independent.


How Is This NLP Course Different?

The course content is very specific and to the point. The learning material is a perfect blend of theoretical concepts and practical applications. Examples and sample code have been included to help you grasp each concept with more clarity. Each NLP concept is structured and presented in such a way that makes it easy for you to understand.

High-quality video content, compelling course material, assessment questions, course notes, and handouts are additional perks that you will get. You can contact our friendly team in case of any queries.

This course encourages you to make quick progress. At the end of each module, you will get an opportunity to revise everything you have learned through Homework/tasks/activities. They have been designed to evaluate / further build your learning based on the concepts and methods you have learned. Most of these assignments are coding-based, and they will be useful to get you up and go ahead with implementations.

The two mini-projects in the last module—Neural Machine/Language Translator and Modify Language Translator a Bit and Build a Chatbot—focus on the innovative applications in this field. These mini-projects help you to apply the concepts of pre-processing text. You will use techniques such as parts of speech tagging, lemmatization, and tokenization using Python libraries.

NLP has made tremendous advances in the last decade, and it’s made the leap from research labs to real-world applications. While getting started in this field can be a challenging pursuit, this course presents you with a clear and actionable roadmap. It makes the task of accomplishing your career goals that much easier.

This course is competitively priced and delivers value for money, as you can learn the concepts and methodologies of NLP at a relatively low cost. The series of brief videos and the detailed code notebooks shorten your learning curve.

Get started with this NLP course without delay!


Course Content:

This complete course consists of the following topics:

1. Introduction

a. Motivation

i. What is Natural Language Processing (NLP)?

ii. Why is NLP important?

iii. What is Neural Language Modeling?

iv. How are language models used in Speech recognition?

v. Chatbots

b. Software

i. SpaCy

ii. NLTK

iii. Gensim

iv. PyTorch

2. Text Pre-Processing

a. Regular Expressions

i. Regular Expression Patterns

b. Text Normalization

i. Word Tokenization

ii. Byte Pair Encoding

iii. Sub words

iv. Word Normalization, Lemmatization, and Stemming

v. Sentence Segmentation

c. String Matching

i. Edit Distance

ii. Minimum Edit Distance

iii. Dynamic Programming

iv. Implementation of Minimum Edit Distance in NumPy

3. Word Embeddings

a. Language Models

i. Vocabulary

ii. Markov models

iii. N-Grams

iv. Novel Sequence generation

v. Language Modeling Using One Hot Vectors

vi. Limitations of One-Hot-encoding

b. Linear Subspaces for Word Embeddings

i. Term-Document-Matrix

ii. Tf-Idf

iii. Latent Semantic Analysis: SVD

iv. Word Cooccurrence Matrix

v. Word embeddings: SVD

vi. Limitations

c. Word2Vec

i. Skip-gram model

ii. Context and target sampling

iii. Hierarchical SoftMax

iv. Negative Sampling

d. More on Embeddings

i. GloVe

ii. FastText

iii. BERT

e. Analogies

i. Cosine Similarity

ii. Examples of Analogies

iii. Bias in Embeddings

4. Natural Language Processing with Deep Learning

a. Neural Networks

b. Types of Recurrent Neural Networks

i. One to One

ii. One to many

iii. Many to One

iv. Many to Many

v. Bi-directional RNNs

vi. Deep RNNs

c. Advanced RNN architectures for NLP

i. Encoder-decoder models

ii. Attention models

5. Projects

a. Neural Machine/Language Translator

b. Modify Language Translator a Bit and Build a Chatbot.


After completing this course successfully, you will be able to:


  • · Apply the concepts to any language to build customized NLP models.

  • · Learn machine learning concepts in a more practical way.

  • · Understand the methodology of NLP using real datasets.


Who this course is for:

  • · Complete beginners to Natural Language Processing.

  • · People who want to upgrade their Python programming skills for NLP.

  • · Individuals who are passionate about numbers and programming.

  • · Data Scientists.

  • · Data Analysts.

  • · Machine Learning Practitioners.


Screenshots

NLP-Natural Language Processing in Python(Theory & Projects)
NLP-Natural Language Processing in Python(Theory & Projects)
NLP-Natural Language Processing in Python(Theory & Projects)
NLP-Natural Language Processing in Python(Theory & Projects)

Content

Introduction

Introduction to Course

Introduction to Instructor

Introduction to Co-Instructor

Course Introduction

Request for Your Honest Review

Links for the Course's Materials and Codes

Introduction(Regular Expressions)

What Is Regular Expression

Why Regular Expression

ELIZA Chatbot

Python Regular Expression Package

Meta Characters(Regular Expressions)

Meta Characters

Meta Characters Bigbrackets Exercise

Meta Characters Bigbrackets Exercise Solution

Meta Characters Bigbrackets Exercise 2

Meta Characters Bigbrackets Exercise 2 Solution

Meta Characters Cap

Meta Characters Cap Exercise 3

Meta Characters Cap Exercise 3 Solution

Backslash

Backslash Continued

Backslash Continued 01

Backslash Squared Brackets Exercise

Backslash Squared Brackets Exercise Solution

Backslash Squared Brackets Exercise Another Solution

Backslash Exercise

Backslash Exercise Solution And Special Sequences Exercise

Solution And Special Sequences Exercise Solution

Meta Character Esterics

Meta Character Esterics Exercise

Meta Character Esterics Exercise Solution

Meta Character Esterics Homework

Meta Character Esterics Greedymatching

Meta Character Plus And Questionmark

Meta Character Curly Brackets Exercise

Meta Character Curly Brackets Exercise Solution

Pattern Objects(Regular Expressions)

Pattern Objects

Pattern Objects Match Method Exersize

Pattern Objects Match Method Exersize Solution

Pattern Objects Match Method Vs Search Method

Pattern Objects Finditer Method

Pattern Objects Finditer Method Exersize Solution

More Meta Characters(Regular Expressions)

Meta Characters Logical Or

Meta Characters Beginning And End Patterns

Meta Characters Paranthesis

String Modification(Regular Expressions)

String Modification

Word Tokenizer Using Split Method

Sub Method Exercise

Sub Method Exercise Solution

Words and Tokens(Text Preprocessing)

What Is A Word

Definition Of Word Is Task Dependent

Vocabulary And Corpus

Tokens

Tokenization In Spacy

Sentiment Classification(Text Preprocessing)

Yelp Reviews Classification Mini Project Introduction

Yelp Reviews Classification Mini Project Vocabulary Initialization

Yelp Reviews Classification Mini Project Adding Tokens To Vocabulary

Yelp Reviews Classification Mini Project Look Up Functions In Vocabulary

Yelp Reviews Classification Mini Project Building Vocabulary From Data

Yelp Reviews Classification Mini Project One Hot Encoding

Yelp Reviews Classification Mini Project One Hot Encoding Implementation

Yelp Reviews Classification Mini Project Encoding Documents

Yelp Reviews Classification Mini Project Encoding Documents Implementation

Yelp Reviews Classification Mini Project Train Test Splits

Yelp Reviews Classification Mini Project Featurecomputation

Yelp Reviews Classification Mini Project Classification

Language Independent Tokenization(Text Preprocessing)

Tokenization In Detial Introduction

Tokenization Is Hard

Tokenization Byte Pair Encoding

Tokenization Byte Pair Encoding Example

Tokenization Byte Pair Encoding On Test Data

Tokenization Byte Pair Encoding Implementation Getpaircounts

Tokenization Byte Pair Encoding Implementation Mergeincorpus

Tokenization Byte Pair Encoding Implementation BFE Training

Tokenization Byte Pair Encoding Implementation BFE Encoding

Tokenization Byte Pair Encoding Implementation BFE Encoding One Pair

Tokenization Byte Pair Encoding Implementation BFE Encoding One Pair 1

Text Nomalization(Text Preprocessing)

Word Normalization Case Folding

Word Normalization Lematization

Word Normalization Stemming

Word Normalization Sentence Segmentation

String Matching and Spelling Correction(Text Preprocessing)

Spelling Correction Minimum Edit Distance Intro

Spelling Correction Minimum Edit Distance Example

Spelling Correction Minimum Edit Distance Table Filling

Spelling Correction Minimum Edit Distance Dynamic Programming

Spelling Correction Minimum Edit Distance Psudocode

Spelling Correction Minimum Edit Distance Implementation

Spelling Correction Minimum Edit Distance Implementation Bugfixing

Spelling Correction Implementation

Language Modeling

What Is A Language Model

Language Model Formal Definition

Language Model Curse Of Dimensionality

Language Model Markov Assumption And N-Grams

Language Model Implementation Setup

Language Model Implementation Ngrams Function

Language Model Implementation Update Counts Function

Language Model Implementation Probability Model Funciton

Language Model Implementation Reading Corpus

Language Model Implementation Sampling Text

Topic Modelling with Word and Document Representations

One Hot Vectors

One Hot Vectors Implementaton

One Hot Vectors Limitations

One Hot Vectors Uses As Target Labeling

Term Frequency For Document Representations

Term Frequency For Document Representations Implementations

Term Frequency For Word Representations

TFIDF For Document Representations

TFIDF For Document Representations Implementation Reading Corpus

TFIDF For Document Representations Implementation Computing Document Frequenc

TFIDF For Document Representations Implementation Computing TFIDF

Topic Modeling With TFIDF 1

Topic Modeling With TFIDF 3

Topic Modeling With TFIDF 4

Topic Modeling With TFIDF 5

Topic Modeling With Gensim

Word Embeddings LSI

Word Co-occurrence Matrix

Word Co-occurrence Matrix vs Document-term Matrix

Word Co-occurrence Matrix Implementation Preparing Data

Word Co-occurrence Matrix Implementation Preparing Data 2

Word Co-occurrence Matrix Implementation Preparing Data Getting Vocabulary

Word Co-occurrence Matrix Implementation Final Function

Word Co-occurrence Matrix Implementation Handling Memory Issues On Large Corp

Word Co-occurrence Matrix Sparsity

Word Co-occurrence Matrix Positive Point Wise Mutual Information PPMI

PCA For Dense Embeddings

Latent Semantic Analysis

Latent Semantic Analysis Implementation

Word Semantics

Cosine Similarity

Cosine Similarity Geting Norms Of Vectors

Cosine Similarity Normalizing Vectors

Cosine Similarity With More Than One Vectors

Cosine Similarity Getting Most Similar Words In The Vocabulary

Cosine Similarity Getting Most Similar Words In The Vocabulary Fixingbug Of D

Cosine Similarity Word2Vec Embeddings

Words Analogies

Words Analogies Implemenation 1

Words Analogies Implemenation 2

Words Visualizations

Words Visualizations Implementaion

Words Visualizations Implementaion 2

Word2vec(Optional)

Static And Dynamic Embeddings

Self Supervision

Word2Vec Algorithm Abstract

Word2Vec Why Negative Sampling

Word2Vec What Is Skip Gram

Word2Vec How To Define Probability Law

Word2Vec Sigmoid

Word2Vec Formalizing Loss Function

Word2Vec Loss Function

Word2Vec Gradient Descent Step

Word2Vec Implemenation Preparing Data

Word2Vec Implemenation Gradient Step

Word2Vec Implemenation Driver Function

Need of Deep Learning for NLP(NLP with Deep Learning DNN)

Why RNNs For NLP

Pytorch Installation And Tensors Introduction

Automatic Diffrenciation Pytorch

Introduction(NLP with Deep Learning DNN)

Why DNNs In Machine Learning

Representational Power And Data Utilization Capacity Of DNN

Perceptron

Perceptron Implementation

DNN Architecture

DNN Forwardstep Implementation

DNN Why Activation Function Is Required

DNN Properties Of Activation Function

DNN Activation Functions In Pytorch

DNN What Is Loss Function

DNN Loss Function In Pytorch

Training(NLP with Deep Learning DNN)

DNN Gradient Descent

DNN Gradient Descent Implementation

DNN Gradient Descent Stochastic Batch Minibatch

DNN Gradient Descent Summary

DNN Implemenation Gradient Step

DNN Implemenation Stochastic Gradient Descent

DNN Implemenation Batch Gradient Descent

DNN Implemenation Minibatch Gradient Descent

DNN Implemenation In Pytorch

Hyper parameters(NLP with Deep Learning DNN)

DNN Weights Initializations

DNN Learning Rate

DNN Batch Normalization

DNN Batch Normalization Implementation

DNN Optimizations

DNN Dropout

DNN Dropout In Pytorch

DNN Early Stopping

DNN Hyperparameters

DNN Pytorch CIFAR10 Example

Introduction(NLP with Deep Learning RNN)

What Is RNN

Understanding RNN With A Simple Example

RNN Applications Human Activity Recognition

RNN Applications Image Captioning

RNN Applications Machine Translation

RNN Applications Speech Recognition Stock Price Prediction

RNN Models

Mini-project Language Modelling(NLP with Deep Learning RNN)

Language Modeling Next Word Prediction

Language Modeling Next Word Prediction Vocabulary Index

Language Modeling Next Word Prediction Vocabulary Index Embeddings

Language Modeling Next Word Prediction Rnn Architecture

Language Modeling Next Word Prediction Python 1

Language Modeling Next Word Prediction Python 2

Language Modeling Next Word Prediction Python 3

Language Modeling Next Word Prediction Python 4

Language Modeling Next Word Prediction Python 5

Language Modeling Next Word Prediction Python 6

Mini-project Sentiment Classification(NLP with Deep Learning RNN)

Vocabulary Implementation

Vocabulary Implementation Helpers

Vocabulary Implementation From File

Vectorizer

RNN Setup

RNN Setup 1

RNN in PyTorch(NLP with Deep Learning RNN)

RNN In Pytorch Introduction

RNN In Pytorch Embedding Layer

RNN In Pytorch Nn Rnn

RNN In Pytorch Output Shapes

RNN In Pytorch Gatedunits

RNN In Pytorch Gatedunits GRU LSTM

RNN In Pytorch Bidirectional RNN

RNN In Pytorch Bidirectional RNN Output Shapes

RNN In Pytorch Bidirectional RNN Output Shapes Seperation

RNN In Pytorch Example

Advanced RNN models(NLP with Deep Learning RNN)

RNN Encoder Decoder

RNN Attention

Neural Machine Translation

Introduction To Dataset And Packages

Implementing Language Class

Testing Language Class And Implementing Normalization

Reading Datafile

Reading Building Vocabulary

EncoderRNN

DecoderRNN

DecoderRNN Forward Step

DecoderRNN Helper Functions

Training Module

Stochastic Gradient Descent

NMT Training

NMT Evaluation


Reviews

J
John21 August 2021

Your use of re.findall is not working. s = 'asdfjl;ajdf;la3534l2k3o;lkcagj;qi4touoq' L = re.findall('[0123456789]',s) print(L) len(L) ['3', '5', '3', '4', '2', '3', '4'] --------------------------------------------------------------------------- TypeError Traceback (most recent call last) <ipython-input-33-9885d60d87ae> in <module> 2 L = re.findall('[0123456789]',s) 3 print(L) ----> 4 len(L) TypeError: 'list' object is not callable if len(re.findall('[0-9][0-9][0-9][0-9]','asdfjl;ajdf;la3534l2k3o;lkdagj;qi4touoq'))>0: print("Found") else: print("Not Found") --------------------------------------------------------------------------- TypeError Traceback (most recent call last) <ipython-input-37-28d132d4072a> in <module> ----> 1 if len(re.findall('[0-9][0-9][0-9][0-9]','asdfjl;ajdf;la3534l2k3o;lkdagj;qi4touoq'))>0: 2 print("Found") 3 else: 4 print("Not Found") TypeError: 'list' object is not callable s = 'asdfjl;ajdf;la3534l2k3o;lkcagj;qi4touoq' L = re.findall('[0-9][0-9][0-9][0-9]',s) print(L) if len(L)>0: print("Found") else: print("Not Found") ['3534'] --------------------------------------------------------------------------- TypeError Traceback (most recent call last) <ipython-input-40-b5d32ce06fb3> in <module> 2 L = re.findall('[0-9][0-9][0-9][0-9]',s) 3 print(L) ----> 4 if len(L)>0: 5 print("Found") 6 else: TypeError: 'list' object is not callable documents = ['asdfj;laieorkdjf;aliejr;akjdf23k4j;lajds;l', 'asdfjoqweitulad;ai@weutadg;lajoetiuaodkgjier', 'asdkfjqoitlskdnfoqwiekhas;ioew=adgoie', 'askdfl_asdkfei_asdjkfla****askeasfff', '{{{{{asdfjowei@@##askdfoie}}}}}'] regExp = '[0-9:"{}()@#&]' for doc in documents: if len(re.findall(regExp,doc))>0: pass else: print(doc) --------------------------------------------------------------------------- TypeError Traceback (most recent call last) <ipython-input-44-403ce93ff6cb> in <module> 6 regExp = '[0-9:"{}()@#&]' 7 for doc in documents: ----> 8 if len(re.findall(regExp,doc))>0: 9 pass 10 else: TypeError: 'list' object is not callable

T
Thomas20 July 2021

I have a lot of confidence that this instructor is going to teach me everything I need to know in this area. I am looking forward to each lesson and I look forward to the day I can make an A.I. that will have conversations with me and my family.


4076522

Udemy ID

5/25/2021

Course created date

7/4/2021

Course Indexed date
Bot
Course Submitted by