Udemy

Platform

English

Language

Data Science

Category

Create a Text Generation Web App with 100% Python (NLP)

Harness GPT-Neo -- a natural language processing (NLP) text generation model. Demonstrate it with a 100% Python web app

4.45 (19 reviews)

Students

2 hours

Content

Jul 2021

Last Update
Regular Price

SKILLSHARE
SkillShare
Unlimited access to 30 000 Premium SkillShare courses
30-DAY FREE TRIAL

What you will learn

How to implement state-of-the-art text generation AI models

Background information about GPT-Neo, a state-of-the-art text generation NLP model

How to use Happy Transformer -- a Python library for implementing NLP Transformer models

How to train/implement GPT-2

How to implement different text generation algorithms

How to fetch data using Hugging Face's Datasets library

How to train GPT-Neo using Happy Transformer

How to create a web app with 100% Python using Anvil

How to host a Transformer model on Paperspace


Description

GPT-3 is a state-of-the-art text generation natural language processing (NLP) model created by OpenAI. You can use it to generate text that resembles text generated by a human.

This course will cover how to create a web app that uses an open-source version of GPT-3 called GPT-Neo with 100% Python. That’s right, no HTML, Javascript, CSS or any other programming language is required. Just 100% Python!


You will learn how to:

  1. Implement GPT-Neo (and GPT-2) with Happy Transformer

  2. Train GPT-Neo to generate unique text for a specific domain

  3. Create a web app using 100% Python with Anvil!

  4. Host your language model using Google Colab and Paperspace


Installations:

NONE!!! All of the tools we use in this tutorial are web-based. They include Google Colab, Anvil and Paperspace. So regardless of if you’re on Mac, Windows or Linux, you will not have to worry about downloading any software.


Technologies:

  1. Model: GPT-Neo -- an open-source version of GPT-3 created by Eleuther AI

  2. Framework: Happy Transformer -- an open-source Python package that allows us to implement and train GPT-Neo with just a few lines of code

  3. Web technologies: Anvil -- a website that allows us to develop web app using Python

  4. Backend technologies: We’ll cover how to use both Google Colab and Paperspace to host the model. Anvil automatically covers hosting the web app.

About the instructor:

My name is Eric Fillion, and I’m from Canada. I’m on a mission to make state-of-the-art advances in the field of NLP through creating open-source tools and by creating educational content. In early 2020, I led a team that launched an open-source Python Package called Happy Transformer. Happy Transformer allows programmers to implement and train state-of-the-art Transformer models with just a few lines of code. Since its release, it has won awards and has been downloaded over 13k times.

Requirements:

  • A basic understanding of Python

  • A google account -- for Google Colab


Screenshots

Create a Text Generation Web App with 100% Python (NLP)
Create a Text Generation Web App with 100% Python (NLP)
Create a Text Generation Web App with 100% Python (NLP)
Create a Text Generation Web App with 100% Python (NLP)

Content

Introduction

Introduction

Course Overview

What is GPT-Neo?

Run GPT-Neo

Introduction

Links

Basic Implementation

Generate Text

Greedy

Generic Sampling

Top-k Sampling

GPT-2

Knowledge Test

Training GPT-Neo

Introduction

Links

Set up

Basic Training

Overview of Learning Parameters

How to Modify Learning Parameters

Training GPT-Neo Knowledge Quiz

Mini-Project: Train a Bill Generator

Introduction

Links

Mini-Project!

Knowledge Quiz

Next steps

Create a Web App With 100% Python

Introduction

Links

Create a Basic Anvil Web App

Create the UI

Connect Anvil to Google Colab

Connect GPT-Neo

Upgrade UI

Error Handling

Implement Finetuned Bill Generation Model

Deploy

Introduction

Links

Host GPT-Neo on Paperspace

Deploy the Anvil App to the web!

Conclusion

Conclusion

Bonus Lecture


Reviews

A
Ademuyiwa23 July 2021

seems like advertisement of gpt-neo. Nothing much to learn, unless you are looking for how to use neo

D
Denis16 June 2021

Course is very short. Materials cover only some very-very top details and almost all information is scrapped from official documentation. So if you are familiar with GPT at any level, it will be waste of time for you.

E
Emily6 June 2021

I'm really looking forward to learning more about Happy Transformer after taking this course. The assignments are interesting and the instructor does a great job of making the content clear and concise.


3993540

Udemy ID

4/20/2021

Course created date

6/7/2021

Course Indexed date
Bot
Course Submitted by