4.45 (19 reviews)
☑ How to implement state-of-the-art text generation AI models
☑ Background information about GPT-Neo, a state-of-the-art text generation NLP model
☑ How to use Happy Transformer -- a Python library for implementing NLP Transformer models
☑ How to train/implement GPT-2
☑ How to implement different text generation algorithms
☑ How to fetch data using Hugging Face's Datasets library
☑ How to train GPT-Neo using Happy Transformer
☑ How to create a web app with 100% Python using Anvil
☑ How to host a Transformer model on Paperspace
GPT-3 is a state-of-the-art text generation natural language processing (NLP) model created by OpenAI. You can use it to generate text that resembles text generated by a human.
You will learn how to:
Implement GPT-Neo (and GPT-2) with Happy Transformer
Train GPT-Neo to generate unique text for a specific domain
Create a web app using 100% Python with Anvil!
Host your language model using Google Colab and Paperspace
NONE!!! All of the tools we use in this tutorial are web-based. They include Google Colab, Anvil and Paperspace. So regardless of if you’re on Mac, Windows or Linux, you will not have to worry about downloading any software.
Model: GPT-Neo -- an open-source version of GPT-3 created by Eleuther AI
Framework: Happy Transformer -- an open-source Python package that allows us to implement and train GPT-Neo with just a few lines of code
Web technologies: Anvil -- a website that allows us to develop web app using Python
Backend technologies: We’ll cover how to use both Google Colab and Paperspace to host the model. Anvil automatically covers hosting the web app.
About the instructor:
My name is Eric Fillion, and I’m from Canada. I’m on a mission to make state-of-the-art advances in the field of NLP through creating open-source tools and by creating educational content. In early 2020, I led a team that launched an open-source Python Package called Happy Transformer. Happy Transformer allows programmers to implement and train state-of-the-art Transformer models with just a few lines of code. Since its release, it has won awards and has been downloaded over 13k times.
A basic understanding of Python
A google account -- for Google Colab
What is GPT-Neo?
Overview of Learning Parameters
How to Modify Learning Parameters
Training GPT-Neo Knowledge Quiz
Mini-Project: Train a Bill Generator
Create a Web App With 100% Python
Create a Basic Anvil Web App
Create the UI
Connect Anvil to Google Colab
Implement Finetuned Bill Generation Model
Host GPT-Neo on Paperspace
Deploy the Anvil App to the web!
seems like advertisement of gpt-neo. Nothing much to learn, unless you are looking for how to use neo
Course is very short. Materials cover only some very-very top details and almost all information is scrapped from official documentation. So if you are familiar with GPT at any level, it will be waste of time for you.
I'm really looking forward to learning more about Happy Transformer after taking this course. The assignments are interesting and the instructor does a great job of making the content clear and concise.