Create a Text Generation Web App with 100% Python (NLP)

Harness GPT-Neo — a natural language processing (NLP) text generation model. Demonstrate it with a 100% Python web app

GPT-3 is a state-of-the-art text generation natural language processing (NLP) model created by OpenAI. You can use it to generate text that resembles text generated by a human.

What you’ll learn

  • How to implement state-of-the-art text generation AI models.
  • Background information about GPT-Neo, a state-of-the-art text generation NLP model.
  • How to use Happy Transformer — a Python library for implementing NLP Transformer models.
  • How to train/implement GPT-2.
  • How to implement different text generation algorithms.
  • How to fetch data using Hugging Face’s Datasets library.
  • How to train GPT-Neo using Happy Transformer.
  • How to create a web app with 100% Python using Anvil.
  • How to host a Transformer model on Paperspace.

Course Content

  • Introduction –> 3 lectures • 12min.
  • Run GPT-Neo –> 8 lectures • 25min.
  • Training GPT-Neo –> 6 lectures • 14min.
  • Mini-Project: Train a Bill Generator –> 4 lectures • 15min.
  • Create a Web App With 100% Python –> 9 lectures • 42min.
  • Deploy –> 4 lectures • 8min.
  • Conclusion –> 2 lectures • 1min.

Create a Text Generation Web App with 100% Python (NLP)

Requirements

  • A solid understanding of basic Python syntax.
  • A Google account (for Google Colab).

GPT-3 is a state-of-the-art text generation natural language processing (NLP) model created by OpenAI. You can use it to generate text that resembles text generated by a human.

This course will cover how to create a web app that uses an open-source version of GPT-3 called GPT-Neo with 100% Python. That’s right, no HTML, Javascript, CSS or any other programming language is required. Just 100% Python!

 

You will learn how to:

  1. Implement GPT-Neo (and GPT-2) with Happy Transformer
  2. Train GPT-Neo to generate unique text for a specific domain
  3. Create a web app using 100% Python with Anvil!
  4. Host your language model using Google Colab and Paperspace

     

Installations:

NONE!!! All of the tools we use in this tutorial are web-based. They include Google Colab, Anvil and Paperspace. So regardless of if you’re on Mac, Windows or Linux, you will not have to worry about downloading any software.

 

Technologies:

  1. Model: GPT-Neo — an open-source version of GPT-3 created by Eleuther AI
  2. Framework: Happy Transformer — an open-source Python package that allows us to implement and train GPT-Neo with just a few lines of code
  3. Web technologies: Anvil — a website that allows us to develop web app using Python
  4. Backend technologies: We’ll cover how to use both Google Colab and Paperspace to host the model. Anvil automatically covers hosting the web app.

About the instructor:

My name is Eric Fillion, and I’m from Canada. I’m on a mission to make state-of-the-art advances in the field of NLP through creating open-source tools and by creating educational content. In early 2020, I led a team that launched an open-source Python Package called Happy Transformer. Happy Transformer allows programmers to implement and train state-of-the-art Transformer models with just a few lines of code. Since its release, it has won awards and has been downloaded over 13k times.

Requirements:

  • A basic understanding of Python
  • A google account — for Google Colab