Welcome to Using Transformers Locally, a simple guide to using transformer models on your machine! This guide will walk you through the fundamental concepts, setup, and practical applications of transformers in natural language processing (NLP).
Welcome to this guide on learning how to use transformer models locally. This guide will walk you through the fundamental concepts, setup, and practical applications of transformers in natural language processing (NLP).
This chapter introduces the basics of transformer models and provides an overview of their applications in NLP.
This chapter covers text classification tasks using transformer models, demonstrating various NLP use cases.
This chapter provides a practical example of using transformers to detect hate speech in Arabic text.
Transformer models are often seen as black boxes. Here we visualize the attention mask to see the relations a model makes.
After visualizing the data in Part 1, we now extract the weights from a GPT model in Part 2, so we can learn how it makes its predictions.
Here we explore the Mamba model and compare it to an 8B transformer model.
In this project, I develop a pipeline to help doctors diagnose autism by simply reading their notes.
Here we use Stable Diffusion to convert text to images. Have a crazy idea? Make it into a picture in seconds!
In this project, we convert images to text. Since we turned the image to text, we may as well add TTS as well!
Here are some additional resources to help you deepen your understanding of transformer models:
- Hugging Face Transformers Library
- Attention is All You Need (Original Paper)
- TensorFlow Transformer Tutorial
By Daniel K Baissa