-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathindex.html
103 lines (96 loc) · 5.39 KB
/
index.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Using Transformers Locally</title>
<link href="https://fonts.googleapis.com/css2?family=Roboto:wght@400;700&display=swap" rel="stylesheet">
<link rel="stylesheet" href="styles.css">
</head>
<body>
<header>
<nav>
<div class="logo">Daniel K Baissa</div>
<ul>
<li><a href="#home">Home</a></li>
<li><a href="#intro">Introduction</a></li>
<li><a href="#chapters">Chapters</a></li>
<li><a href="#resources">Resources</a></li>
</ul>
</nav>
<div class="hero">
<div class="hero-text">
<h1>Using Transformers Locally</h1>
<p>A simple guide to using transformer models on your machine!</p>
<a href="#chapters" class="btn">Start Learning</a>
</div>
<div class="hero-image">
<img src="An_illustration_of_a_transformer_model_used_in_nat.png" alt="Transformers Illustration">
</div>
</div>
</header>
<section id="intro">
<h2>Introduction</h2>
<p>Welcome to this guide on learning how to use transformer models locally. This guide will walk you through the fundamental concepts, setup, and practical applications of transformers in natural language processing (NLP).</p>
</section>
<section id="chapters">
<h2>Chapters</h2>
<div class="chapter">
<h3>Chapter 1: Introduction to Learning Transformers</h3>
<p>This chapter introduces the basics of transformer models and provides an overview of their applications in NLP.</p>
<a href="https://danbaissa.github.io/NLP/Intro_Learning_Transformers" class="btn">Chapter 1</a>
</div>
<div class="chapter">
<h3>Chapter 2: Text Classification with Transformers</h3>
<p>This chapter covers text classification tasks using transformer models, demonstrating various NLP use cases.</p>
<a href="https://danbaissa.github.io/NLP/Text_classification/Text_classification" class="btn">Chapter 2</a>
</div>
<div class="chapter">
<h3>Chapter 3: Example - Detecting Hate Speech in Arabic</h3>
<p>This chapter provides a practical example of using transformers to detect hate speech in Arabic text.</p>
<a href="https://danbaissa.github.io/NLP/Example_Detecting_hatespeach_Arabic" class="btn">Chapter 3</a>
</div>
<div class="chapter">
<h3>Chapter 4a: Looking into the Black Box (Part 1)</h3>
<p>Transformer Models are often seen as black boxes. Here we visualize the attention mask to see the relations a model makes.</p>
<a href="https://danbaissa.github.io/NLP/Visualizing_attention/Visualizing_attention" class="btn">Chapter 4a</a>
<p></p>
<h3>Chapter 4b: Looking into the Black Box (Part 2)</h3>
<p>After visualizing the data in Part 1, we now extract the weights from a GPT model in Part 2, so we can learn how it makes its predictions.</p>
<a href="https://danbaissa.github.io/NLP/Visualizing_attention/attention_weights" class="btn">Chapter 4b</a>
</div>
<div class="chapter">
<h3>Chapter 5: Example - Mamba the Future of NLP?</h3>
<p>Here we explore the Mamba model and compare it to an 8B transformer model.</p>
<a href="https://danbaissa.github.io/NLP/Mamba/Mamba" class="btn">Chapter 5</a>
</div>
<div class="chapter">
<h3>Chapter 6: Diagnosing Autism by Reading Simulated Doctors' Notes</h3>
<p>In this project I develop a pipeline to help doctors diagnose autism by simply reading their notes.</p>
<a href="https://danbaissa.github.io/NLP/Autism_Notes_Model_Training_and_SHAP_Explanation/Autism_Notes_Model_Training_and_SHAP_Explanation" class="btn">Chapter 6</a>
</div>
<div class="chapter">
<h3>Chapter 7: Text to Image</h3>
<p>Here we use Stable Diffusion to convert text to images. Have a crazy idea? Make it into a picture in seconds!</p>
<a href="https://danbaissa.github.io/NLP/sd/Stable_Diffusion" class="btn">Chapter 7</a>
</div>
<div class="chapter">
<h3>Chapter 8: Image to Text and Speech</h3>
<p>In this project we convert images to text. Since we turned the image to text, we may as well add TTS as well!</p>
<a href="https://danbaissa.github.io/NLP/itt/Image_to_text" class="btn">Chapter 8</a>
</div>
</section>
<section id="resources">
<h2>Resources</h2>
<p>Here are some additional resources to help you deepen your understanding of transformer models:</p>
<ul>
<li><a href="https://github.com/huggingface/transformers">Hugging Face Transformers Library</a></li>
<li><a href="https://arxiv.org/abs/1706.03762">Attention is All You Need (Original Paper)</a></li>
<li><a href="https://www.tensorflow.org/tutorials/text/transformer">TensorFlow Transformer Tutorial</a></li>
</ul>
</section>
<footer>
<p>by Daniel K Baissa</p>
</footer>
</body>
</html>