English | 中文
This project is designed as a one-stop learning resource for anyone interested in large language models and their application in Artificial Intelligence Governance and Control (AIGC) scenarios. By providing theoretical foundations, development basics, and hands-on examples, this project offers comprehensive guidance on these cutting-edge topics.
-
Theory and Development Basics of Large Language Models: Deep dive into the inner workings of large language models like GPT-4, including their architecture, training methods, applications, and more.
-
AIGC Application Development with LangChain: Hands-on examples and tutorials using LangChain to develop AIGC applications, demonstrating the practical application of large language models.
You can start by cloning this repository to your local machine:
git clone https://github.com/DjangoPeng/openai-quickstart.git
Then navigate to the directory and follow the individual module instructions to get started.
Date | Description | Course Materials | Events |
---|---|---|---|
Wed Jul 12 Week 1 | Fundamentals of Large Models: Evolution of Theory and Technology - An Initial Exploration of Large Models: Origin and Development - Warm-up: Decoding Attention Mechanism - Milestone of Transformation: The Rise of Transformer - Taking Different Paths: The Choices of GPT and Bert |
Suggested Readings: - Attention Mechanism: Neural Machine Translation by Jointly Learning to Align and Translate - An Attentive Survey of Attention Models - Transformer: Attention is All you Need - BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding |
[Homework] |
Sun Jul 16 | The GPT Model Family: From Start to Present - From GPT-1 to GPT-3.5: The Evolution - ChatGPT: Where It Wins - GPT-4: A New Beginning Prompt Learning - Chain-of-Thought (CoT): The Pioneering Work - Self-Consistency: Multi-path Reasoning - Tree-of-Thoughts (ToT): Continuing the Story |
Suggested Readings: - GPT-1: Improving Language Understanding by Generative Pre-training - GPT-2: Language Models are Unsupervised Multitask Learners - GPT-3: Language Models are Few-Shot Learners Additional Readings: - GPT-4: Architecture, Infrastructure, Training Dataset, Costs, Vision, MoE - GPTs are GPTs: An Early Look at the Labor Market Impact Potential of Large Language Models - Sparks of Artificial General Intelligence: Early experiments with GPT-4 |
[Homework] |
Wed Jul 19 Week 2 | Fundamentals of Large Model Development: OpenAI Embedding - The Eve of General Artificial Intelligence - "Three Worlds" and "Turing Test" - Computer Data Representation - Representation Learning and Embedding Embeddings Dev 101 - Course Project: GitHub openai-quickstart - Getting Started with OpenAI Embeddings |
Suggested Readings: - Representation Learning: A Review and New Perspectives - Word2Vec: Efficient Estimation of Word Representations in Vector Space - GloVe: Global Vectors for Word Representation Additional Readings: - Improving Distributional Similarity with Lessons Learned from Word Embeddings - Evaluation methods for unsupervised word embeddings |
[Homework] Code: [embedding] |
Sun Jul 23 | OpenAI Large Model Development and Application Practice - OpenAI Large Model Development Guide - Overview of OpenAI Language Models - OpenAI GPT-4, GPT-3.5, GPT-3, Moderation - OpenAI Token Billing and Calculation OpenAI API Introduction and Practice - OpenAI Models API - OpenAI Completions API - OpenAI Chat Completions API - Completions vs Chat Completions OpenAI Large Model Application Practice - Initial Exploration of Text Completion - Initial Exploration of Chatbots |
Suggested Readings: - OpenAI Models - OpenAI Completions API - OpenAI Chat Completions API |
Code: [models] [tiktoken] |
Wed Jul 26 Week 3 | Best Practices for Applying Large AI Models - How to Improve the Efficiency and Quality of GPT Model Use - Best Practices for Applying Large AI Models - Text Creation and Generation - Article Abstract and Summary - Novel Generation and Content Supervision - Executing Complex Tasks Step by Step - Evaluating the Quality of Model Output - Constructing Training Annotation Data - Code Debugging Assistant - New Features: Function Calling Introduction and Practical Application |
Suggested Readings - GPT Best Practices - Function Calling |
Code: Function Calling |
Sun Jul 30 | Practical: OpenAI-Translator - Market demand analysis for OpenAI-Translator - Product definition and feature planning for OpenAI-Translator - Technical solutions and architecture design for OpenAI-Translator - OpenAI module design - OpenAI-Translator practical application |
Code: pdfplumber |
|
Wed Aug 2 Week 4 | ChatGPT Plugin Development Guide - Introduction to ChatGPT Plugin - Sample project: Todo management plugin - Deployment and testing of practical examples - ChatGPT developer mode - Practical: Weather Forecast plugin development - Weather Forecast Plugin design and definition - Weather Forecast function service - Integration with third-party weather query platform - Practical Weather Forecast Plugin |
Code: [todo list] [weather forecast] |
|
Sun Aug 6 | LLM Application Development Framework LangChain (Part 1) - LangChain 101 - What is LangChain - Why LangChain is Needed - Typical Use Cases of LangChain - Basic Concepts and Modular Design of LangChain - Introduction and Practice of LangChain Core Modules - Standardized Large-Scale Model Abstraction: Mode I/O - Template Input: Prompts - Language Model: Models - Standardized Output: Output Parsers |
Code: [model io] |
|
Wed Aug 9 Week 5 | LLM Application Development Framework LangChain (Part 2) - Best Practices for LLM Chains - Getting Started with Your First Chain: LLM Chain - Sequential Chain: A Chained Call with Sequential Arrangement - Transform Chain: A Chain for Processing Long Texts - Router Chain: A Chain for Implementing Conditional Judgments - Memory: Endowing Applications with Memory Capabilities - The Relationship between Memory System and Chain - BaseMemory and BaseChatMessageMemory: Memory Base Classes - Memory System for Service Chatting - ConversationBufferMemory - ConversationBufferWindowMemory - ConversationSummaryBufferMemory |
Code: [chain] [memory] |
|
Sun Aug 13 | LLM Application Development Framework LangChain (Part 3) - Native data processing flow of the framework: Data Connection - Document Loaders - Document Transformers - Text Embedding Models - Vector Stores - Retrievers - Agent Systems for Building Complex Applications: Agents - Theoretical Foundation of Agents: ReAct - LLM Reasoning Capabilities: CoT, ToT - LLM Operation Capabilities: WebGPT, SayCan - LangChain Agents Module Design and Principle Analysis - Module: Agent, Tools, Toolkits - Runtime: AgentExecutor, PlanAndExecute, AutoGPT - Getting Started with Your First Agent: Google Search + LLM - Practice with ReAct: SerpAPI + LLM-MATH |
Code: [data connection] [agents] |
|
Wed Aug 23 Week 6 | Practical: LangChain version OpenAI-Translator v2.0 - In-depth understanding of Chat Model and Chat Prompt Template - Review: LangChain Chat Model usage and process - Design translation prompt templates using Chat Prompt Template - Implement bilingual translation using Chat Model - Simplify Chat Prompt construction using LLMChain - Optimize OpenAI-Translator architecture design based on LangChain - Hand over large model management to LangChain framework - Focus on application-specific Prompt design - Implement translation interface using TranslationChain - More concise and unified configuration management - Development of OpenAI-Translator v2.0 feature - Design and implementation of graphical interface based on Gradio - Design and implementation of Web Server based on Flask |
Code: [openai-translator] |
|
Sun, Aug 27 | Practical: LangChain version Auto-GPT - Auto-GPT project positioning and value interpretation - Introduction to Auto-GPT open source project - Auto-GPT positioning: an independent GPT-4 experiment - Auto-GPT value: an attempt at AGI based on Agent - LangChain version Auto-GPT technical solution and architecture design - In-depth understanding of LangChain Agents - LangChain Experimental module - Auto-GPT autonomous agent design - Auto-GPT Prompt design - Auto-GPT Memory design - In-depth understanding of LangChain VectorStore - Auto-GPT OutputParser design - Practical LangChain version Auto-GPT |
Code: [autogpt] |
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated. If you have any suggestions or feature requests, please open an issue first to discuss what you would like to change.
This project is licensed under the terms of the Apache-2.0 License . See the LICENSE file for details.
Django Peng - pjt73651@email.com
Project Link: https://github.com/DjangoPeng/openai-quickstart