Skip to content

This from coursera course by AWS community which is Generative AI with Large Language Models.

Notifications You must be signed in to change notification settings

azaynul10/Generative-AI-with-Large-Language-Models

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 

Repository files navigation

Generative-AI-with-Large-Language-Models

This from coursera course by AWS community which is Generative AI with Large Language Models.

In Lab 1, I worked with a dataset of conversations. My task was to summarise these dialogues, which could be customer support conversations, using a machine learning model. The lab uses Python 3 and several libraries like PyTorch, Torch data, Transformers, and Datasets from Hugging Face. These libraries are installed and tested in a notebook. In this lab, we use a public dataset called Dialogue sum and the FLAN-T5 model to create summaries of conversations. The FLAN-T5 model by Hugging Face is used for creating summaries of conversations. Various techniques, such as zero-shot, one-shot, and few-shot, are used to improve the model's ability to accurately summarise conversations. Various techniques are used to improve the accuracy of generated summaries, such as zero-shot, one-shot, and few-shot prompt engineering. You can change different settings, like the sampling temperature, to affect how the model produces results. Even though there may be some warnings and errors in the notebook, the libraries and notebooks are designed to run smoothly. The lab uses a public dataset called Dialogue sum. Tokenizers are tools that convert raw text into numbers. These numbers are then used as input for the FLAN-T5 model.

About

This from coursera course by AWS community which is Generative AI with Large Language Models.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published