Skip to content

huggingface/optimum-tpu

Folders and files

NameName
Last commit message
Last commit date

Latest commit

7f5b0cc Β· Jul 23, 2024

History

62 Commits
Jul 12, 2024
Jun 12, 2024
Jun 17, 2024
Jul 23, 2024
May 3, 2024
Jul 23, 2024
Apr 26, 2024
Feb 12, 2024
Apr 3, 2024
Jul 12, 2024
Jun 17, 2024
Jun 19, 2024
Jun 3, 2024
Apr 3, 2024

Repository files navigation

Optimum-TPU

Take the most out of Google Cloud TPUs with the ease of πŸ€— transformers

Documentation license

Tensor Processing Units (TPU) are AI accelerator made by Google to optimize performance and cost from AI training to inference.

This repository exposes an interface similar to what Hugging Face transformers library provides to interact with a magnitude of models developed by research labs, institutions and the community.

We aim at providing our user the best possible performances targeting Google Cloud TPUs for both training and inference working closely with Google and Google Cloud to make this a reality.

Supported Model and Tasks

We currently support a few LLM models targeting text generation scenarios:

  • πŸ’Ž Gemma (2b, 7b)
  • πŸ¦™ Llama2 (7b) and Llama3 (8b)
  • πŸ’¨ Mistral (7b)

Installation

optimum-tpu comes with an handy PyPi released package compatible with your classical python dependency management tool.

pip install optimum-tpu -f https://storage.googleapis.com/libtpu-releases/index.html

Inference

optimum-tpu provides a set of dedicated tools and integrations in order to leverage Cloud TPUs for inference, especially on the latest TPU version v5e.

Other TPU versions will be supported along the way.

Text-Generation-Inference

As part of the integration, we do support a text-generation-inference (TGI) backend allowing to deploy and serve incoming HTTP requests and execute them on Cloud TPUs.

Please see the TGI specific documentation on how to get started

Training

Fine-tuning is supported and tested on the TPU v5e. We have tested so far:

  • πŸ¦™ Llama-2 7B and Llama-3 8B
  • πŸ’Ž Gemma 2B and 7B

You can check the examples: