Skip to content

Commit

Permalink
[Docs] Bootstrap Getting Started page
Browse files Browse the repository at this point in the history
- As preparation for the first Alpha release, we should create a getting started page that will help our users navigate the first N steps to run our models.
- Next steps should include:
1. Populating details for wheel installation
2. Link to the list of supported models (require outlined release process and dedicated `.md` section)
3. More detailed next steps (when applicable)

Fix #1114
  • Loading branch information
nvukobratTT committed Jan 28, 2025
1 parent eac7778 commit 2ad7623
Show file tree
Hide file tree
Showing 2 changed files with 59 additions and 3 deletions.
4 changes: 1 addition & 3 deletions docs/src/SUMMARY.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@

# Introduction
- [Introduction](./introduction.md)
- [Getting Started](./getting-started.md)
- [Architecture Overview](./architecture_overview.md)

# Project setup
Expand All @@ -13,6 +14,3 @@
# Dev Notes
- [Running standalone FFE generated TTIRs](./dev_notes/standalone_ttir_run.md)
- [Verification in tests](./dev_notes/verification.md)

# User Guide
- [Getting Started](./getting-started.md)
58 changes: 58 additions & 0 deletions docs/src/getting-started.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
# Getting Started

## Setup
You choose between two ways to setup our project:
- Install pre-built wheel
- Building from source

### Install using Wheel

*Wheel installation instructions will be provided soon. Stay tuned!*

### Build from Source

To build Forge-FE from source, you need to clone the project from our GitHub page:
```bash
git clone https://github.com/tenstorrent/tt-forge-fe.git
```

Afterwards, you can follow our [build instructions](https://docs.tenstorrent.com/tt-forge-fe/build.html) which outline prerequisites, as well as how to build dependencies and our project.

## Run First Example Case

To confirm that our environment is properly setup, let's run one sanity test for element-wise add operation:
```bash
pytest forge/test/mlir/operators/eltwise_binary/test_eltwise_binary.py::test_add
```

In a few seconds, you should get confirmation if this test passed successfully. Once that's done, we can run one of our model tests as well:
```bash
pytest forge/test/mlir/llama/tests/test_llama_prefil.py::test_llama_prefil_on_device_decode_on_cpu
```

## Where to Go Next

Now that you have set up Forge-FE, you can try to compile and run your own models!

For a quick start, here is an example of how to run your own model. Note the introduction of the `forge.compile` call:

```py
import torch
from transformers import ResNetForImageClassification

def resnet():
# Load image, pre-process, etc.
...

# Load model (e.g. from HuggingFace)
framework_model = ResNetForImageClassification.from_pretrained("microsoft/resnet-50")

# Compile the model using Forge
compiled_model = forge.compile(framework_model, input_image)

# Run compiled model
logits = compiled_model(input_image)

...
# Post-process output, return results, etc.
```

0 comments on commit 2ad7623

Please sign in to comment.