Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Autogenerate TTNN Tests for Explorer CI to consume #1333

Open
wants to merge 27 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 18 commits
Commits
Show all changes
27 commits
Select commit Hold shift + click to select a range
282a371
Added maybe_downcast & hardened TT Attrs and Types to include better …
vprajapati-tt Nov 13, 2024
677b0d6
Removed manual maybe_downcast, added tt_class
vprajapati-tt Nov 14, 2024
f71f547
Removed redundant imports
vprajapati-tt Nov 14, 2024
5ee502a
Lint Fixes
vprajapati-tt Nov 14, 2024
87244a0
new MLIR module for parsing TTNN modules
vprajapati-tt Nov 15, 2024
6712f7d
Merged from main
vprajapati-tt Nov 18, 2024
97e6fbd
Added TTNNLayout Support + Fixes
vprajapati-tt Nov 18, 2024
d634d38
editable on Debug, minor fixes
vprajapati-tt Nov 19, 2024
026836e
Merge branch 'main' into vprajapati/issue-933
vprajapati-tt Nov 19, 2024
c9f972c
Use check-ttmlir to generate test-cases for tt-explorer
vprajapati-tt Nov 19, 2024
d463881
Attempted to fix tt-explorer job
vprajapati-tt Nov 19, 2024
688c223
Merge branch 'main' into vprajapati/ttnn-explorer-tests
vprajapati-tt Nov 25, 2024
302d068
Merge branch 'main' into vprajapati/ttnn-explorer-tests
vprajapati-tt Dec 9, 2024
611375b
Fixes + Moved tests to tests/Explorer
vprajapati-tt Dec 9, 2024
a8ec9d2
Shifted to visualize_from_config and added LIT check
vprajapati-tt Dec 9, 2024
4263949
Upload test from other workflow to explorer workflow
vprajapati-tt Dec 10, 2024
c72ce19
Upload test from build workflow instead of run
vprajapati-tt Dec 10, 2024
09f1570
Changed artifact name
vprajapati-tt Dec 10, 2024
d11367e
Fixed artifact name -- again
vprajapati-tt Dec 12, 2024
8725c96
Merge branch 'main' into vprajapati/ttnn-explorer-tests
vprajapati-tt Jan 13, 2025
499e011
Updated PR + Fixed artifact name
vprajapati-tt Jan 13, 2025
1b60bcc
Small fixes
vprajapati-tt Jan 13, 2025
6cf26f7
Remove race condition
vprajapati-tt Jan 13, 2025
8b2349c
Needs list
vprajapati-tt Jan 21, 2025
d351434
Merge branch 'main' into vprajapati/ttnn-explorer-tests
vprajapati-tt Jan 21, 2025
f028538
Silly Spelling Error
vprajapati-tt Jan 22, 2025
02b7761
Another stupid mistake
vprajapati-tt Jan 22, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 12 additions & 0 deletions .github/workflows/build-and-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -158,6 +158,12 @@ jobs:
cmake --build ${{ steps.strings.outputs.build-output-dir }} -- check-ttmlir
cp build/test/report.xml ${{ steps.strings.outputs.test_report_path }}

- name: Upload tests
uses: actions/upload-artifact@v4
with:
name: ${{ matrix.build.runs-on }}-${{ matrix.build.name }}-tests
path: ${{ steps.strings.outputs.build-output-dir }}/test

- name: Upload Test Report
uses: actions/upload-artifact@v4
with:
Expand Down Expand Up @@ -640,9 +646,15 @@ jobs:
source env/activate
cmake --build ${{ steps.strings.outputs.build-output-dir }} -- explorer

- name: Download Tests
uses: actions/download-artifact@v4
with:
name: ${{ matrix.build.runs-on }}-${{ matrix.build.name }}-test

- name: Run tt-explorer tests
shell: bash
run: |
source env/activate
export TT_EXPLORER_GENERATED_TEST_DIR=${{ steps.strings.outputs.build-output-dir }}/test/ttmlir/Silicon/TTNN
pytest tools/explorer/test/run_tests.py
# collect results
1 change: 1 addition & 0 deletions CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ option(TT_RUNTIME_ENABLE_PERF_TRACE "Enable performance mode" OFF)
option(TTMLIR_ENABLE_RUNTIME "Enable runtime" OFF)
option(TTMLIR_ENABLE_STABLEHLO "Enable StableHLO support" OFF)
option(TTMLIR_ENABLE_OP_MODEL "Enable OpModel support" OFF)
option(TT_EXPLORER_EDITABLE "Enable editable install mode for explorer" OFF)

if (TTMLIR_ENABLE_STABLEHLO)
add_compile_definitions(TTMLIR_ENABLE_STABLEHLO)
Expand Down
9 changes: 4 additions & 5 deletions docs/src/tt-explorer.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,12 +5,11 @@ Welcome to the tt-explorer wiki! The Wiki will serve as a source for documentati
## Quick Start
TT-Explorer is made to be as painless as possible, as such the installation on top of the pre-existing [`tt-mlir`](https://github.com/tenstorrent/tt-mlir) project is as minimal as possible.

1. Build `tt-mlir`
1. Build `tt-mlir`, add the `-DTT_EXPLORER_EDITABLE=ON` flag to the cmake build to install the `tt-explorer` package in editable mode.
2. Run `source env/activate` to be in `tt-mlir` virtualenv for the following steps
3. Install [`tt-adapter`](https://github.com/vprajapati-tt/tt-adapter) using `pip install -e .` in tt-adapter root directory.
4. Install `tt-explorer` using `pip install -e .` in tt-explorer root directory
5. Run `tt-explorer` in terminal to start tt-explorer instance. (Refer to CLI section in API for specifics)
6. Ensure server has started in `tt-explorer` shell instance (check for message below)
3. Install the explorer tool by building the `explorer` target using `cmake --build build -- explorer`
4. Run `tt-explorer` in terminal to start tt-explorer instance. (Refer to CLI section in API for specifics)
5. Ensure server has started in `tt-explorer` shell instance (check for message below)
```sh
Starting Model Explorer server at:
http://localhost:8080
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
// RUN: ttmlir-opt --ttir-to-ttnn-backend-pipeline %s
// Need to ensure that model is valid MLIR module

module @SimpleModel attributes {} {
func.func @forward(%arg0: tensor<1x784xf32> {ttir.name = "input_1"}, %arg1: tensor<10x784xf32> {ttir.name = "linear.weight"}, %arg2: tensor<10xf32> {ttir.name = "linear.bias"}) -> (tensor<1x10xf32> {ttir.name = "SimpleModel_472.output_softmax_1495"}) {
%0 = tensor.empty() : tensor<784x10xf32>
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
// RUN: ttmlir-opt --ttir-to-ttnn-backend-pipeline %s

module @LinearAE attributes {} {
func.func @forward(%arg0: tensor<1x784xf32> {ttir.name = "input_1"}, %arg1: tensor<784x128xf32> {ttir.name = "encoder_lin1.weight"}, %arg2: tensor<128xf32> {ttir.name = "encoder_lin1.bias"}, %arg3: tensor<128x64xf32> {ttir.name = "encoder_lin2.weight"}, %arg4: tensor<64xf32> {ttir.name = "encoder_lin2.bias"}, %arg5: tensor<64x12xf32> {ttir.name = "encoder_lin3.weight"}, %arg6: tensor<12xf32> {ttir.name = "encoder_lin3.bias"}, %arg7: tensor<12x3xf32> {ttir.name = "encoder_lin4.weight"}, %arg8: tensor<3xf32> {ttir.name = "encoder_lin4.bias"}, %arg9: tensor<3x12xf32> {ttir.name = "decoder_lin1.weight"}, %arg10: tensor<12xf32> {ttir.name = "decoder_lin1.bias"}, %arg11: tensor<12x64xf32> {ttir.name = "decoder_lin2.weight"}, %arg12: tensor<64xf32> {ttir.name = "decoder_lin2.bias"}, %arg13: tensor<64x128xf32> {ttir.name = "decoder_lin3.weight"}, %arg14: tensor<128xf32> {ttir.name = "decoder_lin3.bias"}, %arg15: tensor<128x784xf32> {ttir.name = "decoder_lin4.weight"}, %arg16: tensor<784xf32> {ttir.name = "decoder_lin4.bias"}) -> (tensor<1x784xf32> {ttir.name = "LinearAE.output_add_29"}) {
%0 = tensor.empty() : tensor<1x128xf32>
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
// RUN: ttmlir-opt --ttir-to-ttnn-backend-pipeline %s

#any_device = #tt.operand_constraint<dram|l1|scalar|tile|none|interleaved|single_bank|height_sharded|width_sharded|block_sharded|any_layout|any_device|any_device_tile|l1_block_sharded>
#loc = loc("LlamaForCausalLM":0:0)
#system_desc = #tt.system_desc<[{role = host, target_triple = "x86_64-pc-linux-gnu"}], [{arch = <wormhole_b0>, grid = 8x8, l1_size = 1499136, num_dram_channels = 12, dram_channel_size = 1073741824, noc_l1_address_align_bytes = 16, pcie_address_align_bytes = 32, noc_dram_address_align_bytes = 32, l1_unreserved_base = 1024, erisc_l1_unreserved_base = 1024, dram_unreserved_base = 1024, dram_unreserved_end = 1073741824, physical_cores = {worker = [ 0x0, 0x1, 0x2, 0x3, 0x4, 0x5, 0x6, 0x7, 1x0, 1x1, 1x2, 1x3, 1x4, 1x5, 1x6, 1x7, 2x0, 2x1, 2x2, 2x3, 2x4, 2x5, 2x6, 2x7, 3x0, 3x1, 3x2, 3x3, 3x4, 3x5, 3x6, 3x7, 4x0, 4x1, 4x2, 4x3, 4x4, 4x5, 4x6, 4x7, 5x0, 5x1, 5x2, 5x3, 5x4, 5x5, 5x6, 5x7, 6x0, 6x1, 6x2, 6x3, 6x4, 6x5, 6x6, 6x7, 7x0, 7x1, 7x2, 7x3, 7x4, 7x5, 7x6, 7x7] dram = [ 8x0, 9x0, 10x0, 8x1, 9x1, 10x1, 8x2, 9x2, 10x2, 8x3, 9x3, 10x3]}, supported_data_types = [<f32>, <f16>, <bf16>, <bfp_f8>, <bfp_bf8>, <bfp_f4>, <bfp_bf4>, <bfp_f2>, <bfp_bf2>, <u32>, <u16>, <u8>], supported_tile_sizes = [ 4x16, 16x16, 32x16, 4x32, 16x32, 32x32], num_cbs = 32}], [0], [3 : i32], [ 0x0x0x0]>
Expand Down
8 changes: 7 additions & 1 deletion tools/explorer/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -15,9 +15,15 @@ ExternalProject_Add(
INSTALL_COMMAND ""
)

set(PIP_EDITABLE_FLAG "")

if (TT_EXPLORER_EDITABLE)
set(PIP_EDITABLE_FLAG "-e")
endif()

add_custom_target(explorer
COMMENT "Building tt-explorer... ${TTMLIR_BIN_DIR}"
COMMAND pip install $<$<CONFIG:Debug>:-e> ${CMAKE_CURRENT_SOURCE_DIR}/tt_adapter
COMMAND pip install ${PIP_EDITABLE_FLAG} ${CMAKE_CURRENT_SOURCE_DIR}/tt_adapter
COMMAND pip install ${CMAKE_CURRENT_SOURCE_DIR}/model-explorer/src/model-explorer/src/server/package

DEPENDS TTMLIRPythonModules model-explorer ttrt
Expand Down
2 changes: 1 addition & 1 deletion tools/explorer/run.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,4 +9,4 @@
# TODO(odjuricic): Hack to make our extension default for .mlir files.
# This can be handled better when we switch to our model-explorer fork.
model_explorer.extension_manager.ExtensionManager.BUILTIN_ADAPTER_MODULES = []
model_explorer.visualize(extensions=["tt_adapter"])
model_explorer.visualize_from_config(extensions=["tt_adapter"], no_open_in_browser=True)
19 changes: 15 additions & 4 deletions tools/explorer/test/run_tests.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,23 +8,29 @@
import multiprocessing
import pytest
import glob
import os

HOST = "localhost"
PORT = 8002
COMMAND_URL = "http://" + HOST + ":" + str(PORT) + "/apipost/v1/send_command"
TEST_LOAD_MODEL_PATHS = [
"test/ttmlir/Dialect/TTNN/optimizer/mnist_sharding.mlir",
"tools/explorer/test/models/*.mlir",
"test/ttmlir/Explorer/*.mlir",
]
TEST_EXECUTE_MODEL_PATHS = [
"test/ttmlir/Silicon/TTNN/optimizer/mnist_sharding_tiled.mlir",
]

if "TT_EXPLORER_GENERATED_TEST_DIR" in os.environ:
TEST_LOAD_MODEL_PATHS.append(
os.environ["TT_EXPLORER_GENERATED_TEST_DIR"] + "/**/*.mlir"
)


def get_test_files(paths):
files = []
for path in paths:
files.extend(glob.glob(path))
files.extend(glob.glob(path, recursive=True))
return files


Expand All @@ -47,8 +53,13 @@ def execute_command(model_path, settings):
@pytest.fixture(scope="function", autouse=True)
def start_server(request):
server_thread = multiprocessing.Process(
target=model_explorer.visualize,
kwargs={"extensions": ["tt_adapter"], "host": HOST, "port": PORT},
target=model_explorer.visualize_from_config,
kwargs={
"extensions": ["tt_adapter"],
"host": HOST,
"port": PORT,
"no_open_in_browser": True,
},
)
server_thread.start()

Expand Down
Loading