Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix dead links in cpp docs #2937

Merged
merged 1 commit into from
Feb 10, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 9 additions & 9 deletions cpp/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,30 +34,30 @@ cd serve
torchserve torchserve --ncs --start --model-store model_store
```
## Backend
TorchServe cpp backend can run as a process, which is similar to [TorchServe Python backend](https://github.com/pytorch/serve/tree/master/ts). By default, TorchServe supports torch scripted model in cpp backend. Other platforms such as MxNet, ONNX can be supported through custom handlers following the TorchScript example [src/backends/handler/torch_scripted_handler.hh](https://github.com/pytorch/serve/blob/master/src/backends/handler/torch_scripted_handler.hh).
TorchServe cpp backend can run as a process, which is similar to [TorchServe Python backend](https://github.com/pytorch/serve/tree/master/ts). By default, TorchServe supports torch scripted model in cpp backend. Other platforms such as MxNet, ONNX can be supported through custom handlers following the TorchScript example [src/backends/handler/torch_scripted_handler.hh](https://github.com/pytorch/serve/blob/master/cpp/src/backends/handler/torch_scripted_handler.hh).
### Custom Handler
By default, TorchServe cpp provides a handler for TorchScript [src/backends/handler/torch_scripted_handler.hh](https://github.com/pytorch/serve/blob/master/src/backends/handler/torch_scripted_handler.hh). Its uses the [BaseHandler](https://github.com/pytorch/serve/blob/master/src/backends/handler/base_handler.hh) which defines the APIs to customize handler.
* [Initialize](serve/blob/cpp_backend/cpp/src/backends/handler/base_handler.hh#L29)
* [LoadModel](serve/blob/cpp_backend/cpp/src/backends/handler/base_handler.hh#L37)
* [Preprocess](serve/blob/cpp_backend/cpp/src/backends/handler/base_handler.hh#L40)
* [Inference](serve/blob/cpp_backend/cpp/src/backends/handler/base_handler.hh#L46)
* [Postprocess](serve/blob/cpp_backend/cpp/src/backends/handler/base_handler.hh#L53)
By default, TorchServe cpp provides a handler for TorchScript [src/backends/handler/torch_scripted_handler.hh](https://github.com/pytorch/serve/blob/master/cpp/src/backends/handler/torch_scripted_handler.hh). Its uses the [BaseHandler](https://github.com/pytorch/serve/blob/master/cpp/src/backends/handler/base_handler.hh) which defines the APIs to customize handler.
* [Initialize](https://github.com/pytorch/serve/blob/ba8f96a6e68ca7f63b55d72a21aad364334e4d8e/cpp/src/backends/handler/base_handler.hh#L34)
* [LoadModel](https://github.com/pytorch/serve/blob/ba8f96a6e68ca7f63b55d72a21aad364334e4d8e/cpp/src/backends/handler/base_handler.hh#L41)
* [Preprocess](https://github.com/pytorch/serve/blob/ba8f96a6e68ca7f63b55d72a21aad364334e4d8e/cpp/src/backends/handler/base_handler.hh#L43)
* [Inference](https://github.com/pytorch/serve/blob/ba8f96a6e68ca7f63b55d72a21aad364334e4d8e/cpp/src/backends/handler/base_handler.hh#L49)
* [Postprocess](https://github.com/pytorch/serve/blob/ba8f96a6e68ca7f63b55d72a21aad364334e4d8e/cpp/src/backends/handler/base_handler.hh#L55)
#### Usage
##### Using TorchScriptHandler
* set runtime as "LSP" in model archiver option [--runtime](https://github.com/pytorch/serve/tree/master/model-archiver#arguments)
* set handler as "TorchScriptHandler" in model archiver option [--handler](https://github.com/pytorch/serve/tree/master/model-archiver#arguments)
```
torch-model-archiver --model-name mnist_base --version 1.0 --serialized-file mnist_script.pt --handler TorchScriptHandler --runtime LSP
```
Here is an [example](https://github.com/pytorch/serve/tree/cpp_backend/cpp/test/resources/examples/mnist/base_handler) of unzipped model mar file.
Here is an [example](https://github.com/pytorch/serve/tree/master/cpp/test/resources/examples/mnist/base_handler) of unzipped model mar file.
##### Using Custom Handler
* build customized handler shared lib. For example [Mnist handler](https://github.com/pytorch/serve/blob/cpp_backend/cpp/src/examples/image_classifier/mnist).
* set runtime as "LSP" in model archiver option [--runtime](https://github.com/pytorch/serve/tree/master/model-archiver#arguments)
* set handler as "libmnist_handler:MnistHandler" in model archiver option [--handler](https://github.com/pytorch/serve/tree/master/model-archiver#arguments)
```
torch-model-archiver --model-name mnist_handler --version 1.0 --serialized-file mnist_script.pt --handler libmnist_handler:MnistHandler --runtime LSP
```
Here is an [example](https://github.com/pytorch/serve/tree/cpp_backend/cpp/test/resources/examples/mnist/mnist_handler) of unzipped model mar file.
Here is an [example](https://github.com/pytorch/serve/tree/master/cpp/test/resources/examples/mnist/mnist_handler) of unzipped model mar file.

#### Examples
We have created a couple of examples that can get you started with the C++ backend.
Expand Down
2 changes: 1 addition & 1 deletion examples/cpp/aot_inductor/llama2/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ We will need the handler .so file as well as the stories15M.so file containing t
cd ~/serve/examples/cpp/aot_inductor/llama2
cp ../../../../cpp/_build/test/resources/examples/aot_inductor/llama_handler/libllama_so_handler.so ./
```
We will leave the model .so file in place and just use its [path](../../../../cpp/_build/test/resources/examples/aot_inductor/llama_handler/stories15M.so) in the next step.
We will leave the model .so file in place and just use its absolute path in the next step.

4. Create a [config.json](config.json) with the path of the downloaded model and tokenizer:

Expand Down
4 changes: 2 additions & 2 deletions examples/cpp/babyllama/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
## BabyLlama example

This example is adapted from https://github.com/karpathy/llama2.c. The handler C++ source code for this examples can be found [here](../../../cpp/src/examples/babyllama/).
This example is adapted from https://github.com/karpathy/llama2.c. The handler C++ source code for this examples can be found [here](./src/).

### Setup
1. Follow the instructions in [README.md](../../../cpp/README.md) to build the TorchServe C++ backend.
Expand Down Expand Up @@ -29,7 +29,7 @@ echo '{

5. Copy handle .so file

While building the C++ backend the `libbabyllama_handler.so` file is generated in the [babyllama_handler](../../../cpp/test/resources/examples/babyllama/babyllama_handler) folder.
While building the C++ backend the `libbabyllama_handler.so` file is generated in the [babyllama_handler](../../../cpp/_build/test/resources/examples/babyllama/babyllama_handler/) folder.

```bash
cp ../../../cpp/_build/test/resources/examples/babyllama/babyllama_handler/libbabyllama_handler.so ./
Expand Down
4 changes: 2 additions & 2 deletions examples/cpp/llamacpp/README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
## Llama.cpp example

This example used [llama.cpp](https://github.com/ggerganov/llama.cpp) to deploy a Llama-2-7B-Chat model using the TorchServe C++ backend.
The handler C++ source code for this examples can be found [here](../../../cpp/src/examples/llamacpp/).
The handler C++ source code for this examples can be found [here](./src/).

### Setup
1. Follow the instructions in [README.md](../../../cpp/README.md) to build the TorchServe C++ backend.
Expand All @@ -28,7 +28,7 @@ echo '{

5. Copy handle .so file

While building the C++ backend the `libllamacpp_handler.so` file is generated in the [llamacpp_handler](../../../cpp/test/resources/examples/llamacpp/llamacpp_handler) folder.
While building the C++ backend the `libllamacpp_handler.so` file is generated in the [llamacpp_handler](../../../cpp/_build/test/resources/examples/llamacpp/llamacpp_handler) folder.

```bash
cp ../../../cpp/_build/test/resources/examples/llamacpp/llamacpp_handler/libllamacpp_handler.so ./
Expand Down
Loading