Skip to content

Commit

Permalink
Merge branch 'staging_0_1_1' into issue_57
Browse files Browse the repository at this point in the history
  • Loading branch information
maaquib authored May 2, 2020
2 parents 5f279b1 + 71681cb commit 0b66894
Show file tree
Hide file tree
Showing 32 changed files with 747 additions and 435 deletions.
50 changes: 50 additions & 0 deletions ISSUE_TEMPLATE/bug_template.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
---
name: "\U0001F41B Bug report"
about: Create a report to help us improve

---

Your issue may already be reported!
Please search on the [issue tracker](https://github.com/pytorch/serve/issues) before creating one.

## Context
<!--- How has this issue affected you? What are you trying to accomplish? -->
<!--- Providing context helps us come up with a solution that is most useful in the real world -->
* torchserve version:
* torch version:
* torchvision version [if any]:
* torchtext version [if any]:
* torchaudio version [if any]:
* java version:
* Operating System and version:

## Expected Behavior
<!--- If you're describing a bug, tell us what should happen -->

## Current Behavior
<!--- If describing a bug, tell us what happens instead of the expected behavior -->

## Possible Solution
<!--- Not obligatory, but suggest a fix/reason for the bug -->

## Steps to Reproduce
<!--- Provide a link to a live example, or an unambiguous set of steps to -->
<!--- reproduce this bug. Include code to reproduce, if relevant -->
1.
2.
...

## Failure Logs [if any]
<!--- Provide any relevant log snippets or files here. -->

## Your Environment
<!--- Include as many relevant details about the environment you experienced the bug in -->
* Installed using source? [yes/no]:
* Are you planning deploy it using docker container? [yes/no]:
* Is it a CPU or GPU environment?:
* Using a default/custom handler? [If possible upload/share custom handler/model]:
* What kind of model is it e.g. vision, text, audio?:
* Are you planning to use local models from model-store or public url being used e.g. from S3 bucket etc.?
[If public url then provide link.]:
* Provide config.properties, logs [ts.log] and parameters used for model registration/update APIs:
* Link to your project [if any]:
9 changes: 9 additions & 0 deletions ISSUE_TEMPLATE/doc_template.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
---
name: "\U0001F4DA Documentation"
about: Report an issue related to https://pytorch.org/serve/

---

## 📚 Documentation

<!-- A clear and concise description of what content in https://pytorch.org/serve/ is an issue. If this has to do with the general https://pytorch.org website, please file an issue at https://github.com/pytorch/pytorch.github.io/issues/new/choose instead. If this has to do with https://pytorch.org/tutorials, please file an issue at https://github.com/pytorch/tutorials/issues/new -->
20 changes: 20 additions & 0 deletions ISSUE_TEMPLATE/feature_template.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
---
name: "\U0001F680 Feature request"
about: Suggest an idea for this project

---

<!--
Thank you for suggesting an idea to improve torchserve model serving experience.
Please fill in as much of the template below as you're able.
-->

## Is your feature request related to a problem? Please describe.
<!-- Please describe the problem you are trying to solve. -->

## Describe the solution
<!-- Please describe the desired behavior. -->

## Describe alternatives solution
<!-- Please describe alternative solutions or features you have considered. -->
72 changes: 51 additions & 21 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,48 +17,62 @@ Conda instructions are provided in more detail, but you may also use `pip` and `
**Note:** Java 11 is required. Instructions for installing Java 11 for Ubuntu or macOS are provided in the [Install with Conda](#install-with-conda) section.

### Install with pip

To use `pip` to install TorchServe and the model archiver:

```
pip install torch torchtext torchvision sentencepiece
``` bash
pip install torch torchtext torchvision sentencepiece psutil future
pip install torchserve torch-model-archiver
```

### Install with Conda
_Ubuntu_

#### Ubuntu

1. Install Java 11

```bash
sudo apt-get install openjdk-11-jdk
```

1. Install Conda (https://docs.conda.io/projects/conda/en/latest/user-guide/install/linux.html)
1. Create an environment and install torchserve and torch-model-archiver
For CPU

```bash
conda create --name torchserve torchserve torch-model-archiver pytorch torchtext torchvision -c pytorch -c powerai
conda create --name torchserve torchserve torch-model-archiver psutil future pytorch sentencepiece torchtext torchvision -c pytorch -c powerai
```

For GPU

```bash
conda create --name torchserve torchserve torch-model-archiver pytorch torchtext torchvision cudatoolkit=10.1 -c pytorch -c powerai
conda create --name torchserve torchserve torch-model-archiver psutil future pytorch sentencepiece torchtext torchvision cudatoolkit=10.1 -c pytorch -c powerai
```

1. Activate the environment

```bash
source activate torchserve
```

_macOS_
#### macOS

1. Install Java 11

```bash
brew tap AdoptOpenJDK/openjdk
brew cask install adoptopenjdk11
```

1. Install Conda (https://docs.conda.io/projects/conda/en/latest/user-guide/install/macos.html)
1. Create an environment and install torchserve and torch-model-archiver

```bash
conda create --name torchserve torchserve torch-model-archiver pytorch torchtext torchvision -c pytorch -c powerai
conda create --name torchserve torchserve torch-model-archiver psutil future pytorch sentencepiece torchtext torchvision -c pytorch -c powerai
```

1. Activate the environment

```bash
source activate torchserve
```
Expand All @@ -68,18 +82,25 @@ Now you are ready to [package and serve models with TorchServe](#serve-a-model).
### Install TorchServe for development

If you plan to develop with TorchServe and change some of the source code, you must install it from source code.
First, clone the repo with:

```bash
git clone https://github.com/pytorch/serve
cd serve
```
1. Install dependencies

Then make your changes executable with this command:
```bash
pip install psutil future -y
```

```bash
pip install -e .
```
1. Clone the repo

```bash
git clone https://github.com/pytorch/serve
cd serve
```

1. Make your changes executable

```bash
pip install -e .
```

* To develop with torch-model-archiver:

Expand All @@ -90,6 +111,7 @@ pip install -e .

* To upgrade TorchServe or model archiver from source code and make changes executable, run:

For CPU run the following command:
```bash
pip install -U -e .
```
Expand All @@ -103,8 +125,8 @@ This section shows a simple example of serving a model with TorchServe. To compl
To run this example, clone the TorchServe repository and navigate to the root of the repository:

```bash
cd ~
git clone https://github.com/pytorch/serve.git
cd serve
```

Then run the following steps from the root of the repository.
Expand All @@ -117,8 +139,8 @@ You can also create model stores to store your archived models.
1. Create a directory to store your models.

```bash
mkdir ~/model_store
cd ~/model_store
mkdir ./model_store
cd ./model_store
```

1. Download a trained model.
Expand All @@ -130,7 +152,7 @@ You can also create model stores to store your archived models.
1. Archive the model by using the model archiver. The `extra-files` param uses fa file from the `TorchServe` repo, so update the path if necessary.

```bash
torch-model-archiver --model-name densenet161 --version 1.0 --model-file ~/serve/examples/image_classifier/densenet_161/model.py --serialized-file ~/model_store/densenet161-8d451a50.pth --extra-files ~/serve/examples/image_classifier/index_to_name.json --handler image_classifier
torch-model-archiver --model-name densenet161 --version 1.0 --model-file ./serve/examples/image_classifier/densenet_161/model.py --serialized-file ./model_store/densenet161-8d451a50.pth --extra-files ./serve/examples/image_classifier/index_to_name.json --handler image_classifier
```

For more information about the model archiver, see [Torch Model archiver for TorchServe](model-archiver/README.md)
Expand All @@ -140,7 +162,7 @@ For more information about the model archiver, see [Torch Model archiver for Tor
After you archive and store the model, use the `torchserve` command to serve the model.

```bash
torchserve --start --model-store ~/model_store --models ~/model_store/densenet161.mar
torchserve --start --model-store ./model_store --models ./model_store/densenet161.mar
```

After you execute the `torchserve` command above, TorchServe runs on your host, listening for inference requests.
Expand Down Expand Up @@ -249,6 +271,14 @@ To run your TorchServe Docker image and start TorchServe inside the container wi
```bash
./start.sh
```
For GPU run the following command:
```bash
./start.sh --gpu
```
For GPU with specific GPU device ids run the following command:
```bash
./start.sh --gpu_devices 1,2,3
```

## Learn More

Expand Down
16 changes: 2 additions & 14 deletions benchmarks/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,20 +7,8 @@ The benchmarks measure the performance of TorchServe on various models and bench
### Ubuntu

The script is mainly intended to run on a Ubuntu EC2 instance. For this reason, we have provided an `install_dependencies.sh` script to install everything needed to execute the benchmark on this environment. All you need to do is run this file and clone the TorchServe repo.

While installing JMeter through brew, the `install_depdendencies.sh` script asks for following command line input.
```bash
Installing JMeter through Brew
+ yes ''
+ brew update
==> Select the Linuxbrew installation directory
- Enter your password to install to /home/linuxbrew/.linuxbrew (recommended)
- Press Control-D to install to /home/ubuntu/.linuxbrew
- Press Control-C to cancel installation
[sudo] password for ubuntu:
```

Here `Press Control-D to install to /home/ubuntu/.linuxbrew` as the `ubuntu` user on EC2 node has password-less sudo access.
On CPU based instance, use `install_depdendencies.sh`.
On GPU based instance, use `install_depdendencies.sh True`.

### MacOS

Expand Down
Loading

0 comments on commit 0b66894

Please sign in to comment.