Skip to content

Commit

Permalink
Add PSPNet
Browse files Browse the repository at this point in the history
  • Loading branch information
mitmul committed Aug 10, 2017
0 parents commit 31b7e71
Show file tree
Hide file tree
Showing 17 changed files with 25,617 additions and 0 deletions.
108 changes: 108 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,108 @@
*.jpg
*.png
.DS_Store
.sync-config.cons
weights/*.chainer
weights/*.caffemodel

# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class

# C extensions
*.so

# Distribution / packaging
.Python
env/
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
*.egg-info/
.installed.cfg
*.egg

# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec

# Installer logs
pip-log.txt
pip-delete-this-directory.txt

# Unit test / coverage reports
htmlcov/
.tox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
.hypothesis/

# Translations
*.mo
*.pot

# Django stuff:
*.log
local_settings.py

# Flask stuff:
instance/
.webassets-cache

# Scrapy stuff:
.scrapy

# Sphinx documentation
docs/_build/

# PyBuilder
target/

# Jupyter Notebook
.ipynb_checkpoints

# pyenv
.python-version

# celery beat schedule file
celerybeat-schedule

# SageMath parsed files
*.sage.py

# dotenv
.env

# virtualenv
.venv
venv/
ENV/

# Spyder project settings
.spyderproject
.spyproject

# Rope project settings
.ropeproject

# mkdocs documentation
/site

# mypy
.mypy_cache/
3 changes: 3 additions & 0 deletions .isort.cfg
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
[settings]
force_single_line = True
sections = FUTURE,STDLIB,DJANGO,THIRDPARTY,FIRSTPARTY,LOCALFOLDER
26 changes: 26 additions & 0 deletions .sync-config.cson
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
remote:
host: "sakura227",
# host: "sakura37",
user: "shunta",
path: "/mnt/sakuradata3/mitmul/codes/chainer-pspnet"
# path: "/mnt/sakuradata10-striped/mitmul/codes/segmentation"

behaviour:
uploadOnSave: true
syncDownOnOpen: false
forgetConsole: false
autoHideConsole: true
alwaysSyncAll: false

option:
deleteFiles: true
autoHideDelay: 1500
exclude: [
'.sync-config.cson'
'.git'
'node_modules'
'tmp'
'vendor'
]
flags: 'avpur'
shell: 'ssh'
66 changes: 66 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
PSPNet
======

This is an unofficial implementation of Pyramid Scene Parsing Network (PSPNet) in Chainer.

# Training

## Requirement

- Python 3.4.4+
- Chainer 3.0.0b1+
- ChainerMN master@distributed-batch-normalization
- `pip install git+https://github.com/chainer/chainermn@distributed-batch-normalization`
- CuPy 2.0.0b1+
- ChainerCV 0.6.0+
- cffi 1.10.0+
- NumPy 1.12.0+
- mpi4py 2.0.0+

# Inference using converted weights

## Requirement

- Python 3.4.4+
- Chainer 3.0.0b1+
- ChainerCV 0.6.0+
- Matplotlib 2.0.0+

## 1. Download converted weights

```
$ bash download.sh
```

## 2. Run demo.py

```
$ python demo.py sample.jpg
```


# Convert weights by yourself

**Caffe is not needed** to convert `.caffemodel` to Chainer model by using `caffe_pb2.py`.

## Requirement

- Python 3.4.4+
- protobuf 3.2.0+
- Chainer 3.0.0b1+

## 1. Download the original weights

Please download the weights below from the author's repository:

- pspnet50\_ADE20K.caffemodel: [GoogleDrive](https://drive.google.com/open?id=0BzaU285cX7TCN1R3QnUwQ0hoMTA)
- pspnet101\_VOC2012.caffemodel: [GoogleDrive](https://drive.google.com/open?id=0BzaU285cX7TCNVhETE5vVUdMYk0)
- pspnet101\_cityscapes.caffemodel: [GoogleDrive](https://drive.google.com/open?id=0BzaU285cX7TCT1M3TmNfNjlUeEU)

**and then put them into `weights` directory.**

## 2. Convert weights

```
$ python -m convert
```
6,532 changes: 6,532 additions & 0 deletions caffe_pb2.py

Large diffs are not rendered by default.

Loading

0 comments on commit 31b7e71

Please sign in to comment.