Reuse data memory if not propagate down #86
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Make a dry-run of forward pass. If some layer does not propagate down (or during test phase), reuse its top blob's data memory.
Change the keys from blob address to blob_name + "_data" (or "_diff") in the slot array.
Add to NetParameter an option
optimize_mem
that controls the memory optimization strategy. It has three enum values:NO_OPTIM
: no memory optimizationTRAIN_ONLY
: only optimize forward and backward blobs memory during training phaseALL_OPTIM
: optimize for both the training and test phaseIt is set to
TRAIN_ONLY
by default. Note that using the memory optimization may corrupt internal blobs data/diff. So you may wish to useNO_OPTIM
for feature extraction.