Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ReductionLayer #2089

Merged
merged 1 commit into from
Jun 3, 2015
Merged

ReductionLayer #2089

merged 1 commit into from
Jun 3, 2015

Conversation

jeffdonahue
Copy link
Contributor

Performs a "reduction" operation (currently SUM, MEAN, ASUM for sum of absolute values, and SUMSQ for sum of squares) to turn a number of "tail" axes into a single scalar value. The MEAN operation, in combination with a loss_weight, is useful for creating custom losses that don't have an obnoxious amount of output. For example, this EuclideanLoss:

layer {
  type: "EuclideanLoss"
  bottom: "preds"
  bottom: "targets"
  top: "mean_squared_error"
  loss_weight: 1
}

...is equivalent to this Reduction:

layer {
  type: "Eltwise"
  bottom: "preds"
  bottom: "targets"
  top: "error"
  eltwise_param {
    operation: SUM
    coeff: 1
    coeff: -1
  }
}
layer {
  type: "Reduction"
  bottom: "error"
  top: "sum_squared_error_per_instance"
  reduction_param { operation: SUMSQ axis: 1 }
}
layer {
  type: "Reduction"
  bottom: "sum_squared_error_per_instance"
  top: "mean_squared_error"
  reduction_param { operation: MEAN axis: 0 coeff: 0.5 }
  loss_weight: 1
}

(would be more efficient to do as a single Reduction with SUM_OF_SQUARES and a certain coeff setting, but then you have to compute the batch size and everything is less pretty...)

Eventually, this should support reduction along inner axes (e.g. support an end_axis), but that makes the implementation substantially more involved than this...

@jeffdonahue jeffdonahue changed the title Reduction Layer ReductionLayer Mar 10, 2015
@jeffdonahue jeffdonahue force-pushed the reduction-layer branch 2 times, most recently from 536cbc6 to 8995235 Compare March 13, 2015 01:29
myfavouritekk added a commit to myfavouritekk/caffe that referenced this pull request Mar 16, 2015
ReductionLayer

* jeffdonahue/reduction-layer:
  Add ReductionLayer to reduce any number of "tail" axes to a scalar value

Conflicts:
	src/caffe/proto/caffe.proto
case ReductionParameter_ReductionOp_ASUM:
*top_data = caffe_cpu_asum(dim_, bottom_data);
break;
case ReductionParameter_ReductionOp_SUM_OF_SQUARES:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

SUM_OF_SQUARES is a little unlike the other op names. SUMSQ fits in with ASUM. Your pick.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fair enough -- I'll change it to SUMSQ

@shelhamer
Copy link
Member

Once the comments are addressed this looks fine, although there could be a test for reducing over a tail that isn't all dimensions.

p.s. N-D blobs are nice. count() and CanonicalAxisIndex()make for more obvious code.

@jeffdonahue
Copy link
Contributor Author

Thanks for all the reviews @shelhamer! Tests for non-zero axis added; will merge after Travis.

@jeffdonahue jeffdonahue force-pushed the reduction-layer branch 5 times, most recently from 5228452 to 823d055 Compare June 3, 2015 03:46
Currently implements operations SUM, MEAN, ASUM (sum of absolute
values), and SUMSQ (sum of squares)
@jeffdonahue
Copy link
Contributor Author

Wow... especially thanks for the suggestion to test other axes. I had a bug in Backward for asum and sumsq (wasn't incrementing bottom_data). Should be good now...

@shelhamer
Copy link
Member

Tests are more trustworthy than me. Good catch!

jeffdonahue added a commit that referenced this pull request Jun 3, 2015
@jeffdonahue jeffdonahue merged commit 0cc7e18 into BVLC:master Jun 3, 2015
@jeffdonahue jeffdonahue deleted the reduction-layer branch June 5, 2015 03:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants