Skip to content
This repository has been archived by the owner on Aug 15, 2019. It is now read-only.

Add RMSProp and Adagrad optimizers. #102

Merged
merged 23 commits into from
Sep 16, 2017
Merged

Conversation

mnottheone
Copy link
Contributor

@mnottheone mnottheone commented Sep 6, 2017

Also resolved texture leakage and optimizers inherited from optimizer.ts

This change is Reviewable

@mnottheone
Copy link
Contributor Author

Indentation left. Will do soon.

@nsthorat
Copy link
Contributor

nsthorat commented Sep 8, 2017

Are you using vscode? If so can you merge with master and make sure you have clang-format extension installed? If not using vscode, make sure you format with clang-format and clang-format.style = "Google" (I won't comment on individual stylistic changes that this will fix).

Overall looks great after these minor changes!


Reviewed 1 of 7 files at r1.
Review status: 1 of 7 files reviewed at latest revision, 5 unresolved discussions, some commit checks failed.


demos/model-builder/model-builder.ts, line 853 at r1 (raw file):

}

document.registerElement(ModelBuilder.prototype.is, ModelBuilder);

revert this file


src/adagrad_optimizer.ts, line 54 at r1 (raw file):

        const gradientSquare = math.multiply(gradient, gradient);
        const cache = math.add(oldCache, gradientSquare);
        const variable = math.scaledArrayAdd(this.c!,

remove ! everywhere.


src/rmsprop_optimizer.ts, line 20 at r1 (raw file):

import {TensorArrayMap, SummedTensorArrayMap} from './tensor_array_map';

export class RmspropOptimizer extends Optimizer {

call this RMSPropOptimizer


src/rmsprop_optimizer.ts, line 36 at r1 (raw file):

    super.beforeBatch(math, batchSize, runtime,
      activationArrayMap, gradientArrayMap);
      if (this.cache.size() === 0) {

can we rename this from cache? How about accumulatedGradients


src/rmsprop_optimizer.ts, line 55 at r1 (raw file):

        const oldCache = this.cache.get(node.output);
        const gradientSquare = math.multiply(gradient, gradient);
        const cache = math.scaledArrayAdd(this.g!, oldCache,

remove ! everywhere


Comments from Reviewable

@mnottheone
Copy link
Contributor Author

src/rmsprop_optimizer.ts, line 36 at r1 (raw file):

Previously, nsthorat (Nikhil Thorat) wrote…

can we rename this from cache? How about accumulatedGradients

We have already used accumulatedGradients in optimizer.ts's after example, so it may be confusing. Here, we are storing sum of gradient squares so, we can use something like accumulatedSquaredGradients and for rmsprop we have discounted prev cache , so we can use accumulatedDiscountedSquaredGradients or accumulatedSquaredGradients only. What do you think ?


Comments from Reviewable

@nsthorat nsthorat changed the title resolved texture leakage and optimizers inherited from optimizer.ts Add RMSProp and Adagrad optimizers. Sep 10, 2017
@nsthorat
Copy link
Contributor

Let me know when you're auto formatting and I will merge this!


Reviewed 2 of 7 files at r1, 4 of 4 files at r2.
Review status: all files reviewed at latest revision, 3 unresolved discussions.


src/adagrad_optimizer.ts, line 22 at r2 (raw file):

export class AdagradOptimizer extends Optimizer {
  constructor(protected learningRate: number,
    protected momentum: number, specifiedVariableList?: Node[]) {

with clang-format, this should be indented, can you make sure you've got it running (same for the other files)

Sorry about that, everything else looks good after the code gets formatted


src/rmsprop_optimizer.ts, line 36 at r1 (raw file):

Previously, mnottheone (Aman Kumar Singh) wrote…

We have already used accumulatedGradients in optimizer.ts's after example, so it may be confusing. Here, we are storing sum of gradient squares so, we can use something like accumulatedSquaredGradients and for rmsprop we have discounted prev cache , so we can use accumulatedDiscountedSquaredGradients or accumulatedSquaredGradients only. What do you think ?

Ah, got it. accumulatedSquaredGradients sounds good.


Comments from Reviewable

@nsthorat
Copy link
Contributor

Review status: all files reviewed at latest revision, 4 unresolved discussions, some commit checks failed.


src/adagrad_optimizer.ts, line 1 at r2 (raw file):

/* Copyright 2017 Google Inc. All Rights Reserved.

quick update, this license format slightly changed, can you update it (after merging)?


Comments from Reviewable

@mnottheone
Copy link
Contributor Author

Please review them and let me know anything left. I guess, now I have clang-format setting right \m/ :)

@nsthorat
Copy link
Contributor

Haha, unfortunately this indentation is still because clang-format isn't quite set up properly (it's not formatting with clang, it's formatting with the vscode settings). Do you have both the command line clang-format tool and the visual studio code extension installed?

Sorry to be pedantic, it's just important to have consistent looking code :)


Review status: 0 of 8 files reviewed at latest revision, 4 unresolved discussions.


Comments from Reviewable

@nsthorat
Copy link
Contributor

Review status: 0 of 8 files reviewed at latest revision, 3 unresolved discussions.


src/rmsprop_optimizer.ts, line 36 at r1 (raw file):

Previously, nsthorat (Nikhil Thorat) wrote…

Ah, got it. accumulatedSquaredGradients sounds good.

Still need to change from cache :)


Comments from Reviewable

@mnottheone
Copy link
Contributor Author

Hopefully, okay now.


Comments from Reviewable

@nsthorat
Copy link
Contributor

One more tiny comment then I will merge, thanks for bearing with me :)


Reviewed 2 of 8 files at r3, 6 of 6 files at r4.
Review status: all files reviewed at latest revision, 1 unresolved discussion.


src/index.ts, line 1 at r4 (raw file):

/**

open this file and hit save to format so we dont commit these old formatter changes


Comments from Reviewable

@nsthorat
Copy link
Contributor

Reviewed 1 of 1 files at r5.
Review status: all files reviewed at latest revision, all discussions resolved.


Comments from Reviewable

@nsthorat
Copy link
Contributor

:lgtm_strong:


Review status: all files reviewed at latest revision, all discussions resolved.


Comments from Reviewable

@nsthorat nsthorat merged commit 53afaa6 into tensorflow:master Sep 16, 2017
@mnottheone
Copy link
Contributor Author

Thanks ! :)

Next things to do :

  • Implement Adadelta, Adam, and Adamax.
  • Add documentation in API docs for new optimizers.

You can add or change anything, you want.

@nsthorat
Copy link
Contributor

Awesome! BTW, I'm about to do a big directory structure change (optimizers in a directory, etc). You can start working on those, but just know merging will be.. fun ;)

We'll update the API docs with your new optimizers since it's automatically generated.

mnottheone added a commit to mnottheone/deeplearnjs that referenced this pull request Dec 1, 2018
* Sgd+ momentum optimizer added

* momentum optimizer extended from sgd

* momentum optimizer used in model-builder

* cleanup

* -_-

* redundant code removed in momentumOptimizer

* tabs replaced with spaces

* space added

* space added

* resolved conflicts

* rmsprop and adagrad optimizer added

*  resolved texture leakage and optimizers inherited from optimizer.ts

* Merge branch 'master' into master

* Merge remote-tracking branch 'upstream/master'

* minor changes in optimizers

* Merge branch 'master' into master

*  resolved conflicts

* Merge branch 'master' of https://github.com/mnottheone/deeplearnjs

* formatting done

* license updated

* cache -> accumulatedSquaredGradients

* formatted

* formatted
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants