-
Notifications
You must be signed in to change notification settings - Fork 951
Conversation
Indentation left. Will do soon. |
Are you using vscode? If so can you merge with master and make sure you have clang-format extension installed? If not using vscode, make sure you format with clang-format and clang-format.style = "Google" (I won't comment on individual stylistic changes that this will fix). Overall looks great after these minor changes! Reviewed 1 of 7 files at r1. demos/model-builder/model-builder.ts, line 853 at r1 (raw file):
revert this file src/adagrad_optimizer.ts, line 54 at r1 (raw file):
remove ! everywhere. src/rmsprop_optimizer.ts, line 20 at r1 (raw file):
call this RMSPropOptimizer src/rmsprop_optimizer.ts, line 36 at r1 (raw file):
can we rename this from cache? How about accumulatedGradients src/rmsprop_optimizer.ts, line 55 at r1 (raw file):
remove ! everywhere Comments from Reviewable |
src/rmsprop_optimizer.ts, line 36 at r1 (raw file): Previously, nsthorat (Nikhil Thorat) wrote…
We have already used accumulatedGradients in optimizer.ts's after example, so it may be confusing. Here, we are storing sum of gradient squares so, we can use something like accumulatedSquaredGradients and for rmsprop we have discounted prev cache , so we can use accumulatedDiscountedSquaredGradients or accumulatedSquaredGradients only. What do you think ? Comments from Reviewable |
Let me know when you're auto formatting and I will merge this! Reviewed 2 of 7 files at r1, 4 of 4 files at r2. src/adagrad_optimizer.ts, line 22 at r2 (raw file):
with clang-format, this should be indented, can you make sure you've got it running (same for the other files) Sorry about that, everything else looks good after the code gets formatted src/rmsprop_optimizer.ts, line 36 at r1 (raw file): Previously, mnottheone (Aman Kumar Singh) wrote…
Ah, got it. accumulatedSquaredGradients sounds good. Comments from Reviewable |
Review status: all files reviewed at latest revision, 4 unresolved discussions, some commit checks failed. src/adagrad_optimizer.ts, line 1 at r2 (raw file):
quick update, this license format slightly changed, can you update it (after merging)? Comments from Reviewable |
Please review them and let me know anything left. I guess, now I have clang-format setting right \m/ :) |
Haha, unfortunately this indentation is still because clang-format isn't quite set up properly (it's not formatting with clang, it's formatting with the vscode settings). Do you have both the command line clang-format tool and the visual studio code extension installed? Sorry to be pedantic, it's just important to have consistent looking code :) Review status: 0 of 8 files reviewed at latest revision, 4 unresolved discussions. Comments from Reviewable |
Review status: 0 of 8 files reviewed at latest revision, 3 unresolved discussions. src/rmsprop_optimizer.ts, line 36 at r1 (raw file): Previously, nsthorat (Nikhil Thorat) wrote…
Still need to change from cache :) Comments from Reviewable |
Hopefully, okay now. Comments from Reviewable |
One more tiny comment then I will merge, thanks for bearing with me :) Reviewed 2 of 8 files at r3, 6 of 6 files at r4. src/index.ts, line 1 at r4 (raw file):
open this file and hit save to format so we dont commit these old formatter changes Comments from Reviewable |
Reviewed 1 of 1 files at r5. Comments from Reviewable |
Review status: all files reviewed at latest revision, all discussions resolved. Comments from Reviewable |
Thanks ! :) Next things to do :
You can add or change anything, you want. |
Awesome! BTW, I'm about to do a big directory structure change (optimizers in a directory, etc). You can start working on those, but just know merging will be.. fun ;) We'll update the API docs with your new optimizers since it's automatically generated. |
* Sgd+ momentum optimizer added * momentum optimizer extended from sgd * momentum optimizer used in model-builder * cleanup * -_- * redundant code removed in momentumOptimizer * tabs replaced with spaces * space added * space added * resolved conflicts * rmsprop and adagrad optimizer added * resolved texture leakage and optimizers inherited from optimizer.ts * Merge branch 'master' into master * Merge remote-tracking branch 'upstream/master' * minor changes in optimizers * Merge branch 'master' into master * resolved conflicts * Merge branch 'master' of https://github.com/mnottheone/deeplearnjs * formatting done * license updated * cache -> accumulatedSquaredGradients * formatted * formatted
Also resolved texture leakage and optimizers inherited from optimizer.ts
This change is![Reviewable](https://camo.githubusercontent.com/1541c4039185914e83657d3683ec25920c672c6c5c7ab4240ee7bff601adec0b/68747470733a2f2f72657669657761626c652e696f2f7265766965775f627574746f6e2e737667)