-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pso optimizer #38
Pso optimizer #38
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
В целом к работе у меня нет вопросов, но есть вопросы к абстракциям и логике взаимодействия с пользователем. См. комменты.
verbose=1, | ||
learning_rate=1e-3, | ||
eps=1e-6, | ||
optimizer_mode='Adam', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
В дополнение у нас получается две логики - одна optimizer_mode='Adam' и другая optimizer_mode=PSO() .
Будто бы тут надо всё унифицировать.
x = torch.linspace(0, 1, grid_res+1) | ||
t = torch.linspace(0, t_end, grid_res+1) | ||
|
||
grid = grid_format_prepare([x,t], mode=mode).float() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Для кастомного лосса приходится доставать mode, видимо и вот эту функцию, которая должна быть скрыта для пользователя.
Кажется надо подумать над тем, как совместить новую абстракцию оптимизатора, со старым оптимизатором+mode.
* interface for data is refactored
We found out that the interface has a lot of thechnical debts and should be refactored, PR moved to #39 |
1.PSO optimizer is realized for all modes.
2. Small refactoring for Solver class. (futher changes are required)
3. PSO example and tutorial are added.
4. Update all doc strings in google style.