Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Require uninitialized optimizers for our learners #119

Merged
merged 2 commits into from
Jul 1, 2020

Commits on Jun 17, 2020

  1. Require uninitialized optimizers for our learners

    An initialized optimizer is a tensorflow object, which (at least in
    graph mode in tf1) is not deepcopy-able. Even if we were able to
    deeocopy it, we probably wouldn't want to since it contains state.
    Scikit-learn needs to be able to deepcopy an estimators arguments so
    that it can create copies and derivatives of it.
    
    Instead we require the uninitialized optimizer and its parameters to be
    passed to our learners separately. The learner can then initialize the
    optimizer as needed.
    timokau committed Jun 17, 2020
    Configuration menu
    Copy the full SHA
    95a0ad2 View commit details
    Browse the repository at this point in the history

Commits on Jun 26, 2020

  1. Pin keras to <2.4

    Newer keras versions delegate to tf.keras and therefore need tf2. See
    https://github.com/keras-team/keras/releases/tag/2.4.0.
    timokau committed Jun 26, 2020
    Configuration menu
    Copy the full SHA
    702bbcc View commit details
    Browse the repository at this point in the history