From 2fd2e0c8e7788f292d15bf5fd4e57d9cf0525963 Mon Sep 17 00:00:00 2001 From: "Anthony D. Blaom" Date: Wed, 12 Jun 2024 17:52:54 +1200 Subject: [PATCH 1/6] minor doc correction --- docs/src/interface/Summary.md | 10 ++++------ 1 file changed, 4 insertions(+), 6 deletions(-) diff --git a/docs/src/interface/Summary.md b/docs/src/interface/Summary.md index cc607e53..911146ce 100644 --- a/docs/src/interface/Summary.md +++ b/docs/src/interface/Summary.md @@ -1,11 +1,9 @@ ## Models -MLJFlux provides four model types, for use with input features `X` and -targets `y` of the [scientific -type](https://alan-turing-institute.github.io/MLJScientificTypes.jl/dev/) -indicated in the table below. The parameters `n_in`, `n_out` and `n_channels` -refer to information passed to the builder, as described under -[Defining Custom Builders](@ref). +MLJFlux provides the model types below, for use with input features `X` and targets `y` of +the [scientific type](https://alan-turing-institute.github.io/MLJScientificTypes.jl/dev/) +indicated in the table below. The parameters `n_in`, `n_out` and `n_channels` refer to +information passed to the builder, as described under [Defining Custom Builders](@ref). | Model Type | Prediction type | `scitype(X) <: _` | `scitype(y) <: _` | |---------------------------------------------|-----------------|-----------------------------------------------------|-------------------------------------------------| From 37453bcc6255ea594bd3ebb0c6a26529bf0eb407 Mon Sep 17 00:00:00 2001 From: "Anthony D. Blaom" Date: Wed, 12 Jun 2024 17:54:00 +1200 Subject: [PATCH 2/6] fix broken link --- docs/src/interface/Summary.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/src/interface/Summary.md b/docs/src/interface/Summary.md index 911146ce..7a436232 100644 --- a/docs/src/interface/Summary.md +++ b/docs/src/interface/Summary.md @@ -1,7 +1,7 @@ ## Models MLJFlux provides the model types below, for use with input features `X` and targets `y` of -the [scientific type](https://alan-turing-institute.github.io/MLJScientificTypes.jl/dev/) +the [scientific type](https://juliaai.github.io/ScientificTypes.jl/dev/) indicated in the table below. The parameters `n_in`, `n_out` and `n_channels` refer to information passed to the builder, as described under [Defining Custom Builders](@ref). From dacdf97c70a73d725572dcf6d442854662669c54 Mon Sep 17 00:00:00 2001 From: "Anthony D. Blaom" Date: Wed, 12 Jun 2024 18:10:35 +1200 Subject: [PATCH 3/6] clarify rows versus columns in docs --- docs/src/interface/Custom Builders.md | 7 +++-- docs/src/interface/Summary.md | 39 ++++++++++++++------------- 2 files changed, 23 insertions(+), 23 deletions(-) diff --git a/docs/src/interface/Custom Builders.md b/docs/src/interface/Custom Builders.md index 42543ed2..cc9fd698 100644 --- a/docs/src/interface/Custom Builders.md +++ b/docs/src/interface/Custom Builders.md @@ -23,7 +23,7 @@ end ``` Note here that `n_in` and `n_out` depend on the size of the data (see -[Table 1](@ref Models). +[Table 1](@ref Models)). For a concrete image classification example, see [Using MLJ to classifiy the MNIST image dataset](@ref). @@ -41,9 +41,8 @@ This method must return a `Flux.Chain` instance, `chain`, subject to the following conditions: - `chain(x)` must make sense: - - for any `x <: Array{<:AbstractFloat, 2}` of size `(n_in, - batch_size)` where `batch_size` is any integer (for use with one - of the first three model types); or + - for any `x <: Array{<:AbstractFloat, 2}` of size `(n_in, batch_size)` where + `batch_size` is any integer (for all models except `ImageClassifier`); or - for any `x <: Array{<:Float32, 4}` of size `(W, H, n_channels, batch_size)`, where `(W, H) = n_in`, `n_channels` is 1 or 3, and `batch_size` is any integer (for use with `ImageClassifier`) diff --git a/docs/src/interface/Summary.md b/docs/src/interface/Summary.md index 7a436232..6f5f0aec 100644 --- a/docs/src/interface/Summary.md +++ b/docs/src/interface/Summary.md @@ -5,13 +5,13 @@ the [scientific type](https://juliaai.github.io/ScientificTypes.jl/dev/) indicated in the table below. The parameters `n_in`, `n_out` and `n_channels` refer to information passed to the builder, as described under [Defining Custom Builders](@ref). -| Model Type | Prediction type | `scitype(X) <: _` | `scitype(y) <: _` | -|---------------------------------------------|-----------------|-----------------------------------------------------|-------------------------------------------------| -| [`NeuralNetworkRegressor`](@ref) | `Deterministic` | `Table(Continuous)` with `n_in` columns | `AbstractVector{<:Continuous)` (`n_out = 1`) | -| [`MultitargetNeuralNetworkRegressor`](@ref) | `Deterministic` | `Table(Continuous)` with `n_in` columns | `<: Table(Continuous)` with `n_out` columns | -| [`NeuralNetworkClassifier`](@ref) | `Probabilistic` | `<:Table(Continuous)` with `n_in` columns | `AbstractVector{<:Finite}` with `n_out` classes | -| [`NeuralNetworkBinaryClassifier`](@ref) | `Probabilistic` | `<:Table(Continuous)` with `n_in` columns | `AbstractVector{<:Finite{2}}` (`n_out = 2`) | -| [`ImageClassifier`](@ref) | `Probabilistic` | `AbstractVector(<:Image{W,H})` with `n_in = (W, H)` | `AbstractVector{<:Finite}` with `n_out` classes | +| Model Type | Prediction type | `scitype(X) <: _` | `scitype(y) <: _` | +|---------------------------------------------|-----------------|-------------------------------------------------------------------------|-------------------------------------------------| +| [`NeuralNetworkRegressor`](@ref) | `Deterministic` | `AbstractMatrix{Continuous}` or `Table(Continuous)` with `n_in` columns | `AbstractVector{<:Continuous)` (`n_out = 1`) | +| [`MultitargetNeuralNetworkRegressor`](@ref) | `Deterministic` | `AbstractMatrix{Continuous}` or `Table(Continuous)` with `n_in` columns | `<: Table(Continuous)` with `n_out` columns | +| [`NeuralNetworkClassifier`](@ref) | `Probabilistic` | `AbstractMatrix{Continuous}` or `Table(Continuous)` with `n_in` columns | `AbstractVector{<:Finite}` with `n_out` classes | +| [`NeuralNetworkBinaryClassifier`](@ref) | `Probabilistic` | `AbstractMatrix{Continuous}` or `Table(Continuous)` with `n_in` columns | `AbstractVector{<:Finite{2}}` (but `n_out = 1`) | +| [`ImageClassifier`](@ref) | `Probabilistic` | `AbstractVector(<:Image{W,H})` with `n_in = (W, H)` | `AbstractVector{<:Finite}` with `n_out` classes | ```@raw html @@ -31,23 +31,24 @@ particular, an MLJ model does not store learned parameters. ``` ```@raw html -
Dealing with non-tabular input +
Are oberservations rows or columns? ``` -Any `AbstractMatrix{<:AbstractFloat}` object `Xmat` can be forced to -have scitype `Table(Continuous)` by replacing it with ` X = -MLJ.table(Xmat)`. Furthermore, this wrapping, and subsequent -unwrapping under the hood, will compile to a no-op. At present this -includes support for sparse matrix data, but the implementation has -not been optimized for sparse data at this time and so should be used -with caution. - -Instructions for coercing common image formats into some -`AbstractVector{<:Image}` are -[here](https://juliaai.github.io/ScientificTypes.jl/dev/#Type-coercion-for-image-data). + +In MLJ the convention for two-dimensional data (tables and matrices) is +**rows=obervations**. For matrices Flux has the opposite convention. If your data is a +matrix with whose column index the observation index, then your optimal solution is to +present the `adjoint` or `transpose` of your matrix to MLJFlux models. Otherwise, you can +use the matrix as is, or transform one time with `permutedims`, and again present the +`adjoint` or `transpose` as the optimal solution for MLJFlux training. + ```@raw html
``` +Instructions for coercing common image formats into some `AbstractVector{<:Image}` are +[here](https://juliaai.github.io/ScientificTypes.jl/dev/#Type-coercion-for-image-data). + + ```@raw html
Fitting and warm restarts ``` From 3535d12dcf203306ecbc91144a61c716737679ec Mon Sep 17 00:00:00 2001 From: "Anthony D. Blaom" Date: Wed, 12 Jun 2024 18:17:55 +1200 Subject: [PATCH 4/6] address #236 dep warning --- src/types.jl | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/src/types.jl b/src/types.jl index b6e9af9b..e7bb880d 100644 --- a/src/types.jl +++ b/src/types.jl @@ -950,7 +950,7 @@ We arrange for standardization of the the target by wrapping our model in model in a pipeline: ```julia -pipe = Standardizer |> TransformedTargetModel(model, target=Standardizer) +pipe = Standardizer |> TransformedTargetModel(model, transformer=Standardizer) ``` If we fit with a high verbosity (>1), we will see the losses during training. We can also @@ -1166,7 +1166,7 @@ We will arrange for standardization of the the target by wrapping our model in model in a pipeline: ```julia -pipe = Standardizer |> TransformedTargetModel(model, target=Standardizer) +pipe = Standardizer |> TransformedTargetModel(model, transformer=Standardizer) ``` If we fit with a high verbosity (>1), we will see the losses during training. We can also From a4e0d4619d46d759b544fc1495d68d1fb4c14b09 Mon Sep 17 00:00:00 2001 From: "Anthony D. Blaom" Date: Wed, 12 Jun 2024 18:21:20 +1200 Subject: [PATCH 5/6] doc improvement --- docs/src/index.md | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/docs/src/index.md b/docs/src/index.md index aba818d5..b24a0537 100644 --- a/docs/src/index.md +++ b/docs/src/index.md @@ -72,8 +72,9 @@ As in the example above, any MLJFlux model has a `builder` hyperparameter, an ob encoding instructions for creating a neural network given the data that the model eventually sees (e.g., the number of classes in a classification problem). While each MLJ model has a simple default builder, users may need to define custom builders to get -optimal results, and this will require familiarity with the [Flux -API](https://fluxml.ai/Flux.jl/stable/) for defining a neural network chain. +optimal results (see [Defining Custom Builders](@ref) and this will require familiarity +with the [Flux API](https://fluxml.ai/Flux.jl/stable/) for defining a neural network +chain. ## Flux or MLJFlux? From 3427fcbff880cd5949be7d7d7bd458be32f61132 Mon Sep 17 00:00:00 2001 From: "Anthony D. Blaom" Date: Thu, 13 Jun 2024 08:26:09 +1200 Subject: [PATCH 6/6] fix readme; get rid of redundant appearances of colname -> true --- README.md | 15 +- .../architecture_search/notebook.ipynb | 10 +- .../architecture_search/notebook.jl | 2 +- .../architecture_search/notebook.md | 2 +- .../notebook.unexecuted.ipynb | 2 +- .../comparison/notebook.ipynb | 8 +- .../common_workflows/comparison/notebook.jl | 2 +- .../common_workflows/comparison/notebook.md | 2 +- .../comparison/notebook.unexecuted.ipynb | 2 +- .../composition/notebook.ipynb | 18 +- .../common_workflows/composition/notebook.jl | 2 +- .../common_workflows/composition/notebook.md | 4 +- .../composition/notebook.unexecuted.ipynb | 4 +- .../early_stopping/notebook.ipynb | 148 +++--- .../early_stopping/notebook.jl | 4 +- .../early_stopping/notebook.md | 4 +- .../early_stopping/notebook.unexecuted.ipynb | 4 +- .../hyperparameter_tuning/notebook.ipynb | 444 ++++++++++++++++++ .../hyperparameter_tuning/notebook.jl | 2 +- .../hyperparameter_tuning/notebook.md | 2 +- .../notebook.unexecuted.ipynb | 2 +- .../incremental_training/notebook.ipynb | 18 +- .../incremental_training/notebook.jl | 2 +- .../incremental_training/notebook.md | 9 +- .../notebook.unexecuted.ipynb | 14 +- .../live_training/notebook.jl | 2 +- .../live_training/notebook.md | 4 +- .../live_training/notebook.unexecuted.ipynb | 4 +- readme_figure.png | Bin 17581 -> 23421 bytes 29 files changed, 596 insertions(+), 140 deletions(-) create mode 100644 docs/src/common_workflows/hyperparameter_tuning/notebook.ipynb diff --git a/README.md b/README.md index bc4c72a2..8702c8e4 100644 --- a/README.md +++ b/README.md @@ -34,7 +34,7 @@ Grab some data and split into features and target: ```julia iris = RDatasets.dataset("datasets", "iris"); -y, X = unpack(iris, ==(:Species), colname -> true, rng=123); +y, X = unpack(iris, ==(:Species), rng=123); X = Float32.(X); # To optmise for GPUs ``` @@ -83,17 +83,16 @@ Train the wrapped model: julia> mach = machine(iterated_model, X, y) julia> fit!(mach) -[ Info: Training machine(ProbabilisticIteratedModel(model = NeuralNetworkClassifier(builder = MLP(hidden = (5, 4), …), …), …), …). -[ Info: No iteration parameter specified. Using `iteration_parameter=:(epochs)`. -[ Info: final loss: 0.10431026246922499 -[ Info: final training loss: 0.046286315 -[ Info: Stop triggered by Patience(4) stopping criterion. -[ Info: Total of 349 iterations. +[ Info: No iteration parameter specified. Using `iteration_parameter=:(epochs)`. +[ Info: final loss: 0.1284184007796247 +[ Info: final training loss: 0.055630706 +[ Info: Stop triggered by NumberSinceBest(5) stopping criterion. +[ Info: Total of 811 iterations. ``` Inspect results: ```julia-repl -julia> plot(train_losses, label="Validation Loss", linewidth=2, size=(800,400)) +julia> plot(train_losses, label="Training Loss") julia> plot!(validation_losses, label="Validation Loss", linewidth=2, size=(800,400)) ``` diff --git a/docs/src/common_workflows/architecture_search/notebook.ipynb b/docs/src/common_workflows/architecture_search/notebook.ipynb index 958109de..286491d1 100644 --- a/docs/src/common_workflows/architecture_search/notebook.ipynb +++ b/docs/src/common_workflows/architecture_search/notebook.ipynb @@ -95,7 +95,7 @@ "cell_type": "code", "source": [ "iris = RDatasets.dataset(\"datasets\", \"iris\");\n", - "y, X = unpack(iris, ==(:Species), colname -> true, rng = 123);\n", + "y, X = unpack(iris, ==(:Species), rng = 123);\n", "X = Float32.(X); # To be compatible with type of network network parameters\n", "first(X, 5)" ], @@ -130,7 +130,7 @@ { "output_type": "execute_result", "data": { - "text/plain": "NeuralNetworkClassifier(\n builder = MLP(\n hidden = (1, 1, 1), \n σ = NNlib.relu), \n finaliser = NNlib.softmax, \n optimiser = Adam(0.01, (0.9, 0.999), 1.0e-8), \n loss = Flux.Losses.crossentropy, \n epochs = 10, \n batch_size = 8, \n lambda = 0.0, \n alpha = 0.0, \n rng = 42, \n optimiser_changes_trigger_retraining = false, \n acceleration = ComputationalResources.CPU1{Nothing}(nothing))" + "text/plain": "NeuralNetworkClassifier(\n builder = MLP(\n hidden = (1, 1, 1), \n σ = NNlib.relu), \n finaliser = NNlib.softmax, \n optimiser = Adam(0.01, (0.9, 0.999), 1.0e-8), \n loss = Flux.Losses.crossentropy, \n epochs = 10, \n batch_size = 8, \n lambda = 0.0, \n alpha = 0.0, \n rng = 42, \n optimiser_changes_trigger_retraining = false, \n acceleration = CPU1{Nothing}(nothing))" }, "metadata": {}, "execution_count": 4 @@ -306,7 +306,7 @@ { "output_type": "execute_result", "data": { - "text/plain": "NeuralNetworkClassifier(\n builder = MLP(\n hidden = (21, 57, 25), \n σ = NNlib.relu), \n finaliser = NNlib.softmax, \n optimiser = Adam(0.01, (0.9, 0.999), 1.0e-8), \n loss = Flux.Losses.crossentropy, \n epochs = 10, \n batch_size = 8, \n lambda = 0.0, \n alpha = 0.0, \n rng = 42, \n optimiser_changes_trigger_retraining = false, \n acceleration = ComputationalResources.CPU1{Nothing}(nothing))" + "text/plain": "NeuralNetworkClassifier(\n builder = MLP(\n hidden = (45, 49, 21), \n σ = NNlib.relu), \n finaliser = NNlib.softmax, \n optimiser = Adam(0.01, (0.9, 0.999), 1.0e-8), \n loss = Flux.Losses.crossentropy, \n epochs = 10, \n batch_size = 8, \n lambda = 0.0, \n alpha = 0.0, \n rng = 42, \n optimiser_changes_trigger_retraining = false, \n acceleration = CPU1{Nothing}(nothing))" }, "metadata": {}, "execution_count": 8 @@ -341,9 +341,9 @@ { "output_type": "execute_result", "data": { - "text/plain": "\u001b[1m10×2 DataFrame\u001b[0m\n\u001b[1m Row \u001b[0m│\u001b[1m mlp \u001b[0m\u001b[1m measurement \u001b[0m\n │\u001b[90m MLP… \u001b[0m\u001b[90m Float64 \u001b[0m\n─────┼────────────────────────────────────────────\n 1 │ MLP(hidden = (21, 57, 25), …) 0.0867019\n 2 │ MLP(hidden = (45, 17, 13), …) 0.0929803\n 3 │ MLP(hidden = (33, 13, 49), …) 0.0973896\n 4 │ MLP(hidden = (21, 41, 61), …) 0.0981502\n 5 │ MLP(hidden = (57, 49, 61), …) 0.100331\n 6 │ MLP(hidden = (25, 25, 29), …) 0.101083\n 7 │ MLP(hidden = (29, 61, 21), …) 0.101466\n 8 │ MLP(hidden = (29, 61, 5), …) 0.107513\n 9 │ MLP(hidden = (21, 61, 17), …) 0.107874\n 10 │ MLP(hidden = (45, 49, 61), …) 0.111292", + "text/plain": "\u001b[1m10×2 DataFrame\u001b[0m\n\u001b[1m Row \u001b[0m│\u001b[1m mlp \u001b[0m\u001b[1m measurement \u001b[0m\n │\u001b[90m MLP… \u001b[0m\u001b[90m Float64 \u001b[0m\n─────┼────────────────────────────────────────────\n 1 │ MLP(hidden = (45, 49, 21), …) 0.0860875\n 2 │ MLP(hidden = (25, 45, 33), …) 0.0877367\n 3 │ MLP(hidden = (29, 17, 53), …) 0.0970372\n 4 │ MLP(hidden = (61, 9, 29), …) 0.0970978\n 5 │ MLP(hidden = (49, 49, 9), …) 0.0971594\n 6 │ MLP(hidden = (21, 33, 61), …) 0.0984172\n 7 │ MLP(hidden = (57, 61, 61), …) 0.099232\n 8 │ MLP(hidden = (41, 13, 25), …) 0.101498\n 9 │ MLP(hidden = (53, 29, 21), …) 0.105323\n 10 │ MLP(hidden = (57, 33, 45), …) 0.110168", "text/html": [ - "
10×2 DataFrame
Rowmlpmeasurement
MLP…Float64
1MLP(hidden = (21, 57, 25), …)0.0867019
2MLP(hidden = (45, 17, 13), …)0.0929803
3MLP(hidden = (33, 13, 49), …)0.0973896
4MLP(hidden = (21, 41, 61), …)0.0981502
5MLP(hidden = (57, 49, 61), …)0.100331
6MLP(hidden = (25, 25, 29), …)0.101083
7MLP(hidden = (29, 61, 21), …)0.101466
8MLP(hidden = (29, 61, 5), …)0.107513
9MLP(hidden = (21, 61, 17), …)0.107874
10MLP(hidden = (45, 49, 61), …)0.111292
" + "
10×2 DataFrame
Rowmlpmeasurement
MLP…Float64
1MLP(hidden = (45, 49, 21), …)0.0860875
2MLP(hidden = (25, 45, 33), …)0.0877367
3MLP(hidden = (29, 17, 53), …)0.0970372
4MLP(hidden = (61, 9, 29), …)0.0970978
5MLP(hidden = (49, 49, 9), …)0.0971594
6MLP(hidden = (21, 33, 61), …)0.0984172
7MLP(hidden = (57, 61, 61), …)0.099232
8MLP(hidden = (41, 13, 25), …)0.101498
9MLP(hidden = (53, 29, 21), …)0.105323
10MLP(hidden = (57, 33, 45), …)0.110168
" ] }, "metadata": {}, diff --git a/docs/src/common_workflows/architecture_search/notebook.jl b/docs/src/common_workflows/architecture_search/notebook.jl index 61ba5d49..a5e4a15a 100644 --- a/docs/src/common_workflows/architecture_search/notebook.jl +++ b/docs/src/common_workflows/architecture_search/notebook.jl @@ -25,7 +25,7 @@ import Optimisers # native Flux.jl optimisers no longer supported # ### Loading and Splitting the Data iris = RDatasets.dataset("datasets", "iris"); -y, X = unpack(iris, ==(:Species), colname -> true, rng = 123); +y, X = unpack(iris, ==(:Species), rng = 123); X = Float32.(X); # To be compatible with type of network network parameters first(X, 5) diff --git a/docs/src/common_workflows/architecture_search/notebook.md b/docs/src/common_workflows/architecture_search/notebook.md index e995c68f..b355247a 100644 --- a/docs/src/common_workflows/architecture_search/notebook.md +++ b/docs/src/common_workflows/architecture_search/notebook.md @@ -28,7 +28,7 @@ import Optimisers # native Flux.jl optimisers no longer supported ````@example architecture_search iris = RDatasets.dataset("datasets", "iris"); -y, X = unpack(iris, ==(:Species), colname -> true, rng = 123); +y, X = unpack(iris, ==(:Species), rng = 123); X = Float32.(X); # To be compatible with type of network network parameters first(X, 5) ```` diff --git a/docs/src/common_workflows/architecture_search/notebook.unexecuted.ipynb b/docs/src/common_workflows/architecture_search/notebook.unexecuted.ipynb index 85b68135..6093c80e 100644 --- a/docs/src/common_workflows/architecture_search/notebook.unexecuted.ipynb +++ b/docs/src/common_workflows/architecture_search/notebook.unexecuted.ipynb @@ -75,7 +75,7 @@ "cell_type": "code", "source": [ "iris = RDatasets.dataset(\"datasets\", \"iris\");\n", - "y, X = unpack(iris, ==(:Species), colname -> true, rng = 123);\n", + "y, X = unpack(iris, ==(:Species), rng = 123);\n", "X = Float32.(X); # To be compatible with type of network network parameters\n", "first(X, 5)" ], diff --git a/docs/src/common_workflows/comparison/notebook.ipynb b/docs/src/common_workflows/comparison/notebook.ipynb index 8163b302..d968843e 100644 --- a/docs/src/common_workflows/comparison/notebook.ipynb +++ b/docs/src/common_workflows/comparison/notebook.ipynb @@ -81,7 +81,7 @@ "cell_type": "code", "source": [ "iris = RDatasets.dataset(\"datasets\", \"iris\");\n", - "y, X = unpack(iris, ==(:Species), colname -> true, rng=123);" + "y, X = unpack(iris, ==(:Species), rng=123);" ], "metadata": {}, "execution_count": 3 @@ -107,7 +107,7 @@ { "output_type": "execute_result", "data": { - "text/plain": "NeuralNetworkClassifier(\n builder = MLP(\n hidden = (5, 4), \n σ = NNlib.relu), \n finaliser = NNlib.softmax, \n optimiser = Adam(0.01, (0.9, 0.999), 1.0e-8), \n loss = Flux.Losses.crossentropy, \n epochs = 50, \n batch_size = 8, \n lambda = 0.0, \n alpha = 0.0, \n rng = 42, \n optimiser_changes_trigger_retraining = false, \n acceleration = ComputationalResources.CPU1{Nothing}(nothing))" + "text/plain": "NeuralNetworkClassifier(\n builder = MLP(\n hidden = (5, 4), \n σ = NNlib.relu), \n finaliser = NNlib.softmax, \n optimiser = Adam(0.01, (0.9, 0.999), 1.0e-8), \n loss = Flux.Losses.crossentropy, \n epochs = 50, \n batch_size = 8, \n lambda = 0.0, \n alpha = 0.0, \n rng = 42, \n optimiser_changes_trigger_retraining = false, \n acceleration = CPU1{Nothing}(nothing))" }, "metadata": {}, "execution_count": 4 @@ -271,9 +271,9 @@ { "output_type": "execute_result", "data": { - "text/plain": "\u001b[1m4×2 DataFrame\u001b[0m\n\u001b[1m Row \u001b[0m│\u001b[1m mlp \u001b[0m\u001b[1m measurement \u001b[0m\n │\u001b[90m Probabil… \u001b[0m\u001b[90m Float64 \u001b[0m\n─────┼────────────────────────────────────────────────\n 1 │ BayesianLDA(method = gevd, …) 0.0610826\n 2 │ NeuralNetworkClassifier(builder … 0.0857014\n 3 │ RandomForestClassifier(max_depth… 0.102881\n 4 │ ProbabilisticTunedModel(model = … 0.221056", + "text/plain": "\u001b[1m4×2 DataFrame\u001b[0m\n\u001b[1m Row \u001b[0m│\u001b[1m mlp \u001b[0m\u001b[1m measurement \u001b[0m\n │\u001b[90m Probabil… \u001b[0m\u001b[90m Float64 \u001b[0m\n─────┼────────────────────────────────────────────────\n 1 │ BayesianLDA(method = gevd, …) 0.0610826\n 2 │ NeuralNetworkClassifier(builder … 0.0857014\n 3 │ RandomForestClassifier(max_depth… 0.107885\n 4 │ ProbabilisticTunedModel(model = … 0.221056", "text/html": [ - "
4×2 DataFrame
Rowmlpmeasurement
Probabil…Float64
1BayesianLDA(method = gevd, …)0.0610826
2NeuralNetworkClassifier(builder = MLP(hidden = (5, 4), …), …)0.0857014
3RandomForestClassifier(max_depth = -1, …)0.102881
4ProbabilisticTunedModel(model = XGBoostClassifier(test = 1, …), …)0.221056
" + "
4×2 DataFrame
Rowmlpmeasurement
Probabil…Float64
1BayesianLDA(method = gevd, …)0.0610826
2NeuralNetworkClassifier(builder = MLP(hidden = (5, 4), …), …)0.0857014
3RandomForestClassifier(max_depth = -1, …)0.107885
4ProbabilisticTunedModel(model = XGBoostClassifier(test = 1, …), …)0.221056
" ] }, "metadata": {}, diff --git a/docs/src/common_workflows/comparison/notebook.jl b/docs/src/common_workflows/comparison/notebook.jl index 4d75c49d..6716ec52 100644 --- a/docs/src/common_workflows/comparison/notebook.jl +++ b/docs/src/common_workflows/comparison/notebook.jl @@ -23,7 +23,7 @@ import Optimisers # native Flux.jl optimisers no longer supported # ### Loading and Splitting the Data iris = RDatasets.dataset("datasets", "iris"); -y, X = unpack(iris, ==(:Species), colname -> true, rng=123); +y, X = unpack(iris, ==(:Species), rng=123); # ### Instantiating the models Now let's construct our model. This follows a similar setup diff --git a/docs/src/common_workflows/comparison/notebook.md b/docs/src/common_workflows/comparison/notebook.md index 1419ab55..8d689eb1 100644 --- a/docs/src/common_workflows/comparison/notebook.md +++ b/docs/src/common_workflows/comparison/notebook.md @@ -26,7 +26,7 @@ import Optimisers # native Flux.jl optimisers no longer supported ````@example comparison iris = RDatasets.dataset("datasets", "iris"); -y, X = unpack(iris, ==(:Species), colname -> true, rng=123); +y, X = unpack(iris, ==(:Species), rng=123); nothing #hide ```` diff --git a/docs/src/common_workflows/comparison/notebook.unexecuted.ipynb b/docs/src/common_workflows/comparison/notebook.unexecuted.ipynb index b8517a90..65e472ff 100644 --- a/docs/src/common_workflows/comparison/notebook.unexecuted.ipynb +++ b/docs/src/common_workflows/comparison/notebook.unexecuted.ipynb @@ -73,7 +73,7 @@ "cell_type": "code", "source": [ "iris = RDatasets.dataset(\"datasets\", \"iris\");\n", - "y, X = unpack(iris, ==(:Species), colname -> true, rng=123);" + "y, X = unpack(iris, ==(:Species), rng=123);" ], "metadata": {}, "execution_count": null diff --git a/docs/src/common_workflows/composition/notebook.ipynb b/docs/src/common_workflows/composition/notebook.ipynb index ced33e3c..306a24c6 100644 --- a/docs/src/common_workflows/composition/notebook.ipynb +++ b/docs/src/common_workflows/composition/notebook.ipynb @@ -10,7 +10,7 @@ { "cell_type": "markdown", "source": [ - "This tutorial is available as a Jupyter notebook or julia script\n", + "This demonstration is available as a Jupyter notebook or julia script\n", "[here](https://github.com/FluxML/MLJFlux.jl/tree/dev/docs/src/common_workflows/composition)." ], "metadata": {} @@ -83,7 +83,7 @@ "cell_type": "code", "source": [ "iris = RDatasets.dataset(\"datasets\", \"iris\");\n", - "y, X = unpack(iris, ==(:Species), colname -> true, rng=123);\n", + "y, X = unpack(iris, ==(:Species), rng=123);\n", "X = Float32.(X); # To be compatible with type of network network parameters" ], "metadata": {}, @@ -146,7 +146,7 @@ { "output_type": "execute_result", "data": { - "text/plain": "MLJFlux.NeuralNetworkClassifier" + "text/plain": "NeuralNetworkClassifier" }, "metadata": {}, "execution_count": 5 @@ -173,7 +173,7 @@ { "output_type": "execute_result", "data": { - "text/plain": "NeuralNetworkClassifier(\n builder = MLP(\n hidden = (5, 4), \n σ = NNlib.relu), \n finaliser = NNlib.softmax, \n optimiser = Adam(0.01, (0.9, 0.999), 1.0e-8), \n loss = Flux.Losses.crossentropy, \n epochs = 50, \n batch_size = 8, \n lambda = 0.0, \n alpha = 0.0, \n rng = 42, \n optimiser_changes_trigger_retraining = false, \n acceleration = ComputationalResources.CPU1{Nothing}(nothing))" + "text/plain": "NeuralNetworkClassifier(\n builder = MLP(\n hidden = (5, 4), \n σ = NNlib.relu), \n finaliser = NNlib.softmax, \n optimiser = Adam(0.01, (0.9, 0.999), 1.0e-8), \n loss = Flux.Losses.crossentropy, \n epochs = 50, \n batch_size = 8, \n lambda = 0.0, \n alpha = 0.0, \n rng = 42, \n optimiser_changes_trigger_retraining = false, \n acceleration = CPU1{Nothing}(nothing))" }, "metadata": {}, "execution_count": 6 @@ -284,7 +284,7 @@ "\rProgress: 13%|███████▏ | ETA: 0:00:01\u001b[K\rProgress: 100%|█████████████████████████████████████████████████████| Time: 0:00:00\u001b[K\n", "\rProgress: 67%|███████████████████████████████████▍ | ETA: 0:00:01\u001b[K\r\n", " class: virginica\u001b[K\r\u001b[A[ Info: After filtering, the mapping from each class to number of borderline points is (\"virginica\" => 1, \"versicolor\" => 2).\n", - "\rOptimising neural net: 4%[> ] ETA: 0:05:10\u001b[K\rOptimising neural net: 6%[=> ] ETA: 0:03:22\u001b[K\rOptimising neural net: 8%[=> ] ETA: 0:02:29\u001b[K\rOptimising neural net: 10%[==> ] ETA: 0:01:56\u001b[K\rOptimising neural net: 12%[==> ] ETA: 0:01:35\u001b[K\rOptimising neural net: 14%[===> ] ETA: 0:01:20\u001b[K\rOptimising neural net: 16%[===> ] ETA: 0:01:08\u001b[K\rOptimising neural net: 18%[====> ] ETA: 0:00:59\u001b[K\rOptimising neural net: 20%[====> ] ETA: 0:00:52\u001b[K\rOptimising neural net: 22%[=====> ] ETA: 0:00:46\u001b[K\rOptimising neural net: 24%[=====> ] ETA: 0:00:41\u001b[K\rOptimising neural net: 25%[======> ] ETA: 0:00:37\u001b[K\rOptimising neural net: 27%[======> ] ETA: 0:00:33\u001b[K\rOptimising neural net: 29%[=======> ] ETA: 0:00:30\u001b[K\rOptimising neural net: 31%[=======> ] ETA: 0:00:28\u001b[K\rOptimising neural net: 33%[========> ] ETA: 0:00:25\u001b[K\rOptimising neural net: 35%[========> ] ETA: 0:00:23\u001b[K\rOptimising neural net: 37%[=========> ] ETA: 0:00:21\u001b[K\rOptimising neural net: 39%[=========> ] ETA: 0:00:20\u001b[K\rOptimising neural net: 41%[==========> ] ETA: 0:00:18\u001b[K\rOptimising neural net: 43%[==========> ] ETA: 0:00:17\u001b[K\rOptimising neural net: 45%[===========> ] ETA: 0:00:15\u001b[K\rOptimising neural net: 47%[===========> ] ETA: 0:00:14\u001b[K\rOptimising neural net: 49%[============> ] ETA: 0:00:13\u001b[K\rOptimising neural net: 51%[============> ] ETA: 0:00:12\u001b[K\rOptimising neural net: 53%[=============> ] ETA: 0:00:11\u001b[K\rOptimising neural net: 55%[=============> ] ETA: 0:00:10\u001b[K\rOptimising neural net: 57%[==============> ] ETA: 0:00:10\u001b[K\rOptimising neural net: 59%[==============> ] ETA: 0:00:09\u001b[K\rOptimising neural net: 61%[===============> ] ETA: 0:00:08\u001b[K\rOptimising neural net: 63%[===============> ] ETA: 0:00:08\u001b[K\rOptimising neural net: 82%[====================> ] ETA: 0:00:03\u001b[K\rOptimising neural net: 84%[=====================> ] ETA: 0:00:02\u001b[K\rOptimising neural net: 86%[=====================> ] ETA: 0:00:02\u001b[K\rOptimising neural net: 88%[======================> ] ETA: 0:00:02\u001b[K\rOptimising neural net: 90%[======================> ] ETA: 0:00:01\u001b[K\rOptimising neural net: 92%[=======================> ] ETA: 0:00:01\u001b[K\rOptimising neural net: 94%[=======================> ] ETA: 0:00:01\u001b[K\rOptimising neural net: 96%[========================>] ETA: 0:00:01\u001b[K\rOptimising neural net: 98%[========================>] ETA: 0:00:00\u001b[K\rOptimising neural net: 100%[=========================] Time: 0:00:12\u001b[K\n", + "\rOptimising neural net: 4%[> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 6%[=> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 8%[=> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 10%[==> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 12%[==> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 14%[===> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 16%[===> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 18%[====> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 20%[====> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 22%[=====> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 24%[=====> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 25%[======> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 27%[======> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 29%[=======> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 31%[=======> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 33%[========> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 35%[========> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 37%[=========> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 39%[=========> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 41%[==========> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 43%[==========> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 45%[===========> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 47%[===========> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 49%[============> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 51%[============> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 53%[=============> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 55%[=============> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 57%[==============> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 59%[==============> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 61%[===============> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 63%[===============> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 65%[================> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 67%[================> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 69%[=================> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 71%[=================> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 73%[==================> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 75%[==================> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 76%[===================> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 78%[===================> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 80%[====================> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 82%[====================> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 84%[=====================> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 86%[=====================> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 88%[======================> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 90%[======================> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 92%[=======================> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 94%[=======================> ] ETA: 0:00:00\u001b[K\rOptimising neural net: 96%[========================>] ETA: 0:00:00\u001b[K\rOptimising neural net: 98%[========================>] ETA: 0:00:00\u001b[K\rOptimising neural net: 100%[=========================] Time: 0:00:00\u001b[K\n", "[ Info: After filtering, the mapping from each class to number of borderline points is (\"virginica\" => 3, \"versicolor\" => 1).\n", "[ Info: After filtering, the mapping from each class to number of borderline points is (\"virginica\" => 3, \"versicolor\" => 1).\n", "[ Info: After filtering, the mapping from each class to number of borderline points is (\"versicolor\" => 2).\n", @@ -298,18 +298,18 @@ "│ layer = Dense(4 => 5, relu) # 25 parameters\n", "│ summary(x) = \"4×8 Matrix{Float64}\"\n", "└ @ Flux ~/.julia/packages/Flux/Wz6D4/src/layers/stateless.jl:60\n", - "\rEvaluating over 5 folds: 40%[==========> ] ETA: 0:00:16\u001b[K[ Info: After filtering, the mapping from each class to number of borderline points is (\"virginica\" => 1, \"versicolor\" => 2).\n", + "\rEvaluating over 5 folds: 40%[==========> ] ETA: 0:00:10\u001b[K[ Info: After filtering, the mapping from each class to number of borderline points is (\"virginica\" => 1, \"versicolor\" => 2).\n", "[ Info: After filtering, the mapping from each class to number of borderline points is (\"virginica\" => 1, \"versicolor\" => 2).\n", - "\rEvaluating over 5 folds: 60%[===============> ] ETA: 0:00:07\u001b[K[ Info: After filtering, the mapping from each class to number of borderline points is (\"virginica\" => 1).\n", + "\rEvaluating over 5 folds: 60%[===============> ] ETA: 0:00:05\u001b[K[ Info: After filtering, the mapping from each class to number of borderline points is (\"virginica\" => 1).\n", "┌ Warning: Cannot oversample a class with no borderline points. Skipping.\n", "└ @ Imbalance ~/.julia/packages/Imbalance/knJL1/src/oversampling_methods/borderline_smote1/borderline_smote1.jl:67\n", "\rProgress: 67%|███████████████████████████████████▍ | ETA: 0:00:00\u001b[K\r\n", " class: virginica\u001b[K\r\u001b[A[ Info: After filtering, the mapping from each class to number of borderline points is (\"virginica\" => 1).\n", "┌ Warning: Cannot oversample a class with no borderline points. Skipping.\n", "└ @ Imbalance ~/.julia/packages/Imbalance/knJL1/src/oversampling_methods/borderline_smote1/borderline_smote1.jl:67\n", - "\rEvaluating over 5 folds: 80%[====================> ] ETA: 0:00:03\u001b[K[ Info: After filtering, the mapping from each class to number of borderline points is (\"virginica\" => 3, \"versicolor\" => 3).\n", + "\rEvaluating over 5 folds: 80%[====================> ] ETA: 0:00:02\u001b[K[ Info: After filtering, the mapping from each class to number of borderline points is (\"virginica\" => 3, \"versicolor\" => 3).\n", "[ Info: After filtering, the mapping from each class to number of borderline points is (\"virginica\" => 3, \"versicolor\" => 3).\n", - "\rEvaluating over 5 folds: 100%[=========================] Time: 0:00:11\u001b[K\n" + "\rEvaluating over 5 folds: 100%[=========================] Time: 0:00:07\u001b[K\n" ] }, { diff --git a/docs/src/common_workflows/composition/notebook.jl b/docs/src/common_workflows/composition/notebook.jl index 182021eb..b617a4b6 100644 --- a/docs/src/common_workflows/composition/notebook.jl +++ b/docs/src/common_workflows/composition/notebook.jl @@ -26,7 +26,7 @@ import Optimisers # native Flux.jl optimisers no longer supported # ### Loading and Splitting the Data iris = RDatasets.dataset("datasets", "iris"); -y, X = unpack(iris, ==(:Species), colname -> true, rng=123); +y, X = unpack(iris, ==(:Species), rng=123); X = Float32.(X); # To be compatible with type of network network parameters # To simulate an imbalanced dataset, we will take a random sample: diff --git a/docs/src/common_workflows/composition/notebook.md b/docs/src/common_workflows/composition/notebook.md index 0ef30b3b..949d5322 100644 --- a/docs/src/common_workflows/composition/notebook.md +++ b/docs/src/common_workflows/composition/notebook.md @@ -4,7 +4,7 @@ EditURL = "notebook.jl" # Model Composition with MLJFlux -This tutorial is available as a Jupyter notebook or julia script +This demonstration is available as a Jupyter notebook or julia script [here](https://github.com/FluxML/MLJFlux.jl/tree/dev/docs/src/common_workflows/composition). In this workflow example, we see how MLJFlux enables composing MLJ models with MLJFlux @@ -28,7 +28,7 @@ import Optimisers # native Flux.jl optimisers no longer supported ````@example composition iris = RDatasets.dataset("datasets", "iris"); -y, X = unpack(iris, ==(:Species), colname -> true, rng=123); +y, X = unpack(iris, ==(:Species), rng=123); X = Float32.(X); # To be compatible with type of network network parameters nothing #hide ```` diff --git a/docs/src/common_workflows/composition/notebook.unexecuted.ipynb b/docs/src/common_workflows/composition/notebook.unexecuted.ipynb index 54b2439a..ef75b9ab 100644 --- a/docs/src/common_workflows/composition/notebook.unexecuted.ipynb +++ b/docs/src/common_workflows/composition/notebook.unexecuted.ipynb @@ -10,7 +10,7 @@ { "cell_type": "markdown", "source": [ - "This tutorial is available as a Jupyter notebook or julia script\n", + "This demonstration is available as a Jupyter notebook or julia script\n", "[here](https://github.com/FluxML/MLJFlux.jl/tree/dev/docs/src/common_workflows/composition)." ], "metadata": {} @@ -75,7 +75,7 @@ "cell_type": "code", "source": [ "iris = RDatasets.dataset(\"datasets\", \"iris\");\n", - "y, X = unpack(iris, ==(:Species), colname -> true, rng=123);\n", + "y, X = unpack(iris, ==(:Species), rng=123);\n", "X = Float32.(X); # To be compatible with type of network network parameters" ], "metadata": {}, diff --git a/docs/src/common_workflows/early_stopping/notebook.ipynb b/docs/src/common_workflows/early_stopping/notebook.ipynb index bbdda628..9f136402 100644 --- a/docs/src/common_workflows/early_stopping/notebook.ipynb +++ b/docs/src/common_workflows/early_stopping/notebook.ipynb @@ -3,7 +3,7 @@ { "cell_type": "markdown", "source": [ - "# Early Stopping with MLJFlux" + "# Early Stopping with MLJ" ], "metadata": {} }, @@ -81,7 +81,7 @@ "cell_type": "code", "source": [ "iris = RDatasets.dataset(\"datasets\", \"iris\");\n", - "y, X = unpack(iris, ==(:Species), colname -> true, rng=123);\n", + "y, X = unpack(iris, ==(:Species), rng=123);\n", "X = Float32.(X); # To be compatible with type of network network parameters" ], "metadata": {}, @@ -108,7 +108,7 @@ { "output_type": "execute_result", "data": { - "text/plain": "NeuralNetworkClassifier(\n builder = MLP(\n hidden = (5, 4), \n σ = NNlib.relu), \n finaliser = NNlib.softmax, \n optimiser = Adam(0.01, (0.9, 0.999), 1.0e-8), \n loss = Flux.Losses.crossentropy, \n epochs = 50, \n batch_size = 8, \n lambda = 0.0, \n alpha = 0.0, \n rng = 42, \n optimiser_changes_trigger_retraining = false, \n acceleration = ComputationalResources.CPU1{Nothing}(nothing))" + "text/plain": "NeuralNetworkClassifier(\n builder = MLP(\n hidden = (5, 4), \n σ = NNlib.relu), \n finaliser = NNlib.softmax, \n optimiser = Adam(0.01, (0.9, 0.999), 1.0e-8), \n loss = Flux.Losses.crossentropy, \n epochs = 50, \n batch_size = 8, \n lambda = 0.0, \n alpha = 0.0, \n rng = 42, \n optimiser_changes_trigger_retraining = false, \n acceleration = CPU1{Nothing}(nothing))" }, "metadata": {}, "execution_count": 4 @@ -148,7 +148,7 @@ { "output_type": "execute_result", "data": { - "text/plain": "5-element Vector{Any}:\n IterationControl.Step(1)\n EarlyStopping.NumberLimit(100)\n EarlyStopping.Patience(5)\n EarlyStopping.NumberSinceBest(9)\n EarlyStopping.TimeLimit(Dates.Millisecond(1800000))" + "text/plain": "5-element Vector{Any}:\n Step(1)\n NumberLimit(100)\n Patience(5)\n NumberSinceBest(9)\n TimeLimit(Dates.Millisecond(1800000))" }, "metadata": {}, "execution_count": 5 @@ -179,7 +179,7 @@ { "output_type": "execute_result", "data": { - "text/plain": "1-element Vector{IterationControl.WithLossDo{Main.var\"##351\".var\"#3#4\"}}:\n IterationControl.WithLossDo{Main.var\"##351\".var\"#3#4\"}(Main.var\"##351\".var\"#3#4\"(), false, nothing)" + "text/plain": "1-element Vector{WithLossDo{Main.var\"##267\".var\"#1#2\"}}:\n WithLossDo{Main.var\"##267\".var\"#1#2\"}(Main.var\"##267\".var\"#1#2\"(), false, nothing)" }, "metadata": {}, "execution_count": 6 @@ -250,7 +250,7 @@ "[ Info: Training machine(ProbabilisticIteratedModel(model = NeuralNetworkClassifier(builder = MLP(hidden = (5, 4), …), …), …), …).\n", "[ Info: final loss: 0.05287897645527522\n", "[ Info: final training loss: 0.045833383\n", - "[ Info: Stop triggered by EarlyStopping.NumberLimit(100) stopping criterion. \n", + "[ Info: Stop triggered by NumberLimit(100) stopping criterion. \n", "[ Info: Total of 100 iterations. \n" ] } @@ -290,101 +290,101 @@ "\n", "\n", "\n", - " \n", + " \n", " \n", " \n", "\n", - "\n", + "\n", "\n", - " \n", + " \n", " \n", " \n", "\n", - "\n", + "\n", "\n", - " \n", + " \n", " \n", " \n", "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n" + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n" ], "image/svg+xml": [ "\n", "\n", "\n", - " \n", + " \n", " \n", " \n", "\n", - "\n", + "\n", "\n", - " \n", + " \n", " \n", " \n", "\n", - "\n", + "\n", "\n", - " \n", + " \n", " \n", " \n", "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n" + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n" ] }, "metadata": {}, diff --git a/docs/src/common_workflows/early_stopping/notebook.jl b/docs/src/common_workflows/early_stopping/notebook.jl index a6c59da3..adcf39f7 100644 --- a/docs/src/common_workflows/early_stopping/notebook.jl +++ b/docs/src/common_workflows/early_stopping/notebook.jl @@ -10,7 +10,7 @@ using Pkg #!md Pkg.activate(@__DIR__); #!md Pkg.instantiate(); #!md -# **Julia version** is assumed to be 1.10.* +# **Julia version** is assumed to be 1.10.* # ### Basic Imports @@ -24,7 +24,7 @@ import Optimisers # native Flux.jl optimisers no longer supported # ### Loading and Splitting the Data iris = RDatasets.dataset("datasets", "iris"); -y, X = unpack(iris, ==(:Species), colname -> true, rng=123); +y, X = unpack(iris, ==(:Species), rng=123); X = Float32.(X); # To be compatible with type of network network parameters diff --git a/docs/src/common_workflows/early_stopping/notebook.md b/docs/src/common_workflows/early_stopping/notebook.md index e6738259..076b7007 100644 --- a/docs/src/common_workflows/early_stopping/notebook.md +++ b/docs/src/common_workflows/early_stopping/notebook.md @@ -2,7 +2,7 @@ EditURL = "notebook.jl" ``` -# Early Stopping with MLJFlux +# Early Stopping with MLJ This demonstration is available as a Jupyter notebook or julia script [here](https://github.com/FluxML/MLJFlux.jl/tree/dev/docs/src/common_workflows/early_stopping). @@ -26,7 +26,7 @@ import Optimisers # native Flux.jl optimisers no longer supported ````@example early_stopping iris = RDatasets.dataset("datasets", "iris"); -y, X = unpack(iris, ==(:Species), colname -> true, rng=123); +y, X = unpack(iris, ==(:Species), rng=123); X = Float32.(X); # To be compatible with type of network network parameters nothing #hide ```` diff --git a/docs/src/common_workflows/early_stopping/notebook.unexecuted.ipynb b/docs/src/common_workflows/early_stopping/notebook.unexecuted.ipynb index 5effdb73..4441ab52 100644 --- a/docs/src/common_workflows/early_stopping/notebook.unexecuted.ipynb +++ b/docs/src/common_workflows/early_stopping/notebook.unexecuted.ipynb @@ -3,7 +3,7 @@ { "cell_type": "markdown", "source": [ - "# Early Stopping with MLJFlux" + "# Early Stopping with MLJ" ], "metadata": {} }, @@ -73,7 +73,7 @@ "cell_type": "code", "source": [ "iris = RDatasets.dataset(\"datasets\", \"iris\");\n", - "y, X = unpack(iris, ==(:Species), colname -> true, rng=123);\n", + "y, X = unpack(iris, ==(:Species), rng=123);\n", "X = Float32.(X); # To be compatible with type of network network parameters" ], "metadata": {}, diff --git a/docs/src/common_workflows/hyperparameter_tuning/notebook.ipynb b/docs/src/common_workflows/hyperparameter_tuning/notebook.ipynb new file mode 100644 index 00000000..18a49f77 --- /dev/null +++ b/docs/src/common_workflows/hyperparameter_tuning/notebook.ipynb @@ -0,0 +1,444 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "source": [ + "# Hyperparameter Tuning with MLJFlux" + ], + "metadata": {} + }, + { + "cell_type": "markdown", + "source": [ + "This demonstration is available as a Jupyter notebook or julia script\n", + "[here](https://github.com/FluxML/MLJFlux.jl/tree/dev/docs/src/common_workflows/hyperparameter_tuning)." + ], + "metadata": {} + }, + { + "cell_type": "markdown", + "source": [ + "In this workflow example we learn how to tune different hyperparameters of MLJFlux\n", + "models with emphasis on training hyperparameters." + ], + "metadata": {} + }, + { + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + " Activating project at `~/GoogleDrive/Julia/MLJ/MLJFlux/docs/src/common_workflows/hyperparameter_tuning`\n" + ] + } + ], + "cell_type": "code", + "source": [ + "using Pkg\n", + "Pkg.activate(@__DIR__);\n", + "Pkg.instantiate();" + ], + "metadata": {}, + "execution_count": 1 + }, + { + "cell_type": "markdown", + "source": [ + "**Julia version** is assumed to be 1.10.*" + ], + "metadata": {} + }, + { + "cell_type": "markdown", + "source": [ + "### Basic Imports" + ], + "metadata": {} + }, + { + "outputs": [], + "cell_type": "code", + "source": [ + "using MLJ # Has MLJFlux models\n", + "using Flux # For more flexibility\n", + "import RDatasets # Dataset source\n", + "using Plots # To plot tuning results\n", + "import Optimisers # native Flux.jl optimisers no longer supported" + ], + "metadata": {}, + "execution_count": 2 + }, + { + "cell_type": "markdown", + "source": [ + "### Loading and Splitting the Data" + ], + "metadata": {} + }, + { + "outputs": [], + "cell_type": "code", + "source": [ + "iris = RDatasets.dataset(\"datasets\", \"iris\");\n", + "y, X = unpack(iris, ==(:Species), rng=123);\n", + "X = Float32.(X); # To be compatible with type of network network parameters" + ], + "metadata": {}, + "execution_count": 3 + }, + { + "cell_type": "markdown", + "source": [ + "### Instantiating the model" + ], + "metadata": {} + }, + { + "cell_type": "markdown", + "source": [ + "Now let's construct our model. This follows a similar setup the one followed in the\n", + "[Quick Start](../../index.md#Quick-Start)." + ], + "metadata": {} + }, + { + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[ Info: For silent loading, specify `verbosity=0`. \n", + "import MLJFlux ✔\n" + ] + }, + { + "output_type": "execute_result", + "data": { + "text/plain": "NeuralNetworkClassifier(\n builder = MLP(\n hidden = (5, 4), \n σ = NNlib.relu), \n finaliser = NNlib.softmax, \n optimiser = Adam(0.01, (0.9, 0.999), 1.0e-8), \n loss = Flux.Losses.crossentropy, \n epochs = 10, \n batch_size = 8, \n lambda = 0.0, \n alpha = 0.0, \n rng = 42, \n optimiser_changes_trigger_retraining = false, \n acceleration = CPU1{Nothing}(nothing))" + }, + "metadata": {}, + "execution_count": 4 + } + ], + "cell_type": "code", + "source": [ + "NeuralNetworkClassifier = @load NeuralNetworkClassifier pkg=MLJFlux\n", + "clf = NeuralNetworkClassifier(\n", + " builder=MLJFlux.MLP(; hidden=(5,4), σ=Flux.relu),\n", + " optimiser=Optimisers.Adam(0.01),\n", + " batch_size=8,\n", + " epochs=10,\n", + " rng=42,\n", + ")" + ], + "metadata": {}, + "execution_count": 4 + }, + { + "cell_type": "markdown", + "source": [ + "### Hyperparameter Tuning Example" + ], + "metadata": {} + }, + { + "cell_type": "markdown", + "source": [ + "Let's tune the batch size and the learning rate. We will use grid search and 5-fold\n", + "cross-validation." + ], + "metadata": {} + }, + { + "cell_type": "markdown", + "source": [ + "We start by defining the hyperparameter ranges" + ], + "metadata": {} + }, + { + "outputs": [ + { + "output_type": "execute_result", + "data": { + "text/plain": "NominalRange(optimiser = Adam(0.0001, (0.9, 0.999), 1.0e-8), Adam(0.00215443, (0.9, 0.999), 1.0e-8), Adam(0.0464159, (0.9, 0.999), 1.0e-8), ...)" + }, + "metadata": {}, + "execution_count": 5 + } + ], + "cell_type": "code", + "source": [ + "r1 = range(clf, :batch_size, lower=1, upper=64)\n", + "etas = [10^x for x in range(-4, stop=0, length=4)]\n", + "optimisers = [Optimisers.Adam(eta) for eta in etas]\n", + "r2 = range(clf, :optimiser, values=optimisers)" + ], + "metadata": {}, + "execution_count": 5 + }, + { + "cell_type": "markdown", + "source": [ + "Then passing the ranges along with the model and other arguments to the `TunedModel`\n", + "constructor." + ], + "metadata": {} + }, + { + "outputs": [], + "cell_type": "code", + "source": [ + "tuned_model = TunedModel(\n", + " model=clf,\n", + " tuning=Grid(goal=25),\n", + " resampling=CV(nfolds=5, rng=42),\n", + " range=[r1, r2],\n", + " measure=cross_entropy,\n", + ");" + ], + "metadata": {}, + "execution_count": 6 + }, + { + "cell_type": "markdown", + "source": [ + "Then wrapping our tuned model in a machine and fitting it." + ], + "metadata": {} + }, + { + "outputs": [], + "cell_type": "code", + "source": [ + "mach = machine(tuned_model, X, y);\n", + "fit!(mach, verbosity=0);" + ], + "metadata": {}, + "execution_count": 7 + }, + { + "cell_type": "markdown", + "source": [ + "Let's check out the best performing model:" + ], + "metadata": {} + }, + { + "outputs": [ + { + "output_type": "execute_result", + "data": { + "text/plain": "NeuralNetworkClassifier(\n builder = MLP(\n hidden = (5, 4), \n σ = NNlib.relu), \n finaliser = NNlib.softmax, \n optimiser = Adam(0.0464159, (0.9, 0.999), 1.0e-8), \n loss = Flux.Losses.crossentropy, \n epochs = 10, \n batch_size = 1, \n lambda = 0.0, \n alpha = 0.0, \n rng = 42, \n optimiser_changes_trigger_retraining = false, \n acceleration = CPU1{Nothing}(nothing))" + }, + "metadata": {}, + "execution_count": 8 + } + ], + "cell_type": "code", + "source": [ + "fitted_params(mach).best_model" + ], + "metadata": {}, + "execution_count": 8 + }, + { + "cell_type": "markdown", + "source": [ + "### Learning Curves" + ], + "metadata": {} + }, + { + "cell_type": "markdown", + "source": [ + "With learning curves, it's possible to center our focus on the effects of a single\n", + "hyperparameter of the model" + ], + "metadata": {} + }, + { + "cell_type": "markdown", + "source": [ + "First define the range and wrap it in a learning curve" + ], + "metadata": {} + }, + { + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[ Info: Training machine(ProbabilisticTunedModel(model = NeuralNetworkClassifier(builder = MLP(hidden = (5, 4), …), …), …), …).\n", + "[ Info: Attempting to evaluate 25 models.\n", + "\rEvaluating over 25 metamodels: 0%[> ] ETA: N/A\u001b[K\rEvaluating over 25 metamodels: 4%[=> ] ETA: 0:00:03\u001b[K\rEvaluating over 25 metamodels: 8%[==> ] ETA: 0:00:02\u001b[K\rEvaluating over 25 metamodels: 12%[===> ] ETA: 0:00:01\u001b[K\rEvaluating over 25 metamodels: 16%[====> ] ETA: 0:00:01\u001b[K\rEvaluating over 25 metamodels: 20%[=====> ] ETA: 0:00:01\u001b[K\rEvaluating over 25 metamodels: 24%[======> ] ETA: 0:00:01\u001b[K\rEvaluating over 25 metamodels: 28%[=======> ] ETA: 0:00:01\u001b[K\rEvaluating over 25 metamodels: 32%[========> ] ETA: 0:00:01\u001b[K\rEvaluating over 25 metamodels: 36%[=========> ] ETA: 0:00:01\u001b[K\rEvaluating over 25 metamodels: 40%[==========> ] ETA: 0:00:01\u001b[K\rEvaluating over 25 metamodels: 44%[===========> ] ETA: 0:00:01\u001b[K\rEvaluating over 25 metamodels: 48%[============> ] ETA: 0:00:01\u001b[K\rEvaluating over 25 metamodels: 52%[=============> ] ETA: 0:00:01\u001b[K\rEvaluating over 25 metamodels: 56%[==============> ] ETA: 0:00:01\u001b[K\rEvaluating over 25 metamodels: 60%[===============> ] ETA: 0:00:01\u001b[K\rEvaluating over 25 metamodels: 64%[================> ] ETA: 0:00:01\u001b[K\rEvaluating over 25 metamodels: 68%[=================> ] ETA: 0:00:01\u001b[K\rEvaluating over 25 metamodels: 72%[==================> ] ETA: 0:00:01\u001b[K\rEvaluating over 25 metamodels: 76%[===================> ] ETA: 0:00:01\u001b[K\rEvaluating over 25 metamodels: 80%[====================> ] ETA: 0:00:01\u001b[K\rEvaluating over 25 metamodels: 84%[=====================> ] ETA: 0:00:01\u001b[K\rEvaluating over 25 metamodels: 88%[======================> ] ETA: 0:00:01\u001b[K\rEvaluating over 25 metamodels: 92%[=======================> ] ETA: 0:00:00\u001b[K\rEvaluating over 25 metamodels: 96%[========================>] ETA: 0:00:00\u001b[K\rEvaluating over 25 metamodels: 100%[=========================] Time: 0:00:06\u001b[K\n" + ] + }, + { + "output_type": "execute_result", + "data": { + "text/plain": "(parameter_name = \"epochs\",\n parameter_scale = :log10,\n parameter_values = [1, 2, 3, 4, 5, 6, 7, 9, 11, 13 … 39, 46, 56, 67, 80, 96, 116, 139, 167, 200],\n measurements = [0.9231712033780419, 0.7672938542047157, 0.6736075721456418, 0.6064130950372606, 0.5595521804926612, 0.5270759259385482, 0.5048969423979114, 0.47993815474701584, 0.46130985568830307, 0.4449225600160762 … 0.1621185148276446, 0.12283639917434747, 0.09543014842693512, 0.07850181447968614, 0.06950203807005066, 0.063248279208185, 0.060053521895940286, 0.05921442672620914, 0.05921052970422136, 0.060379476300399186],)" + }, + "metadata": {}, + "execution_count": 9 + } + ], + "cell_type": "code", + "source": [ + "r = range(clf, :epochs, lower=1, upper=200, scale=:log10)\n", + "curve = learning_curve(\n", + " clf,\n", + " X,\n", + " y,\n", + " range=r,\n", + " resampling=CV(nfolds=4, rng=42),\n", + " measure=cross_entropy,\n", + ")" + ], + "metadata": {}, + "execution_count": 9 + }, + { + "cell_type": "markdown", + "source": [ + "Then plot the curve" + ], + "metadata": {} + }, + { + "outputs": [ + { + "output_type": "execute_result", + "data": { + "text/plain": "Plot{Plots.GRBackend() n=1}", + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAlgAAAGQCAIAAAD9V4nPAAAABmJLR0QA/wD/AP+gvaeTAAAgAElEQVR4nO3dZ0AU194G8DNl6SAgHQEBwQqoiCgqIE0EMfaaaIyKJYkmlsDNm6j3em8ssSSaxIhKLLFrbKiIWLBhw65oFLtIV0D6zM77YQ1BXBQNu7Pl+X3amR2GPzrw7Jk5hRIEgQAAAGgrWuwCAAAAxIQgBAAArYYgBAAArYYgBAAArYYgBAAArYYgBAAArYYgBAAArYYgBAAArYYgBAAArYYgBAAAraYGQbhnz54jR47U82CpVMrzvELrAai/qqoqsUsA+BsuSLnUIAhPnjx59uzZeh7M8zz+p0F1lJeXi10CwN9wQcqlBkEIAACgOAhCAADQaghCAADQaghCAADQaghCAADQaghCAADQaghCAADQaghCAADQapoWhDsekMBEOg9jRgEAoH5YsQtoYH2cyJUCodNubl8Pxr0RJXY5AAAie/z4cWpqqux1WVmZvr6+uPW8N2tra39/f0WcWdOCkCLkG0/B1ZT2T+C2h7BdrJGFAKDV4uLiduzY0bJlS0KIIAgUpZZ/FSsqKs6cOZOVlaWIk2taEMqMcqftDam+B7lfujADnDXt9i8AQP0JgjB48OBvvvlG7EL+kZycHA8PDwWdXGNDIsye2hfOTk6V/nxDKnYtAACgujQ2CAkhHSyoE1HMTzekk1N5qSB2NQAAoJI0OQgJIc7G1Mko9mK+MPgwX45lCgEA4DUaHoSEEHNdcrAny1AkIpF7Xil2NQAAoGI0PwgJIboM2RjEdLCkuu7hHr7ATVIAAPibVgQhIYQiZH5HZmxzusse/mI+shAAAF7SliCUmdyG/rEz3WM/l/gYWQgAoFoqKyuvXr368OFDJX9f7QpCQki/pvSuMHb0Mf73OxhWAQCgKqZNm2ZsbOzr6zt//nwlf2utC0JCSGcr6mAE822adNYFdCQFAFAJY8eOzcrKGj16tPK/tTYGISGklSmV2pvd81AYfYzn0DIEAFCWuLi4nj17Vm8OGzbsu+++I4Q0b97czMxMlJI0c4q1+rDRJymR7ODDXOQBblsIaywRuyAAAKX4s1C4XKCkfhKmOlSo/Suzmw4bNiwmJiYjI8PV1TU/P3/Pnj0LFixQTjF10d4gJIQYSciuUPbTU3zwPm5PGGutrnOyAwC8g/Tnwta7SgpCByMh1J6pucfIyGj48OG//fbbf//739WrV/fo0cPOzk45xdRFq4OQEMLSZHlXZt5lqd9ubl840xwrNwGApvvAif7AScwCPv300+Dg4JkzZ8bHx//www9ilkII0dpnhLXEeNHftqMDErgTWRhWAQCgWC1btnRzc5s+fXp5eXlwcLDY5Wh9i7Dax+50E0OqXzL3kx8zyAWfDwAAFGj8+PHDhg2bM2cOTb/8e3v48OGkpKRTp04RQmJjY8PCwoKCgpRTDILwbyH2VHIEG5XE55STz1ohCwEAFMXf319HR2fUqFHVe/T09MzMzAYNGlS9qbRiEISv8DSnUiKZiAP87UJhcSeGxhNDAICGxnHcnDlzBg8ebG1tXb3Tz8/Pz89PlHrQ7qmtqTF1Moq9lC8MwspNAAANraKiwtra+tKlS3PmzBG7lpcQhHKY6ZKknqyEJsH7uPwKsasBANAgurq6+fn5J06csLe3F7uWlxCE8ukyZEN3JtSe8sfKTQAAGg1BWCeKkFntmegWtN8e/kIeshAAQDMhCN9icht6aWc6PJHb/whZCACggRCEb9e3Kb07jP3kGLfiJubnBgDQNBg+US+drKjjUWzEAf5JqTCrPfP2LwAAUBnnz5+Pi4sTu4p/pLi4WHEnRxDWVzMT6lgvttcB7uELfnlXRoK2NACog9DQ0JycnLS0NEJIVVWVRKKuS+1MmjRJQWdGEL4DG32S0osdfIiLPMBtx8pNAKAO/P39/f39Za+Li4uNjY3FrUcFoV3zbgxZsiuMdTWhuu7hnpSg+wwAgNpDEL4zhiLLujDDXOluCfzN58hCAAD1hiB8TzFe9Mz2dOBe7jhWbgIAUGcIwvc30o1e350dcIjbfBfDKgAA1BWC8B8JtqOSe7LTz0jnX0EWAgCoJQThP+VhTqX2ZjbckU5O5aW4SwoAoG4QhA3A3pA6EsleLhAGHuLLOLGrAQCAd4EgbBhmuuRAOKvHkOB9XF652NUAAEC9IQgbjC5Dfu/OhDWhAhK4B1i5CQBATSAIG5Js5abxLekue/g0rNwEAKAOEIQN7/PW9E9+dM9Ebh9WbgIAUHkKDEKe5+/du1dYWFjXAYIgZGZmZmZmCoKmBUYfJ3pPGDvmOLccKzcBAKg2RQXh9evX3dzcIiIimjZtunDhwtcPyMjI8PT0DAwMDAwM9PT0zMjIUFAlYvG1oo73Yhdelcae4zUt5wEANIiignDy5MkjR45MT08/e/bsrFmz7t69W+uAWbNmdejQ4c8///zzzz99fHxmzpypoEpE5GpCpfZmj2cJnxzjq9AyBABQSQoJwqysrCNHjkycOJEQ4ubmFhQUtHnz5lrHlJSUuLm5yV67u7uXlpYqohLRNdYlST3ZvHIh4gBXVCV2NQAA8BqFrEf48OFDExMTS0tL2aabm9v9+/drHfPtt99+8sknUqmUELJ9+/b4+Pi6zlZRUfHkyRPZqpKEEH19/VatWimibAUxZMnOUPbzU3zXPdy+HkwTQ0rsigAA4G8KCcLi4mJ9ff3qTQMDgydPntQ6huM4hmHu3btHURRN01VVdTaX7ty5c+HChRMnTsg2aZresmWLlZWV3IOrqqp4nuc4lZvfZb4X+eUW03W3dHtAVXMTPDTUFiUlJRSFjz6gKrTwgtTT02PZtySdQoLQ2tr6+fPn1ZsFBQXW1ta1jhk7duynn346duxYQsjKlSujo6MvXbok92ytW7fu2rVrTExMfb61LAj19PTet3YF+sqb2DaSRh6htgSz/jbadS1qLUEQjIyMxK4C4CVckHIp5Bmhi4uLRCK5fPmybPPMmTPt2rWrdUxOTo6Tk5PsddOmTXNychRRiar5qBm9oTs78BC3KQOdZwAAVIJCgtDAwGDUqFGTJ08+d+7c3LlzHz9+PHDgQEJIamqqt7e37JiIiIiZM2eePn36zJkzM2bMiIiIUEQlKijIjjoUwcack866wItdCwAAKObWKCFk3rx5//vf/7788ksHB4dDhw4ZGBgQQkxMTNq3by87YMmSJYsWLZoxYwYhJCIiYsqUKQqqRAW1MaNSezORB/hnFfziTgyNu6QAAOKhVH9Wl9jYWDMzMw14RlhLcRUZkMwZSqj1gYy+oj6QgMiKi4uNjY3FrgLgJVyQcmGuUdEYS8ieHqwBS4L2cblYuQkAQCQIQjHp0GRdINOjCeW3m7tdqOpNcwAAjYQgFJls5aYYL7r7PqzcBAAgAgShShjTnP7Zj+6ZyCU8RBYCACgVglBVfOBEJ/Rgo09wv6ZjiCEAgPIgCFVIR0vqRBS7+Jp0cipWbgIAUBIEoWpxMaaO92JP5whjjvEcWoYAAIqHIFQ5VvrkcCSbVSb0S+ZLVW7ycAAATYMgVEWGLNkVylrpk6B9XB6GGAIAKBKCUEWxNFnRjQlvQgUkcI9K8MQQAEBREISqSzbEMLoF7bebv1qALAQAUAgEoaqb3Iae60OH7udOZiMLAQAaHoJQDQxvRq/vzg5I5vY+QhYCADQwBKF6CLajdoexY49zK29hUAUAQENCEKoNH0vqeC923mWs6AsA0JAQhOrE1YQ61ovd9UCYnMpLcZcUAKAhIAjVjK0BORLJXsoXPjrKV+IuKQDAP4YgVD+mOiSpJ1spJT0TuaIqsasBAFBzCEK1pMuQTUGMmwkVvJfLKRO7GgAAdYYgVFcMRX7tygxwpjvv5u4U4YEhAMB7QhCqtxgveooHHZDAX8pHFgIAvA8Eodr7tBX9cxc6bD938AmyEADgnSEINUEfJ3pLMPvhUW7LXXQkBQB4NwhCDRFoSx2KYKeflf6ajiwEAHgHCELN0caMOhrJLL4mjT2HqWcAAOoLQahRnI2pY73Y5CfCJ8d4Di1DAIB6QBBqGmt9cjSSzSwVBh7iyzixqwEAUHkIQg1kJCG7w1g9lvQ8wBVWil0NAIBqQxBqJh2abOjOdLCguu7hnpRgWAUAQJ0QhBqLImSBLzOmOd0tgb9ViCwEAJCPFbsAUKzJbWgzXRKQwO0IZTtbUWKXAwCgctAi1Hwj3OgV3ZgPkrjEx2gXAgDUhiDUClGO9J4w9pNj3MYMDKoAAHgFglBb+FpRyRFs7Dnp91eQhQAAf0MQapFWplRqb2bdHenkVB43SQEAZBCE2sXOgEqJZNPyhI9T+Cq0DAEAEIRayEyXJPVk88uFfslcKaaeAQCthyDURgYs2RXG2uhT3fdyeeViVwMAICoEoZZiKBLXjenpQPkncA9f4IkhAGgvBKH2ogiZ1Z4Z35L228NfKUAWAoCWQhBqu0mt6Xk+dOh+7kQWshAAtBGCEMjwZvTaALb/IW7fI2QhAGgdBCEQQkiPJtTeHuzoY1z8nxhUAQDaBZNuw0sdLKgjkWx4Il9QQaZ54BMSAGgL/L2Dv7UwpVJ7s2tvS2PPYeoZANAWCEJ4ha0BSYlkj2cJo1J4DndJAUALIAihNtnUMznlwoBDfBmmngEATYcgBDkMWbI7lDXTJT0PcIWVYlcDAKBICEKQj6VJvD/TwYLqlsBlluKJIQBoLAQh1IkiZIEvM9yV7raHv1OELAQAzYQghLeI8aK/bUd338tfykcWAoAGQhDC233sTi/1o3smcscxDRsAaBwEIdRLHyd6QxDbP5n74z4GVQCARpEfhCtWrMjLy1NyKaDiuttS+8PZz07xq24hCwFAc8gPwhkzZjRp0mT48OEpKSlKLghUmbcFdSKKnXtZOv8KshAANIT8IExLS/v3v/996tSpwMDA5s2bz5s3Dw1EkHExpo71YjfckU5OxTRsAKAJ5AehnZ1dTExMRkbGwYMHvby8vvnmmyZNmgwaNCg5OVkQ8NdP29kakCORbFqe8DGmYQMA9femzjI0TYeEhGzZsuXevXtjxozZunVraGioh4fHqlWrKisx3YhWk03Dllcu9EvGNGwAoN7e0mtUEIQjR45Mnz595cqVRkZGY8aMcXJyio6ODggIqKioUE6JoJoMWLIrlLXUI+GJmIYNANRYnUGYm5u7YMGCFi1aBAUFXblyZcGCBY8fP16xYsXevXtPnTp14cKFpKQkZRYKKoilyUp/pqMl1XUP96QE98wBQC3JX5h39OjR69evFwShX79+cXFxAQEBNd/19fV1dnbOyclRSoWg0ihCvvdlLPSk3RL4A+GMWyNK7IoAAN6N/CC8du3ajBkzRo8ebW1tLfeA3377zdHRUZGFgTqJ8aJtDEj3fXxCGNO2MbIQANSJ/CA8ffo0Rb3pz1nnzp0VUw+oq5FutKkO6ZnIbQ5m/W2QhQCgNuQHIUVRgiCcPHny4sWLmZmZVlZWHh4e3bt3ZxhGyfWBGvnAiW6kQw08xP3ix/R3xux9AKAe5Adhbm7ugAEDjh07RghhWZbjOEJI27Ztd+zY0bRp0/qct7Ky8ocffkhNTXVycoqNjbWxsXn9mKysrMWLF//5559WVlaff/55mzZt3v/nANUQaEsl92QjD/AFFWRsC2QhAKgB+X+qPv7440uXLi1fvjwvL6+qqur58+cbNmzIzs7u169fPQfUT5s2bffu3dHR0eXl5WFhYVJp7XHXWVlZHTt2LCoqGjlyZPv27XNzc//pjwKqwcOcOtqL+f6qdNYFXuxaAADejno92J4/f25ubr527doPP/yw5v4jR44EBQWlp6e3aNHizSctLCy0s7O7cOFC8+bNBUFo2rRpXFxcjx49ah4zceLE4uLidevWvbXE2NhYMzOzmJiY+vw8VVVVPM/r6enV52BQnMxSoWci72dNLenMSLS4ZVhcXGxsbCx2FQAv4YKUS86fqKqqKkEQfHx8au2X7anPOPpr164ZGRk1b96cEEJRVJcuXc6ePVvrmJSUlNDQ0Llz537xxRf79+9/z/JBVdkZUCej2KxSErSPyykTuxoAgLrJeUZoaWnZpk2b5ORkWZJVS05OtrS0bNmy5VtPmp2dbW5uXr3ZuHHjrKysWsfcv39/5syZkyZNatOmzZgxY2bOnBkdHS33bLdu3bp69eqBAwdkmwzDrFixwsLCQu7Bshah7KEmiG5tZzLnGtNhp7ChS2Vbc20ccV9SUvLmDtgAyqSFF6Senh7Lyu8NU03+20uXLh02bJjsoaCNjU1eXt6+fft+/PHHZcuWlZSUlJSUEEIMDAx0dXXlfrmBgUHNhmN5eXnNXJTR19cfMmTIl19+SQjR1dWdO3duXUHo4OBgYWExePBg2SbDMI6OjjQt/3Ybbo2qmu86ES8raf/jOr92Yfo21bqbpIIgGBkZiV0FwEu4IOWSH4RDhgzJzs6ePXv27Nmza+4fMGBA9esVK1aMGTNG7pc7ODhkZWWVl5fLAun+/fuenp61jnF0dHRwcKh+/YbOMgYGBvb29iEhIfX4cUAVDXahmzei+hzkLxcIM9sz2vVxFABUnvwg/P7778vK3vJgp2vXrnW91bp1a2dn540bN44aNerOnTunTp367bffCCH3798/ffr0kCFDCCFDhgxJTEycMGECRVF79+59/ZEkaJK2janU3mzfZC79Of+bP2PwlhsVAADKI/8P0kcfffQPz7tkyZKhQ4euXbv2+vXrsvXuCSHnz5+fOnWqLAgnTpyYkJDg5eVlYmKSm5u7e/fuf/gdQcXZGpCUSHbcCb7LHm5XKONohJYhAKiEt3wyz8vLe/z4sa2tbV2TjtYlODg4IyPj+vXrjo6OdnZ2sp29e/cOCgqSvTYyMjp27Nj169elUmmLFi0kEsl7VA/qRZchqwOYH69Ju+zhtwUzvlbIQgAQX52dF1asWOHg4GBpadmuXTsbGxsrK6v58+e/Pi7+DYyNjTt16lSdgoQQHR2dWr1mWrdu7eHhgRTUKpPb0Cu6MR8c5NbcxvL2ACA++S3CZcuWTZw40dvbe/LkybJeo7t27YqJiSkuLq7VfQbgPYQ3oY71Ynsn8RfyhEWdGPSfAQARyZlZRiqV2tvbh4WFrV69uuaIk6+//nrx4sV5eXmGhobKLBEzy2iqggoy6BAnocmmILaRjtjVKAYm8gCVggtSLjm3RnNycrKysiZNmlRr3OWkSZPKy8tv3bqlrNpAw5nrksRw1qsx1XEXd6tQG4fbA4AqkBOEOjo6hJCioqJa+2V76hpED/AeWJrM9WGmedCBCdyhTGQhAIhAThCam5t7eXlNnz49Ozu7emdRUdEXX3xha2v71hm3Ad7V2Bb0thB2xFF+3mV0nwEAZatzirWwsDBnZ+fAwEBbW9ucnJzjx4+/ePFiy5YtWJsXFKGLNXX6A6bPQT6jWPjJj9HRurnYAEA08v/edOvWLS0tbdCgQbdu3dqxY8eVK1fCw8NPnTrVr18/JdcH2sPBkEqJZPPKSdBeLFgBAMojp0VYWlq6dOnSqKio1atXK70e0GpGErI9hPn3Bb7DTm5nKNPeAuMqAEDh5LQIi4qKYmNji4uLlV8NAEXIrPbMvI50zwPcpgw8MgQAhZMThFZWVtbW1vfu3VN+NQAyQ13pxHD2Pxelw4/wzyvFrgYANJqcIKRpeu7cud9+++3169eVXxCATLvG1IW+rIMRafsHd+QpRlYAgKLI7zW6b9++oqIiT09PV1fXJk2a1OwpevDgQWXVBtpOjyFzfZgQO2HkUb6nA7W4E9ZvAoCGV2cvdU9Pz6CgICcnJ4yXAHGF2FNX+rMlHOm4i7uYj6YhADQw+R+wt2zZouQ6AN7AVIf8HshsvScNT+QmtKS/bYd5ugGgwchvEa5bt67mtDIy2dnZcXFxii8JQL6BzvS5D9iUp4J/ApdRhKYhADQM+UE4ffr0jIyMWjvv3r07btw4xZcEUCdHI+pwJDvIme68m4u7icEVANAA3mEmqxcvXhgZGSmuFID6oAiZ3IY+EskuS5cOPMTnV4hdEACouVeeEV69ejU1NZUQUlZWtmvXrmvXrlW/VV5evmHDBsy4DSqitRmV2puddYH33M6t6MZEOOCZIQC8p1eCMDk5ecqUKbLX8+fPr3Wou7s7nhGC6pANrgizF0Yd48ObYHAFALynV26NTpgwoaCgoKCgwNLSMjExsaCGsrKyW7duBQQEiFUogFxBdtSVfmwZRzrs5C7koQcNALyzVz5C6+np6enpEULOnTtnbW0tew2g4hrpkLWBzIYMac8D3GetmGketD6ahgBQb/I7yzg5OSEFQb0Mc6XP92GvFggtt3EbM6RoGwJAPckPwvz8/M8//1w2uRr1KiXXB1B/DobUlmBmSzDz0w2p7y7ueBbSEADeTv4tpEGDBp08eXLo0KHu7u40jcXCQZ10tKRORLHb7klHpPBtzMiPnRkXY3yAA4A6yQnCsrKylJSUZcuWjR07VvkFAfxzFCEDneleDvSS61LfXdwwV/o/3kwjHbHLAgCVJKe1V1JSwvN8hw4dlF8NQAPSZ0mMF50+QEIIcd9a9eM1KYe5aADgNXKC0MLCon379rKR9QDqzkKP/NiZORLJHngi9fiD2/sIDw4B4BXynxEuWbJkxIgRenp6PXr0MDAwqPmWmZmZUgoDaEitTKl9PdjkJ8IXp/kfrpFFvoyHOR4cAgAhdQVh//79s7OzR48e/fpbgoAP1KCuQuypi33ZX25IQ/ZzA5zpf3nRTQwRhwDaTn4QfvPNNyUlJUouBUAJJDSZ3Ib+yI3+7hLf9g/O14oa3ZyOcqQl6BwNoK3kB+Fnn32m5DoAlMlclyzwZf7Xgdn9ULosXTr2OD/Amf60Fe2J+6UA2ucdPgbzPF9aWqq4UgCUTJchA53pgz3ZS/1YF2MqKonvsJOLuyl9USV2ZQCgRK8EYceOHX/66SfZa0EQhg0bVrPv6ObNmw0NDZVaHYBSOBhSMV70vcHsXB8m+YngtKlqxFE++QkehwNohVeCMDs7u7i4WPZaEISNGzc+ePBAjKoAREBTJMSe2hLMXB8gaW1GTTjJt9rGzbsszSsXuzIAUCT0EACozUafxHjRtwex6wKZu8WC+9aqQYf45CfoMA2gmRCEAHXytqCWd2XuDZaE2FMx53jHjVzsOf7BCwQigEZBEAK8RSMdEt2CTuvDJoYzhBCfnVzofm7rPWkVJmwD0AgIQoD6am1GzfVhHg6VRLeg425KnTZVTU7lrz1DAxFAvVE1Z4pxcnJ69OhR9aYgCLUWIBQEQfkzy8TGxpqZmcXExNTn4KqqKp7nsaowKMGfhcKqW9K1t6UuJtRHzehBLrS5bu1jiouLjY2NxagOQA5ckHK9MqB+yJAheXl5YpUCoF7cG1HzOjLf+TBHMoW1t6X/d77Kz5oa6EwPdKb15c9UAQCq6JXf13nz5olVB4CaYigSYk+F2DNFVczO+9Kt96RTTvMRDvQINzrYHvPUAKgBfHAFaBgmEjLCjR7hRj8pEbbdE6af5QsqSH8HdkIbwa0REhFAdaGzDEADszekJrehL/Zl9/VgCCH+CVyHndyP16S5GJgPoJIQhACK0tqM+rcn93iYZK4Pk5YntNhaFZXEbb0nrcS4CwBVglujAIpV/RCxsJLZfl/68w3pxJP8IBf6w2Z0ZyvcMgUQH4IQQEka6ZBP3OlP3OmHL4Tf7wijj/ECIZ+40x+50Tb6YhcHoMVwaxRA2RyNqK/b0jcGsL8HMneKhJZbqzBVDYCI5AfhyZMnqxdgKisrmzZtmr+//5QpU7AeIUADks1l+nT4K1PVXC3AVDUASiU/CD/88MPz58/LXs+YMWPx4sUURcXHx48dO1aJtQFoBb2/1gc+Gsma6ZJeWB8YQLnkBOGLFy/u37/fpUsXQgjP82vWrPniiy9SUlK2bdu2efPmwsJCpRcJoBXcG1Gz2jPV6wPbb8DyTwDKIKezTFFRESGkcePGhJALFy7k5uYOHDiQENKtWzee5+/fv+/l5aXkKgG0B/1XL9OCCmbbPem0M3w5T0a50x+709boUwOgAHJahJaWljRN3759mxCybdu2Ro0aeXt7E0Jki9czDKPkEgG0k7kuiW5BX+rHxvszt4uEVtuqRqZgsQuAhienRSiRSCIiIsaNG9e/f/+4uLj+/ftLJBJCyJUrV2iadnR0VHqRAFrNz5rys2YWd2Lib0nDE/mmRiTGi45yRJdvgIYh/3dp+fLlbdq02bBhQ0BAwJw5c2Q7V69e3bZtWxMTEyWWBwAvGUvI5Db03cFsdAv6q7PSDju5tbelPNqHAP8Ypfz1Bd8V1iME9aWg5d+kAtn7SPrdJWleOfmsFT2uJa2HRxZQD1iPUK563V15+vRpUlLS06dPFV0NANQHTZEoRzq1N/ubP5OcKXXeVDXrAl9YKXZZAOpJfhAOGTJk5syZstfHjh1r1qxZjx49XFxcdu/ercTaAOAtutpQe8LYxJ7s3SLisrlqciqfWarq93gAVI2cIOQ4bufOnbJxhISQ2NhYV1fXlJSUwYMHT5o0ied55VYIAG/hZU6tDWTO9WGrpMRjOzftDF/KiV0TgPqQE4QFBQUVFRUuLi6EkNzc3DNnzsTExPj7+8+ZM+fBgwePHj1SepEA8HYuxtQvXZj0AZLsMtJ2B3cqG01DgHqRE4SywRJVVVWEkP379wuCEBwcTAgxNzcnhOTl5Sm3QgB4B1b6ZF0g80MnZvBhftwJNA0B3k5OEJqZmdnZ2cXHx7948WLlypVt27a1sbEhhDx48IAQYmlpqewaAeAdRThQV/uzZRzpsJM7l4umIcCbyO8sM3v27EWLFhkbG588efLrr7+W7dy7d6+FhQUG1AOoBVMdsjaQ+bc33TuJiz3HV+DhPkAd5C/M+8knn7Rv3/7ixYtt27Zt166dbKednd3SpUspCmtqA6iNgc50gA094STvvZNbE5CqzJYAAB6fSURBVMB4W+D3F6C2Oleob9u2bdu2bWvuGTx48DudOi8v7+bNm66urra2tm84LDMzk6KoNx8DAO/NSp9sD2G23pNGHuA+dqf/483oYHY2gBrq/IV48eJFXFzcxIkTo6KioqOjlyxZ8uzZs/qfd9u2bc2bN//3v//t4eHx66+/1nVYenq6q6vriBEj3q1qAHhHA53pS/0kN58Tn53cxXw8NQT4m/wp1h48eBAcHJyRkdGoUSMbG5vc3NyCggIbG5sDBw54enq+9aRVVVVOTk4rV66MiIi4ePGiv7//48ePGzVqVOswqVQaGBjYpEmT3NzcgwcP1nU2TLEG6ksFZ7Taek/62Sl+lDs925uRoGmoZVTwglQF8n8Pxo8fX1xcvG/fvmfPnt28eTM/P//EiRNGRkYfffRRfU568uRJQRB69uxJCGnXrl2zZs327t37+mGLFy/28fHp1KnTP/kBAOCdDHSmL/WVXH8mdN3D3XyOpiGAvGeEJSUlBw8e3LhxoyzJZLp06bJmzZouXbrcvXtXNtb+DR4+fOjk5FTdraZp06avD8O/d+9efHz8mTNn4uPj33y2kpKSvLy86iYjTdP+/v51LYso/cubzwmgHKp5NVrrkZ0h9Mpbgn8CN6UNPdWDYtCHRjuo5gWpUDT99vsecoKwsLCQ5/kWLVrU2t+yZUtCSH5+/luDsLy8XEdHp3pTV1e3tLS05gFSqXTUqFGLFi0yMjJ6a4lPnjy5cuXK3bt3q8/WrFmzxo0byz1YdmtU2/6nQWWVlZWp7FrWwx1IN3Nqwhl2xz2yvBPXzBitQ82nyhekgujp6bFsnd1CZeS8bWlpaWRktGfPHg8Pj5r79+zZQ9O0s7PzW7+xtbV1fn5+9WZ+fn6tTqFJSUl37969dOnSpUuXTp06df/+/YULF06dOlXu2dzd3X19ffGMENSRIAj1+bQnllZG5GhvsuKmNPQQPc2Dme5J02gaajQVvyDFIn+F+tGjR8+YMSMnJ2fgwIF2dna5ubkJCQkLFy4cMGCAhYXFW0/q7e19586dnJwcKyurioqKs2fPzps3r+YBzs7On376aYP9EADwvihColvQIfbUqBR+z0Pp6gCmmQnCELSL/F6jlZWV48ePX7NmTc17jH369FmzZk09V6gfNmxYfn7+pEmT1q5dm5ube/jwYUJIfHz8b7/9dvz48ZpHLlmyZM+ePeg1ChpJjTrpSQWy8pb0/87z37RlJrVBy1AzqdEFqUzy75zq6OjEx8fPmDHj5MmTz549MzEx6dSpk7u7e/3Pu2rVqkWLFq1evbp58+ZxcXGyna1atRowYECtIzt06GBgYPB+1QNAQ6EpEt2C7mpDfZzC738s3didNdMVuyYApZDTInz69Kmdnd3evXsjIiJEqakWtAhBfanjB3BOSr46yx95KhzsyVrgN0mzqOMFqQRy+pUaGxvTNI0HqgDaiaXJok7MEBe6WwKH9e5BG8gJQiMjo8jIyC1btii/GgBQETFe9MdudNBe/nEJshA0nPxnhCNHjpw4cWJ2dnZUVJStrW3NFSdCQkKUVRsAiCnGi2Zp0i2BPxTBuBij9wxoLPlB+Omnn+bk5Gzbtm3btm213pLbyxQANNJUD9qQJcH7+IM9MawCNJb8IExKSqqqqlJyKQCggsa3pBmKBCTwST2Z1mbIQtBA8oOwPktMAICWGNuCNpSQsP18YjjjYY4sBE3zSmeZ0tLSuLi406dPv37cjRs34uLicnJylFUYAKiQYa704k50yH7ufB4ejoCmeaVFuHTp0u++++7mzZuvH+fk5BQVFXXp0qVffvlFWbUBgAoZ5EIbsFTUAW5nKOtrhXYhaI5XWoRr164dM2ZMrQmyZQwNDadOnfr7779zHKes2gBAtfRypOL92Q8Ocqey0S4EzfF3EJaVlaWnpwcGBtZ1aGBgYHFx8e3bt5VRFwCopJ4O1KYgtl8ydzgTWQga4u8gLC0tFQThDbPvyOaaefHihTLqAgBVFWhLbQlmhx3hDj5BFoIm+DsITU1NJRJJRkZGXYfK3rKyslJGXQCgwvxtqO0h7PAj3O4HWAQb1N7fQcgwTJcuXVauXMnzvNxDly9f7uTk5OTkpKzaAEB1dbGm9oSxY0/wJ/G8ENTcK51lvvrqq9OnT48YMeL58+c195eVlcXExGzevHn69OnKLQ8AVJevFfV7IDv4MP8I85GCOntl+ETPnj1nz549Y8aMXbt2BQQEODs7Mwzz6NGjlJSUgoKCjz/+eOLEiWIVCgAqKNSe+rIN3TuJPxnFGsifnwNA1dW+cr/55pvOnTvPmzfv8OHD5eXlhBCWZTt27Dh58uRBgwaJUSEAqLSpHnT6c2FECr81mMHoQlBHcj7CBQcHBwcHV1ZW5uTkSKVSKysrrHMLAG/wkx8TuJebe1n6Ly85K7sBqLg672Xo6Og0adJEmaUAgJrSY8jOUNZ3F9fGjEQ5IgtBzeCSBYAGYKNPdoQwY47z156h4wyoGQQhADSM9hbUQl+mdxKfVy52KQDvAkEIAA3mw2Z0v6bU0CMch3H2oD4QhADQkOZ3ZHRp8tVZ+fNyAKggBCEANCSaIhuC2AOPhZW30CoE9YAgBIAGZiIhf4QyX5/jj2eh4wyoAQQhADS85o2odYHs0CP8Y8y+BioPQQgACtGjCTWpNd07iS/FYt6g2hCEAKAoX3nSbRtT0SfQcQZUGoIQABToly7M7ULh+yvoOAOqC0EIAAokm31t6XVpwkM8LAQVhSAEAMWyNSCbg5lRx7jrmH0NVBKCEAAUrrMVtdCX6ZfMP68UuxSA1yAIAUAZRrjREQ7UoEMcj2YhqBgEIQAoyQJfhqVJLGZfAxWDIAQAJWEosj6Q3fVQiP8TnUhBhSAIAUB5zHTJ7lDm63P8mRzcIQVVgSAEAKVqYUqt6MYMPMRnliILQSUgCAFA2aIc6egW9PAjvBRRCCoAQQgAIvi6LU1TZD5mnAEVgCAEABHQFFkbwPxwDQ8LQXwIQgAQh70htawL8+FRvrhK7FJAuyEIAUA0fZvSAbbUl6cxshDEhCAEADH92Jk5kSVsvouHhSAaBCEAiMmQJeu7M5NS+Ycv8LAQxIEgBACReVtQX7ZhPjzKYxpSEAWCEADE95UnLaEJ1u8FUSAIAUB8GE0BIkIQAoBKkI2mGI7RFKB0CEIAUBV9m9KBGE0BSocgBAAVgtEUoHwIQgBQIRhNAcqHIAQA1YLRFKBkCEIAUDkYTQHKhCAEAJWD0RSgTAhCAFBFGE0BSoMgBAAVhdEUoBwIQgBQXRhNAUqAIAQA1WXIkg0YTQEKhiAEAJXWHqMpQMEQhACg6r7ypFmKzL2MG6SgEAhCAFB1NEXWBTK/3JAefIJWITQ8BCEAqAF7Q2prMPPRUe5eMbIQGhiCEADUg5819ZUn0y+ZL+PELgU0C4IQANTGFA/aw4wadwIjC6EhIQgBQJ0s68pczBdW3ETHGWgwCgzCXbt2hYSEBAQErFy58vV3b9++PX369O7duwcHB8+dO7eiokJxlQCAxjBkyR8hzP+d509m42EhNAxWQedNS0sbOXLkmjVrTE1Nhw8fbmpqOmDAgJoHHDt2TFdX99tvv+V5furUqZmZmUuWLFFQMQCgSdwaUav8mWFH+PN9WEs9sasB9UcJgkI+VY0ePdrExGTx4sWEkF9++WXLli1Hjx6t6+AdO3ZMmzYtIyND7ruxsbFmZmYxMTH1+b5VVVU8z+vp4ZcDVEJxcbGxsbHYVWim/zvPp2YLST1ZFk946g0XpFyKuoIuX77s6+sre+3r63vp0qU3HHz16lVXV1cFVQIAGmm2N6PHkv87j44z8E8p6tZoTk6Oqamp7LW5uXlhYWF5ebnchtrFixcXLlx4+PDhuk517dq1U6dO/frrr7JNXV3dvXv3WllZyT1Y1iKsqsLCLaASXrx4IXYJmuzXDlRAkk5ro/K+Dug7Uy9aeEHq6elJJJI3H6OoIDQ2Ni4tLZW9fvHiha6urq6u7uuHpaenR0ZGrly50tvbu65TNW/evHXr1uPGjZNt0jTdtGnTug7GrVFQNbgTpTjGhOzsIYTtp7xt2dZmlNjlqAdckK9TVBA2bdq0+plfRkZG06ZNKar2ZXr79u2wsLD58+cPHDjwDaeSSCRmZmYuLi4KKhUA1JeXObXQl+mXzJ/9gG2kI3Y1oJ4U9Yxw2LBhq1evLikp4Xl+2bJlQ4cOle1funRpeno6IeTBgwdhYWHffPPNhx9+qKAaAEAbfNiM7m5LjUzB6hTwnhQVhIMHD/b29nZ2dnZychIEYcqUKbL9ixYtunHjBiFkxYoV9+/fHz9+PEVRFEUZGBgoqBIA0HhL/Zi8cuH7K3hSCO9DUcMnZPLz86uqqmxsbP7JSTB8AtQXeqsrzdNS4rOLW9WN6dEEDwvrhAtSLsUOwGncuPE/TEEAgPqwNSDrApkRKVieAt4ZRqICgIbobktN92T6Y3kKeEcIQgDQHFM9aLdG1OTTGGUP7wBBCACagyJkVTfmVLaw8hY6zkB9IQgBQKMYSV4uT3EuFw8LoV4QhACgadwbUSu6MgMP8bnlYpcC6gBBCAAaqLcTPdSVGnKYwzB7eCsEIQBopv91YHRo8g2Wp4C3QRACgGaiKbIukN2YIay7g44z8CaKmnQbAEB0FnpkXzgTvp8vriQTW+FzP8iHKwMANFkrU+p4FPPDdWnsOdwjBfkQhACg4ZyMqOO92MRHwqRUdJ0BORCEAKD5rPXJ0V7shTxh5FGewxNDeBWCEAC0gqkOSerJ5pQLAw7x5bhLCjUgCAFAWxiwZHcYq0OTiESuuErsakBlIAgBQIvo0GRjEONqQgXv4/IrxK4GVAOCEAC0C0ORuG5MgA0VkMBllqL3DCAIAUD7UIR878t81IzuuofPKEIWajsEIQBoqRgveronHbCXv1qALNRqmFkGALTXhJZ0Ix0Ssp/bFcp2sqLELgfEgRYhAGi1Ya70qm5s7yQu+QnahVoKQQgA2q6XI7UthB12hPvjPgbbayPcGgUAIP421L5wNuoAV8aR4c3QQtAu+P8GACCEkA4W1OFI9ts06aBD/B10JdUmCEIAgJdamlI3B7JdrCm/3dy4E3x2mdgFgVIgCAEA/qZDk8lt6JsDJWa6pPW2qthzPCZj03gIQgCA2sx1yVwf5kJf9lkFab2Ni7spxZoVGgxBCAAgn6MRtbwr80cosylD6vEHt/UewlAzIQgBAN5E1olmaWfmfxelfru5k9noR6NpEIQAAG8XYk9d6Mt+6UF/dJRHt1INgyAEAKgXmiIDnelr/VlvC3Qr1SgIQgCAd2DAkhgvOv2vbqWzLvBlnNg1wT+DIAQAeGeNdclcHyatL3u3iLhv5eJuSnncK1VbCEIAgPfkZEStDWS2BTPr70hbb+PmXpY+LkEeqh8EIQDAP+JrRaX0Yn8LYB68ENr+wYXu536/Iy3B/VL1gSAEAGgAna2oZV2YJ8Mkk1rTux8IduurBh3ik58IaCGqPgQhAECD0WVIlCO9JZi5N0QSYk/NusA7buRiz/G3CxGIqgtBCADQ8Mx1SXQL+kQUe6AnQwjxT+A67OR+vCbNKxe7MngNghAAQIFamVJzfZgnwyRzfZi0PMF9a1VUErf1nrQK87WpDCzMCwCgcDRFQuypEHvmeSWz+a70h2vSSaf4fs50R0vK24JqaUoxlNglajEEIQCA8pjqkHEt6HEt6NuFwu6HwoHHwneXpJmlgqc51b4x5W3xMhdZ3K1TIgQhAIAI3BpRUz1eNgOLq8jlfCEtTzicKXx/RXq3WHAxfhmK3haUjyWly4hbrIZDEAIAiMxYQrraUF1t/s7Fi/lCWp5wLk/49ab00QuhjfnLUPQ0p1yMKXNdcevVNAhCAADVYiwh/jaUf41cvJQvpOUJRzKFn29IM4oEihBXE8rFmHIxIS7GlKsJ5WJMHAxxQ/U9IQgBAFSasYR0s6G62fzdneZZBblbLNwtFu4WkbQ8Yes96d0iklkq2Bm8jMbqjGxmQjXSEbF29YAgBABQM2a6xFuX8rZ4padppZTcLxbuFpOMIuFusXA6h2QUS+8WCToMsdCjzHSIqQ4xZnQsDXkzHWKmS5npElPZCx1ipkvMdClTbY1MBCEAgCbQoYl7I8q9ESHklYDMKycFFcKzCvK8kmQ+ryhnJM8qSE6ZcKuQPKsgzyulzyrIs0ryrEIoqiRmusRUhzLTJWa6hBCiz1B6f/XTMdEhsjEeDEVM/opMuQe8k+IqwkmJlJDCSoEQUs4T2bJWRVWElxJOIMVVAiGklCMVPCGEPK8kk1rTk1o35F1gBCEAgCaz0CMWen89bmwkNTauM0KkwstEfF5JnlcQQkgZL5TzL98trCRSgRBCeIEUVb7cWcYLz/56ff/FywPeiZGESGhCEWKmQxFC9BiizxJCiImEMDRhKGIioQkh+iyRJa6pDrExaOBBlwhCAAAghBCaIo11SWPdmjGjFeP80ccIAAC0GoIQAAC0mqYF4a1bt86fPy92FQAv7dq1q7S0VOwqAAghpKioKCEhQewqVJGmBWFiYuKWLVvErgLgpTlz5ty9e1fsKgAIIeTmzZsLFy4UuwpVpGlBCAAA8E4QhAAAoNUQhAAAoNUoQXj3AZDK1bdv3/Pnz1tbW9fn4JycnMrKyiZNmii6KoD6uHHjhouLi56entiFAJDS0tKHDx+2aNFC7EKUaujQoVOnTn3zMWoQhPfu3Xv8+LGBgUF9Di4tLeU4zsTERNFVAdRHdna2lZUVRWnFqGRQcVKpNC8vz8rKSuxClMre3t7GxubNx6hBEAIAACgOnhECAIBWQxACAIBWQxACAIBW0/wgXLJkSVRU1OzZs3mef/vRAIq0fv36L7744qeffhK7EABy586dzz//vEePHuPGjXv48KHY5YhJw4NwzZo1qamp69atKygomDdvntjlgLbLzc21sbFJTEwUuxAAkp2dHRkZ+fvvv/v4+AwePFjscsSk4b1Gw8PDZ82a1alTp0ePHkVGRl65ckXsikDbnThxYu7cuZj7GFRHUVGRu7t7VlaW2IWIRsNbhI8fP5YNrre3t3/y5InY5QAAqJw5c+aMGjVK7CrEpOEr1EskEtmjQZ7nJRKJ2OUAAKiWX3/99erVqzt27BC7EDGpaxA+f/78woULd+7cCQwMdHd3r97/9OnTtWvXvnjxom/fvu3bt3dzc0tPT3dycrp161azZs1ELBg027Nnz86fP3/v3r3Q0FBnZ+fq/Y8ePVq3bl1FRUX//v09PT1FrBC0B8/z6enply5dMjQ07Nu3b/V+QRC2bduWlpbm5uY2YsQIiUQSHx+/Y8eOXbt2aXk7gZk1a5bYNbyPdu3aHTlyZPv27a1bt27btq1sZ0FBQfv27S0tLS0sLKKjo319fTt27Pjtt99aWVnNnj17/Pjxbdq0Ebds0FTu7u5nzpzZtGmTr69vy5YtZTuzsrLatWvn5ORkZGQ0duzYoKCg8+fPJycnp6WlMQxja2trbGwsbtmgkZYsWTJx4sS0tLTU1NTo6Ojq/bGxsStWrOjWrdumTZuSkpKMjIyio6PHjh1748aNtLS09u3ba+1cgOraWYbjOJZlO3XqNGHChJEjR8p2Lliw4ODBgwcOHCCELFq0KDExMSkp6cyZM0ePHvXx8QkKChK1ZNBksguydevWs2fP7tevn2znrFmzrl69un37dkLIf/7zn8uXL3/00Uc5OTmyd3v16mVnZydaxaC5ZFfj77///uOPP547d0628/nz5/b29mlpaS1atCgqKrKzs1u/fn12dnb1V40dO1Zrg1Bdb42yrJzKjx49Gh4eLnsdHh7+r3/9SyqV+vr6+vr6Krc60DpyL8gjR458+OGHstfh4eE//PCDLBQBFEru1XjmzBkrKyvZ0hMmJiadOnXKzMycMGGC0qtTRRrVazQrK8vS0lL22traurKyMi8vT9ySQJvVuiCfPXtWVlYmbkmgtZ4+fVpz3Qlra+unT5+KWI9K0aggZBhGKpXKXnMcRwjR8ifAIK5aFyRFUQzDiFsSaC2WZauvRkIIz/NyG47aSaOC0M7OLjMzU/Y6MzNTX1/f1NRU3JJAm9W6IC0tLXV0dMQtCbSWra1t9dVICHny5AkeUVfTqCCMjIzcsWOH7FPP9u3bIyMjtfbZL6iCyMjI7du3y/qjyS5IsSsC7eXn51deXn7q1ClCyJMnT9LS0qp7VIC69hqdMWNGamrquXPnmjRpYmtr+9133/n4+JSWlnbr1q1Ro0aOjo579+49dOgQRm6BckybNu3y5cupqamurq5WVlY//PBD69atCwsL/fz8HBwcLCwsDh48ePz48ZpjXgEU5MKFCzExMVlZWQ8fPuzYsaOvr+9///tfQsjPP//8v//9r0+fPgcPHuzTp8/3338vdqWqQl2D8PLly7m5udWb7dq1a9y4MSGkoqIiMTGxuLg4NDTU2tpavAJBu6SlpT179qx6s0OHDrLb8qWlpQcOHCgrKwsLC7OwsBCvQNAiBQUFFy5cqN60sLCoHmx95cqVtLS05s2b+/n5iVSdKlLXIAQAAGgQGvWMEAAA4F0hCAEAQKshCAEAQKshCAEAQKshCAEAQKshCAEAQKshCAEAQKshCAE039ChQ0eNGiV2FQAqCrOPA2i+p0+f6uvri10FgIpCixAAALQaWoQAypabm7t58+aMjAxTU9OoqKj27dvL9vM8v2rVqs6dOxsZGW3cuDE/P9/Hx2fQoEE0/fcH1tLS0k2bNl27ds3AwCAkJCQwMLDmmXmeT0hIOHv2bHl5ebNmzXr16uXg4FD9bnl5+fr169PT052dnQcOHFhzmdZr164lJCTk5OSYmJh4enqGhYUZGRkp9l8BQGVgrlEApTp8+HC/fv309fW9vb0fP3585cqV77//furUqYSQiooKPT29wYMHHzx4sG3bthUVFadOnerVq9eOHTtkK/o+fPiwe/fuWVlZfn5+ubm5ly9fHjVq1KpVq2TLjeXm5kZERFy8eLFdu3a2trbXr19v0qRJSkoKISQwMFAqlZaVlZWUlFhbW585c6Zx48bXr183MTEhhMTHx48ZM8bT09Pd3T0vLy8tLW3NmjV9+vQR9d8JQIkEAFCWgoICc3PzXr16lZSUyPbMnDmTYZjr168LglBeXk4IoWn68OHDsnfj4+MJIStXrpRtRkREmJiYXLlyRbYpW1tnw4YNsk1ZvqakpFR/uz///FP2IiAggBAyf/582eaZM2coilqwYIFss3nz5iNHjqz+qtLS0vz8/Ib/4QFUFYIQQHl+/fVXQsjt27er91RVVRkYGCxatEj4Kwh79epV/a5UKm3VqlWPHj0EQSguLqYoaurUqdXvVlZW2tvbh4eHC4KQnZ1d692aAgICHB0deZ6v3tOqVavq8HN0dOzXr19paWkD/qQAagTPCAGU58qVKzRNf/nll7LMkxEE4c6dO9Wb7dq1q35NUVTbtm3PnDlDCMnIyBAEofqBIiFEIpF4eXnduHGDEHLjxg1BEDp16lTXt3Zzc6v5rNHCwqJ6Rc/p06dPnjzZ2to6MjIyLCysT58+ZmZmDfDTAqgJBCGA8lRWVrIs27Vr15o7Q0JCPDw8qjdZ9pXfSh0dnYqKCkJIYWGhbLPWuzzPE0I4jnv93ZokEknNTdljRZnPPvvM399/27Zthw4dio6O/uqrr/bt2+fj4/PuPx+AWkIQAiiPq6trZWXlkCFDnJyc6jrm9u3bNTdv3brl6upKCGnatKlss+a76enpsv3NmjUjhFy7dq13797vUZinp6enp+d//vOf+/fvd+rUac6cOX/88cd7nAdAHWEcIYDyDBo0SCKRxMbGyhpwMiUlJc+ePave3L59+/3792WvT506dfr06bCwMEKIo6Ojt7d3XFxcQUGB7N0dO3bcunWrX79+hJCmTZv6+fn9+OOPjx8/rj6VrCn5ZlKpNDMzs3rTycnJ1ta2ZnkAGg8tQgDlcXFxWbZs2fjx469evRoeHq6np3fnzp3ExMRNmzaFh4fLjmnfvr2fn9+wYcMqKirWrFnTqlWryZMny9765ZdfgoODfXx8BgwYkJOTs379+s6dO0+YMEH27sqVK7t37+7l5TVgwABbW9sbN24UFhYeOHDgzSVxHOfk5BQWFubh4WFoaHj8+PGrV6/OmTNHcf8IAKoG4wgBlO3y5curVq26fv26RCJxcHAIDQ3t1auXgYGBbBzhwoULW7ZsGR8fn5+f37Fjx5iYmJpdV27fvr106dKrV68aGhoGBwePHz++5txp2dnZP//889mzZ3med3Z2HjZsmGzE/fLlyyUSySeffFJ95PLly3V0dEaNGiUIwsaNG48dO/bw4UOO45o1azZmzJiaXXIANB6CEEBVVAfhlClTxK4FQIvgGSEAAGg1BCGACjEzM9PT0xO7CgDtglujAACg1dAiBAAArYYgBAAArfb/ks8yH29TyCcAAAAASUVORK5CYII=", + "text/html": [ + "\n", + "\n", + "\n", + " \n", + " \n", + " \n", + "\n", + "\n", + "\n", + " \n", + " \n", + " \n", + "\n", + "\n", + "\n", + " \n", + " \n", + " \n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n" + ], + "image/svg+xml": [ + "\n", + "\n", + "\n", + " \n", + " \n", + " \n", + "\n", + "\n", + "\n", + " \n", + " \n", + " \n", + "\n", + "\n", + "\n", + " \n", + " \n", + " \n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "\n" + ] + }, + "metadata": {}, + "execution_count": 10 + } + ], + "cell_type": "code", + "source": [ + "plot(\n", + " curve.parameter_values,\n", + " curve.measurements,\n", + " xlab=curve.parameter_name,\n", + " xscale=curve.parameter_scale,\n", + " ylab = \"Cross Entropy\",\n", + ")" + ], + "metadata": {}, + "execution_count": 10 + }, + { + "cell_type": "markdown", + "source": [ + "---\n", + "\n", + "*This notebook was generated using [Literate.jl](https://github.com/fredrikekre/Literate.jl).*" + ], + "metadata": {} + } + ], + "nbformat_minor": 3, + "metadata": { + "language_info": { + "file_extension": ".jl", + "mimetype": "application/julia", + "name": "julia", + "version": "1.10.3" + }, + "kernelspec": { + "name": "julia-1.10", + "display_name": "Julia 1.10.3", + "language": "julia" + } + }, + "nbformat": 4 +} diff --git a/docs/src/common_workflows/hyperparameter_tuning/notebook.jl b/docs/src/common_workflows/hyperparameter_tuning/notebook.jl index aa39830d..3c85ec16 100644 --- a/docs/src/common_workflows/hyperparameter_tuning/notebook.jl +++ b/docs/src/common_workflows/hyperparameter_tuning/notebook.jl @@ -24,7 +24,7 @@ import Optimisers # native Flux.jl optimisers no longer supported # ### Loading and Splitting the Data iris = RDatasets.dataset("datasets", "iris"); -y, X = unpack(iris, ==(:Species), colname -> true, rng=123); +y, X = unpack(iris, ==(:Species), rng=123); X = Float32.(X); # To be compatible with type of network network parameters diff --git a/docs/src/common_workflows/hyperparameter_tuning/notebook.md b/docs/src/common_workflows/hyperparameter_tuning/notebook.md index ae50dd14..d6649fe0 100644 --- a/docs/src/common_workflows/hyperparameter_tuning/notebook.md +++ b/docs/src/common_workflows/hyperparameter_tuning/notebook.md @@ -26,7 +26,7 @@ import Optimisers # native Flux.jl optimisers no longer supported ````@example hyperparameter_tuning iris = RDatasets.dataset("datasets", "iris"); -y, X = unpack(iris, ==(:Species), colname -> true, rng=123); +y, X = unpack(iris, ==(:Species), rng=123); X = Float32.(X); # To be compatible with type of network network parameters nothing #hide ```` diff --git a/docs/src/common_workflows/hyperparameter_tuning/notebook.unexecuted.ipynb b/docs/src/common_workflows/hyperparameter_tuning/notebook.unexecuted.ipynb index 2060f391..bbb6280a 100644 --- a/docs/src/common_workflows/hyperparameter_tuning/notebook.unexecuted.ipynb +++ b/docs/src/common_workflows/hyperparameter_tuning/notebook.unexecuted.ipynb @@ -73,7 +73,7 @@ "cell_type": "code", "source": [ "iris = RDatasets.dataset(\"datasets\", \"iris\");\n", - "y, X = unpack(iris, ==(:Species), colname -> true, rng=123);\n", + "y, X = unpack(iris, ==(:Species), rng=123);\n", "X = Float32.(X); # To be compatible with type of network network parameters" ], "metadata": {}, diff --git a/docs/src/common_workflows/incremental_training/notebook.ipynb b/docs/src/common_workflows/incremental_training/notebook.ipynb index b85e848b..e3b44f52 100644 --- a/docs/src/common_workflows/incremental_training/notebook.ipynb +++ b/docs/src/common_workflows/incremental_training/notebook.ipynb @@ -7,6 +7,14 @@ ], "metadata": {} }, + { + "cell_type": "markdown", + "source": [ + "This demonstration is available as a Jupyter notebook or julia script\n", + "[here](https://github.com/FluxML/MLJFlux.jl/tree/dev/docs/src/common_workflows/incremental_training)." + ], + "metadata": {} + }, { "cell_type": "markdown", "source": [ @@ -36,9 +44,7 @@ { "cell_type": "markdown", "source": [ - "**Julia version** is assumed to be 1.10.* This tutorial is available as a Jupyter\n", - "notebook or julia script\n", - "[here](https://github.com/FluxML/MLJFlux.jl/tree/dev/docs/src/common_workflows/incremental_training)." + "**Julia version** is assumed to be 1.10.*" ], "metadata": {} }, @@ -73,7 +79,7 @@ "cell_type": "code", "source": [ "iris = RDatasets.dataset(\"datasets\", \"iris\");\n", - "y, X = unpack(iris, ==(:Species), colname -> true, rng=123);\n", + "y, X = unpack(iris, ==(:Species), rng=123);\n", "X = Float32.(X) # To be compatible with type of network network parameters\n", "(X_train, X_test), (y_train, y_test) = partition(\n", " (X, y), 0.8,\n", @@ -113,7 +119,7 @@ { "output_type": "execute_result", "data": { - "text/plain": "NeuralNetworkClassifier(\n builder = MLP(\n hidden = (5, 4), \n σ = NNlib.relu), \n finaliser = NNlib.softmax, \n optimiser = Adam(0.01, (0.9, 0.999), 1.0e-8), \n loss = Flux.Losses.crossentropy, \n epochs = 10, \n batch_size = 8, \n lambda = 0.0, \n alpha = 0.0, \n rng = 42, \n optimiser_changes_trigger_retraining = false, \n acceleration = ComputationalResources.CPU1{Nothing}(nothing))" + "text/plain": "NeuralNetworkClassifier(\n builder = MLP(\n hidden = (5, 4), \n σ = NNlib.relu), \n finaliser = NNlib.softmax, \n optimiser = Adam(0.01, (0.9, 0.999), 1.0e-8), \n loss = Flux.Losses.crossentropy, \n epochs = 10, \n batch_size = 8, \n lambda = 0.0, \n alpha = 0.0, \n rng = 42, \n optimiser_changes_trigger_retraining = false, \n acceleration = CPU1{Nothing}(nothing))" }, "metadata": {}, "execution_count": 4 @@ -161,7 +167,7 @@ { "output_type": "execute_result", "data": { - "text/plain": "trained Machine; caches model-specific representations of data\n model: NeuralNetworkClassifier(builder = MLP(hidden = (5, 4), …), …)\n args: \n 1:\tSource @068 ⏎ ScientificTypesBase.Table{AbstractVector{ScientificTypesBase.Continuous}}\n 2:\tSource @767 ⏎ AbstractVector{ScientificTypesBase.Multiclass{3}}\n" + "text/plain": "trained Machine; caches model-specific representations of data\n model: NeuralNetworkClassifier(builder = MLP(hidden = (5, 4), …), …)\n args: \n 1:\tSource @547 ⏎ Table{AbstractVector{Continuous}}\n 2:\tSource @645 ⏎ AbstractVector{Multiclass{3}}\n" }, "metadata": {}, "execution_count": 5 diff --git a/docs/src/common_workflows/incremental_training/notebook.jl b/docs/src/common_workflows/incremental_training/notebook.jl index 20d38b53..6d44c046 100644 --- a/docs/src/common_workflows/incremental_training/notebook.jl +++ b/docs/src/common_workflows/incremental_training/notebook.jl @@ -22,7 +22,7 @@ import Optimisers # native Flux.jl optimisers no longer supported # ### Loading and Splitting the Data iris = RDatasets.dataset("datasets", "iris"); -y, X = unpack(iris, ==(:Species), colname -> true, rng=123); +y, X = unpack(iris, ==(:Species), rng=123); X = Float32.(X) # To be compatible with type of network network parameters (X_train, X_test), (y_train, y_test) = partition( (X, y), 0.8, diff --git a/docs/src/common_workflows/incremental_training/notebook.md b/docs/src/common_workflows/incremental_training/notebook.md index 94be1207..3810f90c 100644 --- a/docs/src/common_workflows/incremental_training/notebook.md +++ b/docs/src/common_workflows/incremental_training/notebook.md @@ -4,11 +4,12 @@ EditURL = "notebook.jl" # Incremental Training with MLJFlux +This demonstration is available as a Jupyter notebook or julia script +[here](https://github.com/FluxML/MLJFlux.jl/tree/dev/docs/src/common_workflows/incremental_training). + In this workflow example we explore how to incrementally train MLJFlux models. -**Julia version** is assumed to be 1.10.* This tutorial is available as a Jupyter -notebook or julia script -[here](https://github.com/FluxML/MLJFlux.jl/tree/dev/docs/src/common_workflows/incremental_training). +**Julia version** is assumed to be 1.10.* ### Basic Imports @@ -23,7 +24,7 @@ import Optimisers # native Flux.jl optimisers no longer supported ````@example incremental_training iris = RDatasets.dataset("datasets", "iris"); -y, X = unpack(iris, ==(:Species), colname -> true, rng=123); +y, X = unpack(iris, ==(:Species), rng=123); X = Float32.(X) # To be compatible with type of network network parameters (X_train, X_test), (y_train, y_test) = partition( (X, y), 0.8, diff --git a/docs/src/common_workflows/incremental_training/notebook.unexecuted.ipynb b/docs/src/common_workflows/incremental_training/notebook.unexecuted.ipynb index 4d12d4d7..b9227430 100644 --- a/docs/src/common_workflows/incremental_training/notebook.unexecuted.ipynb +++ b/docs/src/common_workflows/incremental_training/notebook.unexecuted.ipynb @@ -7,6 +7,14 @@ ], "metadata": {} }, + { + "cell_type": "markdown", + "source": [ + "This demonstration is available as a Jupyter notebook or julia script\n", + "[here](https://github.com/FluxML/MLJFlux.jl/tree/dev/docs/src/common_workflows/incremental_training)." + ], + "metadata": {} + }, { "cell_type": "markdown", "source": [ @@ -28,9 +36,7 @@ { "cell_type": "markdown", "source": [ - "**Julia version** is assumed to be 1.10.* This tutorial is available as a Jupyter\n", - "notebook or julia script\n", - "[here](https://github.com/FluxML/MLJFlux.jl/tree/dev/docs/src/common_workflows/incremental_training)." + "**Julia version** is assumed to be 1.10.*" ], "metadata": {} }, @@ -65,7 +71,7 @@ "cell_type": "code", "source": [ "iris = RDatasets.dataset(\"datasets\", \"iris\");\n", - "y, X = unpack(iris, ==(:Species), colname -> true, rng=123);\n", + "y, X = unpack(iris, ==(:Species), rng=123);\n", "X = Float32.(X) # To be compatible with type of network network parameters\n", "(X_train, X_test), (y_train, y_test) = partition(\n", " (X, y), 0.8,\n", diff --git a/docs/src/common_workflows/live_training/notebook.jl b/docs/src/common_workflows/live_training/notebook.jl index 16bae98a..de1a6fb8 100644 --- a/docs/src/common_workflows/live_training/notebook.jl +++ b/docs/src/common_workflows/live_training/notebook.jl @@ -23,7 +23,7 @@ using Plots # ### Loading and Splitting the Data iris = RDatasets.dataset("datasets", "iris"); -y, X = unpack(iris, ==(:Species), colname -> true, rng=123); +y, X = unpack(iris, ==(:Species), rng=123); X = Float32.(X); # To be compatible with type of network network parameters diff --git a/docs/src/common_workflows/live_training/notebook.md b/docs/src/common_workflows/live_training/notebook.md index edc1b140..14b77358 100644 --- a/docs/src/common_workflows/live_training/notebook.md +++ b/docs/src/common_workflows/live_training/notebook.md @@ -4,7 +4,7 @@ EditURL = "notebook.jl" # Live Training with MLJFlux -This tutorial is available as a Jupyter notebook or julia script +This demonstration is available as a Jupyter notebook or julia script [here](https://github.com/FluxML/MLJFlux.jl/tree/dev/docs/src/common_workflows/live_training). **Julia version** is assumed to be 1.10.* @@ -26,7 +26,7 @@ using Plots ````@example live_training iris = RDatasets.dataset("datasets", "iris"); -y, X = unpack(iris, ==(:Species), colname -> true, rng=123); +y, X = unpack(iris, ==(:Species), rng=123); X = Float32.(X); # To be compatible with type of network network parameters nothing #hide ```` diff --git a/docs/src/common_workflows/live_training/notebook.unexecuted.ipynb b/docs/src/common_workflows/live_training/notebook.unexecuted.ipynb index a647a39a..fb86f8e7 100644 --- a/docs/src/common_workflows/live_training/notebook.unexecuted.ipynb +++ b/docs/src/common_workflows/live_training/notebook.unexecuted.ipynb @@ -10,7 +10,7 @@ { "cell_type": "markdown", "source": [ - "This tutorial is available as a Jupyter notebook or julia script\n", + "This demonstration is available as a Jupyter notebook or julia script\n", "[here](https://github.com/FluxML/MLJFlux.jl/tree/dev/docs/src/common_workflows/live_training)." ], "metadata": {} @@ -73,7 +73,7 @@ "cell_type": "code", "source": [ "iris = RDatasets.dataset(\"datasets\", \"iris\");\n", - "y, X = unpack(iris, ==(:Species), colname -> true, rng=123);\n", + "y, X = unpack(iris, ==(:Species), rng=123);\n", "X = Float32.(X); # To be compatible with type of network network parameters" ], "metadata": {}, diff --git a/readme_figure.png b/readme_figure.png index c5ad0267d655cae529b8ec334ddbbc66fb4b3b9d..77554bfd918f79955204fb7b73db1bcd3eb964ef 100644 GIT binary patch literal 23421 zcmb4rbzGEN)b0QxNQg8N(%mVI3W&6HH_{!_p$HNJ(%sVC-O?p3jdXYC-2>;l-|zl; z=gc`X!#lJ0+H0@9*0Y|q-V^XqMhq1R9|;11po)JGk%vIwNFWfHpNO#FFH>R>2H@qX zzNDB4=Q&zQh}+lUH0Q^gBt%AAAWzE znsc9LzK>1f=es7-M*FvX%zuqSE{*{MWOy^G7sUi#AuXdMy3jvxus&e`FWyvs-tge{ z&+iu{-~|$o2a5w<-e3y1Kv!U*kEja$vos0}GI&W1ApP&h9k3J00&Ww98cw6uxKGla z1;L#ACPUaxp8QOs_Le$KPE~D@kdd-x^K6aKnkjKI%QV3Vtb6+wym5zNixazTHWH((n*3ISB&&6T5!?PGN@o=l^0K1HywDB27?J+7wI%> z3*vn9yU!%#u`_)0rsdSL_xPsqe7>*Di;kr~MM#L6dVNTs$s%x-_ak%-p;kK2X>y_8|5$-|9D*K($+5d7PGcV#XkGw2Ni zxAnBKx3{rjf&{c@<#5xLwbj*C<@{4?uWJocQ`1kMRxb{hW0-ZAUb`&?P|GCoii!|>Ui|YopF1Dq z+aDVnlS<}Sv(l|p?+PK>w&C_S8BFAHns-@86T11RsX5#1b?<|Kh9T$T;xbvJb-p*J z0SxQ&&R@6v#_F=NvexO(d$UuGZs+#vgP#ly9d`c8CC_R$I3E_Ml~ai46%^cTWQ6*g zY)=$sybsRH%`KhT-EY2cqNp8s3cJU9XE4)bG@XfA)$nR8D|n<59Akvu*OIdPn4*){ z2p;1tQxht5 zcQO4`*bl{Jw{p_I+p4-)tvpMk+UovlvKj2|S<3y*cmI)*5v%&$s!l~tx3e7;ZI@MW zq^QYsbQWTCJYCxTQ29RZ#PZm$MG@D56_pxv4aTu&AD(ZMfdhcv{vxd1Orh>M1actt z9+kZcd)f6eaGK!$?_~Ye$tF4RhwzhW^Ws8{>OrG&Q;zr`TxRW=8nMtJqMO{e|HP4GLhR>AB?wbt*K^3P~wb@&a?A5a~s(8(63b!3wkH55B8aL zs~_`Kisx%=mw`py+}_G2a+4O@)FBP&Jb#MPKE_AvWC_R2%v?UjkW!(_kp4GkoR2Q5 zchjM_>-l%#LS@Y1Ij04W>)mSW`39HO-YDpC4Zx)4b=q@3TZft=Kf^CS;+gV#HB3Jz( zAu1w+0CnweGQbgD+pS(mX3_gPWpNiV!=$;H?6x;m=V-34ukY;aTx~VyoIGj5G0By{ zWf@;xUG2^fOvUSFnHXr1-~FPbL%(NWV6MT19X%MIQzBEoqqFmsfU5&Io^o<>>_+|F z-Q6D~B|~S{u+GOwhkobcTXf z5Nj@y78i9dz4cWHEp`xu)vZ#l?lWtgD;b`R>elFS}V~ofUfHk55y; zXf>^4TvXPYU?ipSpwrsG1Ox=kR9mmR+-1gS*KHToM>2I$)|j`1XCsP2LlcC?@@S<8 zC_}jpX>y6nK2Aj;H{io>L})NaN54I2M?12TcO-n?mB?d%bbMTF>Auz7S=EM+?C~%lM_abDiJZUNssfN{$QRGr+w|9 zj*$$pP#CzUFNV{Eze`b#4kqzQQz@94nK?UOM6p@{2Mdik-~%}GAUwLmdwJcL8xOtW zcP=rXECPs{26*{|hzJ7~ zyr*cd62rsoBD}sz_jiOP4kB}7%gHyJ( zJp`_K9!vy{bH{4~w6wIBmzQ&Ojx&vJPVe5eMeVz;#ThFXYg=;QzqXpa_#=!aaCS6FN7{f!NikmXlyJEe!q zU6{i=Q{~N%K(r%`K>$j(q1m#BC|GpCo zf^hpg0Q3HO;lAnq?i#rKMQ(s4JUxIH2f^LSY7am{Dydi&$E~sH>1l8NHg=_cN8o&;TUQ740d+xEEw;1NDegyC+r^V^t1;TH zroi1E?J45~Wm?Ug0K@D>AO}%_MYjc+xq-#7`zLrj!=ysN=kWKpv$I=y3J(t-gu}?| za%AY>aH1e9Fb<-CtkCUAnAj`Q^t0WWJ~Uzu{9xb(O<8n585?^YcHrVNX|mu4vl|ae z$;ix?=(YbO6)r0;ciNxdU0zn0v>Qqm0QO^fws?rZNvG>^GAz~?%{aP!DxzrJLI)fU zpOe+x1TL4xv5=2iCKVaZ0@)LY_gb3t#s&j-FATIg~jBELc0RxVIwH6BbH z61?7tGfpN=pPikhPIg`b2<0n@qLLCSv0%ged0@Kp&fn6C3XaQ9DS{p=-jw1a4KByZ zMVhpTUr7*Sn6xmSd|9q>We!hD0w$~W8hNT%r@6mh>axa4+QZP$(AL)0r#dbwDk?l& zp94R@|I<{H$F-4>5eS86Adm$G2IBMBtw0fQFhN(@&dKKJ_4&S#Zi_eASiyr9I29Qy zx*RzebKqEjS~N@bJAe-;o_t?CxzKxhe}vC~!SWSg-+{LLnK#8&a8X4|XbHTITS3fC z+|CD!;L!DI$O|0zvjaG*+3YFMS7zOGX#|3PqqU)$a-s4hYSw>Sv{%I7(ob=52=qp<|F^O=Ho7t2t9vfLRAV9D)jj1CT? z`pHiDkW3ug>UV(W0Lq#GkPC$7PR-_8&Z%m|LgnoQ>!wooI{o z`_NZf+S-Bu0=wHI&51t%&Ug%iTA9f(*=2u)>1dX0%H6n%4m9j~UJaZA9B_{Z z@IG_%jl8C&X0>g*&fQTANc2MRSl@+H@EG-ENyV`NHC=)nC6NbuB#y*=T1zaZ%7FjO z%FG0KclPs)rPHjf7&$#N6I0Vst6($msBON;2&nkLWqk%N0faJWP@Ju`7XXfaJU>Re zc%jlfhQBmAGI9)zlbV_e(6iwE&GBS?6HTCW)XNHWOCMQLLke-TmDLs6wmNBMt3NxJ z>nY8LYcEAO03w6B*4Y}{Bgy1-fZ5$v<0}dByDvJ9rWz7ko_iQK4|i94i{%v+S-Drk zpn*iSEMemrbHl+q8=HgjA;B3b$~jkn-Bf&|06<+^S*fsI(9RfEEt~*=0ALP9MG7H} zlGIcH0zS?^Nz2Mw>kcCi3kz#)ZNh#wC*1%4c?c zeSM$$ye*!b)C%^eqoV`q0U0^Dfq}s)JMeG#!S4r+fg@{ZxasY}#|RrYYc$hS77=;U zWt*9qIW;u}!jj`~NnRcuJ-sAW$LHPY9@(fW^+J%)0OSZz!;wOMai!P&9dM9PujF+l z8{-eo_ET!!1kVIN0Nk)}aHcJ|g5TTw7(Mk#)2ja*DtNUq;DMAq`~sbx`{2HSk+92l z1Mbbnvjd^m!-t`B{B{U4s(QIH{Xns*SESchPLOV)m$3RFVIO{P79CAZ$E5=kZLGr^0jIYIAM`m)LC@ zml{j5g{{*7jhdR$qK^Xrk_5bg2Y{khiAC{{Vlp!A-}q4G!iFapzjeCWdI8{BkSa<@ zNYsJFgUAO=ZNDQ0%&8uc$ay$N+c^99ER}!|+kkXM=rco=*XC zT?QHe4?vm3$1ce_+01aNil7uzkqrlYdJt57mPa>tOvJGKaed~t%if}N*yvLhs_PHJ zb_4_HCu&lz_XJ7>X$J8j6!a;4Q_e&N+JIaTIXzcBLX0L-_y#^L{?Ay|29NAV&ND#gb7kX58HS^rI5*`2>_c#FtQMVU@aXr_E3!eDO{iKp|3@v` z-kGFi!TyiT5mbSPbozLeDdwUkTc;6bA7J2i4ZnKVuM4;xyYcy3lPV}jQ3$h2{UO1tt;**vA zM^ObNATU(=`R{ywNGQPI$KILpm|^XY3?b9+Ilz_q6F%f45S67h7|B8^G6bAjI3Nf; z^t&T3!gRaHoPZcMqyWq%KB`e=X%D&`WdHL;ARsAOPr$>rK8zXfFY=ID{Wm5hS=*zq z7>~lT{&!Y>lfy=1K3dnii_X`K^z`{fMIph#h3DBuw$5m_EHS5GqteL+}a6(4ag3l zMAd*tG=;cpqqbot9a)kIK(?=)h-{qfWIxs+eSGB#+YQLxEN5AhGVfE6IWVbLm;jj^ zgG#MnT*B&=OTG_c zk(EbZ7W3~8MlyqvtTp7(4Yu9FMXO7Sq|c>4p7)qqj~4xLb+T}a|4vC87>xg`=;P6g z`3D0dDJ-(`fM9`eXo#xnWcoz^o#2O|$;qBSx*|W|id?9jqA;PE2;htVoxubRJEMnn z{^x+fz&QEC=wZq%^pDQ-6Y$k|g}CT0&;H->m|MvtBx0a@`$pu|ixSn~5WG&wvI*Xw z(E36PWsL{ai7Dd056>HjrG!ouvv>y+ZP^q&)7dYMMtx*+~w+c&}Yo%H{c+xQ8CH5nUXCFk=8ZXa?0 z1~hR%>2eF0U}MLC8W7D8CM*s*E3Y*fUGeV+;y#rh?V0KkTR|@!LpxNdt1c2PM@OXT zTX}SkVjU!UXHt_%Vn0NlH!jH-B@Xii;~{K)OwNA7KrBS2 zZ$=-<1l4_u0xXU(3r1L0mg1w#{8YM^_yOI0@Lru$>9zC2CZ&L9RKtdMQOjhU=5Gd@qa)UsVv}R+$T^nFm|v( zUEjx81bBvfU9vJP+(%_rI3#11jLAI{`NhPY`!M&P5s*2@4Xn{BWIcKM6x3#}09jI7 z^<~^FiPKyeP`RqAe?dM^5~z5#DOE| zi(!Ul`yL)1^-lYqlUhz5H-}w-Edz8BrYyf62fokOuOLAx`p}}!!K>sk?6r?4e&?Ty!&o$t5L=J+qEL?hTOa9N<2Xzm_;`$_I7mb0vVz<&*) z8v*~rtXX5T>>w%v*y;p8{e#LA0Yj>GIK}4amHCi)c4PScUO~yM5BByw5znPsn-06I zQ)4oO5%qtAgLB}Yg#{rRnj9;=Ymn1nU|bg_YW`%lPR5f?(_w$Of9zhBvfztnMBlG zES%zzi+{?;QL_u7(?Efa$AMbhN=63wH&9B{bYue$c6J3p8A-{$o*vVskvqZ^L2^q` zmkBKbCJ&a^YjIABq{9*_(6h!8b z*)CnEc*S)T9Q1qo!sqfF-hyj?$ebNo$n?kJOTOz+8r0})ZuXxqkjQ}Ne?wVwDn8iiZ4?Ae~PV(ngRmDJY ztHrc9F9Uc%|R6Wa?$8T$yTe0!cVM3Czjv=a+fC_>ZRH^wnEQ zXkz9Ches0~2UXr$+ZWN|SjNwf7puQ`J8BioGan33@+^2$C`=SNX_rnXJ=*ae49xa> z?&g-%O=NE3`^XXD1RSOko+ppLeC_Q$4iV&LI|y?rn5Z{zE!xLzvUCc3{AJN|n0&pw z?wXb^ZR1Mqxg#g{N|Q9ejcb8wJ*0-?Nx^%$E(&4?hY$DG+)S@rTvq3U)$rq0nR~biYBbjKl!y z^95d%+~ENuUo@~T)4qB|>E`8mJw8I0KY|UediT9;xoQVN>TM{qZX>EzAKEL=>q}(H z@&CkyhuJpDAzS`x%(w#gE+StxS?%KHke|O8x-XtS-i{IAWTnRKACSrgtd(bF%I@>X zr9P^j?nQ~!t961G9LtQrq<=8#G~a9nyOl>3LGx5iSlEe_w#y3yp~EV!TuJZt$cnwo zq2{T=hu^01gXIJbZ@^|XZq3=pDwmv2-CY?I7djyR-2!F%R8ip;Ggy?a>jm~crxRg@ zK9p|t!+{y==*hoPOwhI!EFbT2L-(gYP~A7MmAA(;Y}Q-hhmv|yg$xUbuXcR^>)Oil;#CYFJ@Ileq% zwvb`y1vYSCgQz+Xa*T=?-_&A!AV3&!%O*5;)EQ)a44@2j;qmCO6O658P2|~61o>me zuN@hL6w7To*OJ5{i=d$(9vK#t-=+6(j%Kgpf6Ki8PHWnp7?iZpL?$_XO4B-$LDUj|O=ArI5^-`B()_#XJW&FvLMN2c52X zc!k|+FW^)G0muiaxOpz4au-nE1NB1Bt=w!-mYMw}>dLLW4`$hiiCAZGUErepF1g@o zv7TiqfeXcSuZsb-4-(blhNE6k?8(o|19_Q2`BagaZmH_k>2@_B&}OXa8L6n&KxJUy zW8U{s$1U8mwr0m&SNz2B!q<(By92nANTqauJ!)Y*M6G)9xzU8*+fvaaKz0R6>GZq!AMnr*HTZsQ0AS%Co`z16q^nFH{4yX`8F=HOr`cR5a zjmx41VCw$8WH;i!-YI|Gv;+9==IgzBkUQ701A5nTJWq*yyGoUTaJ_b!w?N(P z98=S5VY*nzpGIii`98N%Tv)zK+R)K8lir@^)jVeE@c$w)^}^^+1bFFY+`6Wf)AYDn zQx;rFGEPbaQ4-^sk}*Ij3I0nY3$Jfk8n&K0W~(0M-E(Tct7plrMsUB0`s!m;3tqK$Rrex_@SdkCasJ=Hd_)=OZ(a z*xt=bHsh3lQ)$8SNT() zyHyD?kf}oxeSo?Aa|#M9fGOA|*e-Y8-W(65(_jn(Knofq4uoDv`=bRQB{#J=-yi=v zN2BGBZ(b~2HW!N6QSpkw1csm~(oU`_!r^sy!?#ys0P6Z(p~S?j`Xa#i12#aB9MC-x z6aoHJGKm+UKHvmOoxHrR3r!yR%7te0c^VYOOx5NQ^~#>Nq(pzy=Wj0qP8h#93F6H* z3K^jQ+t8HGfCZJ3U6~B&TLT%0(Pdx4W0;5{l3aFrg zB12PcwTg<$^~D8;rTCsmxY@Zi)AgA`)dU5`@(v5lS`lrcwfD#;+SJVRCl|#3xAV?B z%6l(Bufgqk&`tCG20{pVh>EdA?ZvcD$3k)Bj&deF6tO*FaOyN`ytzBlz1Ttf``JTL zCjYhz+gQh=#Av|?Lo`zoe92_^Mak?dVmzF?KMOuaP0NVZ8M%!4s+9*fH(i$5p)^{! zAYS(Qdasgl6e_3-)OX+C;WGGrY`Spo5z?utxIr~m`!eDP7FHpGsQT)gUPJvX&i%;u z>3<4D4a&WX@>lJl}n#OLh`HgY(b4TPMAw--5>g0Cf`!1NuP8es4V@-1A4!TWv@SJaR@ z97ljJmM)ySEnIP+fjSm%<997)v^Noa4!752n_pCzFFOO2K4F{Rt8woabGmU!d{uay zR5~MB6!8^#o_%PJ%v|VgS+&`yR|egnQh`gc>$X{Y>N9NtKb8iG%`K>ePu`cPzw#5` zOq$ppKW7f{JZy0Kp-abAsHT3=>9eL&9q|q=3uFNYEQqR%*zG5;{#2iv_l2N(9Tuqk z_^7RUw@@93!*3?D5JJ5RhPx35U&HTX6aId-dpTWpawm&x@|DOJw^yY>35-{H&824v ztzt@pwoZBbej)=uipDy3mFa?@X&YnHFQ((^ZM6~Ro_7*u8c{_6*|yx0=(c_mRKJ_I zZJ%zq8P2L$m)X?!?JWyL?L*S`NChkFn5N6j2WDbJji`FS2g7@oW(ya7e^>12q0@UH zgJGVZ$8}r1IF3Nab)&(8O|J#8h__n{7wX4Q7VEX1dN&=s>Fs64xMyM$C%F>EmoCD) zd)AIb*Ow?Y8rbOB4qw2km{4Jmi|F4E-7LgDH-4JcRK0Am{e!bzx_)4D20S?flH9GD zH4?^r@!oIZLr*3;EDi_BU`szSkyGCNvGr~eq3z1{v(<#nF}<1G_PjrMx0X&Wj`ys;gd1uWS)*AAS-BveSdQ z$DOx`&RY9hztK6W1{aebHV*3w*aWbBL&~0~Z%;#YoX_!#WC=Q6`aF#O#sw(mJLLsi zpY>$oF7L(wIh+LB^9I#g-hvg8)2i`tp~CUX?aY0r?^J?H5AsKT^M-$&TZwk8r^8fj zKUB?LZ)|APY%>3s%Q}YAJ`_CZ8lswpbQyD{s_5u6&2$|4rO~F_lVA$*^^d$MD zF}?6Ki(w&|UvA`@?`SAanVHtvMZQCYm|K#GR*M|0xe(C#(j+2c$aVJD)bKhxf~6`W zN3wiE#-SGU>Sku23z*%P)56bX=t>lz)u*z(!Mic^EA71qdwa_t6R;sB5ay(2|EddBbk78L%WmT9^>Qppaid= z#fa|{TAp)Eo|O)VO0ipy5>Br$4CY@Q2IVp4SBPt!-tLc8ZT>(234wRlo0hK{X$p|z zg_)XJ_0OBE!ZFBBC>yM6@XgDmQwa=WpVeduE0Vdp5x%)N67S4;yjTw{lz@S>?CPPT z%<{_3_iN169-rr2`LbdTQlL!=Ra*BNUxg{S82FbLq;7p5LRynya`CFVoZN+ao>E}h zuhGa=ZSpsjI5HToo!ROS(`znbJ&ii%LPPPdf6@s%8U}&QzKIh4yz2%~U1J^U$nVPb zF;al{h@LQfeA@SC1Mjl~id%)Ix~ry?;N)FM>~q=KJLiO;dNVYO(i9=NdN2FkyMaX3 zzx91@ii_Ybe$r%00{>|O3T+=%-N=vM67LOI&3efMI_(epZu#}&aa#OnYKd#ANx z**Ri)hyI&wWy#K0ZhN$Dd%1zdLSw(PT^(z+V&XfuP9#)7Co;&s7;H!hg`IwvfM+&c z<|pdkgF4Sl(=3Ul&0nt+ZJuEw^l#1-cA-tn5S1qJ%9pMQSlX#*W#3`VXm93$Q(goN zxQJ#r#-*D2buoSQR>6gQ-9b&T2Y2rInA`O?xnQ6=9*^&g)a`aBxP8MqA2Mw#_E*PO zxh(Bo>+@tHgUo{DT??$rTF0mk=2SCP*=FWQb9qOVIT2jDBN{d8%fB8uik4-BQ>KCy z?VhAF?k3+a|2zlQoxcsy>hj#mc>*27I~ow@vu|SFtoaDe@(I)o4f!(@Vo7nimIUvQ z6=k$v&&kEPMGHh}yXgj%Z|I*^agNVS9KfW3?Q|fz&dtD#Vgf{mIt|bUfgqhLMSurI zfHZTIATMo|4)rhJl5^4TvmV#nDPNX%)uqdCUTZsgVj7i>uK<+GDS#l@ULahEMp|@^6I-7jp5Xt&@!@v4!VB(C$&L7A= zRWrW+`*jE++HT}tumSN0UB35}z<^Q6sriDA8N|Fw3Fpk4e_9Gu4Y^tnv;N@C3TDd8 z=F0pITJeSwQKj&GcfA}u>6@>=9ivTD?iPW?if^>!j7cm23#$RQR~)DwK3yAUnx)q=u@0+X z6Wck8`nl63^~1QUQa|{q7x0vgZm2OkT-!au69(77;t{i!;sprI&nyfT&7|KUSkCNC zuN1kzi5)oy^6eo2`3Qw0aoT?Q+61}UCfIch4Sg6QwmCumv>#xw=daf|j<0pCU3b>_ zA$-S$$8Mx-J-O4_eprbs-*OVbA_D-zPXV|m``%FIKEPj5uXOqsQ#@A^XW=fbSb|MV<0oKI<*85K9%Y#e&P>t?5Xvkz` zEH-2;x}VB!<7sDFjVmYo6_5BMN0j8PB?SX{1N_JvQeM{v;DR-(y5DUN1tFLJ?h!(6 z&-hz!5cf~609RJzH-gxf5BaHwnB><{&%K({71EiW_Opi3s?nF=De~K_am+S%`gchD zx952!stkW&AtpVbIOvI)p3xOP1)loImNg`vKN$sUBx?kIH0EO;$CymWN{jrqX#im# zGev?t+kXvTL&k4+MQ=N|eK1@y=6r%ozY%qsE2)B3fa6&bHyjrmJjBEnpP&R(hn@Cj zbCvT!Qj@Q&(y@zM691FeY2_A>**!BAFrcRt2Kv5~wq-?(!=yK`pas~vDzQo;QdgZ8Y{b!|}fjCoip@37kC%^>ri{27vB zDj=avx6o0{-X`EEg*s(O{hOVE(cr`Prk zbjcm%c`L;f`TMWCG-@wYD8be|4@sM8i=An!adI1BjW`()-CT9}C*sRN^Ofjj#}9vf zA4io6BOXV^bY8mQ^|N~MaKg)V3*(UKL89`M9*n$R#yB^II5&TPIrA~OfUODI_`imZ z7Hr#=AGh{Eh!*D{ED@3G9~ue`3=G$KFDeRJ=QpQ~KraGl%?EikN1PP2CDA9QFfWf6 zwC;lbNHr%TaCc5HS_0@3{rac7bFnpY_nG)p2{W;wN&|_4el(X6C)~{JvuDqq;=k$v z67nLvxV|pNC`)6l`-7)7O+#iEy~a04RGq?SCFC`$mT1gpKlXXegcF?=g&3nyd?Pi) zGC$C&{iP3fEoZgQ!ZQ+KS%_Ns*^}r`aehs7^ENHvk}H>ej7oG-14i{#an;r5puPs0 zx}fa`pl&x+ZoF_&2HNPdWmB|4HGY-XS&)O^|YaPbrFoE!;37YA7Z`S!&uPu4N{vB(Y9&1u3>(Sh8h~2HRk*{$caiQ%} zb8c00PWwFhTIBcD#i5+##fFF=U0A3;d8P7~8G8h#5zIHXphwbtA~|*3i~vP7SltJk zkgx$57HGF6Q?IpK1#}hYS9EIZgtn7|_S?Mte9)MQS`@&k`iDZZmRd6>%(FX+YgL>N z={HwXf5`_URje5S|9@5&)$$mLXUJ2)o17t2@8Ze^az-`Z{`jOsI=rOL5A;0(Y%h7IY|RyBrn=*#6QP9`ImnChSO zXY_#Q%8Ob7(UCQN+pAaq-UU0jar04peM#6_agAG zm|st9fKayO&#h#bNXMp)jnjAVI=&L1MyjzKPSpVz^O96?=l8NDK^4ua)6vXUeK8@C z;z_WvF#q~nY^uZ#9IGbtLQQgQS62Vp+Y{DiTdvCjaIkP2yQ)Ykm)Fg$6-9nKpUv*;^7HV@%i6<&qHh%+qlnd1&)U}4zu=OkHPcy4ZZ zqm5}Lppzl~8wgt=N0Oygy|$YL0`~>v%w1tI%QS-#lh){#5dw~fAbbp zzm51S7vaofE&jYdV=<9@zkhWs%1b7P0h6w;fZ8*PfQ4}AyGU-E{_6cjiR!!%UVaS( zV)Yyr$9tPC$HWkeoh^_&-3$S^=+8KT_MX`6y}v?Wq*6X+S<@uov|VuO-*dmuR2Hk2 z6RWPhQ4ddC^#!&$$DDpk>BYaoNtj_~aeMz3y+ z=2tNsnn2emJ{MAkM}p=<*a1h zk6vXUg0L2enLEgiNVIk^Jt!nyvsICyIH_9m@crOxZ?uplA96+LgjmE_tpvg)&*&f( zG+>@rl=CgW31Y981N`#{q2{c@;1SCxSfTJIKUSMmDwHG(NI~m8e0SK=?n(e~1*h%q z_TUGupbYLIwZwIXh7FoQ`{)%ONv58@b-*i*aT;(Zv-=&qMbBIOAjln#jJOKd=4`+s zj|*|81G^Ra>ErMsqLIxL;olg;xe}&2r4Feg>T%?+ihs)A#O=M6swTGjf zUIXhlG{j{%VIBjTbpo4XN*Ktj9ndOcSV;Cfsnq_@zPG)G89tfvuDKOk1t-x~F+-(#S+iC*FBL;5!2ZmgD8Uh#&8Ioud; z(_&)3C{_L#Mo$y`L6-TcpB)UCJ5o+(oiqdj2?Pl-;G($jv^GMKe-f{8t?)p>>2Afs770 zQb4bTLd>^1b*hG8)7R`tuyG;pvlK`omL?CXj=Ug!N1ZP~;wyeJXF6cPxAD(;NL7~`jS5B=$2xQXN8yHd> zyMmo9w=?V2QXNJ+aKVYzkV+Z!ynGLL_mxr5&#fHytrx@CrSzjVFkj>>{Z_>t24)OK z&h*T+UTWCe&$K>aKnLz{Jk0=E`yZn&-hAZ-Z(l>EkRf+8Hxm$VA<+(butb9v>kMUh zqh5v>w!TcF%MrJ(-#Zety$Pn)hVnN#NnzrL#u%txxOPZ#tY?(uEJC` zC}SlOu_@oVgng7FQ;g)6&itDDA&0J+e?y6QaB&D9mU{tA?aht{Rex8?G4cw^`-z!9 zKb&A8R^Tu};#FgDeO*|E?_XzbU~MW!MJh3Uiu=VpBoVi&uU$CS86$pXFn!71<2g2% zVrr>dla-=UfUs=|D(;{cAm57w>0OjK3pO9U#P43V>|9l7D3O3yDE+8Ksu=f;E`pqS zeHxjKj?%nvd+*?sxc<|#S&Ab5Agx`g{C2xqxGfy(9O1wVU{wj1d(PRtNmK67t3TGbe3_VmZ z40VcbY#7NG44|Y-))BYZCC<0Ff*FOtu^hiHn{O(ZZ@I)%vnHOHQMsdKtv2+- zhTsE=AEFr+m}w|PHMQ?l`%k{mp)6cRSz-!%WI}Q&R|4GE&gz;KOpKYos_^yq6CJA7 zH|X<47_adv{m>9Iz5i@;{i>1I z98I$g&On8k7c7|H4`hEj0X|)pXbX1hL8?)}UCA}Uh}KkX2~WS>M`KvPt}2QHW-_CL zLa6J?Axcp%TLcbszwb8z%|objR*}$B0quT*G+&F@!7a$)i&*A#pd3<~^$utO`c@(> zOTGZSrT3lAyT7fFFUWP^I?$8S8mVJ*EY%h)sCG{^ZbT`lM$137+9H5Au?3VtORy=o z#Gx^WVQ*n%h>4Tj%R1izArk3mvjfj?#8Z-R?DHrhf|pRF8omUue@sNfG}c{0!OAdJ z3>;+P^8a^1Pynfy3TOIW2$e*lpfFM)Mz;vZJpcE!L}`_JWGE0UzW2R5ojs59=1D=! zo!!~UB@3_h#LV~c&DHzZd+U7}gXOxd<}Y4DjrP@3{L2D^Yn=h%LRzUWFcfW8a2c^9 zfBL4AQ8EUTP>6h_38Rrie;4R~lm)N&gzjV^or~mGg&75nw5iTFO0yHlPz1L$t)Wh0 z$1CRY+1l$%0p`6i=?E@lB4hpab2*MWdw0wGPQj+*0iN}F^UO#?_0o}3*U}%-l!PUC z1`5*Z^){lo5We64Fo@`e|6yE-ZFDRj**we6AB)1WXN!%%wrA6qdHVX(Z|RrAytRY7 znD4~bBOUhG>e0pP6(!Zj zQDM^)3DGI5voh*sV$)~F(&}Yiv=#2WZ)r?q{Ve=}(nh1artnfLm;NWWTw41lLfYYB zHcRG36SY^g3N+^8+ga)>@0`>fj78p?D_)+Ge6#b-roM8eatFp;~UpjP0prSCiO?*3UGX@yomh@<<3^JM&37KecZ}a3ZhQ$9gKhPa2REaI^oZ~WTl@=-4>Hd ztX3J&86FfFDK&TLqwO;j<>jiaaIo0DLL#R{Q4lx!W<-;vBxzLl^23qcK%2y~fsYQ7vEm&SsX6xpLtjeKyy0*RerVBT zFQ4Jqy7l??Eq&J{q#r#*BFbOX4bvuQ87||OBP9j>;gVYABl<>pXBDGn2TBx0@T6dS z#kD&m7>=}nFom5ilDtvV3SA*Wib3YdlG;)H*;T?=BT>?t%FIP_(unPf1r>iB(&E}A zXzO64Rlo<7w9UXMjCfK9VOE8AqrO=|(}Q*TSt9Q+{Tw-kusXSQSv&t?o3jra1%_5@ z;Wc!+C+7SCa?vk~_}*E=P+Ph*eElj(6dfDYSr8c;*4AGU8`bs$yu;XieQ;g*M_TE1 zm}OJmPjeYevu0|l%Rr9p!x zl_|du8-Y>p-zhi<xhO2y6gMg3L6xcA5g$Jbs2yQX8#Ca#=2+ z?t2LbN>+MCqhEibu*yd^Bm!U34(q+Gea=ifY*Kmn%K?+gMdDMf=Aa=PE{FW0!lGoh z)EC(%u_8u+7}(I+yc4Y_O%C}h^;pi8!7L z_=Wd>#G=ckN~0l5dqxjWqW$$LZImL8NLvXOySO}^vOERH8=6fjk+eS!ZxGpOG4QcO zSruLc>Ssa@qK5GOp|&B7B^@wnk%o)gG*pW&qim8I7{QfPcDPU0_!%FrieD>J4;O4g z+=<%exWblIC)d=Hsa<+I7n=QEJ2`6;3cP3ZP zTNp+d6u$8}sLgR;cPJ_@8e}1pm_x2Ct zzR!K1>s;r$&%Dm(^N!GmE@)9jBN6dhm?GhirWS%otO!?h^Lf_J$8XF}E5{2l3m!`Q zm407kqRE1ca-%bEX%v;MW~mdu#dzpO1+JZ}nQ>uQ>DosUR}P`|`N0c=>4-;phd!oK z4N0p?Y7#1GR72nGEKM!ND9nx@82TUBxs$+x1)qH)DBl`@d}*B3ZHld zA@Llht;(>QODm;Mu8rADHv`{;V^!5RxRTL~K=+o@A@vmFwc@oJ)24|$g2>|%c9#ui z1+>L09E6wsdy8dEdyQj-msU#kQ+jY%t7&CJQOuxS6^ShJ=gbWI znFLNa`bb^EMGXf(AK%lC`%LW8+Y8VtHC3jS2B+SlLW9G5Y42{4tIh7vXEX7$Tg>I$ z{iv|cIhA<4+d_7_w=Blr2xmjB>G6ys3J*z9Jn<()hR)-x@PlOG z3m@8&EKTH6iG^0>frjb$$NzMN+-uepkFV)4wJ;T4cqWtf%nZqre zado-cC`a16IX)m_|C-B|JtxqT=6Z)s+r#*4CIpi2dxri$waL&&9T@*%5SD3vpeTvqHW=M z{4q;1(>-DFsGp|ZU!Npk@3DGHdDS+q#$MO1$#j;bxX1oe_yW5Uh{}$ny$Dd1&k4~4 z(V6Q$aR^bCcSob`f@dl3o*3!1qn2eR`+Nst4_tiQg6L$*;y9CyZ_!K$-wsR=T!BBQ zIZ`G!&iIc`bhV-!3G_%z!#d)Q(Xl46W6J6K{V9r@Sy*=mpFIbdFLp+5`fkfnZ@uTlXLOZhZ4cJ0mBnE4 zb=4AsQOi{COfqZj1KDHCddK4$!*iswaMQSH_LPrqQm}r1$dOgq8SJN0$7_jOsMTwf50CEQO>~Zh6GW<( zH6Qf3E}y836Xmz&53+;ioSl1hEDxAikF{<->_$0$*L5m`q<2Z0ebTXC&K|$%KX7DS zX2x8itTyM_+#7L2dUA;JOab3xs=36S!+mC#EcLF$5QV&A8URu%f9 zJ=k+R&d&oy29a-@bji^03hwK+j$@s>00W|3$q%D?XhsLM>ikKsO;4mK>>mVIm-4jk z69)ErW)W8DLBUj(noKPoXZe&qY(=e6<8}F(ETMwOEt-bfa)LJOk0hU|q_U)35C|W%?qnMxE?DE*x`_vD%Nyn+*jj<8qQCBM3|v&slOyiSBo0`(zQG z^Y-C=#y^ScXdYeAZ2F;k|Jg;!efN0Z9>lDI^yqRgr2usziY`RnyO3_RxpEau9~Vw%5Ndv8pWx=B&L%whQGtFykYD0m z65Z8L?_vv?<>L>0(N&`>i$t~#{p{8#zr7Qd&iPB|R2;1g!jPC(lqG582JBw{F6UEa zOS~>OgU;|{6jdP6`jt3(=kE;@`~E#}lJ>z(v#RbPh zST0J34LaY#%Bx9wXqxK>e(R6Y*#5$qBz4t{i2!zfPoQzoF5*R@!1U!vJ5^Chi3Z@2 zuV?)nSZwA3l%6DFgnZs?5DgIH&XFefC%%bx-T(6(ZUa%e_xO~b0Y^v?xPDEWBd)b7 zGBV!aE9-kLsqm_fc=pVh_iEu=Bn<@xvRb$f#-hsdm6(jgHB>ZrKlcM^yWl~;{D@dl z>ahI~aC|nt4}R?JU9k_5<^nOQE09qj9h+p%;@kD#@4{G5p zqBUZF_$)cf564Qee`b=Sd&Q(Yt@@%I{m6emA$0xEvrp01s)@3eH~tiTS@Jjc6-ENX zfrA9wSH2Xy&iGbWfcvl~PTAc-2ebB`+vFtdJAz((10=mbG9zoOk$JrcR!<$^TwX;_ zf@P)8aV|CBvj7L=gI&OAEl8%BE_R9{xYTGqEx;NDcEG-0KhRl##d6?S$PzsE6l{2T z{g|JhUswnQ-TD!XM7HLwjm4ebhol-f>srTy1z?DIs`^grkJQ2L9ECzz&dm@0Rz5a1 zMk0~mYQ?fE3WC6F1Kg7Odb!E^`nuZB0Y|g;x*ZS(1NG+$h`WIq#unAQx!bL_F_H>+ zDqx|2xZMqN%vh2*@9*d5G5G8n$c6KFzTas2)mN=hFQo9a{p$+I%k2ZdMA1NL<2#E1 zmoHyti`vI^v9hnLs|&zyfJ59^Z1@2VHM_>WQ(;>jusyW^J%ru6LHD2bd%rGecidRt zsI=d41>6hx)KFe?3k$HRzW_ey{%ZTM5Bve|lfZ@8_0^YlkG<_#jQ~GC8cU6%;$feC zSXdaC${g>h{oH8U3kA3J&dv@RsY8iiWIGx(odS+#5T|<% z?CRENLx5t9zop51e>FrGVmuDqEp-|*Wp}OA2Y`0tyO8h@+r18ey?|<;UvYQNB+S*r zBNUi{S8grTzIgEhrk5YSBio67{$lbs)hxET2=|T2xQ{%Fl|oS)QEf9HwOB^xV>w?#(fo7V_+t9 zadkc7HTUC3)B5MjQXTuQqR4vcD=EMiX<3-vh^$L7<^cX;rbgHpD>iL)LXChr(>V^l2XNRkfZcNM>AldCDf)D+PI8cA zvE$63se0r@@8<0-^ROdH!^6{4Jz&w6VIJFm9H!Of|$n%NC4 zG0+}^7MrQi;E42Rrs7Mo*ot33nY<#Iq5g|<@Af2jejU&Y096%#LS5AMZk|lrK1+pc ze)}%=$F|b>dpDiTFCukdmcBSS5B$k{m*q15Yf%%-?%ap+sfKw@xL6oPT5p46RpAZK+xXh&|44m&bqoGgc}#fp=_{r?RA-yy_Pa z0C_Gmc9fTw_vq1?h7yw?3{q`#;%GALyF&q#TyHi;#|k(}I>r#5`16D9&+2wN6_ zMlrc4dyVg`KfyvW6cYIR+}t3*oB*g592~5xt6SgDz{#(I_8PC*1<%ds=qR0M{ha6` zi+L&n-_X!dTr6^XsJGWlGGl4cG2+K>Kq2z|V3}GN;VwxPkKcvJ5^ZV?nG5EBfou4s zTfcN;gBXm=8kG{w&n+69aw!xjxGI_F7l&Woi>1Y2czBo^Kkz=)?Dm42Jrvn03(47+ z`6_;raL#yBeyoDm*hqi>;g?>(QEG|jF>Bwf+5jG*t0u)g;`LH9+f>5HT>W|^|Ne78!UA?8ueEGt<*^;AUjFZqS&*H)TN=Cd1B#s# z7f(XVdC}E}P1{kiKzV2b7?<)`5ODgS4zonOd9eVBo^^nrxm^89AkB2=AKP=$!_Y(` z;IBnsYHc8alTBiVz$^l!L%g^U9zbcdcMfsBp4i|_!PBZXx80a9n-zKusDd-ATX}zu z&7BNmECjG|qG>^AYOi&`>=zvpFrNApd>rbwg>X*lrT7_yhUY zdxQkb2)F}OBR@YsaLb(V#My1I*>7syq8-11<~U!!h<+$WT_{*L5WX_46#5INlj&rh zLm-V+i{${m`59rgTi((9GAbG*WOLuY^W~$HlaiC8u>;UTmApxnJUPm{86dF#%9P0Cyyqi18_^sAS68O6q`=5nG;@@SRRj$vU6*sE#^q=p7|9>^ItI?V(mQ|N`hGs>hGtjw! JCTd-c_&@y=h+zN# literal 17581 zcmb7sbyQSe^zQ&FDhLkJ`E@`#lx`4)5~WMJySu>v1`v>zZjf$98by!>NofJ;l5Tis z#^3s__5OP6EnPFtz2~00&px}){urvHAoJip`F#ik@<2`&t^$Ez{Re?ye7uVZu1w0r znt-o6#_}?7$PMaGW^+L-1o9t54*p!tGj(g$%UAsIl66jOO2?-b-4PM!T@5zbU%9wS-_?AED+$OMav7B%0pCjYceTKm z$cMn&;9~p#*N+k;=D5hoOtsI+P7eRjQ(V0XpC)QUvE+QBhFga?GgfO@!A2E-^(xm( z1CJKNId*&@1LujjTal}Y9jrGhav8THAH1V|^$}D)49JN1#FqO7&Gp}qf^_eq zgU%F(7O&Q>_=%yRp}9GBSbGS529M478==vO35M8iNj>}Z;XH%u{lo(Yi@s0KybsrM zSoPp=YOg=h_@bx3wFZ=I!(4F_+BbU<{9Hd%*~eiI-asIP(Lp#Q*IO217s!AMzJu+l zTAPV-gMhQ;M2E>Lv$Or>1;^9eScPOhhdF?_1c?0mW;=Gw05n08$m90YN2`gkqfTxK56FW=62?u`GY3l5-1Jcop;Dt_vt#j z>8i;sK=u`y<2eazI>!b#M+UI$&2upO#i**{lh|&SvM zy$lMeq@i%<`k9sfl$@VyJ@G94WWxd12N|tk6|gFsiL%f~4D32}C6$$XCDjAxSGxfm zFrVYCdLS&TgBkNfLw|w%45W+bjMT8-+#fGD5OZ4mPQ;?IHB;|$ez-25$W9ckY2fE} zeR)=>#+>vLCHt`bZib9h_bDs$+Cg9=(nWkCPdOSr2(54uHjcw&m7?nMB3@wkLNOud z3Un;(8NJeBM+T?mebu_2GtZ}FgsBHF&kizt_qzxEc3kRpcaW`Dr2_0dF9l%E?GN~H zDf$1FHSSbRZ(~!pn0CjA_#FFQ?)@k#()T}G&XJDbM0!wXxdDM`{re3EhlIW9e53ee zXZB(wS}zZWicF*$g>8gTt% z**VQ?B?-2_36%5=HZ`KdSihVvrGI>4B8v3cVw;I(skYmcRY749GfLG4PbkU^TCahS z&Q=(`o3(eawkGGZpE2suskK)3xBF2&DC#p&rq|!!4_iFB=*Lo5rxJ9vG%+zr%y1nm zQU_#f_DujrGy^$ssh`P9)Iaaw@OJ_VwwmTKr`fU}|MeaAU<%*v53%b#rh-&wpho&m z9Yp2@rq39s0+)%RGl|VOJs{#V)kXU~!tz2$ocdz-a2gXn6y7U7T&Mn$FnlK1BofB!C zO&1&THL6Q^U7E4-KuQ@mXY;Rh_2mAMY4z*kv7(}lz9g>Ajg16$z4Ob1LB&r`7l8JW z3A(Q3ze<_&T1_J{@OA3^K<3c22R^wPguQ4Oho5PMO}@?R(29Hr3EXJ}^zAmE%<;{6 zE+yNPL8BWQ8a6gImZHS;;lRX+QLcP#OX9L%*K6RWjtcJrngd?fw=Lwp^-0vPp{S@* ztB=z13CzD$j1rH=Glq=Iyb0J_bARvavsFdGb-LF109P8*mA)j7roV5`&QVt9wInS1 zuct7Hp6kyMMKL{f^;)C$U^vXEdU8N;vjm&Q*AW;M7R@r@hTfwu#&%zw&){^q9lXQ9 zPlzM+T>5!DW}JSc3b^!%wgAfVRSE}zCdbFegH8Tw_*V&gAem=##$_;#)2wH-P!&kS zr(Kxt7srJsiMr0fY55=ilrdL&6kVuVU|u~5KF3E#n*bNiVtrSI<^@h>TrB#Nc|8CA z_L%ef;cHuw;(yvsGZ%1kwX%73uzK?wTa14-)n#QMZS&+pvqZ~Ce;FmJz&c4sQR``H z@)@=Tf=PjCe}~d~ATtEv9U7(DM{QX6ls}QhC$7awJB+{+EVW$hqySZkrv7ZDEtuvm zuyr(lW~@*t>aksnOjF72w`H{YX*s%jK-6$QPHd;fm^c(3j7Q}?l<+C0&Sh1hd?F=- z?{Tzj)5)9K_E-i*yjz$s zW4y~AhC1LT4WF*LH=Yy=M9cn}zw_?v8!T*d&Ln)X%ct=#UZo39ZXP9*=rw!0`T1S7 zy}@EoOfO*!LzL(s?Ryj0z8X4ij22>3wQ|62_Bn2xdzcDl9NQm=oY?J1MAsQI5!(!G z)U2pfdP;oPS#yR!7|Gg$A8cNZZ@aG+v7h1$gWltC8czwOqMa`d7)LUQ58j z;fWo#{1=9FVqEuNNQ=5AFI@ zX)-8wbBX1_e`!3}o5<0+)3CYkFetE^3e^0JQJw@E4UBh>u3LdE;rf3b&`Pf+n6Vrx z+!CK4X4myZ88Pq8(v@mApieVRUI)OVh+Us9ne`^rB9SO_@fL^uX>0PTbG@LL!zs`j z9v&V&@AbShs;Q|dU=kwObn0+PpZ*?}=O`^H5q7LxRbq^zA)Fr_J>2i*2x(uacUeu& z`Ptu*ZZVL0b~vKQ5>{pS^)0XU$T?@7}B{29SrsG2gTSog48odOp&)+X9U3Nf;vE3gzh9X|8{} zH4I*Y+{Q_gy~&2hWwy3nq6d@)=0&L)!| zkzhcXUgP-F4+mf~)(bKc{K7>Hyq4pEzI22URh#zBtVO!l z3eC-_4Acua3}QCzKL=+``sMYVi^)7Tz;xJbocv6pkOEXw|H zl~hacxdS;Q5+Ry*S=EeqSu_+xsk{75NVg!>|F?T$uiK!7T5+v^~4n64{5$nMPt zY`Q?nfW?LIdTwmXacz0h=Dp6Py%1_|%fq#w!A6w8+5$T;<-i<*b(L{*5CDRX2BF<1 zRXC7t_ls?0M!@wM%7Xyk={@p_Z`tDlu$Epi-M5zw>AI)4>4K7V`*Oss8cRBsukREo z?m)x`Rksx;POa)!f+a|tABOcCQFzW|wKRZOzy&O6&SzJz$~1IvHATdy4k5hXMSF8{ zb3N}WSh@+K$-?sTLB+1QNe~i)fZE9YKRhx{Ix&-Rz}c3oZHI}mF{|Y9;bBe?X}8BA z&HIxtP19Xg`X-xwPUf0@q`vQ-q+m0LaRLLMpP$bzwho>Hs`5!}EL$oZ1eipcx=z4% zwVth{tb(QS0GA9LcEAFT!I#alMquOpz>czORb=+gj9Y~985X7`(Be*6eZvHsHm2g# zza7NBX!#PvK!1OK}MNLX+;N#f$(NYHHblYxHO1$_aeWl%z%)4#n^6LA`ufJ z9;yZ27jSbDz@gVLs=HO6m6ZjYLaNmjc<^N&-sG;=`Fa702hp-{`1Scj>k5gFy`|-+ zr$%oPHAp1K>#FZFq<3!tBZ21~(-BUx*}x7P6usE`9{MQCyYMsc>sP37_3V7oB12@m zW<*W>K}b7Zh{;@&SCsI+tee{r2;aXgR1_7B@%A`NdJX|bSe z<4*lXQE|&j^KP2k_-n}!Jfdh+$eg^m&x2#Km#8< zcz{Ytkn_N^pn{C8$!e6EKT;I_Im2%^#a%om=zn<{78V8^JVQVX$R^HEO0z0@QV-S> zC3?OFV!So~<@t-b6ObQY_xk#E5NU##g9;@OAmGA!)7BY5Vc>V9esF0sUP1;!46vF= zp`9;QBl#jA@<+`9vd5|M@jAbYV~}N%$ecddWRq(61@hy8+O5fhSewl)9ydN=r^&sJ z>EU344uYmNiW#ep5?vCXvf|oisS6>;R;Ip^2woVpwvsP7A^{-rgK*75;*NjSrfS{apq)X)jS3({UN%{;HzzhB9 z=67{|^wZMy+xvSfGY@88Z`D^;KAV5c2wO28OV0NL84JkiP&eCr+n@5<&see_rz-k@ zXzy2xU*p7P8oyI^hq1ug$@`9UvAms4*d>CP6#{AKw{7eJOXIQ_;5JwdDbv)@Nahfx zj{33Gy(?@EVjYuCgslH#3#h=t`$r1=3-7Tw!0wsRS6@!T;p$_)htJ$BDp!+}ld<@; zS&6GXI2D!Su)+m^zu(U&_9S%Z{kUCdmDam9TLXibzKgHavtt0%?^-F?u(S z!x1r=;ox^%%wk&pb9ky~2EDP|Pn6T~Dk`EU2=zKxA!_(&Vn!NVAN(Jl8Q>mpxYh>x zm0-Hw-KbFhVP(np6xfiV>L&f&X@^#-s2@0bU6q`xH(O0}G|ZTFw5@ zd#1F}A8u_0mED0TxdXl^uMv2Q3#MrhcG$1-nL(L0o@o{R;D9}5n@Zk^Kk@}dDQ`(- zY*u-hwy>HCZ9K4UDk-W8HKYMtqTUjvACS1nMLaDKNW8C+wFzUF0L3iMFkwTt4_~v< zi~5+2rxi9^_dkzHC^`KdHz3{x-c$5L1}Av)maK70c_ z9%I<7J%{V#tDuqf8tq0|CBQ3GYypZ?OfXG-uVoP6odJJK?SBI%pYfRY8!1rST{%18 zepNkbYmdAbd6garC$BLb-aMjSpA3|IhWk$FzF(lw1HU%&YG_4RKQzjk{FyOaM@PpY z&Y7`D=s%ct>KF*U#jVyfaK@toE|;nY1dFSxoR@pz0aN~h%y4en%+37dLJIy0e!ia# zRxVSjgTX=_H4og;f}`i@-vveCz3*aIBVtGHuhRrBG4#GC+wS&;gK8aRt^Ms15I__o zH~Dd+#2ykk<--~tGfPT=g5Xb+iCmB6zwu@irC8veb^9 z;gRjivcxq|LuC9ya90Z!<8%j3M{Ujw444w;3tGeF84ReLOi~{es)#szS1SytqV@-r zeM8?)QZ^UV?IO=C63!wkaf~5v)%)2mSM9+pvf~ zv>T4Dk#>}sVkUG@5JE+!G}ZvuAPq{K7`;0-wt_0r-J(eIr5gY_A z%o!@dXV#BdQKg%}MGf%8B7T@`{PD-MU%wVCVU3z=#CD*m|xBMliq}3{Z3U;XU~3 zB@duDOiS)ItAP@nxq6|QRR)xx!?c)gvxYNdvRDTZ&-MO?5{2(Cm<&UU^0!6RKOEP| z{+l#6=Drq;Oj8NISdsv|Q%U0aEE0@_cAD_C3ZL&-kc z&BijiGJy>ZQ8LDmKMH}Hiq|zqqt&+(_0qjN{bSC_1plZ~15_!G)zD^!{haH0he~#3 zxen3@C4jx-UnZe?j`vYfyp zW_wAo^~fC~P&b1p#er8Q6)&sV{11`$?yx5E*;M+}exJL054EQ`;GM7-=U_P9%s;A0 z<4xOoyd?qAv?vc`tQLbI-}D&T{T{8lggG&|T6N+U1*M~YS_ z)Wi4HG{O)plRYm{s&W@xa1LgcG8NA+<V+9J790@)lt6;sp zkJ^E4^CA?Zd|z-Go=G%hYy6oy`Cb)N2KLD?0E~jGs}5u`OY6w{pe5s%fz=(>=$b|e z_@Z6kD404-q3{lCyTN1FdlLhvk4T zK^hTuR1|y&=|-K`JgaiIDA=Ax&ifwpgObo(i{AyPCP-!tfox@BI^Iq0Bq&#k^B>b+ zf@3jI54LV0<+dC$OVm3CDIJkyn=$%Tsgbt031yog@!UZL$?{{AP!~=QdGzsHm^@Ou zV*h$c?3y;Ee+~rjdOP(X1alb>_+ec6$w4NH`U=!~-Bt;b^UXNPUX3)&jO^@fzy{QG zxXHH%lHLp$S8Dz5iJC$IkyPb0NXT+q9YG?Mvi~i|}8!R0-GgqlVBgbf|3s%O$!P$L1%=6nj z+=sl4vkGm@yucwxi5&mJ!WXUpe*gKqJ8l*MQp&!f28@J)!BY)}dqF6r^z#_LD-bu= z2qMik==wNm$q#CuEwGX+P%p~cJO^hqCtH&sVsQ)ED1R7@p=~4CggP35xDJ;Rmxx`h zE2b5ge*z_bR1%Y-?$+C~e0Q3!(+w)0)eH1;ZKyP`)^l%hu55{sk=LhlrP5-FZQBVN z@b9itmHaXSAv;Ay5UoXl4*6gzQyB6mK&8g!{y79u^(l;A%HX{JF|yIacv=HMhZrmW z`GdcO;6h7SCW3#l2BmYT=X;4t9!4vbm9qU)nq#Kzz`UK5`^Arq2pfRX1mfjURrHNH z7rB%+K0+HZ!wfJ<(Y$VCMFiTZ3cnCCeBJb1uk!t&r3tGh>W~p480X$bASKuMF3bo% z^$Bh8!wK-UU;-ZCOxXx+)QO+aRHzxq!av+aJ0@f0eGrdY{1f6AqK5S-Tl()OH~8uG z_BnJfcgn>Zt&b0*Iqzy|7>L6qv~!dG*(1g?V2|8ymE_Z*Ig_olk0I}NAw?G#&P$JW zV}Xb`$+rD-say#hbc7KGU&m%u6o)f&qZPIiOpy0$OQ3ip%VH`!RAK^nrTrx&iOO|G5;3Sh$A(2?E;D|LvhOWU_BXSpGBF;z_qyi$TOifJ1`Lteo#@ z;k6pTZJ82iZ$ThIn)DdGwqN&GVcY1^1LVA!^BzcZ7+_htHA#iQjWC%%+5$X7_X3D# z#mlu-eTuEOiEb`LFBCl-7`^y=1_>muw%j) z(C&^Y96exK!59X^!nsTYDLN3iU+RmQ&`$FZY7>LzqT%wRiqtom2+x>pvSyT~2415^ z2&hWRhSzYSV~fxG_p~ln$RF&!;v_=RnG1eXxkRi?{qwCFXSZ0ZZx_Es+t#dj#kTXB z+u_T1fuVgKuZ&)yO8C|jl(M~L&Iw#TTq91=lz+d33kpq;AT?2xe3b z>)WjDj~+&=eL_$7I9|Q2>g+8nCKPpO0=ZvUf|)P^h^PH0Y`L5`T0kqzzoGZ<{%3Gk zfJg(x*enC+=+Ur;)evy_h0Qz>2P(SAq+uxrN&+qkPW+o8bN{{;5f6-$ZbVBF<@tkD zH86U$1(e{y<|*W8Idbe_He_=o-_GR0*ak+w8q28p=6!tsKpUH^IDLa&ejwivoJ0b zSM(*32b;N1FrZWYAZ2@uUV@@=A~&VFJkeyd%gfAQMh50-l~{S?bqtyGze0QLLsjr7 zoQ{N&5hj2(Egv~S=uq?q7GqNFHbFt(1Q8DdWV$V}U0c>2t=*R0K zo9)7~AV|EQiHns`TciPxRGcWcLvI--V|$=kF_jTx&#%KsH#XKppz`)VHR~MauR#s> z*!{W4H(t*;#OFpnw5ko!xUm?rX;u7QPBdt_-0$^iOim{t7gd2aONV}z1rOf{6HIiYGJ0B`IvATE z6)ybz`3pEA(J0r~Q&xT!aD6$)zWGa?N=zHx`JK1i`mqsEC(*G|_HVDN&1>uGj@_9= zdi`ha1!t!w&-@M1TURYk4Ppn{e@JUKE%n*^vJxYQ(mUU1_QyYY;V=VLr|&}UF$hn0 z-_Ij*VDrzF1h7jsV(CM>R;R>tN2F2e=~s}7mEiaNw4_5+wDzc6XBr1pcv+L_5in%p z^~rT_VwOH&DE?=PP!~gG9cSpEd)ucn+AL^&n`b--RQK68g$l4W8LzVw0Q0d%YerpR zD=95~`3`$UThq59f2bCvF4=vc?B%JD#s>T9c9W9*0KDihbRS>qoki*sScZd-!}_t> zKjo7+!)CSV9~v+EDJpWkH!sGB3I#I_7;gffz}^g|J%;gY-XKL(Q4z4X*boQi_(w7e z>#*g%F6_bUoJ#cD_9r+%9%%p-F-jQgdXm^0H-e)F+!Le670fWS>k|$;7vb_{=Z%+} zNiKRk=Y#=fg5Tf0{@Mc1qdy6G0j%hE_8ldHchU}|aKKzVS{h7Un$V-2$D=Sc954Tj zCs$%s+1coy=_N#?T++m04j7)(;`=U(F>%X#eUiur%{xY&D;C3gnGbd%Mv)_3mNy5U z@(n{{fGM0~t(Xaj)WhX+bF~Wrwq4=V;rV}0C|St)k99BOiE%OYV#{A|X#L6(c0fy0+0ZheplVnVYzwaN(&gHhNqsh_#hc5+BRgRgATZu9 zZyspDoWJqRN1j_V&>?Lv`ma*H9TI{ZJN;)ls8PB`HSL916<>TCzqvY6gobBtjS?*tT>L8FX1meFRC9T-3(0gfuz5Gk}i!BW<{uX$&wA7*d6$Q{0aag{0#y)LHe(KnAA_Lhw@lmKI8FPT0Gb-Y zhX26bZ+Eq*2#dGhr}&eSJ>58c+1^FcaaUR~7n zEZa%*wAxx2kRLgip>%Qnz;e!}S|ezN-1z!-ENG|I^6`Zd0J&_Kyvzl8f*z_FFyKFY5BATB=TX<{u$_ zSC8~6L4O_4PpPQDv7n(o7i_vM*8TE zjPFq+R*REIllAwHwj&-BvpLs`Df6C-S9wP@W?)r#w@PvrD0vp%?~43syj;m`be3>F z*e>=b4Ola{Sq?ohn9Lfn0&-Fk#GYwdXl>ZJdZjvG>ve&@El;8-SCUe-5n{vp;j+_- zL{Y%x`0Y<#l6eqspR=g6JyW(lE~@4zOLfzq77DQS**v=!%JztNMz!^*JD^-(UTA5? zM)4lys``CBT40B4Zk14H{z@aVFd#^FG1}SM$w`sSz;8clJSE zv5b-iW`O}wx(1KDQchjU|3J?7WTMJvcee%kq;cY3?*wGQWo3R^TTyqC1dy{uMG#p|(5GAxC( z>!3%)(r0pj32_vd8oMg28WZoyav1uC9{pUC+2y&ktM^&mnEV;g7kS+mo6>gRcN zMVF`y-le$cjA9M-Dqxh&-UN=xs3AF?9?Z1$xg5c{o6fKFXf8V%uV@$;_|lq8Vz)fK z2UCdfHD11=7F#|IjU6CgD0C%k&;?q6dsoZvmD`2H$ZAmjAXin=X=ha3&6VF~XX}HP zb-FW$!Q9Q!7b$M_o9|0`9_g9TM`k>A@BA;NU1co{h$s|hNc6$D+UJL&h}EUqvk$gc z;|=&X)cTE8*;cco-+!ng{aB~dLpj`<|By}aTfTKTlw(jJ70R13H(T@r>WhIf_rs%sFl&*~L#9wCfPZ3Fql1zXp1o zI8C?Oo|!O2QfY5yS%Gssm_!@1KC!`P^Np?K!fGjalnnVqK$Fe3waF?2PpM94eqB`Q zJo{03?>t?J_w^11QJLsh9^1Z(N79P?EuHQ2%n-=!j`GFfOz+Eg^n;|_Ucm^`$(I*( zuXMz&>FWLbGMywD$Aq6Wz4%sa8r!8))lELLKtoney-ddI{Ogl4L*!?(Phc5-9|G0A z2~$M4%`g8vTk1!Gs*K$xAGiIS<)~eW^{j5ZUAeU$Qz?sCb5}~0`q9>51VTNcVj}Xh z!1ZaRN$`cpz_)RHi1;L}VZ5+Tg>&Cwl1cs`_k3qcGH>bpuPVFhpA~rxcJfmBZoZzS z?#C;bga>xKDbr1M6J@1B#z)g*%|a~gLcw$%_%C*-<7U_z zZ|IAq;NhFo_;m!nzZUUq`}nn1@{UlE4Uy2JJ}k%(+pUu84)f|r3NGK7PYW=Y zKDN@(nS3_42%*iz2SRpiZg0BEj0%Irc_}UL$@el;DZ_JL!{OfsdcHVhD!<+=BI7nx zvNPNM&u00*u_caCpyrvD(2DcM`$)H`_fSqFxWBx-Ht*IwCG{rn^6?^tq$yp-?%US!II==;u1k2lq$SG;t{{i$7Psw_Y>71v#|1#ky5K z{FqAy|Km?AIDL{%jKC!6xW|^Ywf(O&%M#5gcOO_IcK$JCqW$6WVq~4&aNVoAa74|H zkX+}+i?gB#!`kX9`lp70jn092*;53ct8Nt&;&@O&GW+(5@$_6?FUP$&itBmf+|%Z} zVg0Ii1yNbYH1lhSbbk7CEw5nqtN75)tjW%=lg)Fw-{$Hhr`)@ExA#2yQC~!5 z322Vp%;_vk(3(!qHiC3Ur0{o3$x`g>hE3X`P z=58U#rgJ)llymaxpUgG!QP@|)kMF3m@l1|S>AI+wmGVw+_2jeJ9Sv(_%rUKFDcFA| zZl_Xe#*q0uy!b$?>G_y}(cT?3Me(ECfP#&CjJzv{m=Hp03>?3#&L=$VW*d;|t3vJE zsTu0-7kYiasAG?umK2?GDn-Z~CQ~A6ihr61UXcG`S};~<$7NgF#y#kjxsnpRd*0~yHJl(bh}WCGN_z7Uo^o2ehriOez!yH z-eWUgS$N;pGSRx{rK7_Oasp|+m*fPU4eu*i*nLbUwfIY;AmP*Gaovd zv8yE6l@EjWex^+s)=^Cw&u=%-tQ6{4e&vZ(H2pzWbZl9DRv+~tI<~DYICrj=7~(Gn zX^Vy#X30Kq?12mzL?3Bac=$CjjJ4=^4SIRai&QxX6(Zz#VIgPqwSvk!fyi%8XY+YF| zt&V@TXe0;qVb?+*ivhW-#VEq79_$(Sn0h^~dEXb-Qg`@KL@tMkDMBvi8IzA{XF~3l zT+dwG&(s)^dy?`D@3dJzOTE+P{@ei$lEuUC9DXqf$F8ybj|6|CGLRT&<5>prgsPKV zM=fokQl*f3fxOeF!rmOg)UftEe%+$poL_0+R9R3rtT$&0{40xJq-eJA7iHK;si0oA zw)Y0bZ3rVFCUK8a*W7-y-JNmV1%h*R zxpX<GRA~ zV0JRfO~#eXcm<1&EE*{rt}J*3#x^|i@HMCWv7eSShNVuR1|4xOmRx< z0MdpNo=uI+)qOyU*wt8kzZ_}Wb>laA&lS@6xxcIaC{9t4{|H1_XEpCQ9bjYRET_2~HaGj;*JUF|K9eEkH1` zBeOpnS2DUMB0d!=^l6z_A`lzsjeUfy=x}IJkwTQ00uWe2DLA2wOh{qUT`9OS zYfnppK*xfqIx2LaQ547jJN4a0HF!tj$=RP@6!@!mtg8q{gs7&arR!B_sfTQ4-gc$j z-2dZ%#TMOl=j}!R5Nh+5xm}Wq&~8mIc0Kq#7eP1qUh4NMf^J!U zlkd(1gD<#Q7CuA|(WS>bN@u^OjgIwH_(rCqQvJkPT2DTJDk5LEcmJ}XL&ff z3Y?`by0XlAeQP&Xr;-^GDdmqkE!6L@R>iyYr@l9~Kv2sN&`j+kx8`$;HIw^u&(H5w zO>y+0sWlcKOchj;Yx17t$*FKnjl_&UNsreWkhLKgK-5d}o0z&j(-{~}e$OSXktnOd z*QSZb^2OHJIGK)Jdy+*Z4XiOMd}TaZaxrj47Ei8gWDvvl+9 z5O9Y1@p*yC=Ami9r`+X3*ZmP>L%L!JRl!D0b-qx?>3*DLG#QzL2uHKHo6vEZ`k9$pJgbg&cW8n&m0<<~u{Imjk-^jW* zsLoiUH}9JMEQ#i#i-Gokbd+|LoZyrOI zn}}YG-L0)`W@FV38+G`1JED-~k!@nBOI|5>DzVe;+ygxS6k}!4SylL3cjY)m+DybP zDysRROHu0L82Rs@^K=AOMtkXGV<|wJHWgsPZen_3Aj3RwazR%AAzI|rQA~tPXpK*O zFqbY-q>H4_nDc_y12c=cR>Oddjs5h6vIa78}+@&`B4nwv$SnHHJH zEw@6KBQMO-0F(mZnp>l#QPp7T8oG+x{_*$&!qt{rQr}<4-n)wW$#IaC2YH4HQ6?Q%1% zs`Gn=Yu@!JT5F)oGB}&@?2-AAqKA!$XSkkzluR@2kU-5c#qmpXkL?13jYFYX6LmqE z8sAu8b?Czoz1jRi5573R&O4aCBEQ^ci=v zcj)eBRF8}_G<5mHC9a*hW=Z$U*b6-=_~K+_x7%{{b0v7P>g-FOAlx9CT$k8%CPPg@ z2cJ=%8Y8%&uHh!c^x4a< zq3+~C@uMHq3S?d%u=!&9r8qcoCn=7)CpcJc{mN4H1qDQhww`RSjbq(Oi%^A-ys8~9@2v*Ax!^hGG`Fb3Mg7_fWIF+Lku z3lLI#^S;%f=4)ecp)DmPRX%0Xx;~buFg$sqL{%U7e6W~IKpzeelR9U^&UF6OPRB+^OLXh2dUMDrC~gkKZmL0tf50fd|KE2zgt;gXi-tPA$@AS8c`AplIrxqR zS@@sy-g|qPTT3ZcOxhC5lSqDPzyrm7k!h>w5L@0&N7rUpOz$!Zv4w!r_95S4zvSS3L0(Q_d1#zVc387e$Y&BqfW#thk_1C zX07gQ2pW!!j{`u^__)Tw!q@ZjbI{Mr9<_gQWpF;6n6H>ImnI}Iu@#*2SOdVzWeyFf^gRVF z76YJ1TD@2Uw9ES#Ty1Cq#6cDMTB1ATX>R);UY-{Q0Y%n7xB z#$58z5di+E6Xx}XkmAR^q|fFwD(J2TuuFyv->PyOsqP;0 zY5{@XK-QvK%h>~MC_^t{1_oQZi!-yacd1h6Y202G13W&t#u zc4m>Xw>yB%TK>!)Q4rb&vUD@x(K0eKlX^4!4qwz0KnTQTue={Hy;l7 zr|^-~_{M#(8CeBHxHjv5AVDyA*|ptn>(U^(vF_`ChT^EX%X2B+Uh_8q`7 zS}(4yT?Oq(rG=?I8fO7!mIv1an3bFB-5dKI7XVoR2XJ1u3yJf}z|p2+z0PE4BsK=VPnR*SC}XnO~pxVW^GctVtsuTpr4il+RF5!w~wC9k>MHq;J| z0MH9yEsCwQjpQZHUN?Im0nFEydn4k@>`ttLpcR09t{e{_GwPxIFn}`#4Xb~;qG46m zV?~!#JpPxS?N5P)i0L*3JvkF+3OWGE_O(aq-c*CzCcwh_1I(}i04T2{yu04g26O-E zj_n`Yxb(ld$B^2VHhvOj`M3_?N-ldEvy@~5yUz2>(?)@Hvm2b@Y!_O zGyW^mnHQW%ed?NeM-dLb?YfR0paJ>^m51(iF%c0F&7I+I=63*v>fd(PDIFgWYwQ#1**&2-#HKn|fx1EQXhem@=3D@N0}C6^RWJ?pN(*3v zQh4n+4Jd*u-8MCK;n;0?Uj!YcuWCM4uE6> z(FM2$b#)>S^2nb+h$lxuVNbK5Xan5XZxuZ7P+Of=9H{~QXDMQ9Ly15=^#B5EQ25}b zjZLu;DZa?z(A(QM_w}GXA1B+c0p>Qe3jm`WYPErhWrN`mvz`K|bJB}EfbgM>5{?Ow zz~&d$n*eyQUo$h8-*7lSnOwX?q4X00;LUzk5bCv-6*3@v&dZ+G*~GNg?$6gS8L)3SO(BPKGl?i@&=uq(k|2= z0T`H<6jvdIx3uEuMCLY?@eNr+845+1rDO@fLe&<7ys^ZNqv`(V>jTL1Z$UWy+o1cJ z252j2dlPZ|Eq?HZqx~t68F1P!3akbI0EM$RFebeuE^~~F0UNb_B6$R0?-6Ht07Uzg z6c_ZiIRo$oXo=_I;)4IgI}wt z;PsO8o7)ga50Zwjm=Mo9;NKr#0xw1P2pk`F-y(*HbN=roA1nv@o*LHQS_I&j7a}L6 L056j;eEWX@4;wGK