Skip to content
This repository has been archived by the owner on Oct 9, 2023. It is now read-only.

Commit

Permalink
typos
Browse files Browse the repository at this point in the history
  • Loading branch information
Borda committed Nov 5, 2021
1 parent d1be93c commit eec8485
Showing 1 changed file with 1 addition and 3 deletions.
4 changes: 1 addition & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -209,7 +209,7 @@ model = ImageClassifier(backbone="resnet18", num_classes=2, optimizer="Adam", lr
model = ImageClassifier(backbone="resnet18", num_classes=2, optimizer="Adam", lr_scheduler=functools.partial(CyclicLR, step_size_up=1500, mode='exp_range', gamma=0.5))

# - Tuple[string, dict]: (The dict takes in the scheduler kwargs)
model = ImageClassifier(backbone="resnet18", num_classes=2, optimizer="Adam", lr_scheduler=("StepLR", {"step_size": 10]))
model = ImageClassifier(backbone="resnet18", num_classes=2, optimizer="Adam", lr_scheduler=("StepLR", {"step_size": 10}))
```

You can also register you own custom scheduler recipes beforeahand and use them shown as above:
Expand Down Expand Up @@ -238,10 +238,8 @@ The example also uses our [`merge_transforms`](https://lightning-flash.readthedo

```py
import torch
from typing import Any
import numpy as np
import albumentations
from torchvision import transforms as T
from flash.core.data.transforms import ApplyToKeys, merge_transforms
from flash.image import ImageClassificationData
from flash.image.classification.transforms import default_transforms, AlbumentationsAdapter
Expand Down

0 comments on commit eec8485

Please sign in to comment.