Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: update finetuner docs #843

Merged
merged 9 commits into from
Oct 21, 2022
5 changes: 1 addition & 4 deletions docs/user-guides/finetuner.md
Original file line number Diff line number Diff line change
Expand Up @@ -92,6 +92,7 @@ run = finetuner.fit(
learning_rate=1e-5,
loss='CLIPLoss',
cpu=False,
jemmyshin marked this conversation as resolved.
Show resolved Hide resolved
to_onnx=True,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As finetuner supports open_clip, can we finetune model='ViT-B-32::openai' in this tutorial.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this model name does not match that in finetuner

jemmyshin marked this conversation as resolved.
Show resolved Hide resolved
)
```

Expand Down Expand Up @@ -174,10 +175,6 @@ executors:
replicas: 1
```
jemmyshin marked this conversation as resolved.
Show resolved Hide resolved

```{warning}
jemmyshin marked this conversation as resolved.
Show resolved Hide resolved
Note that Finetuner only support ViT-B/32 CLIP model currently. The model name should match the fine-tuned model, or you will get incorrect output.
```

You can now start the `clip_server` using fine-tuned model to get a performance boost:

```bash
Expand Down