Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: add encrypted fine-tuning documentation #887

Merged
merged 7 commits into from
Sep 26, 2024

Conversation

andrei-stoian-zama
Copy link
Collaborator

@andrei-stoian-zama andrei-stoian-zama commented Sep 25, 2024

Explains the LORA functionality added by #823

@cla-bot cla-bot bot added the cla-signed label Sep 25, 2024
Copy link
Collaborator

@jfrery jfrery left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is is normal to have line breaks everywhere?

docs/deep-learning/lora_training.md Outdated Show resolved Hide resolved
Comment on lines 60 to 71
Concrete ML requires a conversion step for the `peft` model, adding
FHE compatible layers. In this step the number of gradient accumulation steps
can also be set. For LORA it is common to accumulate gradients over
several gradient descent steps before updating weights.

<!--pytest-codeblocks:skip-->

`python`
gradient_accum_steps = 2
lora_training = LoraTraining(peft_model, gradient_accum_steps)

````
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I realize we don't handle that gradient accumulation properly. It's not going to be easy to fix we really need to integrate the training loop within the lora training. I will fix this in the second PR.

Currently, users need to handle the lora_training.run_optimizer(False|True) every gradient_accum_steps iteration. So basically we should probably limit gradient_accum_steps=1 for now. And not talk about it here.

Comment on lines +121 to +122
One fine-tuned, the LORA hybrid FHE model can perform inference only, through the
`model.inference_model` attribute of the hybrid FHE model.
Copy link
Collaborator

@jfrery jfrery Sep 26, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So here I think we can simply re-use peft_model which should be updated. I will check quickly.

EDIT: I checked and indeed we can replace all the hybrid_model.... by lora_training or the peft_model depending on what we call making it much easier. And it's exactly the usage when using torch + peft fine-tuning.

<!--pytest-codeblocks:skip-->

```python
hybrid_model.model.inference_model(x)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same here we can just call peft_model

Comment on lines +138 to +142
hybrid_model.model.inference_model.disable_adapter_layers()
hybrid_model.model.inference_model(x)

# Re-enable the LORA weights
hybrid_model.model.inference_model.enable_adapter_layers()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same use peft_model everywhere.

@andrei-stoian-zama andrei-stoian-zama marked this pull request as ready for review September 26, 2024 13:46
@andrei-stoian-zama andrei-stoian-zama requested a review from a team as a code owner September 26, 2024 13:46
Copy link
Collaborator

@jfrery jfrery left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good to me thanks!

@andrei-stoian-zama andrei-stoian-zama merged commit 3f65ec2 into main Sep 26, 2024
16 checks passed
@andrei-stoian-zama andrei-stoian-zama deleted the docs/add_lora_docs branch September 26, 2024 13:52
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants