Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add option for Burst inference #4925

Merged
merged 3 commits into from
Feb 9, 2021
Merged

Conversation

chriselion
Copy link
Contributor

@chriselion chriselion commented Feb 8, 2021

Proposed change(s)

Followup from slack with Florent: Add InferenceDevice.Burst and make it the default. I don't think changing the existing behavior of InferenceDevice.CPU in a minor version is allowed though.

Also need to keep CPU as the default for Agent.SetModel:

public void SetModel(
string behaviorName,
NNModel model,
InferenceDevice inferenceDevice = InferenceDevice.CPU)

for now.

Useful links (Github issues, JIRA tickets, ML-Agents forum threads etc.)

https://jira.unity3d.com/browse/MLA-1763
https://jira.unity3d.com/browse/MLA-1765 (followup to break the behavior in the next major version)

Types of change(s)

  • New feature

Checklist

  • Added tests that prove my fix is effective or that my feature works
  • Updated the changelog (if applicable)

@chriselion
Copy link
Contributor Author

Prefab changes coming soon.

@chriselion chriselion merged commit beef587 into master Feb 9, 2021
@delete-merged-branch delete-merged-branch bot deleted the MLA-1763-barracuda-burstCpu branch February 9, 2021 00:52
@github-actions github-actions bot locked as resolved and limited conversation to collaborators Feb 9, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants