-
Notifications
You must be signed in to change notification settings - Fork 406
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support multiple data fidelity dimensions in MultiFidelityGP models #1956
Conversation
…o allow for multiple data fidelity dimensions
Codecov Report
@@ Coverage Diff @@
## main #1956 +/- ##
=========================================
Coverage 100.00% 100.00%
=========================================
Files 178 178
Lines 15748 15739 -9
=========================================
- Hits 15748 15739 -9
📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for adding this and the thorough contribution. I have mostly cosmetic comments as well as a suggestion for how to accept data fidelities going forward (which may make fixing the lint errors unnecessary).
@@ -66,7 +66,7 @@ def __init__( | |||
train_X: Tensor, | |||
train_Y: Tensor, | |||
iteration_fidelity: Optional[int] = None, | |||
data_fidelity: Optional[int] = None, | |||
data_fidelity: Optional[int | List[int]] = None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
While I like this, the |
operator only works in py3.10+, and we still need to be compatible with py3.9
data_fidelity: Optional[int | List[int]] = None, | |
data_fidelity: Optional[Union[int, List[int]]] = None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could also make sense to have a data_fidelities
argument that only accepts lists (or tuples) and require people to use that even for the single-fidelity case, that way we wouldn't have to overload the arg types. This could be done nicely by keeping the data_fidelity
arg around but marking it as deprecated (and raise a deprecation warning if used in the constructor). Wdyt?
cc @saitcakmak, @esantorella for thoughts.
Summary: Pull Request resolved: pytorch#1972 ## Motivation Made a couple minor corrections to the BOPE tutorial, bringing the text in line with the code. The code and text probably drifted apart over time. ### Have you read the [Contributing Guidelines on pull requests](https://github.com/pytorch/botorch/blob/main/CONTRIBUTING.md#pull-requests)? Yes Pull Request resolved: pytorch#1971 Test Plan: Check that it renders okay in GH Reviewed By: esantorella Differential Revision: D48175913 Pulled By: saitcakmak fbshipit-source-id: f312366955ffc69683b06c37ea971e595f1cf738
Summary: Pull Request resolved: pytorch#1945 This diff deprecates `FixedNoiseDataset` and merges it into `SupervisedDataset` with Yvar becoming an optional field. This also simplifies the class hierarchy a bit, removing `SupervisedDatasetMeta` in favor of an `__init__` method. I plan to follow up on this by adding optional metric names to datasets and introducing a MultiTaskDataset, which will simplify some of the planned work in Ax MBM. Reviewed By: esantorella Differential Revision: D47729430 fbshipit-source-id: 551cd78a02755505573b10ea1f075aa21f838ab7
Summary: X-link: facebook/Ax#1771 Pull Request resolved: pytorch#1973 Currently, constraints are not used in single objective AFs in MBM due to a name mismatch between `outcome_constraints` and `constraints`. Reviewed By: SebastianAment Differential Revision: D48176978 fbshipit-source-id: 9495708002c11a874bb6b8c06327f0f4643039df
Summary: see title Reviewed By: saitcakmak Differential Revision: D48230494 fbshipit-source-id: ae5a2355a4ffe30357fdc4a333e3235e6106b6a0
Summary: `posterior_transform` is `Optional` in the constructor of the acquisition function, so it should be optional in the private methods as well. Reviewed By: saitcakmak Differential Revision: D48244486 fbshipit-source-id: f0850ecde5259d626cacbacd8c36401483a69d15
Summary: see title. Deploy on release failed due to running on py3.8 https://github.com/pytorch/botorch/actions/runs/5826595266/job/15801310995#step:6:37 Reviewed By: saitcakmak, Balandat Differential Revision: D48250526 fbshipit-source-id: 11412ad94bd237957f7e61c24f2f7bd890b630b1
Thanks for making these edits. The only outstanding things would be to
|
…d set of exceptions (pytorch#1872) Summary: X-link: facebook/Ax#1772 Pull Request resolved: pytorch#1872 [x] Remove unused arguments from input constructors and related functions. The idea is especially not to let unused keyword arguments disappear into `**kwargs` and be silently ignored [x] add arguments to some input constructors so they don't need any `**kwargs` [x] Add a decorator that ensures that each input constructor can accept a certain set of keyword arguments, even if those are not used are the constructor, while still erroring on [ ] Prevent arguments from having different defaults in the input constructors as in acquisition functions Reviewed By: lena-kashtelyan Differential Revision: D46519588 fbshipit-source-id: 8d727cb991f2899cc9e7ee68da7998e981ca8802
Summary: X-link: facebook/Ax#1782 Pull Request resolved: pytorch#1985 See previous diff Reviewed By: lena-kashtelyan Differential Revision: D48338443 fbshipit-source-id: f5f674fe224cf5a1ef06a6de9cdad10dfda09382
…o allow for multiple data fidelity dimensions
Hi @Balandat I've restored the code coverage to 100%, made the changes to the tutorials' documentation and rebased to the newest changes. :) |
@Balandat has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
Hmm not sure why the doc build is failing, seems like there is an issue with the torch install?
I'll retry hopefully this is just a temporary issue with the package of the GHA runner. |
Summary: Pull Request resolved: pytorch#1990 Sphinx 7.2.0 or later leads to doc build failures. Reviewed By: esantorella Differential Revision: D48487417 fbshipit-source-id: d2ceafa62e94f6a19f82529e868bd342cb691eb9
Hi @Balandat Is this an issue that I can fix from my side? |
This turned out to be an issue caused by a sphinx version bump and was fixed in #1990 - if you rebase on the latest changes in master the issue should go away. |
…o allow for multiple data fidelity dimensions
I've rebased to main again, so it should be fixed now @Balandat |
@Balandat has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
Support multiple data fidelity dimensions in MultiFidelityGP models
Motivation
Both the SingleTaskMultiFidelityGP and the FixedNoiseMultiFidelityGP only allowed for one data-fidelity dimension. This PR generalises them by adding the option of passing in a list of data fidelity dimensions. In the case of multiple data fidelity dimensions, a kernel is created for each dimension and added to a list, which is passed into a product kernel. This is useful for situations such as was discussed in #1942 with @Balandat.
Have you read the Contributing Guidelines on pull requests?
Yes.
Test Plan
I changed code in
botorch/models/gp_regression_fidelity.py
and the accompanying test code intest/models/test_gp_regression_fidelity.py
. The logic remains unchanged in the case of a single data-fidelity dimension. The changes for allowing multiple fidelity dimensions were tested by adding appropriate cases in theFIDELITY_TEST_PAIRS
list in theTestSingleTaskMultiFidelityGP
class.Furthermore, the changes were tested on a minimisation toy problem with two data-fidelity dimensions and used to create the following plots:
where g11 has fidelities [1.0, 1.0], g01 has fidelities [0.75, 1.0], g10 has fidelities [1.0, 0.75] and g00 has fidelities [0.75, 0.75].
Related PRs
N/A