Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prior based on weights #153

Draft
wants to merge 14 commits into
base: main
Choose a base branch
from

Conversation

raphaelshirley
Copy link
Member

@raphaelshirley raphaelshirley commented Jun 5, 2024

We would like to enable general priors. Here is a first pass at enabling any prior on any model parameter.

Closes #60

In a first test I set a function from the Python side which updated the chi2. this caused significant issues with the speed due to the global interpreter lock from the Python side.

Now we just set a weight for each model in the fullLib libary. Given that we have access to fullLib we can then use any feature in the SED to determine the prior.

Change Description

A new prior class which is applied to each onesource object prior to the fit.

  • My PR includes a link to the issue that I am addressing

Solution Description

Code Quality

  • I have read the Contribution Guide
  • My code follows the code style of this project
  • My code builds (or compiles) cleanly without any errors or warnings
  • My code contains relevant comments and necessary documentation

New Feature Checklist

  • I have added or updated the docstrings associated with my feature using the NumPy docstring format
  • I have updated the tutorial to highlight my new feature (if appropriate)
  • I have added unit/End-to-End (E2E) test cases to cover my new feature
  • My change includes a breaking change
    • My change includes backwards compatibility and deprecation warnings (if possible)

Documentation Change Checklist

Build/CI Change Checklist

  • If required or optional dependencies have changed (including version numbers), I have updated the README to reflect this
  • If this is a new CI setup, I have added the associated badge to the README

Other Change Checklist

  • Any new or updated docstrings use the NumPy docstring format.
  • I have updated the tutorial to highlight my new feature (if appropriate)
  • I have added unit/End-to-End (E2E) test cases to cover any changes
  • My change includes a breaking change
    • My change includes backwards compatibility and deprecation warnings (if possible)

@raphaelshirley raphaelshirley self-assigned this Jun 5, 2024
Copy link

codecov bot commented Jun 5, 2024

Codecov Report

Attention: Patch coverage is 64.64646% with 35 lines in your changes missing coverage. Please review.

Project coverage is 66.68%. Comparing base (017703a) to head (87bef7a).
Report is 4 commits behind head on main.

Files Patch % Lines
src/lib/prior.cpp 52.54% 28 Missing ⚠️
src/lib/prior.h 57.14% 6 Missing ⚠️
src/lib/onesource.cpp 75.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #153      +/-   ##
==========================================
+ Coverage   66.32%   66.68%   +0.36%     
==========================================
  Files          50       52       +2     
  Lines        6153     6211      +58     
  Branches      934      931       -3     
==========================================
+ Hits         4081     4142      +61     
+ Misses       2072     2069       -3     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@olivierilbert
Copy link
Collaborator

"apply_nz = 1; // Apply n of z if it is specified unless we switch it off"
I don't think N(z) prior should be applied by default. I would use a uniform prior by default.

"It must not take source properties which do not change during fit."
I would have said the reverse. If it changes during the fit, how could you prepare the prior in advance? Maybe I missed the general idea here.

"// General function allowing pybind overwriting
std::function<vector(vector, onesource*)> weights_function ="
If I understand well, this function should be prepared before the fit, object per object, in the python interface. That's a good way to proceed. The difficulty is if you need to use anything related to the normalisation dm. But as a first approximation, one could simply use a ratio of observed/predicted flux in a given band.

@raphaelshirley
Copy link
Member Author

I actually added the apply_nz switch so that it could be turned off for individual objects. As currently implemented it will only be applied if set as before in the config but with this switch you could turn it off on individual objects. Bu yes I agree it needs A bit of thought if it is applied alongside another set of weights.

@johannct
Copy link
Member

johannct commented Jun 6, 2024

It is important to use github mechanisms of issues for RFC and introduction of ideas and ideas of implementation, but the ultimate way to illuminate the usage and behavior of a new piece of code is the unit testing. I know it is easier said than done :)

@raphaelshirley
Copy link
Member Author

I'll work on a unit test...

Raphael Shirley added 6 commits June 7, 2024 14:28
tabs are required in parts of the makefile depending what languag that part is in
It might be more efficient to calculate all weights before the loop if the fulllib is large
What else should we test and how do we get high coverage of the c side

Need to test the nz prior at least
@raphaelshirley
Copy link
Member Author

I have noticed that the moving of the N z prior out of onesource is causing an issue with the z distribution with peaks at z=1,2 . Looking into it...

@raphaelshirley raphaelshirley marked this pull request as draft June 18, 2024 16:58
@raphaelshirley
Copy link
Member Author

Possibly interesting test prior is HSC mass-SFR relation in Tanaka et al. emission line ratios. Luminosity function. Some work on performance impact and benchmarking.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Improve prior functionality
3 participants