Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Expose hypergeometric_3F2 function #2797

Merged
merged 22 commits into from
Oct 23, 2022
Merged

Expose hypergeometric_3F2 function #2797

merged 22 commits into from
Oct 23, 2022

Conversation

andrjohns
Copy link
Collaborator

Summary

This PR renames and exposes the internal F32 function as hypergeometric_3F2, and updates the calculations to be on the log-scale for additional stability.

The function currently uses autodiff for the gradients, even though there is also an internal grad_F32 function, but I'll be updating that in a separate PR.

Tests

No additional tests, as all existing tests should still pass

Side Effects

N/A

Release notes

Exposed the hypergeometric_3F2 function and improved its numerical stability

Checklist

  • Math issue Hypergeometric function naming #2664

  • Copyright holder: Andrew Johnson

    The copyright holder is typically you or your assignee, such as a university or company. By submitting this pull request, the copyright holder is agreeing to the license the submitted work under the following licenses:
    - Code: BSD 3-clause (https://opensource.org/licenses/BSD-3-Clause)
    - Documentation: CC-BY 4.0 (https://creativecommons.org/licenses/by/4.0/)

  • the basic tests are passing

    • unit tests pass (to run, use: ./runTests.py test/unit)
    • header checks pass, (make test-headers)
    • dependencies checks pass, (make test-math-dependencies)
    • docs build, (make doxygen)
    • code passes the built in C++ standards checks (make cpplint)
  • the code is written in idiomatic C++ and changes are documented in the doxygen

  • the new changes are tested

@bgoodri
Copy link
Contributor

bgoodri commented Jul 26, 2022

I think all of these hypergeometric functions should (eventually) call the ones in Boost, which have been included in the last several versions.

https://www.boost.org/doc/libs/1_79_0/libs/math/doc/html/math_toolkit/hypergeometric.html

It is hard to do well with double precision in general, but I think they have put more effort into the numerics that we have or would want to do ourselves.

@andrjohns
Copy link
Collaborator Author

I think all of these hypergeometric functions should (eventually) call the ones in Boost, which have been included in the last several versions.

https://www.boost.org/doc/libs/1_79_0/libs/math/doc/html/math_toolkit/hypergeometric.html

It is hard to do well with double precision in general, but I think they have put more effort into the numerics that we have or would want to do ourselves.

This is the only one that doesn't call the Boost implementation, since the F32 header was already here (I'm guessing the Boost hypergeometrics didn't exist when that was written though). I'll update the PR to call the boost implementation instead

@andrjohns
Copy link
Collaborator Author

This is now ready for review. I've updated hypergeometric_3F2 to delegate to the hypergeometric_pFq (and boost's implementation). However there are some combinations of arguments which Boost says won't converge, even though they do (as part of the current tests), so I've left the naive infinite-sum approach implemented for that.

The gradients are currently being calculated by the grad_pFq function, but I'll be refactoring and using the existing grad_F32 function in a future PR

spinkney
spinkney previously approved these changes Oct 20, 2022
stan/math/prim/fun/hypergeometric_3F2.hpp Outdated Show resolved Hide resolved
stan/math/prim/fun/hypergeometric_3F2.hpp Outdated Show resolved Hide resolved
stan/math/prim/fun/hypergeometric_3F2.hpp Outdated Show resolved Hide resolved
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants