Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement DiffTransform for RandomWalk distributions #6098

Open
Tracked by #7053
ricardoV94 opened this issue Sep 4, 2022 · 1 comment
Open
Tracked by #7053

Implement DiffTransform for RandomWalk distributions #6098

ricardoV94 opened this issue Sep 4, 2022 · 1 comment

Comments

@ricardoV94
Copy link
Member

ricardoV94 commented Sep 4, 2022

My guess is that unobserved [Gaussian/MvNormal/MvStudentT]RandomWalk would sample much better if NUTS could propose innovation values instead of the "absolute" timeseries values. This is what is done when one manually creates a random walk model by

x_raw = pm.Normal("x_raw")
x = pm.Deterministic("x", x_raw.cumsum())

If we just added the transform, there would some redundancies in the logp, because the proposed diffed values would be cumsumed (in the back-transform) and then differentiated again in the randomwalk logprob, but we can probably teach Aesara to optimize those away.

Once we have that, the user won't be worse off when creating unobserved random walks via the specialized PyMC distribution classes.

@ricardoV94
Copy link
Member Author

Alternatively we can rewrite it once we have something like pymc-devs/pymc-extras#111 working

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant