-
Notifications
You must be signed in to change notification settings - Fork 363
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Clarification Questions #214
Comments
Thanks!
|
Thanks so much Gufeng for the quick reply! very clear, just a few follow up questions
Lastly, I must say that Robyn has massively helped me in saving time and effort for an MMM model. I was actually writing a MMM model code and i stumbled upon the Robyn project!! Thank you so much for making this open source Regards |
Glad to know it helps.
|
Hi Gufeng, Thanks a lot for your explanations. a few final thoughts/questions
Thanks so much again!! Regards |
hey sorry I missed your last questions:
|
Thanks so much Gufeng! I have one more question for you advise (if i may) I have a variable (in-store display) which isn't really media and doesn't need Adstock. If i put it as a context variable, i am seeing the model outputs shows very high contribution (despite displays being present in only 10% of our store universe). I know that 'decomp.rssd' is the way to avoid this error, but it is applicable only for 'paid media' in the config file (and i don't want to put 'display; in paid media given Ad stock isn't applicable) Can you please advise how can I solve this dilemma ? is there any other way i can specify the costs of 'displays' which allows decomp.RSSD to pick it without using an Ad stock factor ? Thanks so much for your help |
no worries. you can use this in-store display as paid_meida_vars normally and just restrict the adstocking parameter(s) to very narrow bounds or even fix it to 0. E.g. for geometric adstock, you can do |
Thanks very much Gufeng! much appreciated. on this, i assume weibull alpha and gamma doesn't need any change to avoid Adstock. Please correct me if I am wrong. Thank you |
alpha and gamma are saturation parameters. I think it's reasonable to expect saturation for any paid media, so I'd recommend to keep the standard hyperparameter ranges |
Thanks Gufeng. Actually, for displays - there shouldn't be any saturation because this variable just counts # of stores with display. Every new store with a display should add incremental sales. in this scenario, what alpha/gamma values you recommend to ensure saturation isnt considered by the algorithm. Thank you |
ah ok then this isn't a media variable at all. it should indeed go into for now, Robyn doesn't have much features to "tame" context vars. you can try to run more iterations, allow wider hyperparameters for you media and output more pareto_fronts to look for models that have stronger media decomposition. |
Thanks very Gufeng! this is helpful perspective!! really appreciate it! :) |
Hi FB Team
Great initiative and really impressive work!!! I have a few questions to better understand the package
Where should i add the following dependent variables - DIstribution, Store Display and Pricing. I understand we cant add this in the context variables (as these are related to trend/seasonality) and cant add to 'organic_vars' either because the code will apply Adstock/Saturation transformations to 'organic_vars' ? PLease advise. (Also, i am not sure what is the purpose of 'factors-vars'. kindly help me understand)
I only have monthly data and my category is non-seasonal. How can i delete the context variable from the modelling ?
I see that the objective of identifying the most optimal models doesnt include 'R Square' and 'MAPE'. May i understand why you haven't included these criteria while identifying the optimal models ?
How can i identify the 'Durbin Watson', 'VIF' and 'P-value' for each independent variable ? To better understand the model parameters
Lastly, i have only 45 time series data points (national data), but i have these broken down at a 'product SKU' and 'regional level'. Any advise how can i use the regional and product SKU data to compensate for relatively low (45) time series data points ?
Sorry for the long list of questions, I would really appreciate if you could advise on the above
Thanks very much
Ravi
The text was updated successfully, but these errors were encountered: