-
Notifications
You must be signed in to change notification settings - Fork 246
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
T3 initialization improvement #988
Comments
I see what you're suggesting now that I've looked at this, but I am still working through Tillson's intent here. This sounds like a potential improvement worth investigating if it's helpful. My interpretation of EMA2 -- Tillson shows this as |
@mihakralj is this what you were suggesting: manual calculations (Excel)? |
the way that MetaStock would work through the data:
all EMAs start warm-up at data[0] and all finish warm-up at data[period]. Your excel calculation for T3(5) should calculate SMA(5) [actually that is mean(5)] for all first 5 rows and then switch to EMA(5) from row 6 and onwards: |
How do you do a mean(5) with less than 5 rows? |
when n<p, you do mean(n) instead of mean(p). :) |
lol, that’s cheating! |
that is the point of warming up - you calculate mean (average) of what is there already... |
I’m not entirely sold on the idea that any data is better than no data in most warmup scenarios. Though, can see it being useful in these EMA of EMA cases. I think it misleads to an idea that you’re getting reliable data sooner. I’ll do a before and after analysis to see if it actually improves convergence time. It did not when analyzing DEMA. |
if you can reduce the warm-up period from 6*p down to p, that is a huge win. |
It really comes down to visibility and size of errors vs. acceptance of errors (personal preference). For example, some people may be okay with asking for a 10 period moving average that’s really mathematically inaccurate in warmup periods, some may not. I think I can do a statistical analysis using your quote randomizer to compute a Gaussian profile for time to convergence and error size and decay. |
I believe I did something wrong with T3 - or perhaps it is just an underwhelming indicator... I created 16 testing datasets (based on Jurik's Lab Test Results) to visually compare trend indicators and run all of my currently implemented indicators through the gauntlet. T3 should be rather similar to TEMA or HEMA (based on its construction, but my implementation comes out very laggy and non-responsive. See it for yourself, I tried to sort trend indicators from least responsive to most; all of them use period=10 and the same data input: |
Yeah, my T3 indicator implementation is valid - and its performance is very underwhelming compared to other moving averages. |
How do you know that this |
I am still trying to find an example how MetaStock handles indicators with early data - but when I briefly used it, I remember that it returned 'stuff' very early on; I used a yearly moving average (period=222) on a data that was shorter than 222 bars... |
I'll probably just start at a full SMA and then do the EMA chain from it as I did for DEMA and TEMA. I'm not as much of a fan of adding mysterious magic numbers to fill the void simply for the sake of filling the void. Not sure yet, because I kind of like starting the EMA from the second period right away also. I'm testing both scenarios to see if there's any observable difference in time to converge and will probably base my update on what it shows. |
Yeah, you are coming closer and closer to my realization (and design goal): if we follow the logic and intent of indicator's algo, we should be able to calculate data within the period. That might horrify you, but look it from the other side: T3 (in its current implementation) uses 6 periods before it returns anything. On long periods (i.e. yearly cycles) the indicator becomes useless as it wants six years of quotes... |
A bit, but only in these cases where there's a large convergence zone in any case. The problem I have is shown here with what happens when you shortcut the standoff periods; you get more early values, but they're wildly wrong, converge off course, but eventually arrive at the same number at the same time (not sooner): with shortcuting (close with 160 historical quotes)T3 on 09/04/2020 with 28 periods: 2,332.93645296
T3 on 09/04/2020 with 40 periods: 4,241.81825723
T3 on 09/04/2020 with 50 periods: 4,132.37240686
T3 on 09/04/2020 with 75 periods: 3,499.85102644
T3 on 09/04/2020 with 100 periods: 3,420.98240164
T3 on 09/04/2020 with 110 periods: 3,422.04470867
T3 on 09/04/2020 with 120 periods: 3,424.05612587
T3 on 09/04/2020 with 130 periods: 3,425.46858259
T3 on 09/04/2020 with 140 periods: 3,426.36887285
T3 on 09/04/2020 with 150 periods: 3,426.86692823
T3 on 09/04/2020 with 160 periods: 3,427.10644902
T3 on 09/04/2020 with 175 periods: 3,427.24142698
T3 on 09/04/2020 with 200 periods: 3,427.27659193
T3 on 09/04/2020 with 250 periods: 3,427.27699705
T3 on 09/04/2020 with 350 periods: 3,427.27685725
T3 on 09/04/2020 with 500 periods: 3,427.27685719
T3 on 09/04/2020 with 600 periods: 3,427.27685719 Previous, without shortcutting (close with 120 historical quotes)T3 on 09/04/2020 with 28 periods:
T3 on 09/04/2020 with 40 periods:
T3 on 09/04/2020 with 50 periods:
T3 on 09/04/2020 with 75 periods:
T3 on 09/04/2020 with 100 periods:
T3 on 09/04/2020 with 110 periods:
T3 on 09/04/2020 with 120 periods: 3,427.08697171
T3 on 09/04/2020 with 130 periods: 3,426.87923394
T3 on 09/04/2020 with 140 periods: 3,427.13219948
T3 on 09/04/2020 with 150 periods: 3,427.32001791
T3 on 09/04/2020 with 160 periods: 3,427.33524747
T3 on 09/04/2020 with 175 periods: 3,427.29436050
T3 on 09/04/2020 with 200 periods: 3,427.27498512
T3 on 09/04/2020 with 250 periods: 3,427.27686038
T3 on 09/04/2020 with 350 periods: 3,427.27685718
T3 on 09/04/2020 with 500 periods: 3,427.27685719
T3 on 09/04/2020 with 600 periods: 3,427.27685719 |
Starting the EMA in the first period seems to be a bit better than using an initial single SMA standoff, but this could be a unique chance with my test quotes. T3 on 09/04/2020 with 5 periods: 3,501.60518534
T3 on 09/04/2020 with 14 periods: 3,418.29210316
T3 on 09/04/2020 with 28 periods: 3,404.59005955
T3 on 09/04/2020 with 40 periods: 3,417.78061671
T3 on 09/04/2020 with 50 periods: 3,433.13183184
T3 on 09/04/2020 with 75 periods: 3,426.62257254
T3 on 09/04/2020 with 100 periods: 3,427.23441313
T3 on 09/04/2020 with 110 periods: 3,426.99039451
T3 on 09/04/2020 with 120 periods: 3,426.94860927
T3 on 09/04/2020 with 130 periods: 3,427.24651935
T3 on 09/04/2020 with 140 periods: 3,427.28878303
T3 on 09/04/2020 with 150 periods: 3,427.27945719
T3 on 09/04/2020 with 160 periods: 3,427.27934236
T3 on 09/04/2020 with 175 periods: 3,427.27743046
T3 on 09/04/2020 with 200 periods: 3,427.27686089
T3 on 09/04/2020 with 250 periods: 3,427.27685544
T3 on 09/04/2020 with 350 periods: 3,427.27685718
T3 on 09/04/2020 with 500 periods: 3,427.27685719
T3 on 09/04/2020 with 600 periods: 3,427.27685719
T3 on 09/04/2020 with 700 periods: 3,427.27685719
T3 on 09/04/2020 with 800 periods: 3,427.27685719
T3 on 09/04/2020 with 900 periods: 3,427.27685719
T3 on 09/04/2020 with 1000 periods: 3,427.27685719 I’m going with this approach. |
This was implemented in v2.4.8 |
Originally posted by @mihakralj in #980 (comment)
The text was updated successfully, but these errors were encountered: