Super Massive Data: Sunshine

Hi, I have a question, It is not clear to how to ‘Build a new target by neutralizing an existing target to the Meta Model’. You only can neutralize a prediction (or target) but not one out of the other right?

Basically subtracting the metamodel from one of the given targets, and training on the residuals. (You can only do this eras >= 888 where we have the metamodel predictions.) And then if the future metamodel is like the old metamodel, and the new eras are like the old eras, then the mistakes it makes going forward should be similar and your model will pick up the slack. The first assumption is not bad – the metamodel seems to change slowly. However it may not stay that way precisely because people can now make models in this way. The second assumption is more dubious but we have that problem of non-stationarity of the market no matter what strategy we use.


Am I missing something?
Where does the number total of features 1586 come from, 1050(v4)+405(v4.1)=1455 and where are the other 131 features?

1050 was v3. v4 had 141 more = 1191. But then 10 were designated “bad” from v3 & v4, and those were gotten rid of for v4.1. So 1181 v4 features carried over to v4.1 + 405 more = 1586.


Is the gap between the last era with targets in the validation data and the live era known ? constant ?