Bob – have to say I don’t know much about the Kullback-Leibler Divergence, other than that it’s a measure of “relative entropy,” what you lose when you try to approximate a true distribution with a theoretical one.
I think your reference to its application in horse race handicapping is apropos. Good handicappers, whether they do it intuitively in their heads or by using sophisticated computer models, create some type of “fair odds” line, be it rudimentary or otherwise, that they use to approximate the “true” win probabilities and hence determine value.
The computer guys take this to the extreme with their sophisticated multinomial logit and probit models, but they’re still using a theoretical model to try to estimate the “true” probability distributions. They do it empirically by comparing the probabilities computed by their theoretical models to the actual observed probability rates over time. If and when a computer team can get the probabilities computed by their models to coincide with the observed rates with a high enough degree of accuracy, and their predictions are consistent across the set of observations they’ve deemed as overlays, then it’s game, set and match.
The problem is that most horseplayers and wannabe programmers have not or cannot compute probability lines on overlays that are proven to consistently perform as predicted over time. They may think that the even money shots they’ve identified that go off at 3-1 on the board win at a 50% rate, but when they go and look at all such cases over time, they find that those horses are actually winning at closer to a 25% rate.
This is the crux of the problem. The most successful computer teams (and some professional handicappers, perhaps on this board) have solved it. But it takes a lot of time, a lot of comparing of theoretical predictions vs. actual observations, to get to the point where there’s enough certainty that the “Kullback-Leibler divergence” is small enough that a consistent profit can be generated over time.
It’s probably no coincidence that Alan Woods, the co-developer of the renowned Benter/Woods model, went by the handle “Entropy” on the Pace Advantage board.
Be happy to discuss it more over a couple of beers in the backyard at Saratoga some afternoon. ;-)
Rocky