About this Event
View mapTitle: A complete generalization of Kelly betting
Abstract:
In the 1950s, John Kelly (working at Bell Labs like Claude Shannon) fundamentally connected gambling on coin tosses with Shannon's information theory, and this was soon extended by Leo Breiman (1963) to more general settings. In an excellent 1999 Yale PhD thesis, Jonathan Li defined and studied a concept called the reverse information projection (RIPr), which is a Kullback-Leibler projection of a given probability measure onto a given set of probability measures. Grunwald et al. (2024) showed that the RIPr characterizes the log-optimal bet/e-variable of a point alternative hypothesis against a composite null hypothesis, albeit under several assumptions (convexity, absolute continuity, etc). In this talk, we will show how to fully and completely generalize the theory underlying Kelly betting and the RIPr, showing that the RIPr is always well defined, without *any* assumptions. Further, a strong duality result identifies it as the dual to an optimal bet/e-variable called the numeraire, which is unique and also always exists without assumptions. This fully generalizes classical Kelly betting to arbitrary composite nulls; the same assumptionless strong duality also holds for Renyi/Hellinger projections (replacing the logarithmic utility by power utilities).
The talk will not assume any prior knowledge on these topics. This is joint work with Martin Larsson and Johannes Ruf (https://arxiv.org/abs/2402.18810), and appeared in the Annals of Statistics (2025).
0 people are interested in this event