A probability distribution governing the evolution of a stochastic process has infinitely many Bayesian representations of the form =∫(). Among these, a natural representation is one whose components ( 's) are ‘learnable’ (one can approximate by conditioning on observation of the process) and ‘sufficient for prediction’ ('s predictions are not aided by conditioning on observation of the process). We show the existence and uniqueness of such a representation under a suitable asymptotic mixing condition on the process. This representation can be obtained by conditioning on the tail‐field of the process, and any learnable representation that is sufficient for prediction is asymptotically like the tail‐field representation. This result is related to the celebrated de Finetti theorem, but with exchangeability weakened to an asymptotic mixing condition, and with his conclusion of a decomposition into i.i.d. component distributions weakened to components that are learnable and sufficient for prediction.
MLA
Jackson, Matthew O., et al. “Bayesian Representation of Stochastic Processes under Learning: de Finetti Revisited.” Econometrica, vol. 67, .no 4, Econometric Society, 1999, pp. 875-893, https://doi.org/10.1111/1468-0262.00055
Chicago
Jackson, Matthew O., Ehud Kalai, and Rann Smorodinsky. “Bayesian Representation of Stochastic Processes under Learning: de Finetti Revisited.” Econometrica, 67, .no 4, (Econometric Society: 1999), 875-893. https://doi.org/10.1111/1468-0262.00055
APA
Jackson, M. O., Kalai, E., & Smorodinsky, R. (1999). Bayesian Representation of Stochastic Processes under Learning: de Finetti Revisited. Econometrica, 67(4), 875-893. https://doi.org/10.1111/1468-0262.00055
By clicking the "Accept" button or continuing to browse our site, you agree to first-party and session-only cookies being stored on your device. Cookies are used to optimize your experience and anonymously analyze website performance and traffic.