Return to Colloquia & Seminar listing
Rotor-routing, smoothing kernels, and reduction of variance: breaking the O(1/n) barrier
Probability| Speaker: | James Propp, UMass Lowell (currently visiting MSRI and UC Berkeley) |
| Location: | 1147 MSB |
| Start time: | Wed, Mar 14 2012, 4:10PM |
Description
If a random variable X is easier to simulate than to analyze,
one way to estimate its expected value E(X) is to generate
n samples that are distributed according to the law of X
and take their average. If the samples are independent, then (assuming
X has finite variance) the estimate will have typical error
O(1/n1/2). But often one can do better by introducing
appropriate forms of negative dependence. In the toy context of simulating
Markov chains to estimate their absorption probabilities, I'll describe a
scheme that uses maximally anticorrelated identically distributed Bernoulli
random variables (aka rotor routers) and has typical error O(1/n).
This might seem to be optimal, and indeed one cannot expect the average
(X1+...+Xn)/n to differ
from its expected value E(X) by less than O(1/n). However, by
using weighted averages that assign Xi less weight when
i is near 1 or n and greater weight when i is near
n/2, one can get estimators for E(X) with typical error
significantly smaller than O(1/n).
The methods and ideas are mostly probabilistic and combinatorial.
No prior knowledge of rotor-routing or smoothing kernels, and no
familiarity with (or fondness for) statistics, will be assumed.
