Knowledge

Coupling (probability)

Source 📝

687:). If they never meet, we continue this process forever (the probability of that is zero, though). After this event, we change the coupling rule. We let them walk together in the horizontal direction, but in a mirror image rule in the vertical direction. We continue this rule until they meet in the vertical direction too (if they do), and from that point on, we just let them walk together. 690:
This is a coupling in the sense that neither particle, taken on its own, can "feel" anything we did. Neither the fact that the other particle follows it in one way or the other, nor the fact that we changed the coupling rule or when we did it. Each particle performs a simple random walk. And yet, our
1086:. As you let time run these two processes will evolve independently. Under certain conditions, these two processes will eventually meet and can be considered the same process at that point. This means that the process outside the stationary distribution converges to the stationary distribution. 976: 695:
and to continue from that point on together permanently. This allows one to prove many interesting results that say that "in the long run", it is not important where you started in order to obtain that particular result.
716:
of turning up heads. Intuitively, if both coins are tossed the same number of times, we should expect the first coin turns up fewer heads than the second one. More specifically, for any fixed
262: 203: 1084: 358: 1038: 1011: 577: 550: 520: 493: 466: 439: 412: 385: 316: 289: 141: 114: 640:
is the copy. And in a sense they both are right. In other words, any mathematical theorem, or result that holds for a regular random walk, will also hold for both
870: 608:
in two dimensions, but they start from different points. The simplest way to couple them is simply to force them to walk together. On every step, if
728:
heads. However proving such a fact can be difficult with a standard counting argument. Coupling easily circumvents this problem.
1129: 1158: 208: 149: 1186: 1095: 755: 20: 1043: 840: 325: 69:
is generally not unique, and the whole idea of "coupling" is about making such a choice so that
758:
for heads in a sequence of flips of the first coin. For the second coin, define a new sequence
659:
from (10,10). First couple them so that they walk together in the vertical direction, i.e. if
54: 1016: 989: 555: 528: 498: 471: 444: 417: 390: 363: 294: 267: 119: 92: 8: 624:, etc. Thus, the difference between the two particles' positions stays fixed. As far as 86: 35: 27: 1154: 1125: 144: 1040:
inside the stationary distribution. Couple these two independent processes together
38:
technique that allows one to compare two unrelated random variables (distributions)
683:
have the same horizontal coordinate, or in other words are on the vertical line (5,
724:
heads should be less than the probability that the second coin produces at least
971:{\displaystyle \Pr(X_{1}+\cdots +X_{n}>k)\leq \Pr(Y_{1}+\cdots +Y_{n}>k).} 857:, a toss by toss comparison of the two coins is now possible. That is, for any 1180: 1100: 692: 47: 981: 636:
holds the opposite view, i.e. that it is, in effect, the original and that
605: 1013:
outside the stationary distribution and initialize another process
1122:
Concentration of Measure for the Analysis of Randomized Algorithms
667:, etc., but are mirror images in the horizontal direction i.e. if 675:
goes right and vice versa. We continue this coupling until
1120:
Dubhashi, Devdatt; Panconesi, Alessandro (June 15, 2009).
982:
Convergence of Markov Chains to a stationary distribution
1124:(1st ed.). Cambridge University Press. p. 91. 720:, the probability that the first coin produces at least 628:
is concerned, it is doing a perfect random walk, while
843:
of tosses made with the second coin. However, because
1046: 1019: 992: 873: 558: 531: 501: 474: 447: 420: 393: 366: 328: 297: 270: 211: 152: 122: 95: 708:
of turning up heads and the second with probability
704:
Assume two biased coins, the first with probability
651:Consider now a more elaborate example. Assume that 1078: 1032: 1005: 970: 571: 544: 514: 487: 460: 433: 406: 379: 352: 310: 283: 256: 197: 135: 108: 1119: 1178: 921: 874: 77:can be related in a particularly desirable way. 360:over which there are two random variables 1167: 257:{\displaystyle (\Omega _{2},F_{2},P_{2})} 198:{\displaystyle (\Omega _{1},F_{1},P_{1})} 1170:Coupling, Stationarity, and Regeneration 1148: 1179: 16:Proof technique in probability theory 143:be two random variables defined on 13: 691:coupling rule forces them to meet 332: 216: 157: 14: 1198: 655:starts from the point (0,0) and 85:Using the standard formalism of 1151:Lectures on the coupling method 699: 1113: 1073: 1047: 962: 924: 915: 877: 591: 347: 329: 251: 212: 192: 153: 1: 1142: 1079:{\displaystyle (X_{n},Y_{n})} 495:has the same distribution as 441:has the same distribution as 353:{\displaystyle (\Omega ,F,P)} 80: 525:An interesting case is when 65:respectively. The choice of 7: 1096:Copula (probability theory) 1089: 620:moves to the left, so does 586: 10: 1203: 18: 21:Coupling (disambiguation) 1106: 841:probability distribution 986:Initialize one process 1168:Thorisson, H. (2000). 1080: 1034: 1007: 972: 816:= 1 with probability ( 573: 546: 516: 489: 462: 435: 408: 381: 354: 312: 285: 258: 199: 137: 110: 55:marginal distributions 1172:. New York: Springer. 1149:Lindvall, T. (1992). 1081: 1035: 1033:{\displaystyle Y_{n}} 1008: 1006:{\displaystyle X_{n}} 973: 832:Then the sequence of 596:Assume two particles 574: 572:{\displaystyle Y_{2}} 547: 545:{\displaystyle Y_{1}} 517: 515:{\displaystyle X_{2}} 490: 488:{\displaystyle Y_{2}} 463: 461:{\displaystyle X_{1}} 436: 434:{\displaystyle Y_{1}} 409: 407:{\displaystyle Y_{2}} 382: 380:{\displaystyle Y_{1}} 355: 313: 311:{\displaystyle X_{2}} 286: 284:{\displaystyle X_{1}} 264:. Then a coupling of 259: 200: 138: 136:{\displaystyle X_{2}} 111: 109:{\displaystyle X_{1}} 1044: 1017: 990: 871: 556: 529: 499: 472: 445: 418: 391: 364: 326: 295: 268: 209: 150: 120: 93: 19:For other uses, see 1153:. New York: Wiley. 756:indicator variables 1187:Probability theory 1076: 1030: 1003: 968: 612:walks up, so does 569: 542: 512: 485: 458: 431: 404: 377: 350: 322:probability space 308: 281: 254: 195: 145:probability spaces 133: 106: 87:probability theory 28:probability theory 1131:978-0-521-88427-3 824:)/(1 −  663:goes up, so does 604:perform a simple 1194: 1173: 1164: 1136: 1135: 1117: 1085: 1083: 1082: 1077: 1072: 1071: 1059: 1058: 1039: 1037: 1036: 1031: 1029: 1028: 1012: 1010: 1009: 1004: 1002: 1001: 977: 975: 974: 969: 955: 954: 936: 935: 908: 907: 889: 888: 839:has exactly the 632:is the copycat. 578: 576: 575: 570: 568: 567: 551: 549: 548: 543: 541: 540: 521: 519: 518: 513: 511: 510: 494: 492: 491: 486: 484: 483: 467: 465: 464: 459: 457: 456: 440: 438: 437: 432: 430: 429: 413: 411: 410: 405: 403: 402: 386: 384: 383: 378: 376: 375: 359: 357: 356: 351: 317: 315: 314: 309: 307: 306: 290: 288: 287: 282: 280: 279: 263: 261: 260: 255: 250: 249: 237: 236: 224: 223: 204: 202: 201: 196: 191: 190: 178: 177: 165: 164: 142: 140: 139: 134: 132: 131: 115: 113: 112: 107: 105: 104: 76: 72: 68: 64: 60: 52: 45: 41: 1202: 1201: 1197: 1196: 1195: 1193: 1192: 1191: 1177: 1176: 1161: 1145: 1140: 1139: 1132: 1118: 1114: 1109: 1092: 1067: 1063: 1054: 1050: 1045: 1042: 1041: 1024: 1020: 1018: 1015: 1014: 997: 993: 991: 988: 987: 984: 950: 946: 931: 927: 903: 899: 884: 880: 872: 869: 868: 855: 848: 837: 814: 807: 797: 790: 780: 771: 764: 753: 744: 737: 702: 594: 589: 563: 559: 557: 554: 553: 536: 532: 530: 527: 526: 506: 502: 500: 497: 496: 479: 475: 473: 470: 469: 452: 448: 446: 443: 442: 425: 421: 419: 416: 415: 398: 394: 392: 389: 388: 371: 367: 365: 362: 361: 327: 324: 323: 302: 298: 296: 293: 292: 275: 271: 269: 266: 265: 245: 241: 232: 228: 219: 215: 210: 207: 206: 186: 182: 173: 169: 160: 156: 151: 148: 147: 127: 123: 121: 118: 117: 100: 96: 94: 91: 90: 83: 74: 70: 66: 62: 58: 50: 43: 39: 24: 17: 12: 11: 5: 1200: 1190: 1189: 1175: 1174: 1165: 1159: 1144: 1141: 1138: 1137: 1130: 1111: 1110: 1108: 1105: 1104: 1103: 1098: 1091: 1088: 1075: 1070: 1066: 1062: 1057: 1053: 1049: 1027: 1023: 1000: 996: 983: 980: 979: 978: 967: 964: 961: 958: 953: 949: 945: 942: 939: 934: 930: 926: 923: 920: 917: 914: 911: 906: 902: 898: 895: 892: 887: 883: 879: 876: 853: 846: 835: 830: 829: 812: 805: 800: 795: 788: 776: 769: 762: 749: 742: 735: 701: 698: 593: 590: 588: 585: 566: 562: 539: 535: 509: 505: 482: 478: 455: 451: 428: 424: 401: 397: 374: 370: 349: 346: 343: 340: 337: 334: 331: 305: 301: 278: 274: 253: 248: 244: 240: 235: 231: 227: 222: 218: 214: 194: 189: 185: 181: 176: 172: 168: 163: 159: 155: 130: 126: 103: 99: 82: 79: 57:correspond to 46:by creating a 15: 9: 6: 4: 3: 2: 1199: 1188: 1185: 1184: 1182: 1171: 1166: 1162: 1160:0-471-54025-0 1156: 1152: 1147: 1146: 1133: 1127: 1123: 1116: 1112: 1102: 1101:Kruskal count 1099: 1097: 1094: 1093: 1087: 1068: 1064: 1060: 1055: 1051: 1025: 1021: 998: 994: 965: 959: 956: 951: 947: 943: 940: 937: 932: 928: 918: 912: 909: 904: 900: 896: 893: 890: 885: 881: 867: 866: 865: 864: 860: 856: 849: 842: 838: 827: 823: 820: −  819: 815: 808: 801: 798: 791: 784: 783: 782: 779: 775: 768: 761: 757: 752: 748: 741: 734: 729: 727: 723: 719: 715: 711: 707: 697: 694: 693:almost surely 688: 686: 682: 678: 674: 670: 666: 662: 658: 654: 649: 647: 643: 639: 635: 631: 627: 623: 619: 615: 611: 607: 603: 599: 584: 583:independent. 582: 564: 560: 537: 533: 523: 507: 503: 480: 476: 453: 449: 426: 422: 399: 395: 372: 368: 344: 341: 338: 335: 321: 303: 299: 276: 272: 246: 242: 238: 233: 229: 225: 220: 187: 183: 179: 174: 170: 166: 161: 146: 128: 124: 101: 97: 88: 78: 56: 49: 48:random vector 37: 33: 29: 22: 1169: 1150: 1121: 1115: 985: 862: 858: 851: 844: 833: 831: 825: 821: 817: 810: 803: 793: 786: 777: 773: 766: 759: 750: 746: 739: 732: 730: 725: 721: 717: 713: 709: 705: 703: 700:Biased coins 689: 684: 680: 676: 672: 668: 664: 660: 656: 652: 650: 645: 641: 637: 633: 629: 625: 621: 617: 613: 609: 601: 597: 595: 580: 524: 319: 84: 31: 25: 850:depends on 671:goes left, 606:random walk 592:Random walk 1143:References 809:= 0, then 792:= 1, then 781:such that 414:such that 81:Definition 941:⋯ 919:≤ 894:⋯ 333:Ω 217:Ω 158:Ω 1181:Category 1090:See also 587:Examples 32:coupling 772:, ..., 745:, ..., 1157:  1128:  468:while 89:, let 53:whose 1107:Notes 712:> 616:, if 318:is a 36:proof 34:is a 1155:ISBN 1126:ISBN 957:> 910:> 799:= 1, 731:Let 679:and 644:and 600:and 579:are 552:and 387:and 291:and 205:and 116:and 73:and 61:and 42:and 802:if 785:if 754:be 581:not 320:new 26:In 1183:: 922:Pr 875:Pr 861:≤ 828:). 765:, 738:, 648:. 522:. 30:, 1163:. 1134:. 1074:) 1069:n 1065:Y 1061:, 1056:n 1052:X 1048:( 1026:n 1022:Y 999:n 995:X 966:. 963:) 960:k 952:n 948:Y 944:+ 938:+ 933:1 929:Y 925:( 916:) 913:k 905:n 901:X 897:+ 891:+ 886:1 882:X 878:( 863:n 859:k 854:i 852:X 847:i 845:Y 836:i 834:Y 826:p 822:p 818:q 813:i 811:Y 806:i 804:X 796:i 794:Y 789:i 787:X 778:n 774:Y 770:2 767:Y 763:1 760:Y 751:n 747:X 743:2 740:X 736:1 733:X 726:k 722:k 718:k 714:p 710:q 706:p 685:y 681:B 677:A 673:B 669:A 665:B 661:A 657:B 653:A 646:B 642:A 638:A 634:B 630:B 626:A 622:B 618:A 614:B 610:A 602:B 598:A 565:2 561:Y 538:1 534:Y 508:2 504:X 481:2 477:Y 454:1 450:X 427:1 423:Y 400:2 396:Y 373:1 369:Y 348:) 345:P 342:, 339:F 336:, 330:( 304:2 300:X 277:1 273:X 252:) 247:2 243:P 239:, 234:2 230:F 226:, 221:2 213:( 193:) 188:1 184:P 180:, 175:1 171:F 167:, 162:1 154:( 129:2 125:X 102:1 98:X 75:Y 71:X 67:W 63:Y 59:X 51:W 44:Y 40:X 23:.

Index

Coupling (disambiguation)
probability theory
proof
random vector
marginal distributions
probability theory
probability spaces
random walk
almost surely
indicator variables
probability distribution
Copula (probability theory)
Kruskal count
ISBN
978-0-521-88427-3
ISBN
0-471-54025-0
Category
Probability theory

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.