Knowledge

Binocular disparity

Source 📝

160:
placed along the line of fixation of the left eye. The same disparity produced from a shift in depth of an object (filled coloured circles) can also be produced by laterally shifting the object in constant depth in the picture one eye sees (black circles with coloured margin). Note that for near disparities the lateral shift has to be larger to correspond to the same depth compared with far disparities. This is what neuroscientists usually do with
126: 220:
their corresponding image patches. For example, for a disparity of 1, the patch in the left image would be compared to a similar-sized patch in the right, shifted to the left by one pixel. The comparison between these two patches can be made by attaining a computational measure from one of the following equations that compares each of the pixels in the patches. For all of the following equations,
66: 219:
can be solved using an algorithm that scans both the left and right images for matching image features. A common approach to this problem is to form a smaller image patch around every pixel in the left image. These image patches are compared to all possible disparities in the right image by comparing
80:
At any given moment, the line of sight of the two eyes meet at a point in space. This point in space projects to the same location (i.e. the center) on the retinae of the two eyes. Because of the different viewpoints observed by the left and right eye however, many other points in space do not fall
77:) depending on each individual. Thus, each eye has a slightly different view of the world around. This can be easily seen when alternately closing one eye while looking at a vertical edge. The binocular disparity can be observed from apparent horizontal shift of the vertical edge between both views. 152:
with different disparities to the cells and look whether they are active or not. One possibility to present stimuli with different disparities is to place objects in varying depth in front of the eyes. However, the drawback to this method may not be precise enough for objects placed further away as
610:
Knowledge of disparity can be used in further extraction of information from stereo images. One case that disparity is most useful is for depth/distance calculation. Disparity and distance from the cameras are inversely related. As the distance from the cameras increases, the disparity decreases.
88:
The term "binocular disparity" refers to geometric measurements made external to the eye. The disparity of the images on the actual retina depends on factors internal to the eye, especially the location of the nodal points, even if the cross section of the retina is a perfect circle. Disparity on
618:
uses a similar method for scanning the terrain for obstacles. The rover captures a pair of images with its stereoscopic navigation cameras and disparity calculations are performed in order to detect elevated objects (such as boulders). Additionally, location and speed data can be extracted from
159:
The disparity of an object with different depth than the fixation point can alternatively be produced by presenting an image of the object to one eye and a laterally shifted version of the same image to the other eye. The full black circle is the point of fixation. Objects in varying depths are
116:
In computer vision, binocular disparity is calculated from stereo images taken from a set of stereo cameras. The variable distance between these cameras, called the baseline, can affect the disparity of a specific point on their respective image plane. As the baseline increases, the disparity
597:
algorithm. With large patch and/or image sizes, this technique can be very time consuming as pixels are constantly being re-examined to find the lowest correlation score. However, this technique also involves unnecessary repetition as many pixels overlap. A more efficient algorithm involves
117:
increases due to the greater angle needed to align the sight on the point. However, in computer vision, binocular disparity is referenced as coordinate differences of the point between the right and left images instead of a visual angle. The units are usually measured in pixels.
413: 598:
remembering all values from the previous pixel. An even more efficient algorithm involves remembering column sums from the previous row (in addition to remembering all values from the previous pixel). Techniques that save previous information can greatly increase the
589:
The disparity with the lowest computed value using one of the above methods is considered the disparity for the image feature. This lowest score indicates that the algorithm has found the best match of corresponding features in both images.
619:
subsequent stereo images by measuring the displacement of objects relative to the rover. In some cases, this is the best source of this type of information as the encoder sensors in the wheels may be inaccurate due to tire slippage.
141:) can detect the existence of disparity in their input from the eyes. Specifically, these neurons will be active, if an object with "their" special disparity lies within the part of the visual field to which they have access ( 109:. Binocular disparity is the angle between two lines of projection . One of which is the real projection from the object to the actual point of projection. The other one is the imaginary projection running through the 247: 584: 176:
The disparity of features between two stereo images are usually computed as a shift to the left of an image feature when viewed in the right image. For example, a single point that appears at the
500: 164:
to study disparity selectivity of neurons since the lateral distance required to test disparities is less than the distances required using depth tests. This principle has also been applied in
199:
Stereo images may not always be correctly aligned to allow for quick disparity calculation. For example, the set of cameras may be slightly rotated off level. Through a process known as
81:
on corresponding retinal locations. Visual binocular disparity is defined as the difference between the point of projection in the two eyes and is usually expressed in degrees as the
611:
This allows for depth perception in stereo images. Using geometry and algebra, the points that appear in the 2D stereo images can be mapped as coordinates in 3D space.
711: 49:
to determine distance and/or altitude to a target. In astronomy, the disparity between different locations on the Earth can be used to determine various celestial
153:
they possess smaller disparities while objects closer will have greater disparities. Instead, neuroscientists use an alternate method as schematised in Figure 2.
161: 89:
retina conforms to binocular disparity when measured as degrees, while much different if measured as distance due to the complicated structure inside eye.
507: 31: 420: 408:{\displaystyle {\frac {\sum {\sum {L(r,c)\cdot R(r,c-d)}}}{\sqrt {(\sum {\sum {L(r,c)^{2}}})\cdot (\sum {\sum {R(r,c-d)^{2}}})}}}} 207:
image coordinates). This is a property that can also be achieved by precise alignment of the stereo cameras before image capture.
95:
The full black circle is the point of fixation. The blue object lies nearer to the observer. Therefore, it has a "near" disparity
203:, both images are rotated to allow for disparities in only the horizontal direction (i.e. there is no disparity in the 698: 42:, binocular disparity refers to the difference in coordinates of similar features within two stereo images. 196:− 3 in the right image. In this case, the disparity at that location in the right image would be 3 pixels. 148:
Researchers investigating precise properties of these neurons with respect to disparity present visual
675: 723: 746: 74: 46: 137:) in a part of the brain responsible for processing visual information coming from the retinae ( 615: 599: 216: 149: 138: 662: 30:). The mind uses binocular disparity to extract depth information from the two-dimensional 8: 628: 200: 722:"Spacecraft: Surface Operations: Rover ." JPL.NASA.GOV. JPL/NASA, n.d. Web. 5 Jun 2011. 594: 751: 710:"The Computer Vision Laboratory ." JPL.NASA.GOV. JPL/NASA, n.d. Web. 5 Jun 2011. < 694: 643: 633: 54: 22:
refers to the difference in image location of an object seen by the left and right
741: 142: 39: 129:
Figure 2. Simulation of disparity from depth in the plane. (relates to Figure 1)
638: 165: 735: 690: 102:. Objects lying more far away (green) correspondingly have a "far" disparity 82: 125: 110: 693:
and George C. Stockman (2001). Computer Vision. Prentice Hall, 371–409.
35: 65: 614:
This concept is particularly useful for navigation. For example, the
236:
refer to the current row and column of either images being examined.
50: 27: 171: 724:
http://marsrovers.jpl.nasa.gov/mission/spacecraft_rover_eyes.html
579:{\displaystyle \sum {\sum {\left|L(r,c)-R(r,c-d)\right\vert }}} 134: 73:
Human eyes are horizontally separated by about 50–75 mm (
185: 69:
Figure 1. Definition of binocular disparity (far and near).
23: 510: 423: 250: 45:
A similar disparity can be used in rangefinding by a
495:{\displaystyle \sum {\sum {(L(r,c)-R(r,c-d))^{2}}}} 578: 494: 407: 120: 26:, resulting from the eyes' horizontal separation ( 733: 676:Neural mechanisms underlying stereoscopic vision 605: 663:Binocular Disparity and the Perception of Depth 172:Computing disparity using digital stereo images 16:Cue to determine depth or distance of an object 686: 684: 240:refers to the disparity of the right image. 681: 228:refer to the left and right columns while 188:) in the left image may be present at the 655: 124: 64: 678:, Prog Neurobiol, 55(3), 191–224, 1998. 734: 668: 210: 53:, and Earth's orbit can be used for 13: 14: 763: 602:of this image analyzing process. 593:The method described above is a 121:Tricking neurons with 2D images 716: 704: 566: 548: 539: 527: 481: 477: 459: 450: 438: 432: 399: 388: 369: 355: 349: 338: 325: 311: 304: 286: 277: 265: 1: 649: 606:Uses of disparity from images 504:Sum of absolute differences: 60: 674:Gonzalez, F. and Perez, R., 665:, Neuron, 18, 359–368, 1997. 417:Sum of squared differences: 7: 622: 10: 768: 215:After rectification, the 244:Normalized correlation: 113:of the fixation point. 75:interpupillary distance 47:coincidence rangefinder 616:Mars Exploration Rover 600:algorithmic efficiency 580: 496: 409: 217:correspondence problem 130: 70: 581: 497: 410: 139:primary visual cortex 128: 68: 508: 421: 248: 629:Binocular summation 201:image rectification 20:Binocular disparity 595:brute-force search 576: 492: 405: 211:Computer algorithm 162:random dot stimuli 131: 71: 644:Epipolar geometry 403: 402: 759: 727: 720: 714: 708: 702: 691:Linda G. Shapiro 688: 679: 672: 666: 659: 634:Binocular vision 585: 583: 582: 577: 575: 574: 573: 569: 501: 499: 498: 493: 491: 490: 489: 488: 414: 412: 411: 406: 404: 398: 397: 396: 395: 348: 347: 346: 345: 310: 309: 308: 307: 252: 55:stellar parallax 767: 766: 762: 761: 760: 758: 757: 756: 747:Computer vision 732: 731: 730: 721: 717: 709: 705: 689: 682: 673: 669: 660: 656: 652: 625: 608: 523: 519: 518: 514: 509: 506: 505: 484: 480: 431: 427: 422: 419: 418: 391: 387: 365: 361: 341: 337: 321: 317: 261: 257: 253: 251: 249: 246: 245: 213: 174: 143:receptive field 123: 107: 100: 63: 40:computer vision 17: 12: 11: 5: 765: 755: 754: 749: 744: 729: 728: 715: 703: 680: 667: 653: 651: 648: 647: 646: 641: 639:Cyclodisparity 636: 631: 624: 621: 607: 604: 587: 586: 572: 568: 565: 562: 559: 556: 553: 550: 547: 544: 541: 538: 535: 532: 529: 526: 522: 517: 513: 502: 487: 483: 479: 476: 473: 470: 467: 464: 461: 458: 455: 452: 449: 446: 443: 440: 437: 434: 430: 426: 415: 401: 394: 390: 386: 383: 380: 377: 374: 371: 368: 364: 360: 357: 354: 351: 344: 340: 336: 333: 330: 327: 324: 320: 316: 313: 306: 303: 300: 297: 294: 291: 288: 285: 282: 279: 276: 273: 270: 267: 264: 260: 256: 212: 209: 173: 170: 166:autostereogram 122: 119: 105: 98: 62: 59: 32:retinal images 15: 9: 6: 4: 3: 2: 764: 753: 750: 748: 745: 743: 740: 739: 737: 725: 719: 712: 707: 700: 699:0-13-030796-3 696: 692: 687: 685: 677: 671: 664: 658: 654: 645: 642: 640: 637: 635: 632: 630: 627: 626: 620: 617: 612: 603: 601: 596: 591: 570: 563: 560: 557: 554: 551: 545: 542: 536: 533: 530: 524: 520: 515: 511: 503: 485: 474: 471: 468: 465: 462: 456: 453: 447: 444: 441: 435: 428: 424: 416: 392: 384: 381: 378: 375: 372: 366: 362: 358: 352: 342: 334: 331: 328: 322: 318: 314: 301: 298: 295: 292: 289: 283: 280: 274: 271: 268: 262: 258: 254: 243: 242: 241: 239: 235: 231: 227: 223: 218: 208: 206: 202: 197: 195: 191: 187: 184:(measured in 183: 179: 169: 167: 163: 158: 154: 151: 146: 144: 140: 136: 133:Brain cells ( 127: 118: 114: 112: 108: 101: 94: 90: 86: 84: 78: 76: 67: 58: 56: 52: 48: 43: 41: 37: 33: 29: 25: 21: 718: 706: 670: 657: 613: 609: 592: 588: 237: 233: 229: 225: 221: 214: 204: 198: 193: 189: 181: 177: 175: 156: 155: 147: 132: 115: 103: 96: 92: 91: 87: 83:visual angle 79: 72: 44: 19: 18: 192:coordinate 180:coordinate 168:illusions. 111:nodal point 736:Categories 661:Qian, N., 650:References 61:Definition 36:stereopsis 561:− 543:− 516:∑ 512:∑ 472:− 454:− 429:∑ 425:∑ 382:− 363:∑ 359:∑ 353:⋅ 319:∑ 315:∑ 299:− 281:⋅ 259:∑ 255:∑ 157:Figure 2: 93:Figure 1: 752:Parallax 623:See also 51:parallax 28:parallax 150:stimuli 135:neurons 742:Vision 697:  186:pixels 713:>. 38:. In 695:ISBN 232:and 224:and 24:eyes 145:). 34:in 738:: 683:^ 85:. 57:. 726:. 701:. 571:| 567:) 564:d 558:c 555:, 552:r 549:( 546:R 540:) 537:c 534:, 531:r 528:( 525:L 521:| 486:2 482:) 478:) 475:d 469:c 466:, 463:r 460:( 457:R 451:) 448:c 445:, 442:r 439:( 436:L 433:( 400:) 393:2 389:) 385:d 379:c 376:, 373:r 370:( 367:R 356:( 350:) 343:2 339:) 335:c 332:, 329:r 326:( 323:L 312:( 305:) 302:d 296:c 293:, 290:r 287:( 284:R 278:) 275:c 272:, 269:r 266:( 263:L 238:d 234:c 230:r 226:R 222:L 205:y 194:t 190:x 182:t 178:x 106:f 104:d 99:n 97:d

Index

eyes
parallax
retinal images
stereopsis
computer vision
coincidence rangefinder
parallax
stellar parallax

interpupillary distance
visual angle
nodal point

neurons
primary visual cortex
receptive field
stimuli
random dot stimuli
autostereogram
pixels
image rectification
correspondence problem
brute-force search
algorithmic efficiency
Mars Exploration Rover
Binocular summation
Binocular vision
Cyclodisparity
Epipolar geometry
Binocular Disparity and the Perception of Depth

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.