Much of our behavior depends upon accurate perception of the three-dimensional spatial layout of the environment. Multiple sources of visual information are available to specify this layout, including stereopsis, motion parallax, shading, and perspective. This raises the question of how the visual system combines these sources of information, which is also referred to as the problem of sensor fusion. This project explores the role of perceptual surfaces in mediating the integration of two kinds of information for space perception: stereopsis and perspective. By integrating the local stereoscopic information relating an object to a surface with the stereoscopic and perspective information operating across the entire extent of the surface, the visual system can potentially greatly enhance the accuracy and precision with which the object's spatial location is perceived. It has already been shown that the perceived slant of a background surface can influence the perceived relative depth of two probes suspended in front of the surface. Our project uses this paradigm to investigate the processes by which local depth signals and surface slant signals are integrated. We use computer generated stereoscopic displays. The first experiments ask what surface characteristics facilitate surface mediated integration of information for probe depths. Does it depend upon the density of surface texture, whether the surface is continuous or interrupted by a gap, and whether the surface is behind or only adjacent to the probes? A second series of experiments investigates what we call the "topography of surface mediation". How is surface mediation affected by the probes' separation from each other, by their separation from the boundaries of the surface, and by the presence or absence of eye movements? A third series of experiments looks at the relative effects on perceived probe depth of different types of information specifying surface slant. We examine perspective, gradients of stereoscopic discontinuity at surface boundaries, and a closely related form of information produced by gradients of texture discontinuity at surface boundaries. The final experiment examines the perceptual response to inconsistent information about the relative depth of the probes. The importance of the experiments in this project lies in their contributions to two related problems in the contemporary study of visual space perception. The first is the rarely studied question of how objects and extended surfaces interact in the perception of complex spatial layouts. The second is the current debate about the mechanisms underlying the integration of spatial information (sensor fusion).