The present invention relates generally to the field of digital image processing and, more particularly, to a method and apparatus to enable tactile perception of visual images.
There is an increasing interest in various forms of virtual reality for business, entertainment, and educational purposes. In its purest form, virtual reality involves user interaction with a computer-simulated virtual environment. An early application of virtual reality was in various types of simulators, such as flight simulators. Today, the most common use of virtual reality is in connection with on-line video games, such as Linden Labs' Second Life, where the user interacts with a virtual world.
Recently, there has been interest in augmented reality, which combines computer-generated, virtual reality elements with real world experiences. An example of augmented reality is the yellow “first down” line seen in television broadcasts of football games, and the colored trail showing the motion of a puck in television broadcasts of hockey games. Current research in the field of augmented reality focuses primarily on the use of digital images which are processed and “augmented” by the addition of computer-generated graphics.
The present invention relates generally to a method and apparatus for augmenting visual perception of a digital image that enables a user to “feel” remote objects depicted in a visual image. Exemplary embodiments of the invention detect image texture in a digital image and generate tactile feedback control signals as a function of the detected image texture. A tactile feedback device, such as a vibrator, converts the tactile feedback control signals into tactile sensations. The vibrator may vary the intensity, frequency, and/or duty cycle of the vibration responsive to the tactile feedback control signals. In one exemplary embodiment, edge detection techniques are used to detect discontinuities in the digital image, such as sharp changes in image luminous intensity.
In one exemplary embodiment, tactile feedback is generated for the user of a video camera while the user captures a scene. As the user pans the scene with a video camera, the image captured by the video camera changes. The successive frames of the video may be processed in real time and detected changes from frame-to-frame may be used to generate tactile feedback for the user.
In another exemplary embodiment, a still image stored in memory is displayed to the user on a display. The user moves a cursor over the digital image to “feel” the objects depicted in the image. The cursor functions as a “digital finger.” As the digital finger moves over the image, discontinuities in the image where the digital finger traces a path are detected and used to generate tactile feedback for the user.
Referring now to the drawings, exemplary embodiments of an augmented reality system 10 to enhance visual perception of a recorded scene with a simulated sense of touch will be described. The augmented reality system 10, shown in
The basic function of the touch simulator 20 is to translate digital images of remote objects in a digital scene 50 into tactile feedback control signals representing tactile sensation. The touch simulator 20 may comprises or more processors, hardware, or a combination thereof for processing digital scenes, identifying image textures within the digital scene, and generating tactile feedback control signals based on the detected image textures. In one exemplary embodiment, the touch simulator 20 comprises an image processor 22 and tactile feedback processor 24. The image processor 22 receives a digital scene 50 from the image source 12, analyzes the visual content of the digital scene 50, and outputs image texture information to the tactile feedback processor 24. For example, the image texture information may reflect the discontinuities in the digital scene 50, such as when an edge is encountered by the reference object 54. The tactile feedback processor 24 processes the image texture information from the image processor 22 to generate a tactile feedback control signal to control a tactile feedback device 30.
The tactile feedback device 30 may comprise any transducer that converts electrical signals into tactile sensations. For example, the tactile feedback device 30 may comprise one or more vibrators that convert electrical signals into vibrations that may be sensed by the user. The tactile feedback device 30 may be incorporated into an image capture device, such as a video camera, so that tactile feedback is provided to the user while the digital scene 50 is being captured. The tactile feedback device 30 may also be incorporated into a mouse or other pointing device that controls movement of the reference object 54 relative to the objects in the digital scene 50. In embodiments where the tactile feedback device 30 is incorporated in a device (e.g. a mouse) that is separate from other elements of the augmented reality system 10, the tactile feedback control signal may be sent to the tactile feedback device 30 via a wired or wireless link.
The tactile feedback control signals generated by the tactile feedback processor 24 may be used to control one or more properties of the tactile feedback device 30. In the case of a vibrator, for example, the tactile feedback control signals may be used to control the intensity, frequency, duration, or other properties of the vibration depending on the image texture. For example, when the reference object 54 crosses the edges of the columns shown in the digital scene 50 in
It will typically not be necessary to analyze the entire digital scene 50. Instead, the image processor 22 may restrict analysis of the digital scene 50 to a small area around the reference object 54. Thus, the reference object 54 functions somewhat like a virtual finger.
In one exemplary embodiment, the image processor 22 detects the visual texture of an image based on the spatial variations in pixel intensity and/or pixel color. The visual textures detected may comprise edges, lines, boundaries, texture patterns, etc. in the digital scene 50. The visual textures in the digital scene 50 result from the physical characteristics or properties of the objects captured in the digital scene 50. For example, changes in depth in a real scene may result in edges or lines that may be detected by the image processor 22. Similarly, surface features of objects captured in a digital scene 50 may produce texture patterns that may be detected.
The image processor 22 may apply known edge detection and/or texture analysis algorithms to analyze the image and output image texture information to the tactile feedback processor 24. Edge detection is a fundamental process used in image processing applications to obtain information about images as a first step in feature extraction and object segmentation. There are many known techniques for edge detection. The majority of edge detection techniques may be classified into two groups referred to as gradient methods and Laplacian Methods. The gradient methods detect edges by looking for the maximum and minimum in the first derivative (e.g., gradient) of the image. The Laplacian Methods search for zero crossings in the second derivative of the image to find edges. Exemplary edge detection techniques suitable for the present invention include Sobel edge detection, Canny edge detection, and differential edge detection.
In some embodiments of the invention, image processor 22 may also perform texture analysis to detect the surface properties of objects captured in the digital scene 50. For example, a picture of a stone wall or brick wall will produce a near regular texture pattern that may be detected through texture analysis. Also, surface properties of the depicted objects, such as the degree of roughness and coloration, may result in texture patterns in the visual image. The texture patterns may be structured or stochastic. Texture analysis may be used to identify regions of an image where the texture pattern is homogenous. The regions of an image having a homogenous texture pattern may be classified and tactile feedback may be generated based on the classification of texture patterns. For example, the textures may be classified based on varying degrees of roughness. When tactile feedback is in the form of vibration, one or more of the frequency, intensity, and duty cycle of the vibration may be varied, depending upon the roughness of the textures in an image.
In some embodiments, It may be advantageous to preprocess an entire digital scene embodied in a previously captured and stored image to create an image map to facilitate generation of tactile feedback control signals. The image map includes the edges and other textural features of the image. Thus, when the user pans or zooms the image, the current location of the reference object 54 may be compared with the predetermined location of edges and other textural features of the image map. The preprocessing may be performed when the image is opened for viewing and the image map created can be stored either temporarily or permanently in memory. In the later case, the image map may be stored as metadata with the image, or otherwise associated with the image.
The augmented reality system 10 may be incorporated into an image capture device or image display device. For example, the augmented reality system 10 may be incorporated into a video camera or still camera, a cellular phone with video capability, or a computer.
The video camera 200 further comprises a tactile feedback processor 216 and a tactile feedback device 218 to generate tactile sensations responsive to the image texture of the digital scene 50 captured by the video camera 200. The image processor 206 and tactile feedback processor 216 function as the touch simulator 20 shown in
The video camera 200 is used in a conventional manner to capture video of a real scene. The captured video is stored in memory 214 and may be output to the display 210 in real time while the video is being recorded. The reference object 54 may be shown in the display 210 as previously described. As the user moves the camera 200 to record a digital scene 50, the reference object 54 will move within the recorded scene 50. The captured video is processed in real time and tactile feedback is generated to provide the user with a sense of touch. Those skilled in the art will appreciate that the tactile feedback can be generated even when image recording is turned off and the scene is being viewed but not recorded.
The cellular phone 300 may store digital images including digital video in memory 304, which the user may view on the display 306. Additionally, the cellular phone 300 may include an integrated video camera 312. The images captured by the camera 312 may be stored in memory 304 for subsequent viewing or output to the display 306 in real time while the video is being captured.
The cellular phone 300 may include an image processor 314, tactile feedback processor 316 and tactile feedback device 318. The image processor 314 may perform conventional image processing functions, such as compression and decompression. Additionally, the image processor 314, along with the tactile feedback processor 316, function as the touch simulator 20 shown in
The cellular phone 300 may be used as a video camera as previously described. In this case, tactile feedback may be generated in real time while a scene 50 is being captured and displayed on the viewfinder and/or stored in memory. Also, digital scenes 50 stored in memory 304 may be retrieved from memory 304 and displayed for viewing on display 306. The reference object 54 as shown in
The computer 400 further includes an image processor 414, tactile feedback processor 416 and tactile feedback device 418 that may be connected with computer 400 either by wire or wirelessly. The image processor 414 may perform conventional image processing functions, such as compression and decompression. Additionally, the image processor 414, along with the tactile feedback processor 416, function as the touch simulator 20 shown in
The computer 400 may store digital images including digital video in memory 404, which the user may view of the display 406 for viewing. The displayed image may be a video image or still images. During viewing, a reference object 54 may be displayed for the user on the display 406 overlying the image. The user may pan and zoom the image using standard user controls 408, such as a mouse, trackball, jog dial, navigation keys, etc. The reference object 54 may remain fixed in the center of the display 406. As the user navigates (i.e., pans and zooms) the image, the position of the reference object 54 relative to the image changes. In other embodiments, the user may use the user controls 408 to move the reference object 54 over the image. In either case, the relative position of the reference object 54 with respect to the image changes. The touch simulator 20 analyzes the visual content of the image as the relative position of the reference object changes and provides tactile feedback to the user. In this case, the tactile feedback device 418 may be contained in a mouse, keyboard, or other input control.
The present invention may, of course, be carried out in other ways than those specifically set forth herein without departing from essential characteristics of the invention. The present embodiments are to be considered in all respects as illustrative and not restrictive, and all changes coming within the meaning and equivalency range of the appended claims are intended to be embraced therein.