The video game industry has seen many changes over the years. As computing power has expanded, developers of video games have likewise created game software that takes advantage of these increases in computing power. To this end, video game developers have been coding games that incorporate sophisticated operations and mathematics to produce a very realistic game experience.
Example gaming platforms include the Sony Playstation or Sony Playstation2 (PS2), each of which is sold in the form of a game console. As is well known, the game console is designed to connect to a monitor (usually a television) and enable user interaction through handheld controllers. The game console is designed with specialized processing hardware, including a CPU, a graphics synthesizer for processing intensive graphics operations, a vector unit for performing geometry transformations, and other glue hardware, firmware, and software. The game console is further designed with an optical disc tray for receiving game compact discs for local play through the game console. Online gaming is also possible, wherein a user can interactively play against or with other users over the Internet.
As game complexity continues to intrigue players, gaming software and hardware manufacturers have continued to innovate to enable additional interactivity. In reality, however, the way in which users interact with a game has not changed dramatically over the years. Commonly, users still play computer games using hand held controllers or interact with programs using mouse pointing devices.
In view of the foregoing, there is a need for methods and systems that enable more advanced user interactivity with game play.
Broadly speaking, the present invention fills these needs by providing methods, systems and apparatus that enable dynamic user interactivity with a computing system. In one embodiment, the computing system will be executing a program and the interactivity will be with the program. The program may be, for example, a video game that defines interactive objects or features. The interactivity includes providing users with an ability to adjust a gearing component that will adjust a degree by which processing is performed.
In one embodiment, gearing can be applied to relative movement between movement of an input device verses an amount of movement processed by an object or feature of a computer program.
In another embodiment, gearing can be applied to a feature of a computer program, and detection from an input device can be based from processing of an inertial analyzer. The inertial analyzer will track an input device for inertial activity, and the inertial analyzer can then convey the information to a program. The program will then take the output from the inertial analyzer so that a gearing amount can be applied to the output. The gearing amount will then dictate a degree or ratio by which a program will compute an operation. The operation can take on any number of forms, and one example of the operation can be to generate a noise, a variable nose, a movement by an object, or a variable. If the output is a variable, the variable (e.g., a multiplier or the like) may be used to complete the execution of a process, such that the process will take into account the amount of gearing. The amount of gearing can be preset, set dynamically by the user or adjusted on demand.
In one embodiment, the tracking can be by way of an acoustic analyzer. The acoustic analyzer is configured to receive acoustic signals from an input device, and the acoustic analyzer can convey a gearing amount to be applied to the command or interaction being performed. The acoustic analyzer can be in the form of a computer program segment(s) or specifically defined on a circuit that is designed to process acoustic signal information. The acoustic signal information can therefore include gearing data that may be dynamically set by a program, set on-demand by the user through the input device (e.g., by selecting a button on a controller, a voice command, or the like).
In one embodiment, the tracking of the input device may by through an image analyzer. The image analyzer, as will be discussed below, can include a camera that captures images of a space where a user and an input device are located. In this example, the image analyzer is determining position of the controller to cause some respective action to a feature of a processing program. The program may be a game, and the feature may be an object that is being controlled by the input device. The image analyzer is further configured to mix the position data with an input gearing value. The gearing value can be provided by the user dynamically during execution or can be set by a program depending on activity within an execution session. The gearing input will set a relative impact on some processing by the computer program based on an input gesture or action by the user. In one embodiment, the gearing will translate a command or action from a user or user input device to a feature of a program. The feature of the program need not be an object that is visible, but can also include the adjustment of a variable used to calculate some parameter, estimation or translation of either sound, vibration or image movement. Gearing will therefore provide an additional sense of control to the interactivity provided to and with a program and features of the program.
In still another embodiment, mixer analyzer is provided. The Mixer analyzer is designed to generate a hybrid effect to a feature of the game. For instance, the Mixer analyzer can take input from a combination of the image analyzer, the acoustic analyzer, the inertial analyzer, and the like. The Mixer analyzer can therefore, in one embodiment, receive several gearing variables, which can then be mixed and synthesized to generate a hybrid result, command or interaction with a feature of a program. Again, the feature of the program should be broadly understood to include visual and non-visual objects, variables used in the processing of an operation, adjustments to audible reactions, and the like.
In one specific example, the amount by which movement by a user's controller will translate to movement or action on a feature of a game will be at least partially related to the set or determined gearing. The gearing can be dynamically set, preset for the game or adjusted during game play by the user, and the response is mapped to the video game object or feature to provide for another level of user interactivity and an enhanced experience.
It should be appreciated that the present invention can be implemented in numerous ways, including as a process, an apparatus, a system, a device, or a method. Several inventive embodiments of the present invention are described below.
In one embodiment, a method for interactive interfacing with a computer gaming system is provided. The computer gaming system includes a video capture device for capturing image data. The method includes displaying an input device to the video capture device, where the input device has a plurality of lights that are modulated so as to convey positioning of the input device and communication data that is to be interpreted by the computer gaming system based on analysis of the captured image data and a state of the plurality of lights. The method also includes defining movement of an object of a computer game that is to be executed by the computer gaming system, such that the movement of the object may be mapped to movements in position of the input device as detected in the captured image data. The method then establishes a gearing between the movement of the object of the computer game verses the movements in position of the input device. The gearing establishes a ratio between movements in position of the input device and movements of the object. The gearing can be set dynamically by the game, by the user, or can be preset by software or user configured in accordance with a gearing algorithm.
In another embodiment, a method for interactive interfacing with a computer gaming system is disclosed. The computer gaming system includes a video capture device for capturing image data. The method includes displaying an input device to the video capture device, such that the input device conveying positioning of the input device and communication data that is to be interpreted by the computer gaming system based on analysis of the captured image data. The method then defines movement of an object of a computer game that is to be executed by the computer gaming system, and the movement of the object may be mapped to movements in position of the input device as detected in the captured image data. The method then establishes a gearing between the movement of the object of the computer game verses the movements in position of the input device. The gearing establishes a user adjustable ratio between movements in position of the input device and movements of the object.
In yet another embodiment, computer readable media including program instructions for interactive interfacing with a computer gaming system is provided. The computer gaming system includes a video capture device for capturing image data. The computer readable media includes program instructions for detecting the display of an input device to the video capture device, the input device conveying positioning of the input device and communication data that is to be interpreted by the computer gaming system based on analysis of the captured image data. The computer readable media further includes program instruction for defining movement of an object of a computer game that is to be executed by the computer gaming system, where the movement of the object is mapped to movements in position of the input device as detected in the captured image data. The computer readable media also includes program instructions for establishing a gearing between the movement of the object of the computer game verses the movements in position of the input device, where the gearing may be configured to establish a user adjustable ratio between movements in position of the input device and movements of the object. The user adjustable ratio capable of being dynamically changed during game play, before a game play session or during certain action events during a game.
In still another embodiment, a system for enabling dynamic user interactivity between user actions and actions to be performed by an object of a computer program is provided. The system includes a computing system, a video capture device coupled to the computing system, a display, and the display receives input from the computing system. The system further includes an input device for interfacing with the computer program that is to be executed by the computing system. The input device having a gearing control for establishing a ratio between movement of the input device as mapped to movement of an object being controlled. The object being defined by the computer program and executed by the computing system. The movement of the input device being identified by the video capture device and the movement of the object as illustrated on the display having a set gearing value as set by the gearing control of the input device. In this embodiment, the gearing associated with the input device may be applied to a general application, and does not necessarily have to be for a game.
In another embodiment, an apparatus for enabling dynamic user interactivity between user actions and actions to be performed by an object of a computer program is provided. The apparatus includes an input device for interfacing with the computer program that is to be executed by a computing system. The input device has a gearing control for establishing a ratio between movement of the input device as mapped to movement of an object being controlled. The object being defined by the computer program and executed by the computing system, and the movement of the input device is identified by a video capture device and the movement of the object is illustrated on a display having a set gearing value as set by the gearing control of the input device.
The advantages of the present invention will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, illustrating by way of example the principles of the invention.
The present invention will be readily understood by the following detailed description in conjunction with the accompanying drawings, and like reference numerals designate like structural elements.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without some or all of these specific details. In other instances, well known process steps have not been described in detail in order not to obscure the present invention.
The technology described herein can be used to provide actively geared inputs for an interaction with a computer program. Gearing, in the general and broadest sense, can be defined an input that can have varying degrees in magnitude and/or time. The degree of gearing can then be communicated to a computing system. The degree of gearing may be applied to a process executed by the computing system. By analogy, a process can be imaged as a bucket of fluid having an input and an output. The bucket of fluid is a process that is executing on a system, and the gearing therefore controls an aspect of the processing performed by the computing system. In one example, the gearing can control the rate at which the fluid is emptied from the fluid bucket relative to an input amount, which might be thought of as drops of fluid going into the bucket. Thus, the fill rate may be dynamic, the drain rate may be dynamic, and the drain rate might be impacted by the gearing. The gearing can thus be adjusted or timed so as to tune a changing value that may be streaming to a program, such as a game program. The gearing may also impact a counter, such as a moving data counter that then controls an action by a processor or eventually a game element, object, player, character, etc.
Taking this analogy to a more tangible computing example, the rate at which the fluid is emptied might be the rate at which control is passed to or executed by a feature of a computer program, in response to some input plus gearing. The feature of the computer program may be an object, a process, a variable, or predefined/custom algorithm, character, game player, mouse (2D or 3D), etc. The result of the processing, which may have been altered by the gearing, can be conveyed to an observer in any number of ways. One way may be visually on a display screen, audibly via sound, vibration acoustics via feel, a combination thereof, etc., or simply by the modified response of processing for an interactive element of a game or program.
The input can be obtained by tracking performed via: (1) a image analysis, (2) an inertial analysis, (3) acoustic analysis, or hybrid Mixed analysis of (1), (2) or (3). Various examples are provided regarding image analysis and applied gearing, but it should be understood that the tracking is not limited to video, but can accomplished by numerous ways, and in particular, by inertial analysis, acoustic analysis, mixtures of these and other suitable analyzers.
In various embodiments, a computer or gaming system having a video camera (e.g., image analysis) can process image data and identify various actions taking place in a zone of focus or given volume that may be in front of the video camera. Such actions typically include moving or rotating the object in three dimensional space or actuating any of a variety of controls such as buttons, dials, joysticks, etc. In addition to these techniques, the present technology further provides the additional functionality of adjusting a scaling factor, referred to herein as gearing, to adjust the sensitivity of the input with respect to one or more corresponding actions on a display screen or a feature of a program. For instance, the actions on the display screen may be of an object that may be the focus of a video game. The object may also be a feature of a program, such as a variable, a multiplier, or a computation that will then be rendered as sound, vibration, images on a display screen or a combination of the these and other representations of the geared output.
In another embodiment, gearing can be applied to a feature of a computer program, and detection of an input device can be based on processing by an inertial analyzer. The inertial analyzer will track an input device for inertial activity, and the inertial analyzer can then convey the information to a program. The program will then take the output from the inertial analyzer so that a gearing amount can be applied to the output. The gearing amount will then dictate a degree or ratio by which a program will compute an operation. The operation can take on any number of forms, and one example of the operation can be to generate a noise, a variable nose, vibration, a movement by an object, or computation by a program that then outputs a visible and/or audible result. If the output is a variable, the variable may be used to complete the execution of a process, such that the process will take into account the amount of gearing. The amount of gearing can be preset, set dynamically by the user or adjusted on demand.
Various types of inertial sensor devices may be used to provide information on 6-degrees of freedom (e.g., X, Y and Z translation (e.g., acceleration) and rotation about X, Y and Z axes). Examples of suitable inertial sensors for providing information on 6-degrees of freedom include accelerometers, one or more single axis accelerometers, mechanical gyroscopes, ring laser gyroscopes or combinations of two or more of these.
Signals from the sensor(s) may be analyzed to determine the motion and/or orientation of the controller during play of a video game according to an inventive method. Such a method may be implemented as a series of processor executable program code instructions stored in a processor readable medium and executed on a digital processor. For example, a video game system may include one or more processors. Each processor may be any suitable digital processor unit, e.g., a microprocessor of a type commonly used in video game consoles or custom designed multi-processor cores. In one embodiment, the processor may implement an inertial analyzer through execution of processor readable instructions. A portion of the instructions may be stored in a memory. Alternatively, the inertial analyzer may be implemented in hardware, e.g., as an application specific integrated circuit (ASIC) or digital signal processor (DSP). Such analyzer hardware may be located on the controller or on the console or may be remotely located elsewhere. In hardware implementations, the analyzer may be programmable in response to external signals e.g., from the processor or some other remotely located source, e.g., connected by USB cable, Ethernet, over a network, the Internet, short range wireless connection, broadband wireless, Bluetooth, or a local network.
The inertial analyzer may include or implement instructions that analyze the signals generated by the inertial sensors and utilize information regarding position and/or orientation of a controller. The inertial sensor signals may be analyzed to determine information regarding the position and/or orientation of the controller. The position and or orientation information may be utilized during play of a video game with the system.
In one embodiment, a game controller may include one or more inertial sensors, which may provide position and/or orientation information to a processor via an inertial signal. Orientation information may include angular information such as a tilt, roll or yaw of the controller. As noted above, and by way of example, the inertial sensors may include any number and/or combination of accelerometers, gyroscopes or tilt sensors. In a one embodiment, the inertial sensors include tilt sensors adapted to sense orientation of the joystick controller with respect to tilt and roll axes, a first accelerometer adapted to sense acceleration along a yaw axis and a second accelerometer adapted to sense angular acceleration with respect to the yaw axis. An accelerometer may be implemented, e.g., as a MEMS device including a mass mounted by one or more springs with sensors for sensing displacement of the mass relative to one or more directions. Signals from the sensors that are dependent on the displacement of the mass may be used to determine an acceleration of the joystick controller. Such techniques may be implemented by instructions from the game program or general program, which may be stored in memory and executed by a processor.
By way of example an accelerometer suitable as an inertial sensor may be a simple mass elastically coupled at three or four points to a frame, e.g., by springs. Pitch and roll axes lie in a plane that intersects the frame, which is mounted to the joystick controller. As the frame (and the joystick controller) rotates about pitch and roll axes the mass will displace under the influence of gravity and the springs will elongate or compress in a way that depends on the angle of pitch and/or roll. The displacement and of the mass can be sensed and converted to a signal that is dependent on the amount of pitch and/or roll. Angular acceleration about the yaw axis or linear acceleration along the yaw axis may also produce characteristic patterns of compression and/or elongation of the springs or motion of the mass that can be sensed and converted to signals that are dependent on the amount of angular or linear acceleration. Such an accelerometer device can measure tilt, roll angular acceleration about the yaw axis and linear acceleration along the yaw axis by tracking movement of the mass or compression and expansion forces of the springs. There are a number of different ways to track the position of the mass and/or or the forces exerted on it, including resistive strain gauge material, photonic sensors, magnetic sensors, hall-effect devices, piezoelectric devices, capacitive sensors, and the like.
In addition, light sources may provide telemetry signals to the processor, e.g., in pulse code, amplitude modulation or frequency modulation format. Such telemetry signals may indicate which buttons are being pressed and/or how hard such buttons are being pressed. Telemetry signals may be encoded into the optical signal, e.g., by pulse coding, pulse width modulation, frequency modulation or light intensity (amplitude) modulation. The processor may decode the telemetry signal from the optical signal and execute a game command in response to the decoded telemetry signal. Telemetry signals may be decoded from analysis of images of the joystick controller obtained by an image capture unit. Alternatively, an apparatus may include a separate optical sensor dedicated to receiving telemetry signals from the lights sources.
A processor may use inertial signals from the inertial sensor in conjunction with optical signals from light sources detected by an image capture unit and/or sound source location and characterization information from acoustic signals detected by a microphone array to deduce information on the location and/or orientation of a controller and/or its user. For example, “acoustic radar” sound source location and characterization may be used in conjunction with a microphone array to track a moving voice while motion of the joystick controller is independently tracked (through inertial sensors and or light sources). In acoustic radar, a pre-calibrated listening zone is selected at runtime and sounds originating from sources outside the pre-calibrated listening zone are filtered out. The pre-calibrated listening zones may include a listening zone that corresponds to a volume of focus or field of view of the image capture unit.
In one embodiment, the tracking can be by way of an acoustic analyzer. The acoustic analyzer is configured to receive acoustic signals from an input device, and the acoustic analyzer can convey a gearing amount to be applied to the command or interaction being performed. The acoustic analyzer can be in the form of a computer program segment(s) or specifically defined on a circuit that is designed to process acoustic signal information. The acoustic signal information can therefore include gearing data that may be dynamically set by a program, set on-demand by the user through the input device (e.g., by selecting a button on a controller, a voice command, or the like). An example acoustic analyzer is described in US Patent Application, filed May 4, 2006 entitled “SELECTIVE SOUND SOURCE LISTENING IN CONJUNCTION WITH COMPUTER INTERACTIVE PROCESSING”, by inventors Xiadong Mao, Richard L. Marks and Gary, M. Zalewski, the entire disclosure of which is hereby incorporated herein by reference.
The Analyzers can be configured with a mapping chain. Mapping chains can be swapped out by the game during game-play as can settings to the Analyzer and to the Mixer.
In one embodiment, the tracking of the input device may by through an image analyzer. The image analyzer, as will be discussed further below, can include a camera that captures images of a space where a user and an input device are located. In this example, the image analyzer is determining position of the controller to cause some respective action to a feature of a processing program. The program may be a game, and the feature may be an object that is being controlled by the input device. The image analyzer is further configured to mix the position data with an input gearing value. The gearing value can be provided by the user dynamically during execution or can be set by a program depending on activity within an execution session. The gearing input will set a relative impact on some processing by the computer program based on an input gesture or action by the user. In one embodiment, the gearing will translate a command or action from a user or user input device to a feature of a program. The feature of the program need not be an object that is visible, but can also include the adjustment of a variable used to calculate some parameter, estimation or translation of either sound, vibration or image movement. Gearing will therefore provide an additional sense of control to the interactivity provided to and with a program and features of the program.
In still another embodiment, a mixer analyzer is provided. The Mixer analyzer is designed to generate a hybrid effect to a feature of the game. For instance, the Mixer analyzer can take input from a combination of the image analyzer, the acoustic analyzer, the inertial analyzer, and the like. The Mixer analyzer can therefore, in one embodiment, receive several gearing variables, which can then be mixed and synthesized to generate a hybrid result, command or interaction with a feature of a program. Again, the feature of the program should be broadly understood to include visual and non-visual objects, variables used in the processing of an operation, adjustments to audible reactions, and the like.
In one embodiment, image capture device 105 can be as simple as a standard web cam or can include more advanced technology. Image capture device 105 may be capable of capturing images, digitizing the images, and communicating the image data back to the computer 102. In some embodiments, the image capture device will have logic integrated therein for performing the digitizing and another embodiment the image capture device 105 will simply transmit an analog video signal to the computer 102 for digitizing. In either case, the image capture device 105 is capable of capturing either color or black and white images of any object located in front of the image capture device 105.
In one embodiment, the user 210 may also select to change or modify the degree of interactivity with the animated airplane 215′. The degree of interactivity may be modified by allowing the user 215 to adjust a “gearing” component that will adjust an amount by which movement by the user's controller 108 (or toy airplane 215) will be mapped to movement by the animated airplane 215′. Depending on the gearing, which can be dynamically set, preset for the game or adjusted during game play by the user 210, the response mapped to the animated airplane 215′ (e.g., video game object) will change to provide for another level of user interactivity and an enhanced experience. Further details regarding the gearing will be provided below with reference to
The video capture device 300 may be configured to provide depth image. In the this description, the terms “depth camera” and “three-dimensional camera” refer to any camera that is capable of obtaining distance or depth information as well as two-dimensional pixel information. For example, a depth camera can utilize controlled infrared lighting to obtain distance information. Another exemplary depth camera can be a stereo camera pair, which triangulates distance information using two standard cameras. Similarly, the term “depth sensing device” refers to any type of device that is capable of obtaining distance information as well as two-dimensional pixel information.
Camera 300 can therefore provide the ability to capture and map the third-dimension in addition to normal two-dimensional video imagery. Similar to normal cameras, a depth camera captures two-dimensional data for a plurality of pixels that comprise the video image. These values are color values for the pixels, generally red, green, and blue (RGB) values for each pixel. In this manner, objects captured by the camera appear as two-dimension objects on a monitor. However, unlike a conventional camera, a depth camera also captures the z-components of the scene, which represent the depth values for the scene. Since the depth values are typically assigned to the z-axis, the depth values are often referred to as z-values.
In operation, a z-value is captured for each pixel of the scene. Each z-value represents a distance from the camera to a particular object in the scene corresponding to the related pixel. In addition, a maximum detection range is defined beyond which depth values will not be detected. This maximum range plane can be utilized by the embodiments of the present invention to provide user defined object tracking. Thus, using a depth camera, each object can be tracked in three dimensions. As a result, a computer system of the embodiments of the present invention can utilize the z-values, along with the two-dimensional pixel data, to create an enhanced three-dimensional interactive environment for the user. For more information on depth analysis, reference may be made to U.S. patent application Ser. No. 10/448,614, entitled System and Method for Providing a Real-time three dimensional interactive environment, having a filing date of May 29, 2003, which is incorporated herein by reference.
Although a depth camera may be used in accordance with one embodiment, it should not be construed as being necessary to identify a location of position and coordinates of an object in three dimensional space. For example, in the scenario depicted in
Returning to
These and additional aspects of the present invention may be implemented by one or more processors which execute software instructions. According to one embodiment of the present invention, a single processor executes both input image processing and output image processing. However, as shown in the figures and for ease of description, the processing operations are shown as being divided between an input image processor 302 and an output image processor 304. It should be noted that the invention is in no way to be interpreted as limited to any special processor configuration, such as more than one processor. The multiple processing blocks shown in
IOP bus 428 couples CPU 424 to various input/output devices and other busses or device. IOP bus 428 is connected to input/output processor memory 430, controller 432, memory card 434, Universal Serial Bus (USB) port 436, IEEE1394 (also known as a Firewire interface) port 438, and bus 450. Bus 450 couples several other system components to CPU 424, including operating system (“OS”) ROM 440, flash memory 442, sound processing unit (“SPU”) 444, optical disc controlling 4, and hard disk drive (“HDD”) 448. In one aspect of this embodiment, the video capture device can be directly connected to IOP bus 428 for transmission therethrough to CPU 424; where, data from the video capture device can be used to change or update the values used to generate the graphics images in GPU 426. Moreover, embodiments of the present invention can use a variety of image processing configurations and techniques, such as those described in U.S. patent application Ser. No. 10/365,120 filed Feb. 11, 2003, and entitled METHOD AND APPARATUS FOR REAL TIME MOTION CAPTURE, which is hereby incorporated by reference in its entirety. The computer processing system may run on a CELL™ processor.
Main memory 514, vector calculation unit 516, GIF 522, OSDROM 526, real time clock (RTC) 528 and input/output port 524 are connected to MPU 512 over data bus 530. Also connected to BUS 530 is image processing unit 538 which is a processor for expanding compressed moving images and texture images, thereby developing the image data. For example, the image processing unit 538 can serve functions for decoding and development of bit streams according to the MPEG2 or MPEG4 standard formats, macroblock decoding, performing inverse discrete cosine transformations, color space conversion, vector quantization and the like.
A sound system is constituted by sound processing unit SPU 571 for generating musical or other sound effects on the basis of instructions from MPU 512, sound buffer 573 into which waveform data may be recorded by SPU 571, and speaker 575 for outputting the musical or other sound effects generated by SPU 571. It should be understood that speaker 575 may be incorporated as part of monitor 110 or may be provided as a separate audio line-out connection attached to external speaker 575.
Communications interface 540 is also provided, connected to BUS 530, which is an interface having functions of input/output of digital data, and for input of digital contents according to the present invention. For example, through communications interface 540, user input data may be transmitted to, and status data received from, a server terminal on a network in order to accommodate on-line video gaming applications. Input device 532 (also known as a controller) for input of data (e.g. key input data or coordinate data) with respect to the console 510 optical disk device 536 for reproduction of the contents of optical disk 569, for example a CD-ROM or the like on which various programs and data (i.e. data concerning objects, texture data and the like), are connected to input/output port 524.
The present invention further includes digital video camera 105 which is connected to input/output port 524. Input/output port 524 may be embodied by one or more input interfaces, including serial and USB interfaces, wherein digital video camera 190 may advantageously make use of the USB input or any other conventional interface appropriate for use with camera 105.
The above-mentioned image processor 520 includes a rendering engine 570, interface 572, image memory 574 and a display control device 576 (e.g. a programmable CRT controller, or the like). The rendering engine 570 executes operations for rendering of predetermined image data in the image memory, through memory interface 572, and in correspondence with rendering commands which are supplied from MPU 512. The rendering engine 570 has the capability of rendering, in real time, image data of 320×240 pixels or 640×480 pixels, conforming to, for example, NTSC or PAL standards, and more specifically, at a rate greater than ten to several tens of times per interval of from 1/60 to 1/30 of a second.
BUS 578 is connected between memory interface 572 and the rendering engine 570, and a second BUS 580 is connected between memory interface 572 and the image memory 574. First BUS 578 and second BUS 580, respectively, have a bit width of, for example 128 bits, and the rendering engine 570 is capable of executing high speed rendering processing with respect to the image memory. Image memory 574 employs a unified memory structure in which, for example, a texture rendering region and a display rendering region, can be set in a uniform area.
Display controller 576 is structured so as to write the texture data which has been retrieved from optical disk 569 through optical disk device 536, or texture data which has been created on main memory 514, to the texture rendering region of image memory 574, via memory interface 572. Image data which has been rendered in the display rendering region of image memory 174 is read out via memory interface 572, outputting the same to monitor 110 whereby it is displayed on a screen thereof.
Initially, the pixel data input from the camera is supplied to game console 510 through input/output port interface 524, enabling the following processes to be performed thereon. First, as each pixel of the image is sampled, for example, on a raster basis, a color segmentation processing step S201 is performed, whereby the color of each pixel is determined and the image is divided into various two-dimensional segments of different colors. Next, for certain embodiments, a color transition localization step S203 is performed, whereby regions where segments of different colors adjoin are more specifically determined, thereby defining the locations of the image in which distinct color transitions occur. Then, a step for geometry processing S205 is performed which, depending on the embodiment, comprises either an edge detection process or performing calculations for area statistics, to thereby define in algebraic or geometric terms the lines, curves and/or polygons corresponding to the edges of the object of interest.
The three-dimensional position and orientation of the object are calculated in step S207, according to algorithms which are to be described in association with the subsequent descriptions of preferred embodiments of the present invention. The data of three-dimensional position and orientation also undergoes a processing step S209 for Kalman filtering to improve performance. Such processing is performed to estimate where the object is going to be at a point in time, and to reject spurious measurements that could not be possible, and therefore are considered to lie outside the true data set. Another reason for Kalman filtering is that the camera 105 may produce images at 30 Hz, whereas an example display runs at 60 Hz, so Kalman filtering may fill the gaps in the data used for controlling action in the game program. Smoothing of discrete data via Kalman filtering is well known in the field of computer vision and hence will not be elaborated on further.
Specifically, image capture device 105 includes a digital image sensor for generating image data representing an image formed light impacting the sensor after passing through a lens as is generally known in the art. It is also possible that image capture device 105 comprises an analog video camera generating an analog signal representing the image formed by light. In the latter case, the analog signal is converted to a digital representation of the image prior to processing by recognizer 710. Image data representing successive two dimensional images of the three dimensional space 702 is passed to recognizer 710. Recognizer 710 may, in one embodiment, perform various processing steps as described above with reference to
In addition to position information, recognizer 710 may identify commands received from object 705. Commands can be interpreted from transmissions/deformation, sound and light generation etc., of object 705, for example, as described in related U.S. patent application Ser. No. 10/207,677, filed Jul. 27, 2002, entitled “MAN-MACHINE INTERFACE USING A DEFORMABLE DEVICE”; U.S. patent application Ser. No. 10/650,409, Filed Aug. 27, 2003, entitled “AUDIO INPUT SYSTEM”; and U.S. patent application Ser. No. 10/759,782, filed Jan. 16, 2004 entitled “METHOD AND APPARATUS FOR LIGHT INPUT DEVICE”, the above listed patent applications being incorporated herein by reference in their entireties. Commands received from object 705 is interpreted by recognizer and data corresponding to the received commands may be communicated to application 714. Application 714 may be a game application or other computer application that requested or is otherwise receptive to user input from image capture device 105. In one embodiment, mapper 712 may input absolute coordinates from recognizer 710 and maps these coordinates to output coordinates that are scaled in accordance with a gearing amount. In another embodiment, mapper 712 receives successive coordinate information from recognizer 710 and converts the changes in coordinate information to vector movements of object 705. For example, if object 705 moves a distance x1 from time t1 to time t2, then a vector x10,0 may be generated and passed to application 714. Time t1 to time t2 may be the interval of time between successive frames of the video generated by image capture device 105. Mapper 712 may scale the vector according to a scaling algorithm, e.g., by multiplying the vector by a gearing amount, G. In a different embodiment, each coordinate is multiplied by a corresponding gearing factor, e.g., Gx, Gy, and Gz. Thus, a corresponding motion of virtual object 705′ as shown in display 110, may be a distance x2, which is different from x1.
Application 714 may vary the gearing amount in accordance with commands 713 received from recognizer 710, or in accordance with the normal operation of the software, which may send mapper 712 gearing data causing the gearing amount to change. Gearing data may be sent to mapper 712 in response to a user command, various events, or modes of operation of application 714. Thus, the gearing amount may by varied in real time in response to user commands or as controlled by software. Mapper 712 may therefore transmit an output vector of the motion of object 705 to application 714, the output varying in relation to change in position of object 705 in space 702, and the gearing amount. Application 714, which in one embodiment is a video game, translates the output vector into a corresponding action which may then be displayed on display 110.
LED array 905 may generate infrared or visible light. Image capture device 105 (
In the transmission mode, other information, including commands or state information may be transmitted by the controller or device LEDs and according to known encoding and modulation schemes. On the receiver side, a video analyzer coupled to the video capture device may sync with and track the state of the LEDS and decode the information and controller movements. It is known that higher bandwidth may be achieved by modulating data across frames in the transmission mode cycle.
User interaction with interface 902 may cause one or more of LEDs in LED array 905 to modulate and/or change color. For example, as a user moves a joystick LEDs may change brightness or transmit information. The changes in intensity or color can be monitored by the computer system and provided to a gaming program as an intensity value. Furthermore, each button may be mapped to a change in color or intensity of one or more of the LEDs in LED array 905.
As controller 900 is moved about in three-dimensional space and rotated in one of a roll, yaw, or pitch direction, image capture device 105 in conjunction with computer system 102 may be capable of identifying these changes and generating a two dimensional vector, for describing movement on the image plane, or a three dimensional vector, for describing movement in three dimensional space. The vector can be provided as a series of coordinates describing relative movement and/or an absolute position with respect to the image capture device 105. As would be evident to those skilled in the art, movement on a plane perpendicular to the line of sight of image capture device 105 (the image plane) can be identified by an absolute position within the image capture zone, while movement of controller closer to image capture device 105 can be identified by the LED array appearing to spread out.
The rectangular configuration of LEDs 905 allow movement of controller 900 on three axes and rotation about each axis to be detected. Although only four LEDs are shown, it should be recognized that this is for exemplary purposes only, and any number of LEDs in any configuration would be possible. As controller 900 is pitched forward or backward, the top and bottom LEDs will get closer to each other while the left and right LEDs remain the same distance apart. Likewise, as the controller yaws left or right, the left and right LEDs will appear to approach each other while the top and bottom LEDs remain the same distance apart. Rolling motion of the controller can be detected by identifying the orientation of the LEDs on the image plane. As the controller moves closer to image capture device 105 along the line of sight thereof, all the LEDs will appear to be closer to each other. Finally, the controller's movement along the image plane can be tracked by identifying the location of the LEDs on the image plane, thereby identifying movement along respective x and y axes.
Controller 900 may also include a speaker 915 for generating audible or ultrasonic sound. Speaker 915 may generate sound effects for increased interactivity, or can transmit commands issued from interface 902 to a computer system having a microphone or other elements for receiving the transmissions.
In one example, the user can take a few practice swings, and then the computer can map out a number of example time slots corresponding to the user's actual swing ability. Then, the user can custom assign specific gearing to each time interval, depending on how the user wants to impact his game interactivity. Once the gearing is set, the user's movement of the bat 1605 can then be mapped to the movement of the bat 1605′ (e.g., game object). Again, it should be noted that the gearing can be preset by the game during different action, set during game play by the user, and may be adjusted in real time during game play.
In operation 2006, a determination is made as to whether the position has changed. If the position has not changed, then the procedure returns to operation 2004. In one embodiment, operation 2004 is delayed until a new frame of image data is received from the image capture device. If, at operation 2006, it is determined that the position has changed, then the procedure flows to operation 2008.
In operation 2008, a vector of movement is calculated. The movement vector can be any number of dimensions. For example, if the movement is of an input object in three-dimensional space, the movement vector may describe the movement as a three dimensional vector. However, if the movement is of a one dimensional control input, such as a steering wheel, then the movement vector is a one dimensional vector that describes the amount of rotation of the steering wheel. After determining the movement vector the procedure flows to operation 2010.
In operation 2010, the movement vector is multiplied by a current gearing amount to determine an input vector. The current gearing amount may be a scalar quantity or a multidimensional value. If it is a scalar quantity, then all dimensions of the movement vector are multiplied by the same amount. If the gearing amount is multidimensional, then each dimension of the movement vector is multiplied by a corresponding dimension of the gearing amount. The gearing amount may vary in response to user input and may be under software control. Therefore, the current gearing amount may change from moment to moment. After multiplying the movement vector by the current gearing amount, the procedure flows to operation 2012.
In operation 2012, a new position of a virtual object is calculated using the input vector calculated in operation 2010. The new position may be a camera position, or an object position such as a virtual steering wheel. The virtual object may not be displayed on a display screen. Once the new position of the virtual object is calculated, the procedure flows to operation 2014 wherein data representing the new position is passed to the application program. The procedure then ends as indicated by finish block 2016. It should be noted, however, that this flow is only exemplary in nature, and other alternatives may be possible.
In one embodiment, the operations will include the detection of an input device and the movement of the input device. The movement of the input device is determined by a camera that is viewing the input device. By user control or preset or pre-programmed settings, a gearing value is applied. If set by the user, the gearing value can be selected, for example, by allowing the user to hit a button on the input device (e.g., a controller). Depending on the gearing amount, movement control is mapped to an object of a computer game. If the user is using the input device to control an action figure on of a game, then the gearing that is set or controlled to be set will affect how the movement by the input device is mapped to the movement by the action figure of the computer game. Therefore, the gearing and the changes in gearing allow for the dynamic application of a mapped response by the object that may be part of a computer game.
In various embodiments, the image processing functions described above for determining the intensity value, controller player number, orientation and/or position of one or more input objects including controllers is carried out in a process executing on a computer system. The computing system is also executing a main process, referred to herein as an application program, which may be a gaming application, that requests or is otherwise receptive to the data generated from the image or audio processing, such data comprising controller player number, orientation and/or position of one or more input objects including controllers, controller actuation, etc. In various embodiments, the process performing the image and/or audio processing functions is a driver for a video camera or video/audio monitoring device, the driver providing the data to the main process via any type of inter-process communication which may be implementation specific as generally known and understood in the art. The process performing image or audio processing executes on the same processor or a different processor as the one executing the main process which is the gaming software or other application program. It is also possible to have a common process for both image or audio processing and game functionality in the same process, e.g., using a procedure call. Therefore, while it may be stated herein that the input vector or other information is provided “to the program” it should be recognized that the invention encompasses providing such data to one routine of a process using a procedure call or other software function such that a single process can both perform image processing functionality as well as gaming functionality, as well as separating the functions into different processes whereby one or more processes, which may execute on a common processor core or multiple processor cores, perform image and/or audio processing as described herein and a separate process performs gaming functions.
The present invention may be used as presented herein or in combination with other user input mechansims and notwithstanding mechanisms that track the angular direction of the sound and/or mechansims that track the position of the object actively or passively, mechanisms using machine vision, combinations thereof and where the object tracked may include ancillary controls or buttons that manipulate feedback to the system and where such feedback may include but is not limited light emission from light sources, sound distortion means, or other suitable transmitters and modulators as well as buttons, pressure pad, etc. that may influence the transmission or modulation of the same, encode state, and/or transmit commands from or to the device being tracked.
The invention may be practiced with other computer system configurations including game consoles, gaming computers or computing devices, hand-held devices, microprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers and the like. The invention may also be practiced in distributing computing environments where tasks are performed by remote processing devices that are linked through a network. For instance, on-line gaming systems and software may also be used.
With the above embodiments in mind, it should be understood that the invention may employ various computer-implemented operations involving data stored in computer systems. These operations are those requiring physical manipulation of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. Further, the manipulations performed are often referred to in terms, such as producing, identifying, determining, or comparing.
Any of the operations described herein that form part of the invention are useful machine operations. The invention also relates to a device or an apparatus for performing these operations. The apparatus may be specially constructed for the required purposes, such as the carrier network discussed above, or it may be a general purpose computer selectively activated or configured by a computer program stored in the computer. In particular, various general purpose machines may be used with computer programs written in accordance with the teachings herein, or it may be more convenient to construct a more specialized apparatus to perform the required operations.
The invention can also be embodied as computer readable code on a computer readable medium. The computer readable medium is any data storage device that can store data, which can thereafter be read by a computer system. Examples of the computer readable medium include hard drives, network attached storage (NAS), read-only memory, random-access memory, FLASH based memory, CD-ROMs, CD-Rs, CD-RWs, DVDs, magnetic tapes, and other optical and non-optical data storage devices. The computer readable medium can also be distributed over a network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
Still further, although gearing has been discussed in relation to video games, it should be understood that the gearing can be applied to any computer controlled environment. In one example, the gearing can be associated with a computer input device that allows for the interaction, selection, or input of information. Applying different gearing during different input or interactive operations can enable further degrees of operation not normally found in environments that have pre-configured control settings. Accordingly, the embodiments of gearing, as defined herein, should be given a broad encompassing application.
Once the gearing is determined, the gearing can be applied to a gesture, that may be communicated to a computer program. As noted above, tracking of a gesture or input device can be accomplished via image analysis, inertial analysis, or audible analysis. Examples of gestures include, but are not limited to throwing an object such as a ball, swinging an object such as a bat or golf club, pumping a hand pump, opening or closing a door or window, turning steering wheel or other vehicle control, martial arts moves such as punches, sanding movements, wax-on wax-off, paint the house, shakes, rattles, rolls, football pitches, baseball pitches, turning knob movements, 3D/2D MOUSE movements, scrolling movements, movements with known profiles, any recordable movement, movements along any vector back and forth i.e. pump the tire but at some arbitrary orientation in space, movements along a path, movements having precise stop and start times, any time based user manipulation that can be recorded, tracked and repeated within the noise floor, splines, and the like. Each of these gestures may be pre-recorded from path data and stored as a time-based model. The gearing, therefore, can be applied on any one of these gestures, depending the degree of gearing set by the user or program.
Although the foregoing invention has been described in some detail for purposes of clarity of understanding, it will be apparent that certain changes and modifications may be practiced within the scope of the appended claims. Accordingly, the present embodiments are to be considered as illustrative and not restrictive, and the invention is not to be limited to the details given herein, but may be modified within the scope and equivalents of the appended claims.
This application is a continuation of U.S. patent application Ser. No. 15/283,131, entitled, “METHOD AND SYSTEM FOR APPLYING GEARING EFFECTS TO VISUAL TRACKING”, filed on Sep. 30, 2016, which is a continuation of U.S. patent application Ser. No. 11/382,036, entitled, “METHOD AND SYSTEM FOR APPLYING GEARING EFFECTS TO VISUAL TRACKING”, filed on May 6, 2006; U.S. patent application Ser. No. 11/382,036 is a continuation in part (CIP) of U.S. patent application Ser. No. 10/207,677, entitled, “MAN-MACHINE INTERFACE USING A DEFORMABLE DEVICE”, filed on Jul. 27, 2002, now U.S. Pat. No. 7,102,615; U.S. patent application Ser. No. 10/650,409, entitled, “AUDIO INPUT SYSTEM”, filed on Aug. 27, 2003, now U.S. Pat. No. 7,613,310; U.S. patent application Ser. No. 10/663,236, entitled “METHOD AND APPARATUS FOR ADJUSTING A VIEW OF A SCENE BEING DISPLAYED ACCORDING TO TRACKED HEAD MOTION”, filed on Sep. 15, 2003, now U.S. Pat. No. 7,883,415; U.S. patent application Ser. No. 10/759,782, entitled “METHOD AND APPARATUS FOR LIGHT INPUT DEVICE”, filed on Jan. 16, 2004, now U.S. Pat. No. 7,623,115; U.S. patent application Ser. No. 10/820,469, entitled “VIDEO GAME CONTROLLER WITH NOISE CANCELING LOGIC”, filed on Apr. 7, 2004, now U.S. Pat. No. 7,970,147; and U.S. patent application Ser. No. 11/301,673, “METHODS AND SYSTEMS FOR ENABLING DIRECTION DETECTION WHEN INTERFACING WITH A COMPUTER PROGRAM”, filed on Dec. 12, 2005, now U.S. Pat. No. 7,646,372; U.S. patent application Ser. No. 11/381,729, to Xiao Dong Mao, entitled “ULTRA SMALL MICROPHONE ARRAY”, filed on May 4, 2006, now U.S. Pat. No. 7,809,145; application Ser. No. 11/381,728, to Xiao Dong Mao, entitled “ECHO AND NOISE CANCELLATION”, filed on May 4, 2006, now U.S. Pat. No. 7,545,926; U.S. patent application Ser. No. 11/381,725, to Xiao Dong Mao, entitled “METHODS AND APPARATUS FOR TARGETED SOUND DETECTION”, filed on May 4, 2006, now U.S. Pat. No. 7,783,061; U.S. patent application Ser. No. 11/381,727, to Xiao Dong Mao, entitled “NOISE REMOVAL FOR ELECTRONIC DEVICE WITH FAR FIELD MICROPHONE ON CONSOLE”, filed on May 4, 2006, now U.S. Pat. No. 7,697,700; U.S. patent application Ser. No. 11/381,724, to Xiao Dong Mao, entitled “METHODS AND APPARATUS FOR TARGETED SOUND DETECTION AND CHARACTERIZATION”, filed on May 4, 2006, now U.S. Pat. No. 8,073,157; U.S. patent application Ser. No. 11/381,721, to Xiao Dong Mao, entitled “CONTROLLING ACTIONS IN A VIDEO GAME UNIT”, filed on May 4, 2006, now U.S. Pat. No. 8,947,347, all of which are hereby incorporated by reference. This application is also related to co-pending application Ser. No. 11/418,988, to Xiao Dong Mao, entitled “METHODS AND APPARATUSES FOR ADJUSTING A LISTENING AREA FOR CAPTURING SOUNDS”, filed on May 4, 2006, now U.S. Pat. No. 8,160,269, the entire disclosures of which are incorporated herein by reference. This application is also related to co-pending application Ser. No. 11/418,989, to Xiao Dong Mao, entitled “METHODS AND APPARATUSES FOR CAPTURING AN AUDIO SIGNAL BASED ON VISUAL IMAGE”, filed on May 4, 2006, now U.S. Pat. No. 8,139,793, the entire disclosures of which are incorporated herein by reference. This application is also related to co-pending application Ser. No. 11/429,047, to Xiao Dong Mao, entitled “METHODS AND APPARATUSES FOR CAPTURING AN AUDIO SIGNAL BASED ON A LOCATION OF THE SIGNAL” filed on May 4, 2006, now U.S. Pat. No. 8,233,642, the entire disclosures of which are incorporated herein by reference. This application is also related to co-pending application Ser. No. 11/429,133, to Richard Marks et al., entitled “SELECTIVE SOUND SOURCE LISTENING IN CONJUNCTION WITH COMPUTER INTERACTIVE PROCESSING”, filed on May 4, 2006, now U.S. Pat. No. 7,760,248, the entire disclosures of which are incorporated herein by reference. This application is also related to co-pending application Ser. No. 11/429,414, to Richard Marks et al., entitled “Computer Image and Audio Processing of Intensity and Input Devices for Interfacing With A Computer Program”, filed on May 4, 2006, now U.S. Pat. No. 7,627,139, the entire disclosures of which are incorporated herein by reference. This application is also related to co-pending application Ser. No. 11/382,031, entitled “MULTI-INPUT GAME CONTROL MIXER”, filed on the same day as this application, now U.S. Pat. No. 7,918,733, the entire disclosures of which are incorporated herein by reference. This application is also related to co-pending application Ser. No. 11/382,032, entitled “SYSTEM FOR TRACKING USER MANIPULATIONS WITHIN AN ENVIRONMENT”, filed on the same day as this application, now U.S. Pat. No. 7,850,526, the entire disclosures of which are incorporated herein by reference. This application is also related to co-pending application Ser. No. 11/382,033, entitled “SYSTEM, METHOD, AND APPARATUS FOR THREE-DIMENSIONAL INPUT CONTROL”, filed on the same day as this application, now U.S. Pat. No. 8,686,939, the entire disclosures of which are incorporated herein by reference. This application is also related to co-pending application Ser. No. 11/382,035, entitled “INERTIALLY TRACKABLE HAND-HELD CONTROLLER”, filed on the same day as this application, now U.S. Pat. No. 8,797,260, the entire disclosures of which are incorporated herein by reference. This application is also related to co-pending application Ser. No. 11/382,041, entitled “METHOD AND SYSTEM FOR APPLYING GEARING EFFECTS TO INERTIAL TRACKING”, filed on the same day as this application, now U.S. Pat. No. 7,352,359, the entire disclosures of which are incorporated herein by reference. This application is also related to co-pending application Ser. No. 11/382,038, entitled “METHOD AND SYSTEM FOR APPLYING GEARING EFFECTS TO ACOUSTICAL TRACKING”, filed on the same day as this application, now U.S. Pat. No. 7,352,358, the entire disclosures of which are incorporated herein by reference. This application is also related to co-pending application Ser. No. 11/382,040, entitled “METHOD AND SYSTEM FOR APPLYING GEARING EFFECTS TO MULTI-CHANNEL MIXED INPUT”, filed on the same day as this application, now U.S. Pat. No. 7,391,409, the entire disclosures of which are incorporated herein by reference. This application is also related to co-pending application Ser. No. 11/382,034, entitled “SCHEME FOR DETECTING AND TRACKING USER MANIPULATION OF A GAME CONTROLLER BODY”, filed on the same day as this application, the entire disclosures of which are incorporated herein by reference. This application is also related to co-pending application Ser. No. 11/382,037, entitled “SCHEME FOR TRANSLATING MOVEMENTS OF A HAND-HELD CONTROLLER INTO INPUTS FOR A SYSTEM”, filed on the same day as this application, now U.S. Pat. No. 8,313,380, the entire disclosures of which are incorporated herein by reference. This application is also related to co-pending application Ser. No. 11/382,043, entitled “DETECTABLE AND TRACKABLE HAND-HELD CONTROLLER”, filed on the same day as this application, the entire disclosures of which are incorporated herein by reference. This application is also related to co-pending application Ser. No. 11/382,039, entitled “METHOD FOR MAPPING MOVEMENTS OF A HAND-HELD CONTROLLER TO GAME COMMANDS”, filed on the same day as this application, the entire disclosures of which are incorporated herein by reference. This application is also related to co-pending application No. 29/259,349, entitled “CONTROLLER WITH INFRARED PORT ((DESIGN PATENT))”, filed on the same day as this application, the entire disclosures of which are incorporated herein by reference. This application is also related to co-pending application No. 29/259,350, entitled “CONTROLLER WITH TRACKING SENSORS ((DESIGN PATENT))”, filed on the same day as this application, now U.S. Pat. No. D621,836, the entire disclosures of which are incorporated herein by reference. This application is also related to application No. 60/798,031, entitled “DYNAMIC TARGET INTERFACE”, filed on the same day as this application, the entire disclosures of which are incorporated herein by reference. This application is also related to co-pending application No. 29/259,348, entitled “FACE OF A TRACKED CONTROLLER DEVICE ((DESIGN))”, filed on the same day as this application, the entire disclosures of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
3943277 | Everly et al. | Mar 1976 | A |
4263504 | Thomas | Apr 1981 | A |
4313227 | Eder | Jan 1982 | A |
4558864 | Medwedeff | Dec 1985 | A |
4565999 | King et al. | Jan 1986 | A |
4787051 | Olson | Nov 1988 | A |
4802227 | Elko et al. | Jan 1989 | A |
4823001 | Kobayashi et al. | Apr 1989 | A |
4843568 | Krueger et al. | Jun 1989 | A |
4963858 | Chien | Oct 1990 | A |
5034986 | Karmann et al. | Jul 1991 | A |
5055840 | Bartlett | Oct 1991 | A |
5111401 | Everett et al. | May 1992 | A |
5128671 | Thomas, Jr. | Jul 1992 | A |
5144594 | Gilchrist | Sep 1992 | A |
5195179 | Tokunaga | Mar 1993 | A |
5260556 | Lake et al. | Nov 1993 | A |
5297061 | Dementhon et al. | Mar 1994 | A |
5335011 | Addeo et al. | Aug 1994 | A |
5394168 | Smith, III et al. | Feb 1995 | A |
5426450 | Drumm | Jun 1995 | A |
5435554 | Lipson | Jul 1995 | A |
5453758 | Sato | Sep 1995 | A |
5455685 | Mori | Oct 1995 | A |
5473701 | Cezanne et al. | Dec 1995 | A |
5485273 | Mark et al. | Jan 1996 | A |
5517333 | Tamura et al. | May 1996 | A |
5528265 | Harrison | Jun 1996 | A |
5534917 | MacDougall | Jul 1996 | A |
5543818 | Scott | Aug 1996 | A |
5554980 | Hashimoto et al. | Sep 1996 | A |
5557684 | Wang et al. | Sep 1996 | A |
5563988 | Maes et al. | Oct 1996 | A |
5568928 | Munson et al. | Oct 1996 | A |
5581276 | Cipolla et al. | Dec 1996 | A |
5583478 | Florent et al. | Dec 1996 | A |
5586231 | Florent et al. | Dec 1996 | A |
5611000 | Szeliski et al. | Mar 1997 | A |
5611731 | Bouton et al. | Mar 1997 | A |
5616078 | Oh | Apr 1997 | A |
5638228 | Thomas, III | Jun 1997 | A |
5649021 | Matey et al. | Jul 1997 | A |
5675825 | Dreyer et al. | Oct 1997 | A |
5675828 | Stoel et al. | Oct 1997 | A |
5677710 | Thompson-Rohrlich | Oct 1997 | A |
5706364 | Kopec et al. | Jan 1998 | A |
5768415 | Jagadish et al. | Jun 1998 | A |
5796354 | Cartabiano et al. | Aug 1998 | A |
5818424 | Korth | Oct 1998 | A |
5846086 | Bizzi et al. | Dec 1998 | A |
5850222 | Cone | Dec 1998 | A |
5850473 | Andersson | Dec 1998 | A |
5861910 | McGarry et al. | Jan 1999 | A |
5870100 | DeFreitas | Feb 1999 | A |
5883616 | Koizumi et al. | Mar 1999 | A |
5889672 | Schuler et al. | Mar 1999 | A |
5900863 | Numazaki | May 1999 | A |
5913727 | Ahdoot | Jun 1999 | A |
5914723 | Gajewska | Jun 1999 | A |
5917493 | Tan et al. | Jun 1999 | A |
5917936 | Katto | Jun 1999 | A |
5923306 | Smith et al. | Jul 1999 | A |
5923318 | Zhai et al. | Jul 1999 | A |
5929444 | Leichner | Jul 1999 | A |
5930383 | Netzer | Jul 1999 | A |
5930741 | Kramer | Jul 1999 | A |
5937081 | O'Brill et al. | Aug 1999 | A |
5959596 | McCarten et al. | Sep 1999 | A |
5963250 | Parker et al. | Oct 1999 | A |
5978722 | Takasuka et al. | Nov 1999 | A |
5993314 | Dannenberg et al. | Nov 1999 | A |
6009210 | Kang | Dec 1999 | A |
6014167 | Suito et al. | Jan 2000 | A |
6021219 | Andersson et al. | Feb 2000 | A |
6022274 | Takeda et al. | Feb 2000 | A |
6031545 | Ellenby et al. | Feb 2000 | A |
6031934 | Ahmad et al. | Feb 2000 | A |
6037942 | Millington | Mar 2000 | A |
6044181 | Szeliski et al. | Mar 2000 | A |
6049619 | Anandan et al. | Apr 2000 | A |
6056640 | Schaaij | May 2000 | A |
6057909 | Yahav et al. | May 2000 | A |
6061055 | Marks | May 2000 | A |
6072494 | Nguyen | Jun 2000 | A |
6075895 | Qiao et al. | Jun 2000 | A |
6078789 | Bodenmann et al. | Jun 2000 | A |
6091905 | Yahav et al. | Jul 2000 | A |
6094625 | Ralston | Jul 2000 | A |
6097369 | Wambach | Aug 2000 | A |
6100517 | Yahav et al. | Aug 2000 | A |
6100895 | Miura et al. | Aug 2000 | A |
6101289 | Kellner | Aug 2000 | A |
6115052 | Freeman et al. | Sep 2000 | A |
6134346 | Berman et al. | Oct 2000 | A |
6144367 | Berstis | Nov 2000 | A |
6151009 | Kanade et al. | Nov 2000 | A |
6157368 | Faeger | Dec 2000 | A |
6160540 | Fishkin et al. | Dec 2000 | A |
6166744 | Jaszlics et al. | Dec 2000 | A |
6173059 | Huang et al. | Jan 2001 | B1 |
6175343 | Mitchell et al. | Jan 2001 | B1 |
6184863 | Sibert et al. | Feb 2001 | B1 |
6191773 | Maruno et al. | Feb 2001 | B1 |
6195104 | Lyons | Feb 2001 | B1 |
6215898 | Woodfill et al. | Apr 2001 | B1 |
6243074 | Fishkin et al. | Jun 2001 | B1 |
6243491 | Andersson | Jun 2001 | B1 |
6275213 | Tremblay et al. | Aug 2001 | B1 |
6281930 | Parker et al. | Aug 2001 | B1 |
6282362 | Murphy et al. | Aug 2001 | B1 |
6295064 | Yamaguchi | Sep 2001 | B1 |
6297838 | Chang et al. | Oct 2001 | B1 |
6304267 | Sata | Oct 2001 | B1 |
6307549 | King et al. | Oct 2001 | B1 |
6307568 | Rom | Oct 2001 | B1 |
6323839 | Fukuda et al. | Nov 2001 | B1 |
6323942 | Bamji | Nov 2001 | B1 |
6326901 | Gonzales | Dec 2001 | B1 |
6327073 | Yahav et al. | Dec 2001 | B1 |
6331911 | Manassen et al. | Dec 2001 | B1 |
6346929 | Fukushima et al. | Feb 2002 | B1 |
6351661 | Cosman | Feb 2002 | B1 |
6371849 | Togami | Apr 2002 | B1 |
6375572 | Masuyama et al. | Apr 2002 | B1 |
6392644 | Miyata et al. | May 2002 | B1 |
6393142 | Swain et al. | May 2002 | B1 |
6394897 | Togami | May 2002 | B1 |
6400374 | Lanier | Jun 2002 | B2 |
6409602 | Wiltshire et al. | Jun 2002 | B1 |
6411392 | Bender et al. | Jun 2002 | B1 |
6411744 | Edwards | Jun 2002 | B1 |
6417836 | Kumar et al. | Jul 2002 | B1 |
6441825 | Peters | Aug 2002 | B1 |
6473516 | Kawaguchi et al. | Oct 2002 | B1 |
6498860 | Sasaki et al. | Dec 2002 | B1 |
6504535 | Edmark | Jan 2003 | B1 |
6513160 | Dureau | Jan 2003 | B2 |
6516466 | Jackson | Feb 2003 | B1 |
6519359 | Nafis et al. | Feb 2003 | B1 |
6533420 | Eichenlaub | Mar 2003 | B1 |
6542927 | Rhoads | Apr 2003 | B2 |
6545706 | Edwards et al. | Apr 2003 | B1 |
6546153 | Hoydal | Apr 2003 | B1 |
6556704 | Chen | Apr 2003 | B1 |
6577748 | Chang | Jun 2003 | B2 |
6580414 | Wergen et al. | Jun 2003 | B1 |
6580415 | Kato et al. | Jun 2003 | B1 |
6587573 | Stam et al. | Jul 2003 | B1 |
6587835 | Treyz et al. | Jul 2003 | B1 |
6593956 | Potts et al. | Jul 2003 | B1 |
6595642 | Wirth | Jul 2003 | B2 |
6597342 | Haruta | Jul 2003 | B1 |
6621938 | Tanaka et al. | Sep 2003 | B1 |
6626756 | Sugimoto | Sep 2003 | B2 |
6628265 | Hwang | Sep 2003 | B2 |
6661914 | Dufour | Dec 2003 | B2 |
6674415 | Nakamura et al. | Jan 2004 | B2 |
6870526 | Zngf et al. | Mar 2005 | B2 |
6873747 | Askary | Mar 2005 | B2 |
6881147 | Naghi et al. | Apr 2005 | B2 |
6884171 | Eck et al. | Apr 2005 | B2 |
6890262 | Oishi et al. | May 2005 | B2 |
6917688 | Yu et al. | Jul 2005 | B2 |
6919824 | Lee | Jul 2005 | B2 |
6924787 | Kramer et al. | Aug 2005 | B2 |
6928180 | Stam et al. | Aug 2005 | B2 |
6930725 | Hayashi | Aug 2005 | B1 |
6931125 | Smallwood | Aug 2005 | B2 |
6931596 | Gutta et al. | Aug 2005 | B2 |
6943776 | Ehrenburg | Sep 2005 | B2 |
6945653 | Kobori et al. | Sep 2005 | B2 |
6947576 | Stam et al. | Sep 2005 | B2 |
6951515 | Ohshima et al. | Oct 2005 | B2 |
6952198 | Hansen | Oct 2005 | B2 |
6965362 | Ishizuka | Nov 2005 | B1 |
6970183 | Monroe | Nov 2005 | B1 |
6990639 | Wilson | Jan 2006 | B2 |
7006009 | Newman | Feb 2006 | B2 |
7016411 | Azuma et al. | Mar 2006 | B2 |
7016532 | Boncyk et al. | Mar 2006 | B2 |
7023475 | Bean et al. | Apr 2006 | B2 |
7039199 | Rui | May 2006 | B2 |
7039253 | Matsuoka et al. | May 2006 | B2 |
7042440 | Pryor et al. | May 2006 | B2 |
7043056 | Edwards et al. | May 2006 | B2 |
7054452 | Ukita | May 2006 | B2 |
7059962 | Watashiba | Jun 2006 | B2 |
7061507 | Tuomi et al. | Jun 2006 | B1 |
7071914 | Marks | Jul 2006 | B1 |
7084887 | Sato et al. | Aug 2006 | B1 |
7090352 | Kobor et al. | Aug 2006 | B2 |
7098891 | Pryor et al. | Aug 2006 | B1 |
7102615 | Marks | Sep 2006 | B2 |
7106366 | Parker et al. | Sep 2006 | B2 |
7107196 | Waterston | Sep 2006 | B2 |
7113635 | Robert et al. | Sep 2006 | B2 |
7116310 | Evans et al. | Oct 2006 | B1 |
7116330 | Marshall et al. | Oct 2006 | B2 |
7116342 | Dengler et al. | Oct 2006 | B2 |
7121946 | Paul et al. | Oct 2006 | B2 |
7139767 | Taylor et al. | Nov 2006 | B1 |
7148922 | Shimada | Dec 2006 | B2 |
7156311 | Attia et al. | Jan 2007 | B2 |
7158118 | Liberty | Jan 2007 | B2 |
7161634 | Long | Jan 2007 | B2 |
7164413 | Davis et al. | Jan 2007 | B2 |
7174312 | Harper et al. | Feb 2007 | B2 |
7183929 | Antebi et al. | Feb 2007 | B1 |
7212308 | Morgan | May 2007 | B2 |
7215323 | Gombert et al. | May 2007 | B2 |
7223173 | Masuyama et al. | May 2007 | B2 |
7224384 | Iddan et al. | May 2007 | B1 |
7227526 | Hildreth et al. | Jun 2007 | B2 |
7227976 | Jung et al. | Jun 2007 | B1 |
7239301 | Liberty et al. | Jul 2007 | B2 |
7245273 | Eberl et al. | Jul 2007 | B2 |
7259375 | Tichit et al. | Aug 2007 | B2 |
7262760 | Liberty | Aug 2007 | B2 |
7263462 | Funge et al. | Aug 2007 | B2 |
7274305 | Luttrell | Sep 2007 | B1 |
7277526 | Rifkin et al. | Oct 2007 | B2 |
7283679 | Okada et al. | Oct 2007 | B2 |
7296007 | Funge et al. | Nov 2007 | B1 |
7301530 | Lee et al. | Nov 2007 | B2 |
7301547 | Martins et al. | Nov 2007 | B2 |
7305114 | Wolff et al. | Dec 2007 | B2 |
7346387 | Wachter et al. | Mar 2008 | B1 |
7352359 | Zalewski et al. | Apr 2008 | B2 |
7364297 | Goldfain et al. | Apr 2008 | B2 |
7369117 | Evans et al. | May 2008 | B2 |
7379559 | Wallace et al. | May 2008 | B2 |
7391408 | Zalewski et al. | Jun 2008 | B1 |
7414611 | Liberty | Aug 2008 | B2 |
7436887 | Yeredor et al. | Oct 2008 | B2 |
7446650 | Schofield et al. | Nov 2008 | B2 |
7489298 | Liberty | Feb 2009 | B2 |
7489299 | Liberty et al. | Feb 2009 | B2 |
7545926 | Mao | Jun 2009 | B2 |
7555157 | Davidson et al. | Jun 2009 | B2 |
7558698 | Funge et al. | Jul 2009 | B2 |
7613610 | Zimmerman et al. | Nov 2009 | B1 |
7623115 | Marks | Nov 2009 | B2 |
7627139 | Marks et al. | Dec 2009 | B2 |
7636645 | Yen et al. | Dec 2009 | B1 |
7636697 | Dobson et al. | Dec 2009 | B1 |
7636701 | Funge et al. | Dec 2009 | B2 |
7640515 | Balakrishnan et al. | Dec 2009 | B2 |
7646372 | Marks et al. | Jan 2010 | B2 |
7665041 | Wilson et al. | Feb 2010 | B2 |
7697700 | Mao | Apr 2010 | B2 |
7721231 | Wilson | May 2010 | B2 |
8310656 | Zalewski | Nov 2012 | B2 |
9474968 | Zalewski et al. | Oct 2016 | B2 |
10099130 | Zalewski | Oct 2018 | B2 |
20010056477 | McTernan et al. | Dec 2001 | A1 |
20020010655 | Kjallstrom | Jan 2002 | A1 |
20020023027 | Simonds | Feb 2002 | A1 |
20020036617 | Pryor | Mar 2002 | A1 |
20020056114 | Fillebrown et al. | May 2002 | A1 |
20020072414 | Stylinski et al. | Jun 2002 | A1 |
20020075286 | Yonezawa et al. | Jun 2002 | A1 |
20020083461 | Hutcheson et al. | Jun 2002 | A1 |
20020085097 | Colmenarez et al. | Jul 2002 | A1 |
20020094189 | Navab et al. | Jul 2002 | A1 |
20020126899 | Farrell | Sep 2002 | A1 |
20020134151 | Naruoka et al. | Sep 2002 | A1 |
20020158873 | Williamson | Oct 2002 | A1 |
20030014212 | Ralston et al. | Jan 2003 | A1 |
20030022716 | Park et al. | Jan 2003 | A1 |
20030093591 | Hohl | May 2003 | A1 |
20030100363 | Ali | May 2003 | A1 |
20030123705 | Stam et al. | Jul 2003 | A1 |
20030160862 | Charlier et al. | Aug 2003 | A1 |
20030232649 | Gizis et al. | Dec 2003 | A1 |
20040001082 | Said | Jan 2004 | A1 |
20040017355 | Shim | Jan 2004 | A1 |
20040035925 | Wu et al. | Feb 2004 | A1 |
20040054512 | Kim et al. | Mar 2004 | A1 |
20040063480 | Wang | Apr 2004 | A1 |
20040063481 | Wang | Apr 2004 | A1 |
20040070565 | Nayar et al. | Apr 2004 | A1 |
20040087366 | Shum et al. | May 2004 | A1 |
20040095327 | Lo | May 2004 | A1 |
20040140955 | Metz | Jul 2004 | A1 |
20040150728 | Ogino | Aug 2004 | A1 |
20040178576 | Hillis et al. | Sep 2004 | A1 |
20040180720 | Nashi et al. | Sep 2004 | A1 |
20040189720 | Wilson et al. | Sep 2004 | A1 |
20040212589 | Hall et al. | Oct 2004 | A1 |
20040213419 | Varma et al. | Oct 2004 | A1 |
20040227725 | Calarco et al. | Nov 2004 | A1 |
20040254017 | Cheng | Dec 2004 | A1 |
20050037844 | Shum et al. | Feb 2005 | A1 |
20050047611 | Mao | Mar 2005 | A1 |
20050088369 | Yoshioka | Apr 2005 | A1 |
20050102374 | Moragne et al. | May 2005 | A1 |
20050105777 | Koslowski et al. | May 2005 | A1 |
20050117045 | Abdellatif et al. | Jun 2005 | A1 |
20050162384 | Yokoyama | Jul 2005 | A1 |
20050162385 | Doi et al. | Jul 2005 | A1 |
20050198095 | Du et al. | Sep 2005 | A1 |
20050226431 | Mao | Oct 2005 | A1 |
20050239548 | Ueshima et al. | Oct 2005 | A1 |
20060033713 | Pryor | Feb 2006 | A1 |
20060035710 | Festejo et al. | Feb 2006 | A1 |
20060038819 | Festejo et al. | Feb 2006 | A1 |
20060204012 | Marks et al. | Sep 2006 | A1 |
20060233389 | Mao et al. | Oct 2006 | A1 |
20060250681 | Park et al. | Nov 2006 | A1 |
20060252541 | Zalewski et al. | Nov 2006 | A1 |
20060252543 | Van Noland et al. | Nov 2006 | A1 |
20060264258 | Zalewski et al. | Nov 2006 | A1 |
20060264259 | Zalewski et al. | Nov 2006 | A1 |
20060264260 | Zalewski et al. | Nov 2006 | A1 |
20060269072 | Mao | Nov 2006 | A1 |
20060269073 | Mao | Nov 2006 | A1 |
20060274032 | Mao et al. | Dec 2006 | A1 |
20060274911 | Mao et al. | Dec 2006 | A1 |
20060280312 | Mao | Dec 2006 | A1 |
20060282873 | Zalewski et al. | Dec 2006 | A1 |
20060287084 | Mao et al. | Dec 2006 | A1 |
20060287085 | Mao et al. | Dec 2006 | A1 |
20060287086 | Zalewski et al. | Dec 2006 | A1 |
20060287087 | Zalewski et al. | Dec 2006 | A1 |
20070015559 | Zalewski et al. | Jan 2007 | A1 |
20070021208 | Mao et al. | Jan 2007 | A1 |
20070025562 | Zalewski et al. | Feb 2007 | A1 |
20070060336 | Marks et al. | Mar 2007 | A1 |
20070061413 | Larsen et al. | Mar 2007 | A1 |
20070066394 | Ikeda et al. | Mar 2007 | A1 |
20070072675 | Hammano et al. | Mar 2007 | A1 |
20070120834 | Boillot | May 2007 | A1 |
20070120996 | Boillot | May 2007 | A1 |
20070260340 | Mao | Nov 2007 | A1 |
20070260517 | Zalewski et al. | Nov 2007 | A1 |
20070261077 | Zalewski et al. | Nov 2007 | A1 |
20080056561 | Sawachi | Mar 2008 | A1 |
20080070684 | Haigh-Hutchinson | Mar 2008 | A1 |
20080091421 | Gustavsson | Apr 2008 | A1 |
20080208613 | Scibora | Aug 2008 | A1 |
20090010494 | Bechtel et al. | Jan 2009 | A1 |
20090016642 | Hart | Jan 2009 | A1 |
20090221368 | Yen et al. | Sep 2009 | A1 |
20090221374 | Yen et al. | Sep 2009 | A1 |
20090288064 | Yen et al. | Nov 2009 | A1 |
20100004896 | Yen et al. | Jan 2010 | A1 |
20100137064 | Shum et al. | Jun 2010 | A1 |
20110074669 | Marks et al. | Mar 2011 | A1 |
20110077082 | Marks et al. | Mar 2011 | A1 |
Number | Date | Country |
---|---|---|
0353200 | Jan 1990 | EP |
0652686 | May 1995 | EP |
0750202 | Dec 1996 | EP |
0835676 | Apr 1998 | EP |
1098686 | May 2001 | EP |
1402929 | Mar 2004 | EP |
1435258 | Jul 2004 | EP |
2814965 | Apr 2002 | FR |
2206716 | Jan 1989 | GB |
2376397 | Dec 2002 | GB |
2388418 | Nov 2003 | GB |
57-26549 | Jun 1962 | JP |
1284897 | Nov 1989 | JP |
9128141 | May 1997 | JP |
9185456 | Jul 1997 | JP |
9265346 | Oct 1997 | JP |
1138949 | Feb 1999 | JP |
2000172431 | Jun 2000 | JP |
2000259856 | Sep 2000 | JP |
2000350859 | Dec 2000 | JP |
2001166676 | Jun 2001 | JP |
2002369969 | Dec 2002 | JP |
2004145448 | May 2004 | JP |
2005046422 | Feb 2005 | JP |
6102980 | Mar 2017 | JP |
WO8805942 | Aug 1988 | WO |
WO9848571 | Oct 1998 | WO |
WO9926198 | May 1999 | WO |
WO9935633 | Jul 1999 | WO |
WO0227456 | Apr 2002 | WO |
WO0202052496 | Jul 2002 | WO |
WO03079179 | Sep 2003 | WO |
WO2004073814 | Sep 2004 | WO |
WO2004073815 | Sep 2004 | WO |
WO2005073838 | Aug 2005 | WO |
WO2005107911 | Nov 2005 | WO |
WO2007095082 | Aug 2007 | WO |
WO2008056180 | May 2008 | WO |
Entry |
---|
Bolt, R.A., “Put-that-there: voice and gesture at the graphics interface,” Computer Graphics, vol. 14, No. 3 (ACM SIGGRAPH Conference Proceedings) Jul. 1980, pp. 262-270. |
DeWitt, Thomas and Edelstein, Phil “Pantomation: A System for Position Tracking,” Proceedings of the 2nd Symposium on Small Computers in the Arts, Oct. 1982, pp. 61-69. |
Ephraim et al. “Speech Enhancement Using a Minimum Mean-Square Error Short-Time Spectral Amplitude Estimator,” 1984, IEEE. |
Ephraim et al. “Speech Enhancement Using a Minimum Mean-Square Error Log-Spectral Amplitude Estimator,” 1985, IEEE. |
“The Tracking Cube: A Three-Dimensional Input Device,” IBM Technical Disclosure Bulletin, Aug. 1, 1989, pp. 91-95, No. 3B, IBM Corp., New York, U.S. |
Hemmi, et al., “3-D Natural Interactive Interface-Using Marker Tracking from a Single View,” Sep. 9, 1991, Systems and Computers in Japan. |
K. B. Shimoga, et al., “Touch and Force Reflection for Telepresence Surgery,” Engineering in Medicine and Biology Opportunities of the IEEEE, Baltimore, MD, USA, Nov. 3, 1994, New York, New York, USA, pp. 1049-1050. |
Richardson et al. “Virtual Network Computing,” 1998, IEEE Internet Computing vol. 2. |
Kanade, et al., “A Stereo Machine for Video-rate Dense Depth Mapping and Its New Application” 1996, CVPR 96, IEEE Computer Society Conference, pp. 196-202 (022). |
Fujitsu, “Internet Development of Emulators” Abstract, Mar. 1997, vol. 48, No. 2. |
Richardson et al., “Virtual Network Computing” IEEE Internet Computing, vol. 2, No. 1, Jan./Feb. 1998. |
Nakagawa, et al., “A Collision Detection and Motion Image Synthesis Between a Background Image and a Foreground 3-Dimensional Object,” TVRSJ vol. 4, No. 2, pp. 425-430, 1999, Japan. |
Jojic, et al., “Tracking Self-Occluding Articulated Objects in Dense Disparity Maps,” Computer Vision, 1999, The Proceedings of the Seventh IEEE International Conference on Kerkyra, Greece Sep. 20-27, 1999, Los Alamitos, CA, US, IEEE Computer Society, US, Sep. 20, 1999, (Sep. 20, 1999), pp. 123-130. |
Klinker, et al., “Distributed User Tracking Concepts for Augmented Reality Applications,” pp. 37-44, Augmented Reality, 2000, IEEE and ACM Int'l Symposium, Oct. 2000, XP010520308, ISBN: 0-7695-0846-4, Germany. |
Iddan, et al., “3D Imaging in the Studio (and Elsewhere . . . ),” Proceedings of the SPIE, SPIE, Bellingham, WA, US, vol. 4298, Jan. 24, 2001, pp. 48-55, XP008005351. |
Nishida, et al., “A Method of Estimating Human Shapes by Fitting the Standard Human Model to Partial Measured Data”, D-II vol. J84-D-II, No. 7, pp. 1310-1318, Jul. 2001. |
Mihara, et al., “A Realtime Vision-Based Interface Using Motion Processor and Applications to Robotics,” vol. J84-D-11, No. 9, pp. 2070-2078, Sep. 2001, Japan. |
Lanier, Jaron, “Virtually there: three-dimensional tele-immersion may eventually bring the world to your desk,” Scientific American, ISSN: 00368733, Year: 2001. |
Wilson & Darrell, “Audio-Video Array Source Localization for Intelligent Environments,” 2002 IEEE Dept. of Electrical Eng and Computer Science, MIT, Cambridge, MA 02139. |
Fiala, et al., “A Panoramic Video and Acoustic Beamforming Sensor for Videoconferencing,” 2004 IEEE, Computational Video Group, National Research Council, Ottawa, Canada K1A 0R6. |
Nakamura, et al., “A Consideration on Reconstructing 3-D Model Using Object Views,” 2004-01601-003, pp. 17-21, Kokkaido University, Japan, nakamura@media.eng.hokudai.ac.jp. |
XP-002453974, “CFS and F595/98/2000: How to Use the Trim Controls to Keep Your Aircraft Level”, Aug. 10, 2007, http://support.microsoft.com/?scid=kb%3Ben-us%3B175195&x=13&y=15. |
Gvili, et al., “Depth Keying”, SPIE vol. 5006 (2003), 2003 SPIE-IS&T, pp. 564-574 (031). |
Number | Date | Country | |
---|---|---|---|
20190038977 A1 | Feb 2019 | US |
Number | Date | Country | |
---|---|---|---|
60718145 | Sep 2005 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15283131 | Sep 2016 | US |
Child | 16147365 | US | |
Parent | 11382036 | May 2006 | US |
Child | 15283131 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 10207677 | Jul 2002 | US |
Child | 11382036 | US | |
Parent | 10650409 | Aug 2003 | US |
Child | 10207677 | US | |
Parent | 10663236 | Sep 2003 | US |
Child | 10650409 | US | |
Parent | 10759782 | Jan 2004 | US |
Child | 10663236 | US | |
Parent | 10820469 | Apr 2004 | US |
Child | 10759782 | US | |
Parent | 11301673 | Dec 2005 | US |
Child | 10820469 | US | |
Parent | 11381729 | May 2006 | US |
Child | 11301673 | US | |
Parent | 11381728 | May 2006 | US |
Child | 11381729 | US | |
Parent | 11381725 | May 2006 | US |
Child | 11381728 | US | |
Parent | 11381727 | May 2006 | US |
Child | 11381725 | US | |
Parent | 11381724 | May 2006 | US |
Child | 11381727 | US | |
Parent | 11381721 | May 2006 | US |
Child | 11381724 | US |