Implementations of the technology disclosed generally relate to determining positional information and, more particularly, to determining position and/or distance and/or depth of an object or features of an object surface in space.
One way to measure the distance to a remote object is to broadcast a wave (e.g., a sound wave, for example), start a timer and wait to capture the portion of the wave reflected by the object. By measuring the time the wave takes to make the round-trip distance (and by knowing the propagation rate of the wave), the distance to the object can be calculated. The position of the object can be inferred (e.g., via triangulation) from the reflected wave. This method of distance and position determination can work over large distances when precision beyond a few meters is not required.
Unfortunately, such conventional techniques do not work well for more precise determinations and/or determinations made over shorter distances. The accuracy of the measurement depends heavily on recording the precise times of broadcast and capture, which is especially difficult for very fast-moving waves (e.g., light waves). Further, one or both of the angles between wave emitter, wave sensor and object are difficult or impossible to determine, and the transit time of the wave can be very difficult to measure. The result is that the distances computed using conventional techniques can be very inaccurate. A need therefore exists for better methods for determining the distance and position of an object.
The technology disclosed relates to determining positional information of an object in a field of view. In particular, it relates to calculating a distance of the object from a reference such as a sensor including scanning the field of view by selectively illuminating directionally oriented light sources that have overlapping fields of illumination and measuring one or more differences in property of returning light emitted from the light sources and reflected from the object. In some implementations, the property is intensity. In other implementations, the property is phase difference.
The technology disclosed also relates to finding an object in a region of space. In particular, it relates to scanning the region of space with directionally controllable illumination, determining a difference in a property of the illumination received for two or more points in the scanning, and determining positional information of the object based at least in part upon the points in the scanning corresponding to the difference in the property. In some implementations, the property is intensity. In other implementations, the property is phase difference.
Aspects of the systems and methods described herein also provide for determining positional information (e.g., location, distance, and/or depth) for at least a portion of a target object within a field of view. Among other aspects, implementations can enable objects and/or features of an object surface to be automatically (e.g. programmatically) determined using positional information in conjunction with receiving input, commands, communications and/or other user-machine interfacing, gathering information about objects, events and/or actions existing or occurring within an area being explored, monitored, or controlled, and/or combinations thereof.
In one implementation, a method includes emitting light from a plurality of light sources mounted on a surface or surfaces having a non-planar (e.g., curved, polygonal, or arc-based) shape and/or mounted to a planar surface or surfaces and directed at differing angles. Light sources comprising a transmitter can be integrally mounted to a common structure and/or non-integrally distributed over a plurality of structures and/or incorporated into other devices and/or combinations thereof. Light sources can be selectively illuminated (e.g., one-at-a-time, in groups, sequentially or according to some pattern) to advantageously “scan” a field of view. The emitted light reflects from an object in a field of view, enabling the reflected light to be captured by a sensor (e.g., video cameras based on CCD arrays and/or CMOS arrays, arrays constructed of photodiodes, phototransistors, photovoltaic devices, and/or other types of photo-detector devices capable of converting light into current or voltage, and/or sensors comprising single elements of such devices coupled to raster or other scanning hardware and/or software, and/or combinations thereof). Reflected light originating from each light source can have different properties (e.g., intensity, phase, or the like) as captured by the sensor. An analyzer (e.g., computer, specialized circuitry, microcontroller, custom silicon, and/or combinations thereof) can detect the differences in properties and, based at least in part thereon, can determine positional information (e.g., location, distance, and/or depth) for at least a portion of the object.
Variants exist, however; in implementations, depth can be determined from stereoscopic differences in the reflected light obtained from scanning the field of view along a single plane approximately co-planar to the direction of the light emitted from the light sources and/or from differences in the reflected light obtained from scanning the field of view along two or more intersecting planes, each approximately co-planar to the direction of the light illuminated by different sets of light sources arranged integrally or non-integrally to provide for cross-scanning of the field of view, and/or combinations thereof.
According to another aspect, differences in the number of light sources illuminated can determine accuracy. In one method implementation, a coarse scan can be achieved in which some light sources can be skipped when situations call for less accuracy, i.e., light is transmitted from only a subset of the light sources to provide a low-resolution data set of distance to an object. A more accurate fine-grained scan can be achieved by selecting a relatively larger number of light sources to illuminate thereby providing more data leading to greater accuracy.
According to a further aspect, an implementation can conduct a relatively coarse scan of a field of view to locate object(s) and then follow up with a relatively fine-grained scan in a subsection of the field of view in which the object has been located. The fine-grained scan can enable features of objects to be closely identified, thereby enabling different objects (e.g., hands of different human users, different pets walking across the field of view, etc.) to be distinguished.
In another implementation, the light sources can be illuminated to different levels of brightness to provide differences in properties for the light illuminating the target object. In another implementation, light sources can be illuminated to different frequencies to provide differences in color properties for the light illuminating the target object. In an implementation, scans can be completed using light driven to achieve one set of properties, the resulting image data analyzed, a change in light property can be effected, and then a subsequent scan can be effected using the new light property. In another implementation, light source frequencies can be selected from different portions of the electromagnetic spectrum (e.g., ultraviolet, visible, infrared and/or combinations thereof) to illuminate the target object during scan, thereby providing opportunities to capture additional data.
In a yet further implementation, a method provides for determining distance to an object in space. The method can include receiving at a sensor light defining at least a portion of an object, the light originating from a plurality of light sources directed at different angles and of known geometry. The method can also include determining differences in phase for the light received from at least two of light sources. The method can also include determining a distance to the object based at least in part on the differences in the phase.
In a still yet further implementation, a system provides for determining a distance to an object in space. The system can include a plurality of light sources mounted on surface and directed at different angles. A sensor to capture light transmitted from the plurality of light sources and reflected from an object in a field of view of the sensor can also be part of the system. The system can also include a controller configured to determine differences in phases of the captured light and compute a distance to the object based at least in part on the phases.
In another aspect, implementations incorporating low resolution time-measurement based approaches can be used to conduct a relatively coarse scan of a field of view to locate object(s) and then follow up with a relatively fine-grained scan in a subsection of the field of view in which the object has been located.
In a yet further aspect, a set of illumination sources can be disposed to provide illumination to a field of view such that a plurality of cameras (and/or other sensors based upon light sensitive elements, i.e., pixels) disposed to be able to receive light from the illumination sources can provide image information based upon the changing illumination when different ones of the illumination sources are activated. Differences in light properties (e.g., phase, intensity, wavelengths and/or combinations thereof) from the illumination sources will be detected by each camera (or other sensor) and therefore will appear in each of the images provided by the cameras (or other sensor). Correlating corresponding changes in light properties the different images enables determining correspondence between the pixels in images of the camera(s). Such implementations can provide for improved robustness to techniques for correlating objects viewed by multiple cameras (or other light sensors).
Reference throughout this specification to “one example,” “an example,” “one implementation,” or “an implementation” means that a particular feature, structure, or characteristic described in connection with the example is included in at least one example of the present technology. Thus, the occurrences of the phrases “in one example,” “in an example,” “one implementation,” or “an implementation” in various places throughout this specification are not necessarily all referring to the same example. Furthermore, the particular features, structures, routines, actions, or characteristics can be combined in any suitable manner in one or more examples of the technology. The headings provided herein are for convenience only and are not intended to limit or interpret the scope or meaning of the claimed technology.
Advantageously, these and other aspects enable machines, computers and/or other types of intelligent devices, and/or other types of automata to obtain information about objects, events, actions, and/or users employing gestures, signals, and/or other motions conveying meaning and/or combinations thereof. These and other advantages and features of the implementations herein described, will become more apparent through reference to the following description, the accompanying drawings, and the claims. Furthermore, it is to be understood that the features of the various implementations described herein are not mutually exclusive and can exist in various combinations and permutations.
In the drawings, like reference characters generally refer to like parts throughout the different views. Also, the drawings are not necessarily to scale, with an emphasis instead generally being placed upon illustrating the principles of the technology disclosed. In the following description, various implementations of the technology disclosed are described with reference to the following drawings, in which:
Described herein are various implementations of methods and systems for determining the distance, position and/or depth of an object in space. Implementations can provide improved accuracy in positional and/or depth information capable of supporting object or object surface recognition, object change, event or action recognition and/or combinations thereof. An implementation provides for determining distance, position and/or depth of target object(s) relative to a reference (e.g., light transmitter and/or sensor). (The term “light,” as used herein, means electromagnetic radiation of any wavelength or wavelengths. For the purposes described herein, light is typically in the infrared, visible or ultraviolet spectral regions.)
The sensor 204 detects the intensity and angle of incoming light rays. In one implementation, the sensor 204 includes a lens and a charge-coupled device (“CCD”), such as the ones found in digital still or video cameras.
The angle of the incoming light relative to a normal line through lens 252 that strikes each pixel of the CCD (or other image sensor) can be inferred by the computer 208. The lens 252 focuses incoming light onto the CCD 256 in accordance with its shape; each pixel of the CCD 256 corresponds to a point and angle on the lens 252 at which the incoming light is received. Light striking the lens 252 from the object 206, for example, is mapped to a particular pixel (or set of pixels) on the CCD 256. The computer 208 can include a look-up table (or similar data structure) that maps each pixel of the image read from the CCD 256 to a corresponding incoming angle of light. The look-up table can be predetermined (based on the known properties of the lens, such as the size of its field of view) or generated dynamically from data read from the CCD. In one implementation, a test-pattern image is captured by the CCD to generate the look-up table and/or to calibrate the predetermined look-up table (to account for, for example, imperfections in the lens). Other methods of calibrating the lens 252 are also within the scope of the technology disclosed.
In one implementation, the lens 252 can be calibrated by capturing, on the CCD 256, a plurality of images of an object 206 having a flat surface (such as, for example, a computer display, mirror, or wall). The relative position between the lens 252 and the flat surface of the object 206 can be varied for each captured image by, for example, movement of the lens 252 and/or the object 206. The movement can include an increase or decrease in the distance between the lens 252 and the object 206, a rotation of the lens 252 and/or object 206 on any axis, and/or lateral movement of the lens 252 and/or object 206. Each captured image can be analyzed to determine a distance from the lens 252 to one or more points on the flat surface of the object 206; the determination of the distance(s) can be performed in accordance with the implementations of the technology disclosed described herein and/or other methods known in the art. The distances associated with each image are compared across all of the images; any discrepancies or deviations in the measured distances can be used to determine imperfections or defects in the lens 252. A deviation that changes its position in the captured images as the relative positions of the lens 252 and object 206 change can be deemed to be an inconsistency in the flat surface of the object 206; a deviation that does not change its position in the captured images as the relative positions of the lens 252 and object 206 change can be deemed to be an imperfection in the lens 252. The position of each imperfection, and the degree of the imperfection, can be used to construct the look-up table discussed above.
The memory 272 can be used to store instructions to be executed by processor 270 as well as input and/or output data associated with execution of the instructions. In particular, memory 272 contains instructions, conceptually illustrated as one or more modules that control the operation of processor 270 and its interaction with the other hardware components. For example, the memory 272 can contain an image analysis module 278 for analyzing image data received from the sensor 204 and computing a distance to an object 206. An operating system directs the execution of low-level, basic system functions such as memory allocation, file management and operation of mass storage devices. The operating system can be or include a variety of operating systems such as Microsoft WINDOWS operating system, the Unix operating system, the Linux operating system, the Xenix operating system, the IBM AIX operating system, the Hewlett Packard UX operating system, the Novell NETWARE operating system, the Sun Microsystems SOLARIS operating system, the OS/2 operating system, the BeOS operating system, the MAC OS operating system, the APACHE operating system, an OPENACTION or OPENACTION operating system, iOS, Android or other mobile operating systems, or another operating system platform.
The computer 270 can also include other removable/non-removable, volatile/nonvolatile computer storage media, such as a solid-state or magnetic hard disk, an optical drive, flash memory, random-access memory, read-only memory, or any other similar type of storage medium. The processor 270 can be a general-purpose microprocessor, microcontroller, digital-signal processor, or any other type of computational engine. The transmitter/sensor interface 274 can include hardware and/or software that enable communication between the computer 270 and the transmitter 202 and/or sensor 204. For example, the transmitter/sensor interface 274 can include one or more data ports (such as USB ports) to which devices can be connected, as well as hardware and/or software signal processors to modify sent or received data signals (e.g., to reduce noise or reformat data). In some implementations, the interface 274 also transmits control signals to, e.g., activate or deactivate attached devices, to control camera settings (frame rate, image quality, sensitivity, zoom level, etc.), or the like. Such signals can be transmitted, e.g., in response to control signals from processor 270, which can in turn be generated in response to user input or other detected events.
Again with reference to
The transmitter 202 can include a driver circuit 214 for powering and controlling the light sources 210; the light sources can alternatively or in addition be powered and/or controlled via a network link 216 by the computer 208. The structure 212 can be made of any suitable material, such as plastic or metal. Each light source 210 shares a defined geometrical relationship with the other light sources 210 by being mounted on the rigid structure 212. In one implementation, the light sources 210 each share a common radius with respect to a central point of origin, and can have equal angular spacing. In other implementations, the radii of the light sources 210 with respect to the central point of origin can vary in accordance with other geometric relationships; for example, the light sources 210 can be disposed on the structure 212 such that their position conforms to a parabolic or hyperbolic shape.
The manner and level of illumination of each light source 210 can vary in accordance with implementations of the technology disclosed. In one implementation, each light source 210 is switched fully on to a maximum or high level of brightness and then switched off to a minimum or low level of brightness. Each light source 210 is thus switched on and off before a next light source 210 is switched on; there is no (or negligible) overlap between the illumination period of a first light source 210 and a second light source 210. As explained in greater detail below, the overall accuracy of the system 200 in this implementation depends at least in part upon the number of light sources 210. In another implementation, the light sources 210 are illuminated to different levels of brightness; for example, each light source 210 can be first switched to a low dimming setting, then a medium dimming setting, then a full brightness setting, then back down to a medium dimming setting and a low dimming setting. In this implementation, a next light source 210 can begin illumination (at, e.g., a low dimming setting) while a first light source 210 is still illuminated. Only one light source 210 can be configured at a maximum setting at any given time, however. Any method of increasing and decreasing the dimming level of each light source 210 is within the scope of the technology disclosed; the illumination level can be linear, logarithmic, quadratic, exponential, and/or Gaussian, and/or combinations thereof for example. In these implementations, an overall accuracy of the system 200 can further depend at least in part upon the number of discrete dimming levels to which each light source 210 is illuminated. In one implementation, the accuracy can further depend at least in part upon the frequency that the light sources 210 are illuminated.
Again with reference to
The sensor 204 receives the light cast by the light sources A, B, C, D as reflected by the object 206. The received light varies in amplitude/intensity as a result of the angle of reflection between each light source A, B, C, D and the sensor 204, as shown in illustration 400B; a light source at a high or “steep” angle to the object 206 can illuminate the object 206 with less intensity than a light source more directly facing the object 206. The amplitude/intensity that the waves of the received light exhibits can also vary as a result of the differing paths in the travel path between the light sources A, B, C, D and the sensor 204, as shown in illustration 400B; as a result, the amplitude/intensity that captured light exhibits when it arrives at sensor 204 can vary as the light sources A, B, C, and D illuminate in turn. Thus, the phase of the light received at the sensor 204 will vary according to the different points in the illumination cycle.
Again with reference to
In another implementation, the computer 208 determines the phase by performing a fast-Fourier transform (“FFT”) on a series of images read from the sensor 204. In one implementation, the frame rate of the sensor 204 equals and is synchronized with the frequency of the light emitted by the transmitter 202; each frame captured by the sensor 204, therefore, corresponds to a next pulse of emitted light. In other implementations, the frame rate of the sensor 204 is unsynchronized with the frequency of the transmitter 204 and thus captures random pulses of light from the transmitter 204. In any case, the detected phases can be stored and, after a number are collected, analyzed to determine the phases of adjacent light sources 210. As the frame rate of the sensor 204 increases, the accuracy of the distance measurement increases (as explained in greater detail below). In other implementations, the sensor 204 includes a rolling-shutter camera that reads every pixel a large number of times, before proceeding to a next pixel, or a micro-electro-mechanical system (“MEMS”) camera having a scanning mirror that raster-scans a scene using a photodiode.
Referring again to
The phase difference Δθ, unlike the light intensity, has a clear relationship with the position of the objects 206, 502 as a result of the known geometric relationship between the light sources on the transmitter 202. The phase difference Δθ between light rays received from the light sources closest the objects 206, 502 is smaller than the phase difference Δθ between light sources further from the object 206, 502; based on the known geometric relationship between the light sources, the position of the light sources closest the objects 206, 502 can be determined. For example, the two light sources closest to the object 206 produce two light rays 504, 506 of very similar length; the phase difference Δθ between these two light rays 504, 506 is thus very small or zero. A third light ray 508, produced by a light source further from the object 206, is longer; the phase difference Δθ between, for example, light rays 506, 508 is thus greater. In general, the phase difference Δθ between each of the light sources on the transmitter 202, when analyzed, has a minimum value at the point on the transmitter 202 closest to the analyzed object.
The variation in the phase difference Δθ for an object is proportional to the distance between the object and the transmitter 202. An object closer to the transmitter 202, such as the object 206, can exhibit a greater variation in phase difference Δθ than an object farther from the transmitter 202, such as the object 502. For example, the phase difference Δθ 510 corresponding to the object 206 has a greater variation 512 than the variation 514 in the phase difference Δθ 516 corresponding to the object 502. The minima 518, 520 of the phase differences phase difference Δθ are also shown in illustration 500B.
The positions of the minima 518, 520 and/or the variations in phase difference Δθ 512, 514 can be used to determine the distance of the objects 206, 502 from the transmitter 202 and/or sensor 204. As described above with reference to
In one implementation, the positions of the minima 518, 520 can be used to determine the angle between the line formed through a center point 524 of the transmitter 202 and the objects 206, 502 and a reference line (e.g., a horizontal line). In one implementation, the relative position of the minimum within the band of received light rays is mapped onto its corresponding position on the transmitter 202, and the angle of the line is thereby determined. For example, if the minimum occurs in the center of the band, the corresponding position on the transmitter 202 can be at 0° or “north.” As another example, if the minimum occurs at 75% of the distance from the left side of the band, the corresponding position on the transmitter 202 can be at 45° or “northeast.” In one implementation, the positions of the light sources on the transmitter 202 are used to determine the angle (i.e., the degree of the arc that the light sources sweep through).
Alternatively or in addition, the shapes of the phase difference Δθ curves 510, 516 can be used to determine the distance from the transmitter 202 and the objects 206, 502. As discussed above, objects closer to the transmitter 202 have “deeper” curves and objects further away from the transmitter 202 have “shallower” curves. The distance between the transmitter 202 and the objects 206, 502 can thus be determined by analyzing the shape of the curves 510, 516, by looking up the distance in a shape-to-distance look-up table, or by a combination of the two (or by any other suitable method). In one implementation, an ambiguity between two possible positions on the light ray 522 implied by the determined distance is resolved by analyzing the position of the minimum value of the curve.
Once the angle and/or distance of the object(s) relative to the transmitter 202 has been determined, the distance of the object relative to the transmitter 202, sensor 204, or to any other known point in space can be determined by triangulation. For example, using the angles of the object relative to the transmitter 202 and sensor 202, and the distance between the transmitter 202 and sensor 204, the distance of the object to the camera 204 (or, say, the midpoint between the transmitter 202 and camera 204) can be found by using, for example, the law of sines. One of skill in the art will understand that other unknown values (such as the angle of the lines intersecting at the object 206) can similarly be found.
The accuracy of the distance measurement can be increased by increasing the number of light sources on the transmitter 202. With more light sources, the distance between each light source decreases, thereby allowing a more precise determination of the point on the transmitter 202 closest to the object 206. The accuracy of the measurement can be alternatively or in addition improved by increasing the frame rate of the sensor 204, thereby allowing the collection of more phase data. If the implementation of the phase detection at the sensor 204 is done in an analog fashion, the accuracy can be improved by increasing the “listening time” spent on each pixel, thereby similarly allowing the collection of more phase data. The frequency of the illumination of the light sources on the transmitter 202 can be also increased for the same reason.
The above discussion simplifies the operation of the technology disclosed to two-dimensional space in order to more understandably explain the operation of the technology disclosed, but the technology disclosed is not limited to only two-dimensional space. For example,
Illustrations of example implementations 600, 610 of the technology disclosed appear in
At action 702, a field of view is scanned by selectively illuminating respective ones of a plurality of directionally oriented light sources that have overlapping fields of illumination. In one implementation, selectively illuminating the light sources includes at least periodically illuminating the light sources at different levels of brightness. In some implementations, periodically illuminating the light sources at different levels of brightness further includes switching the light sources to at least one of a low dimming setting, medium setting, or a high dimming setting. In other implementations, each of the light sources is illuminated at a different dimming setting.
In one implementation, selectively illuminating the light sources includes at least periodically illuminating the light sources at different frequencies to provide differences in color properties between the emitted light. In another implementation, selectively illuminating the light sources includes at least periodically illuminating the light sources one-at-a-time such that a first light source is turned off before a second light source is turned on. In yet another implementation, selectively illuminating the light sources includes at least periodically illuminating a subset of light sources from the plurality of light sources. Some other implementations include periodically illuminating the light sources sequentially based on at least one of a logarithmic, quadratic, exponential, and/or Gaussian pattern.
At action 704, one or more differences in intensity of returning light emitted from the respective light sources and reflected from the target object using a sensor are measured. In one implementation, the received light varies in intensity as a result of the angle of reflection between the respective light sources and the sensor. For example, a light source at a high or “steep” angle to the object can illuminate the object with less intensity than a light source more directly facing the object. In another implementation, the intensity that the waves of the received light exhibits can also vary as a result of the differing distances in the travel path between the respective light sources and the sensor. In one instance, light received from a light source that travels a greater distance to reach the sensor than light received from a light source that travels a lesser distance. As a result, the intensity that captured light exhibits when it arrives at sensor can vary as the respective light sources illuminate in turn.
At action 706, positional information of the target object is determined based at least in part upon one or more measured differences in intensity of the returning light. In one implementation, one or more angles for the light reflected from the target object is determined with respect to the sensor by mapping pixels of a camera array that captured the reflected light to the one or more angles. In another implementation, when the sensor is positioned apart from the plurality of light sources and not between two of the light sources, an angle between the plurality of light sources and the target object is determined.
In some implementations, a distance of the target object from the light sources or the sensor is determined using an angle between at least one of the light sources and the target object and a second angle between the sensor and the target object. In other implementations, a depth of the field of view is determined by identifying stereoscopic differences between light reflected from the target object, including at least one of scanning the field of view along a single plane that is co-planar to a direction of the light emitted from the plurality of light sources or scanning the field of view along two or more intersecting planes that are co-planar to a direction of the light emitted from the plurality of light sources.
At action 802, the region of space is scanned with directionally controllable illumination from selected ones of a set of illumination sources. In one implementation, directionally controllable illumination includes at least periodically illuminating the illumination sources at different levels of brightness. In some implementations, periodically illuminating the illumination sources at different levels of brightness further includes switching the illumination sources to at least one of a low dimming setting, medium setting, or a high dimming setting. In other implementations, each of the illumination sources is illuminated at a different dimming setting.
In one implementation, directionally controllable illumination includes at least periodically illuminating the illumination sources at different frequencies to provide differences in color properties between the emitted light. In another implementation, directionally controllable illumination includes at least periodically illuminating the illumination sources one-at-a-time such that a first illumination t source is turned off before a second illumination source is turned on. In yet another implementation, directionally controllable illumination includes at least periodically illuminating a subset of illumination sources from the plurality of illumination sources. Some other implementations include periodically illuminating the illumination sources sequentially based on at least one of a logarithmic, quadratic, exponential, and/or Gaussian pattern.
In one implementation, the illuminations sources are arranged on one or more non-planar arcuate surfaces that include at least one of one or more segments of an arc or one or more segments of an N-sided polygon. In another implementation, the illumination sources are arranged on one or more planar surfaces and directed at different angles.
At action 804, illumination in the region of space is detected that includes illumination reflected by the object. In one implementation, a coarse scan of the space is performed to assemble a low-resolution estimate of the object position by illuminating a subset of illumination sources from the set of illumination sources. In another implementation, the coarse scan is followed by performing a fine-grained scan of a subsection the space based on the low-resolution estimate of the object position and distinguishing features of the object are identified based on a high-resolution data set collected during the fine-grained scan.
At action 806, a difference in a property of the illumination received for two or more points in the scanning is determined. In some implementations, the property is intensity of light. In one implementation, the received light varies in intensity as a result of the angle of reflection between the respective ones of the illumination sources and a sensor that captures the light. For example, an illumination source at a high or “steep” angle to the object can illuminate the object with less intensity than an illumination source more directly facing the object. In another implementation, the intensity that the waves of the received light exhibits can also vary as a result of the differing distances in the travel path between the respective of the illumination sources and the sensor. In one instance, light received from an illumination source that travels a greater distance to reach the sensor than light received from an illumination source that travels a lesser distance. As a result, the intensity that captured light exhibits when it arrives at sensor can vary as the respective one of the illumination sources illuminate in turn.
At action 808, positional information of the object is determined based at least in part upon the points in the scanning corresponding to the difference in the property. In one implementation, one or more angles for the light reflected from the object is determined with respect to a sensor by mapping pixels of a camera array that captured the reflected light to the one or more angles. In another implementation, when the sensor is positioned apart from the plurality of illumination sources and not between two of the illumination sources, an angle between the plurality of illumination sources and the target object is determined.
In some implementations, a distance of the object from the illumination sources or the sensor is determined using an angle between at least one of the illumination sources and the object and a second angle between the sensor and the object. In other implementations, a depth of the field of view is determined by identifying stereoscopic differences between one or more light reflected from the object, including at least one of scanning the space along a single plane that is co-planar to a direction of the light emitted from the plurality of illumination sources or scanning the field of view along two or more intersecting planes that are co-planar to a direction of the light emitted from the plurality of illumination sources.
Implementations of the technology disclosed can be used to map out the positions of objects in room or similarly sized area in order to precisely locate the objects, people, or other things in the room, as well as the room walls and/or other room dimensions. This information can be used by a computer, television, or other device in the room to improve the experience of a user of the device by, for example, allowing the user to interact with the device based on the room dimensions. The device can adjust a property (e.g., a sound level, sound distribution, brightness, or user-interface perspective) based on objects in the room or the position of the user.
Implementations can be realized by incorporating time-measurement based approaches to obtain additional information about target objects. For example, light source A can emit a pulse of light at t=10 ns and light source B can emit a pulse of light at t=11 ns; the pulse from light source A can arrive at the sensor 204 at t=10.5 ns while the pulse from light source B can arrive at t=11.6 ns. In this example, θA=0.5 ns and θB=0.6 ns. While such approaches may not yield precision for many applications, these approaches can be used to provide a “coarse” view of the target object upon which techniques yielding more precise results herein described can be applied.
Implementations can employed in a variety of application areas, such as for example and without limitation consumer applications including interfaces for computer systems, laptops, tablets, television, game consoles, set top boxes, telephone devices and/or interfaces to other devices; medical applications including controlling devices for performing robotic surgery, medical imaging systems and applications such as CT, ultrasound, x-ray, MRI or the like, laboratory test and diagnostics systems and/or nuclear medicine devices and systems; prosthetics applications including interfaces to devices providing assistance to persons under handicap, disability, recovering from surgery, and/or other infirmity; defense applications including interfaces to aircraft operational controls, navigations systems control, on-board entertainment systems control and/or environmental systems control; automotive applications including interfaces to automobile operational systems control, navigation systems control, on-board entertainment systems control and/or environmental systems control; security applications including, monitoring secure areas for suspicious activity or unauthorized personnel; manufacturing and/or process applications including interfaces to assembly robots, automated test apparatus, work conveyance devices such as conveyors, and/or other factory floor systems and devices, genetic sequencing machines, semiconductor fabrication related machinery, chemical process machinery and/or the like; and/or combinations thereof.
Implementations of the technology disclosed can further be mounted on automobiles or other mobile platforms to provide information to systems therein as to the outside environment (e.g., the positions of other automobiles). Further implementations of the technology disclosed can be used to track the motion of objects in a field of view or used in conjunction with other mobile-tracking systems. Object tracking can be employed, for example, to recognize gestures or to allow the user to interact with a computationally rendered environment; see, e.g., U.S. Patent Application Ser. No. 61/752,725 (filed on Jan. 15, 2013) and Ser. No. 13/742,953 (filed on Jan. 16, 2013), the entire disclosures of which are hereby incorporated by reference.
It should also be noted that implementations of the technology disclosed can be provided as one or more computer-readable programs embodied on or in one or more articles of manufacture. The article of manufacture can be any suitable hardware apparatus, such as, for example, a floppy disk, a hard disk, a CD ROM, a CD-RW, a CD-R, a DVD ROM, a DVD-RW, a DVD-R, a flash memory card, a PROM, a RAM, a ROM, or a magnetic tape. In general, the computer-readable programs can be implemented in any programming language. Some examples of languages that can be used include C, C++, or JAVA. The software programs can be further translated into machine language or virtual machine instructions and stored in a program file in that form. The program file can then be stored on or in one or more of the articles of manufacture.
Particular Implementations
In one implementation, a method of tracking movement of an object portion in three-dimensional (3D) space is described. The method includes scanning a field of view by selectively illuminating respective ones of a plurality of directionally oriented light sources that have overlapping fields of illumination, measuring one or more differences in intensity of returning light emitted from the respective light sources and reflected from the target object using a sensor, and determining positional information of the target object based at least in part upon one or more measured differences in intensity of the returning light.
This method and other implementations of the technology disclosed can include one or more of the following features and/or features described in connection with additional methods disclosed. In the interest of conciseness, the combinations of features disclosed in this application are not individually enumerated and are not repeated with each base set of features. The reader will understand how features identified in this section can readily be combined with sets of base features identified as implementations.
In one implementation, the method includes selectively illuminating the respective light sources includes varying brightness of pairs of overlapping light sources by dimming a first, initially on light source while brightening a second, initially off light source. In some implementations, the brightness of the two overlapping light sources is varied by applying a quadratic formula. In other implementations, the brightness of the two overlapping light sources according to a Gaussian distribution.
In one implementation, the respective light sources are illuminated selectively one at a time. In another implementation, the sensor scans the field of view using a scanning mirror and a photo detector that rasterizes the field of view. In some implementations, the respective light sources are distinguished based on different frequencies of the respective light sources.
In one implementation, one or more angles are determined for the light reflected from the target object with respect to the sensor by mapping pixels of a camera array that captured the reflected light to the one or more angles. When the sensor is positioned apart from the plurality of light sources and not between two of the light sources, an angle between the plurality of light sources and the target object is determined. In some implementations, a distance of the target object from the light sources or the sensor is determined using an angle between at least one of the light sources and the target object and a second angle between the sensor and the target object.
In another implementation, two or more of the light sources are illuminated respectively at different intensities of illumination. In some implementations, a coarse scan of the field of view is performed to assemble a low-resolution estimate of the target object position by illuminating a subset of light sources from the plurality of light sources. In other implementations, the coarse scan is followed by performing a fine-grained scan of a subsection the field of view based on the low-resolution estimate of the target object position and identifying distinguishing features of the target object based on a high-resolution data set collected during the fine-grained scan. In yet another implementation, a plurality of scans of the field of view is performed and varying light properties of light are emitted from the respective light sources among the scans.
In one implementation, the plurality of directional light sources is arranged on one or more non-planar arcuate surfaces that include at least one of an arc or an N-sided polygon. In another implementation, the plurality of directional light sources is arranged along a parabolic or hyperbolic curve. Some other implementations include determining phase differences includes performing a Fourier transform on a series of intensity measurements of the light reflected from the target object.
Other implementations may include a non-transitory computer readable storage medium storing instructions executable by a processor to perform any of the methods described above. Yet another implementation may include a system including memory and one or more processors operable to execute instructions, stored in the memory, to perform any of the methods described above.
In another implementation, a method of determining positional information of a target object in a field of view is described. The method includes scanning a field of view by selectively illuminating respective ones of a plurality of directionally oriented light sources that have overlapping fields of illumination, measuring one or more differences in property of returning light emitted from the respective light sources and reflected from the target object using a sensor, and determining positional information of the target object based at least in part upon one or more measured differences in property of the returning light. In some implementations, the property is intensity of light.
Other implementations may include a non-transitory computer readable storage medium storing instructions executable by a processor to perform any of the methods described above. Yet another implementation may include a system including memory and one or more processors operable to execute instructions, stored in the memory, to perform any of the methods described above.
In another implementation, a system of determining positional information of a target object in a field of view is described. The system includes a processor and a computer readable storage medium storing computer instructions configured to cause the processor to scan a field of view by selectively illuminating respective ones of a plurality of directionally oriented light sources that have overlapping fields of illumination, measure one or more differences in intensity of returning light emitted from the respective light sources and reflected from the target object using a sensor, and determine positional information of the target object based at least in part upon one or more measured differences in intensity of the returning light.
In another implementation, a method of finding an object in a region of space is described. The method includes scanning the region of space with directionally controllable illumination from selected ones of a set of illumination sources, detecting illumination in the region of space including illumination reflected by the object, determining a difference in a property of the illumination received for two or more points in the scanning, and determining positional information of the object based at least in part upon the points in the scanning corresponding to the difference in the property.
This method and other implementations of the technology disclosed can include one or more of the following features and/or features described in connection with additional methods disclosed.
In one implementation, the method includes conducting a second scanning of the region of space to obtain second positional information of the object and determining a change in the object based upon a comparison of a result from a first scanning and a result from the second scanning.
In another implementation, the method includes conducting a second scanning limited to a portion of the region of space corresponding to the positional information of the object obtained from a first scanning and determining additional positional information of the object based upon a result from the second scanning. In some implementations, the second scanning includes a second scanning limited to a portion of the region of space corresponding to the positional information of the object obtained from a first scanning and determining additional positional information of the object based upon a result from the second scanning.
Other implementations may include a non-transitory computer readable storage medium storing instructions executable by a processor to perform any of the methods described above. Yet another implementation may include a system including memory and one or more processors operable to execute instructions, stored in the memory, to perform any of the methods described above.
Certain implementations of the technology disclosed were described above. It is, however, expressly noted that the technology disclosed is not limited to those implementations, but rather the intention is that additions and modifications to what was expressly described herein are also included within the scope of the technology disclosed. For example, it can be appreciated that the techniques, devices and systems described herein with reference to examples employing light waves are equally applicable to methods and systems employing other types of radiant energy waves, such as acoustical energy or the like. Moreover, it is to be understood that the features of the various implementations described herein were not mutually exclusive and can exist in various combinations and permutations, even if such combinations or permutations were not made express herein, without departing from the spirit and scope of the technology disclosed. In fact, variations, modifications, and other implementations of what was described herein will occur to those of ordinary skill in the art without departing from the spirit and the scope of the technology disclosed. As such, the technology disclosed is not to be defined only by the preceding illustrative description.
This application is a continuation of U.S. Ser. No. 15/625,856, entitled DETERMINING POSITIONAL INFORMATION OF AN OBJECT IN SPACE, filed 16 Jun. 2017, now U.S. Pat. No. 9,927,522, issued 27 Mar. 2018, which is a continuation of U.S. Ser. No. 14/214,605, entitled, “DETERMINING POSITIONAL INFORMATION OF AN OBJECT IN SPACE,” filed 14 Mar. 2014, now U.S. Pat. No. 9,702,977, issued 11 Jul. 2017 and which claims the benefit of three U.S. provisional Patent Applications, including: No. 61/801,479, entitled, “DETERMINING POSITIONAL INFORMATION FOR AN OBJECT IN SPACE,” filed 15 Mar. 2013; No. 61/792,025, entitled, “DETERMINING POSITIONAL INFORMATION FOR AN OBJECT IN SPACE,” filed 15 Mar. 2013; and No. 61/800,327, entitled, “DETERMINING POSITIONAL INFORMATION FOR AN OBJECT IN SPACE,” filed 15 Mar. 2013. The priority applications are hereby incorporated by reference for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
2665041 | Maffucci | Jan 1954 | A |
4175862 | DiMatteo et al. | Nov 1979 | A |
4876455 | Sanderson et al. | Oct 1989 | A |
4879659 | Bowlin et al. | Nov 1989 | A |
4893223 | Arnold | Jan 1990 | A |
5038258 | Koch et al. | Aug 1991 | A |
5134661 | Reinsch | Jul 1992 | A |
5282067 | Liu | Jan 1994 | A |
5434617 | Bianchi | Jul 1995 | A |
5454043 | Freeman | Sep 1995 | A |
5574511 | Yang et al. | Nov 1996 | A |
5581276 | Cipolla et al. | Dec 1996 | A |
5594469 | Freeman et al. | Jan 1997 | A |
5659475 | Brown | Aug 1997 | A |
5691737 | Ito et al. | Nov 1997 | A |
5742263 | Wang et al. | Apr 1998 | A |
5900863 | Numazaki | May 1999 | A |
5940538 | Spiegel et al. | Aug 1999 | A |
6002808 | Freeman | Dec 1999 | A |
6031161 | Baltenberger | Feb 2000 | A |
6031661 | Tanaami | Feb 2000 | A |
6072494 | Nguyen | Jun 2000 | A |
6075895 | Qiao et al. | Jun 2000 | A |
6147678 | Kumar et al. | Nov 2000 | A |
6154558 | Hsieh | Nov 2000 | A |
6181343 | Lyons | Jan 2001 | B1 |
6184326 | Razavi et al. | Feb 2001 | B1 |
6184926 | Khosravi et al. | Feb 2001 | B1 |
6195104 | Lyons | Feb 2001 | B1 |
6204852 | Kumar et al. | Mar 2001 | B1 |
6252598 | Segen | Jun 2001 | B1 |
6263091 | Jain et al. | Jul 2001 | B1 |
6346933 | Lin | Feb 2002 | B1 |
6417970 | Travers et al. | Jul 2002 | B1 |
6463402 | Bennett et al. | Oct 2002 | B1 |
6492986 | Metaxas et al. | Dec 2002 | B1 |
6493041 | Hanko et al. | Dec 2002 | B1 |
6498628 | Iwamura | Dec 2002 | B2 |
6578203 | Anderson, Jr. et al. | Jun 2003 | B1 |
6603867 | Sugino et al. | Aug 2003 | B1 |
6661918 | Gordon et al. | Dec 2003 | B1 |
6674877 | Jojic et al. | Jan 2004 | B1 |
6702494 | Dumler et al. | Mar 2004 | B2 |
6734911 | Lyons | May 2004 | B1 |
6738424 | Allmen et al. | May 2004 | B1 |
6771294 | Pulli et al. | Aug 2004 | B1 |
6798628 | Macbeth | Sep 2004 | B1 |
6804654 | Kobylevsky et al. | Oct 2004 | B2 |
6804656 | Rosenfeld et al. | Oct 2004 | B1 |
6814656 | Rodriguez | Nov 2004 | B2 |
6819796 | Hong et al. | Nov 2004 | B2 |
6901170 | Terada et al. | May 2005 | B1 |
6919880 | Morrison et al. | Jul 2005 | B2 |
6950534 | Cohen et al. | Sep 2005 | B2 |
6993157 | Oue et al. | Jan 2006 | B1 |
7152024 | Marschner et al. | Dec 2006 | B2 |
7213707 | Hubbs et al. | May 2007 | B2 |
7215828 | Luo | May 2007 | B2 |
7244233 | Krantz et al. | Jul 2007 | B2 |
7257237 | Luck et al. | Aug 2007 | B1 |
7259873 | Sikora et al. | Aug 2007 | B2 |
7308112 | Fujimura et al. | Dec 2007 | B2 |
7340077 | Gokturk et al. | Mar 2008 | B2 |
7483049 | Aman et al. | Jan 2009 | B2 |
7519223 | Dehlin et al. | Apr 2009 | B2 |
7532206 | Morrison et al. | May 2009 | B2 |
7536032 | Bell | May 2009 | B2 |
7542586 | Johnson | Jun 2009 | B2 |
7598942 | Underkoffler et al. | Oct 2009 | B2 |
7606417 | Steinberg et al. | Oct 2009 | B2 |
7646372 | Marks et al. | Jan 2010 | B2 |
7656372 | Sato et al. | Feb 2010 | B2 |
7665041 | Wilson et al. | Feb 2010 | B2 |
7692625 | Morrison et al. | Apr 2010 | B2 |
7831932 | Josephsoon et al. | Nov 2010 | B2 |
7840031 | Albertson et al. | Nov 2010 | B2 |
7861188 | Josephsoon et al. | Dec 2010 | B2 |
7940885 | Stanton et al. | May 2011 | B2 |
7948493 | Klefenz et al. | May 2011 | B2 |
7961174 | Markovic et al. | Jun 2011 | B1 |
7961934 | Thrun et al. | Jun 2011 | B2 |
7971156 | Albertson et al. | Jun 2011 | B2 |
7980885 | Gattwinkel et al. | Jul 2011 | B2 |
8023698 | Niwa et al. | Sep 2011 | B2 |
8035624 | Bell et al. | Oct 2011 | B2 |
8045825 | Shimoyama et al. | Oct 2011 | B2 |
8064704 | Kim et al. | Nov 2011 | B2 |
8085339 | Marks | Dec 2011 | B2 |
8086971 | Radivojevic et al. | Dec 2011 | B2 |
8111239 | Pryor et al. | Feb 2012 | B2 |
8112719 | Hsu et al. | Feb 2012 | B2 |
8144233 | Fukuyama | Mar 2012 | B2 |
8185176 | Mangat et al. | May 2012 | B2 |
8213707 | Li et al. | Jul 2012 | B2 |
8218858 | Gu | Jul 2012 | B2 |
8229134 | Duraiswami et al. | Jul 2012 | B2 |
8235529 | Raffle et al. | Aug 2012 | B1 |
8244233 | Chang et al. | Aug 2012 | B2 |
8249345 | Wu et al. | Aug 2012 | B2 |
8270669 | Aichi et al. | Sep 2012 | B2 |
8289162 | Mooring et al. | Oct 2012 | B2 |
8290208 | Kurtz et al. | Oct 2012 | B2 |
8304727 | Lee et al. | Nov 2012 | B2 |
8319832 | Nagata et al. | Nov 2012 | B2 |
8363010 | Nagata | Jan 2013 | B2 |
8395600 | Kawashima et al. | Mar 2013 | B2 |
8432377 | Newton | Apr 2013 | B2 |
8471848 | Tschesnok | Jun 2013 | B2 |
8514221 | King et al. | Aug 2013 | B2 |
8553037 | Smith et al. | Oct 2013 | B2 |
8582809 | Halimeh et al. | Nov 2013 | B2 |
8593417 | Kawashima et al. | Nov 2013 | B2 |
8605202 | Muijs et al. | Dec 2013 | B2 |
8631355 | Murillo et al. | Jan 2014 | B2 |
8638989 | Holz | Jan 2014 | B2 |
8659594 | Kim et al. | Feb 2014 | B2 |
8659658 | Vassigh et al. | Feb 2014 | B2 |
8686943 | Rafii | Apr 2014 | B1 |
8693731 | Holz et al. | Apr 2014 | B2 |
8738523 | Sanchez et al. | May 2014 | B1 |
8744122 | Salgian et al. | Jun 2014 | B2 |
8768022 | Miga et al. | Jul 2014 | B2 |
8817087 | Weng et al. | Aug 2014 | B2 |
8842084 | Andersson et al. | Sep 2014 | B2 |
8843857 | Berkes et al. | Sep 2014 | B2 |
8872914 | Gobush | Oct 2014 | B2 |
8878749 | Wu et al. | Nov 2014 | B1 |
8891868 | Ivanchenko | Nov 2014 | B1 |
8907982 | Zontrop et al. | Dec 2014 | B2 |
8922590 | Luckett, Jr. et al. | Dec 2014 | B1 |
8929609 | Padovani et al. | Jan 2015 | B2 |
8930852 | Chen et al. | Jan 2015 | B2 |
8942881 | Hobbs et al. | Jan 2015 | B2 |
8954340 | Sanchez et al. | Feb 2015 | B2 |
8957857 | Lee et al. | Feb 2015 | B2 |
9014414 | Katano et al. | Apr 2015 | B2 |
9056396 | Linnell | Jun 2015 | B1 |
9070019 | Holz | Jun 2015 | B2 |
9119670 | Yang et al. | Sep 2015 | B2 |
9122354 | Sharma | Sep 2015 | B2 |
9124778 | Crabtree | Sep 2015 | B1 |
9459697 | Bedikian et al. | Oct 2016 | B2 |
9702977 | Holz | Jul 2017 | B2 |
9927522 | Holz | Mar 2018 | B2 |
20010044858 | Rekimoto | Nov 2001 | A1 |
20010052985 | Ono | Dec 2001 | A1 |
20020008139 | Albertelli | Jan 2002 | A1 |
20020008211 | Kask | Jan 2002 | A1 |
20020041327 | Hildreth et al. | Apr 2002 | A1 |
20020080094 | Biocca et al. | Jun 2002 | A1 |
20020105484 | Navab et al. | Aug 2002 | A1 |
20030053658 | Pavlidis | Mar 2003 | A1 |
20030053659 | Pavlidis et al. | Mar 2003 | A1 |
20030081141 | Mazzapica | May 2003 | A1 |
20030123703 | Pavlidis et al. | Jul 2003 | A1 |
20030152289 | Luo | Aug 2003 | A1 |
20030202697 | Simard et al. | Oct 2003 | A1 |
20040103111 | Miller et al. | May 2004 | A1 |
20040125228 | Dougherty | Jul 2004 | A1 |
20040125984 | Ito et al. | Jul 2004 | A1 |
20040145809 | Brenner | Jul 2004 | A1 |
20040155877 | Hong et al. | Aug 2004 | A1 |
20040212725 | Raskar | Oct 2004 | A1 |
20050007673 | Chaoulov et al. | Jan 2005 | A1 |
20050068518 | Baney et al. | Mar 2005 | A1 |
20050094019 | Grosvenor et al. | May 2005 | A1 |
20050131607 | Breed | Jun 2005 | A1 |
20050156888 | Xie et al. | Jul 2005 | A1 |
20050168578 | Gobush | Aug 2005 | A1 |
20050236558 | Nabeshima et al. | Oct 2005 | A1 |
20050238201 | Shamaie | Oct 2005 | A1 |
20060017807 | Lee et al. | Jan 2006 | A1 |
20060028656 | Venkatesh et al. | Feb 2006 | A1 |
20060029296 | King et al. | Feb 2006 | A1 |
20060034545 | Mattes et al. | Feb 2006 | A1 |
20060050979 | Kawahara | Mar 2006 | A1 |
20060072105 | Wagner | Apr 2006 | A1 |
20060098899 | King et al. | May 2006 | A1 |
20060204040 | Freeman et al. | Sep 2006 | A1 |
20060210112 | Cohen et al. | Sep 2006 | A1 |
20060262421 | Matsumoto et al. | Nov 2006 | A1 |
20060290950 | Platt et al. | Dec 2006 | A1 |
20070014466 | Baldwin | Jan 2007 | A1 |
20070042346 | Weller | Feb 2007 | A1 |
20070076224 | Alexander | Apr 2007 | A1 |
20070086621 | Aggarwal et al. | Apr 2007 | A1 |
20070130547 | Boillot | Jun 2007 | A1 |
20070206719 | Suryanarayanan et al. | Sep 2007 | A1 |
20070230929 | Niwa et al. | Oct 2007 | A1 |
20070238956 | Haras et al. | Oct 2007 | A1 |
20080013826 | Hillis et al. | Jan 2008 | A1 |
20080019576 | Senftner et al. | Jan 2008 | A1 |
20080030429 | Hailpern et al. | Feb 2008 | A1 |
20080031492 | Lanz | Feb 2008 | A1 |
20080056752 | Denton et al. | Mar 2008 | A1 |
20080064954 | Adams et al. | Mar 2008 | A1 |
20080106637 | Nakao et al. | May 2008 | A1 |
20080106746 | Shpunt et al. | May 2008 | A1 |
20080110994 | Knowles et al. | May 2008 | A1 |
20080118091 | Serfaty et al. | May 2008 | A1 |
20080126937 | Pachet | May 2008 | A1 |
20080187175 | Kim et al. | Aug 2008 | A1 |
20080244468 | Nishihara et al. | Oct 2008 | A1 |
20080246759 | Summers | Oct 2008 | A1 |
20080273764 | Scholl | Nov 2008 | A1 |
20080278589 | Thorn | Nov 2008 | A1 |
20080291160 | Rabin | Nov 2008 | A1 |
20080304740 | Sun et al. | Dec 2008 | A1 |
20080319356 | Cain et al. | Dec 2008 | A1 |
20090002489 | Yang et al. | Jan 2009 | A1 |
20090093307 | Miyaki | Apr 2009 | A1 |
20090102840 | Li | Apr 2009 | A1 |
20090103780 | Nishihara et al. | Apr 2009 | A1 |
20090116742 | Nishihara | May 2009 | A1 |
20090122146 | Zalewski et al. | May 2009 | A1 |
20090153655 | Ike et al. | Jun 2009 | A1 |
20090203993 | Mangat et al. | Aug 2009 | A1 |
20090203994 | Mangat et al. | Aug 2009 | A1 |
20090217211 | Hildreth et al. | Aug 2009 | A1 |
20090257623 | Tang et al. | Oct 2009 | A1 |
20090274339 | Cohen et al. | Nov 2009 | A9 |
20090309710 | Kakinami | Dec 2009 | A1 |
20100013832 | Xiao et al. | Jan 2010 | A1 |
20100020078 | Shpunt | Jan 2010 | A1 |
20100023015 | Park | Jan 2010 | A1 |
20100026963 | Faulstich | Feb 2010 | A1 |
20100027845 | Kim et al. | Feb 2010 | A1 |
20100046842 | Conwell | Feb 2010 | A1 |
20100053164 | Imai et al. | Mar 2010 | A1 |
20100053209 | Rauch et al. | Mar 2010 | A1 |
20100053612 | Ou-Yang et al. | Mar 2010 | A1 |
20100058252 | Ko | Mar 2010 | A1 |
20100066737 | Liu | Mar 2010 | A1 |
20100066975 | Rehnstrom | Mar 2010 | A1 |
20100091110 | Hildreth | Apr 2010 | A1 |
20100118123 | Freedman et al. | May 2010 | A1 |
20100121189 | Ma et al. | May 2010 | A1 |
20100125815 | Wang et al. | May 2010 | A1 |
20100127995 | Rigazio et al. | May 2010 | A1 |
20100141762 | Siann et al. | Jun 2010 | A1 |
20100158372 | Kim et al. | Jun 2010 | A1 |
20100177929 | Kurtz et al. | Jul 2010 | A1 |
20100194863 | Lopes et al. | Aug 2010 | A1 |
20100199230 | Latta et al. | Aug 2010 | A1 |
20100199232 | Mistry et al. | Aug 2010 | A1 |
20100201880 | Iwamura | Aug 2010 | A1 |
20100208942 | Porter et al. | Aug 2010 | A1 |
20100219934 | Matsumoto | Sep 2010 | A1 |
20100222102 | Rodriguez | Sep 2010 | A1 |
20100245289 | Svajda | Sep 2010 | A1 |
20100264833 | Van Endert et al. | Oct 2010 | A1 |
20100277411 | Yee et al. | Nov 2010 | A1 |
20100296698 | Lien et al. | Nov 2010 | A1 |
20100302015 | Kipman et al. | Dec 2010 | A1 |
20100302357 | Hsu et al. | Dec 2010 | A1 |
20100303298 | Marks et al. | Dec 2010 | A1 |
20100306712 | Snook et al. | Dec 2010 | A1 |
20100309097 | Raviv et al. | Dec 2010 | A1 |
20110007072 | Khan et al. | Jan 2011 | A1 |
20110025818 | Gallmeier et al. | Feb 2011 | A1 |
20110026765 | Ivanich et al. | Feb 2011 | A1 |
20110043806 | Guetta et al. | Feb 2011 | A1 |
20110057875 | Shigeta et al. | Mar 2011 | A1 |
20110066984 | Li | Mar 2011 | A1 |
20110080470 | Kuno et al. | Apr 2011 | A1 |
20110080490 | Clarkson et al. | Apr 2011 | A1 |
20110093820 | Zhang et al. | Apr 2011 | A1 |
20110107216 | Bi | May 2011 | A1 |
20110115486 | Frohlich et al. | May 2011 | A1 |
20110116684 | Coffman et al. | May 2011 | A1 |
20110119640 | Berkes et al. | May 2011 | A1 |
20110134112 | Koh et al. | Jun 2011 | A1 |
20110148875 | Kim et al. | Jun 2011 | A1 |
20110169726 | Holmdahl et al. | Jul 2011 | A1 |
20110173574 | Clavin et al. | Jul 2011 | A1 |
20110176146 | Alvarez Diez et al. | Jul 2011 | A1 |
20110181509 | Rautiainen et al. | Jul 2011 | A1 |
20110193778 | Lee et al. | Aug 2011 | A1 |
20110205151 | Newton et al. | Aug 2011 | A1 |
20110213664 | Osterhout et al. | Sep 2011 | A1 |
20110228978 | Chen et al. | Sep 2011 | A1 |
20110234840 | Klefenz et al. | Sep 2011 | A1 |
20110243451 | Oyaizu | Oct 2011 | A1 |
20110251896 | Impollonia et al. | Oct 2011 | A1 |
20110261178 | Lo et al. | Oct 2011 | A1 |
20110267259 | Tidemand et al. | Nov 2011 | A1 |
20110279397 | Rimon et al. | Nov 2011 | A1 |
20110286676 | El Dokor | Nov 2011 | A1 |
20110289455 | Reville et al. | Nov 2011 | A1 |
20110289456 | Reville et al. | Nov 2011 | A1 |
20110291925 | Israel et al. | Dec 2011 | A1 |
20110291988 | Bamji et al. | Dec 2011 | A1 |
20110296353 | Ahmed et al. | Dec 2011 | A1 |
20110299737 | Wang et al. | Dec 2011 | A1 |
20110304600 | Yoshida | Dec 2011 | A1 |
20110304650 | Campillo et al. | Dec 2011 | A1 |
20110310007 | Margolis et al. | Dec 2011 | A1 |
20110310220 | McEldowney | Dec 2011 | A1 |
20110314427 | Sundararajan | Dec 2011 | A1 |
20120038637 | Marks | Feb 2012 | A1 |
20120050157 | Latta et al. | Mar 2012 | A1 |
20120065499 | Chono | Mar 2012 | A1 |
20120068914 | Jacobsen et al. | Mar 2012 | A1 |
20120092254 | Wong et al. | Apr 2012 | A1 |
20120113316 | Ueta et al. | May 2012 | A1 |
20120159380 | Kocienda et al. | Jun 2012 | A1 |
20120163675 | Joo et al. | Jun 2012 | A1 |
20120194517 | Izadi et al. | Aug 2012 | A1 |
20120204133 | Guendelman et al. | Aug 2012 | A1 |
20120223959 | Lengeling | Sep 2012 | A1 |
20120236288 | Stanley | Sep 2012 | A1 |
20120250936 | Holmgren | Oct 2012 | A1 |
20120270654 | Padovani et al. | Oct 2012 | A1 |
20120274781 | Shet et al. | Nov 2012 | A1 |
20120281873 | Brown et al. | Nov 2012 | A1 |
20120293667 | Baba et al. | Nov 2012 | A1 |
20120314030 | Datta et al. | Dec 2012 | A1 |
20120320080 | Giese et al. | Dec 2012 | A1 |
20130019204 | Kotler et al. | Jan 2013 | A1 |
20130038694 | Nichani et al. | Feb 2013 | A1 |
20130044951 | Cherng et al. | Feb 2013 | A1 |
20130050425 | Im et al. | Feb 2013 | A1 |
20130086531 | Sugita et al. | Apr 2013 | A1 |
20130097566 | Berglund | Apr 2013 | A1 |
20130120319 | Givon | May 2013 | A1 |
20130148852 | Partis et al. | Jun 2013 | A1 |
20130182079 | Holz | Jul 2013 | A1 |
20130182897 | Holz | Jul 2013 | A1 |
20130187952 | Berkovich et al. | Jul 2013 | A1 |
20130191911 | Dellinger et al. | Jul 2013 | A1 |
20130208948 | Berkovich et al. | Aug 2013 | A1 |
20130222640 | Baek et al. | Aug 2013 | A1 |
20130239059 | Chen et al. | Sep 2013 | A1 |
20130241832 | Rimon et al. | Sep 2013 | A1 |
20130252691 | Alexopoulos | Sep 2013 | A1 |
20130257736 | Hou et al. | Oct 2013 | A1 |
20130258140 | Lipson et al. | Oct 2013 | A1 |
20130271397 | MacDougall et al. | Oct 2013 | A1 |
20130300831 | Mavromatis et al. | Nov 2013 | A1 |
20130307935 | Rappel et al. | Nov 2013 | A1 |
20130321265 | Bychkov et al. | Dec 2013 | A1 |
20140002365 | Ackley et al. | Jan 2014 | A1 |
20140010441 | Shamaie | Jan 2014 | A1 |
20140064566 | Shreve et al. | Mar 2014 | A1 |
20140081521 | Frojdh et al. | Mar 2014 | A1 |
20140085203 | Kobayashi | Mar 2014 | A1 |
20140125775 | Holz | May 2014 | A1 |
20140125813 | Holz | May 2014 | A1 |
20140132738 | Ogura et al. | May 2014 | A1 |
20140134733 | Wu et al. | May 2014 | A1 |
20140139425 | Sakai | May 2014 | A1 |
20140139641 | Holz | May 2014 | A1 |
20140157135 | Lee et al. | Jun 2014 | A1 |
20140161311 | Kim | Jun 2014 | A1 |
20140168062 | Katz et al. | Jun 2014 | A1 |
20140176420 | Zhou et al. | Jun 2014 | A1 |
20140177913 | Holz | Jun 2014 | A1 |
20140189579 | Rimon et al. | Jul 2014 | A1 |
20140192024 | Holz | Jul 2014 | A1 |
20140201666 | Bedikian et al. | Jul 2014 | A1 |
20140201689 | Bedikian et al. | Jul 2014 | A1 |
20140222385 | Muenster et al. | Aug 2014 | A1 |
20140223385 | Ton et al. | Aug 2014 | A1 |
20140225826 | Juni | Aug 2014 | A1 |
20140240215 | Tremblay et al. | Aug 2014 | A1 |
20140240225 | Eilat | Aug 2014 | A1 |
20140248950 | Bautista | Sep 2014 | A1 |
20140253512 | Narikawa et al. | Sep 2014 | A1 |
20140253785 | Chan et al. | Sep 2014 | A1 |
20140267098 | Na et al. | Sep 2014 | A1 |
20140282282 | Holz | Sep 2014 | A1 |
20140307920 | Holz | Oct 2014 | A1 |
20140344762 | Grasset et al. | Nov 2014 | A1 |
20140364209 | Perry | Dec 2014 | A1 |
20140364212 | Osman et al. | Dec 2014 | A1 |
20140369558 | Holz | Dec 2014 | A1 |
20140375547 | Katz et al. | Dec 2014 | A1 |
20150003673 | Fletcher | Jan 2015 | A1 |
20150009149 | Gharib et al. | Jan 2015 | A1 |
20150016777 | Abovitz et al. | Jan 2015 | A1 |
20150022447 | Hare et al. | Jan 2015 | A1 |
20150029091 | Nakashima et al. | Jan 2015 | A1 |
20150084864 | Geiss et al. | Mar 2015 | A1 |
20150097772 | Starner | Apr 2015 | A1 |
20150115802 | Kuti et al. | Apr 2015 | A1 |
20150116214 | Grunnet-Jepsen et al. | Apr 2015 | A1 |
20150131859 | Kim et al. | May 2015 | A1 |
20150172539 | Neglur | Jun 2015 | A1 |
20150193669 | Gu et al. | Jul 2015 | A1 |
20150205358 | Lyren | Jul 2015 | A1 |
20150205400 | Hwang et al. | Jul 2015 | A1 |
20150206321 | Scavezze et al. | Jul 2015 | A1 |
20150227795 | Starner et al. | Aug 2015 | A1 |
20150234569 | Hess | Aug 2015 | A1 |
20150253428 | Holz | Sep 2015 | A1 |
20150258432 | Stafford et al. | Sep 2015 | A1 |
20150261291 | Mikhailov et al. | Sep 2015 | A1 |
20150304593 | Sakai | Oct 2015 | A1 |
20150323785 | Fukata et al. | Nov 2015 | A1 |
20160062573 | Dascola et al. | Mar 2016 | A1 |
20160086046 | Holz et al. | Mar 2016 | A1 |
20160093105 | Rimon et al. | Mar 2016 | A1 |
Number | Date | Country |
---|---|---|
1984236 | Jun 2007 | CN |
201332447 | Oct 2009 | CN |
101729808 | Jun 2010 | CN |
101930610 | Dec 2010 | CN |
101951474 | Jan 2011 | CN |
102053702 | May 2011 | CN |
201859393 | Jun 2011 | CN |
102201121 | Sep 2011 | CN |
102236412 | Nov 2011 | CN |
4201934 | Jul 1993 | DE |
10326035 | Jan 2005 | DE |
102007015495 | Oct 2007 | DE |
102007015497 | Jan 2014 | DE |
0999542 | May 2000 | EP |
1477924 | Nov 2004 | EP |
1837665 | Sep 2007 | EP |
2369443 | Sep 2011 | EP |
2419433 | Apr 2006 | GB |
2480140 | Nov 2011 | GB |
2519418 | Apr 2015 | GB |
H02236407 | Sep 1990 | JP |
H08261721 | Oct 1996 | JP |
H09259278 | Oct 1997 | JP |
2000023038 | Jan 2000 | JP |
2002-133400 | May 2002 | JP |
2003256814 | Sep 2003 | JP |
2004246252 | Sep 2004 | JP |
2006019526 | Jan 2006 | JP |
2006259829 | Sep 2006 | JP |
2007272596 | Oct 2007 | JP |
2008227569 | Sep 2008 | JP |
2009031939 | Feb 2009 | JP |
2009037594 | Feb 2009 | JP |
2010-060548 | Mar 2010 | JP |
2011010258 | Jan 2011 | JP |
2011065652 | Mar 2011 | JP |
2011-107681 | Jun 2011 | JP |
4906960 | Mar 2012 | JP |
2012-527145 | Nov 2012 | JP |
101092909 | Jun 2011 | KR |
2422878 | Jun 2011 | RU |
200844871 | Nov 2008 | TW |
1994026057 | Nov 1994 | WO |
2004114220 | Dec 2004 | WO |
2006020846 | Feb 2006 | WO |
2007137093 | Nov 2007 | WO |
201007662 | Jan 2010 | WO |
2010032268 | Mar 2010 | WO |
2010076622 | Jul 2010 | WO |
2010088035 | Aug 2010 | WO |
20100138741 | Dec 2010 | WO |
2011024193 | Mar 2011 | WO |
2011036618 | Mar 2011 | WO |
2011044680 | Apr 2011 | WO |
2011045789 | Apr 2011 | WO |
2011119154 | Sep 2011 | WO |
2012027422 | Mar 2012 | WO |
2013109608 | Jul 2013 | WO |
2013109609 | Jul 2013 | WO |
2014208087 | Dec 2014 | WO |
2015026707 | Feb 2015 | WO |
Entry |
---|
Heikkila, J., “Accurate Camera Calibration and Feature Based 3-D Reconstruction from Monocular Image Sequences”, Infotech Oulu and Department of Electrical Engineering, University of Oulu, 1997, 126 pages. |
PCT/US2013/021709—International Preliminary Report on Patentability dated Jul. 22, 2014, 22 pages. |
U.S. Appl. No. 13/414,485—Office Action dated Nov. 4, 29 pages. |
U.S. Appl. No. 15/253,741—Office Action dated Jan. 13, 2014, 53 pages. |
U.S. Appl. No. 141/23,370—Office Action dated Jan. 13, 2017, 33 pages. |
Gorce et al., “Model-Based 3D Hand Pose Estimation from Monocular Video”, Feb. 24, 2011 [retrieved Jul. 15, 2016], IEEE Transac Pattern Analysis and Machine Intell, vol. 33, Issue: 9, pp. 1793-1805, Retri Internet: <http://ieeexplore.ieee.org/xpl/logi n .jsp ?tp=&arnu mber=571 9617 &u rl=http%3A %2 F%2 Fieeexplore. ieee.org%2Fxpls%2 Fabs all.jsp%3Farnumber%3D5719617>. |
Oka et al., “Real-Time Fingertip Tracking and Gesture Recognition”, Nov./Dec. 2002 [retrieved Jul. 15, 2016], IEEE Computer Graphics and Applications, vol. 22, Issue: 6, pp. 64-71. Retrieved from the Internet: <http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=1046630&url=http%3A%2F%2Fieeexplore.ieee.org%2Fxpls%2Fabsall.jsp%3Farnumber%3D1046630>. |
JP 2010-060548 A—Japanese Patent with English Abstract filed Mar. 18, 2010, 19 pages. |
JP 2011-107681 A—Japanese Patent with English Abstract filed Jun. 2, 2011, 16 pages. |
JP 2012-527145 A—Japanese Patent with English Abstract filed Nov. 1, 2012, 30 pages. |
Pedersini, et al., Accurate Surface Reconstruction from Apparent Contours, Sep. 5-8, 2000 European Signal Processing Conference EUSIPCO 2000, vol. 4, Retrieved from the Internet: http://home.deib.polimi.it/sarti/CV_and_publications.html, pp. 1-4. |
PCT/US2014/028265, Application (Determining Positional Information for an Object in Space), May 9, 2014, 117 pages. |
VCNL4020 Vishay Semiconductors. Datasheet [online]. Vishay Intertechnology, Inc, Doc No. 83476, Rev. 1.3, Oct. 29, 2013 [retrieved Mar. 4, 2014]. Retrieved from the Internet: <www.vishay.com>. 16 pages. |
VCNL4020 Vishay Semiconductors. Application Note [online]. Designing VCNL4020 into an Application. Vishay Intertechnology, Inc, Doc No. 84136, Revised May 22, 2012 [retrieved Mar. 4, 2014]. Retrieved from the Internet: <www.vishay.com>. 21 pages. |
Schaar, R., VCNL4020 Vishay Semiconductors. Application Note [online]. Extended Detection Range with VCNL Family of Proximity Sensor Vishay Intertechnology, Inc, Doc No. 84225, Revised Oct. 25, 2013 [retrieved Mar. 4, 2014]. Retrieved from the Internet: <www.vishay.com>. 4 pages. |
PCT/US2014/028265, International Search Report and Written Opinion, dated Jan. 7, 2015, 15 pages. |
PCT/US2013/021713, International Search Report, dated Sep. 11, 2013, 7 pages (WO 2013/109609). |
Olsson, K., et al., “Shape from Silhouette Scanner—Creating a Digital 3D Model of a Real Object by Analyzing Photos From Multiple Views,” University of Linkoping, Sweden, Copyright VCG 2001, Retrieved from the Internet: <http://liu.diva-portal.org/smash/get/diva2:18671/FULLTEXT01> on Jun. 17, 2013, 52 pages. |
Forbes, K., et al., “Using Silhouette Consistency Constraints to Build 3D Models,” University of Cape Town, Copyright De Beers 2003, Retrieved from the Internet: <http://www.dip.ee.uct.ac.za/˜kforbes/Publications/Forbes2003Prasa.pdf> on Jun. 17, 2013, 6 pages. |
Cumani, A., et al., “Recovering the 3D Structure of Tubular Objects from Stereo Silhouettes,” Pattern Recognition, Elsevier, GB, vol. 30, No. 7, Jul. 1, 1997, 9 pages. |
May, S., et al., “Robust 3D-Mapping with Time-of-Flight Cameras,” 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, Piscataway, NJ, USA, Oct. 10, 2009, pp. 1673-1678. |
Kanhangad, V., et al., “A Unified Framework for Contactless Hand Verification,” IEEE Transactions on Information Forensics and Security, IEEE, Piscataway, NJ, US., vol. 6, No. 3, Sep. 1, 2011, pp. 1014-1027. |
Di Zenzo, S., et al., “Advances in Image Segmentation,” Image and Vision Computing, Elsevier, Guildford, GBN, vol. 1, No. 1, Copyright Butterworth & Co Ltd., Nov. 1, 1983, pp. 196-210. |
PCT/US2013/021709, International Preliminary Report on Patentability and Written Opinion, dated Sep. 12, 2013, 22 pages (WO 2013/109608). |
Butail, S., et al., “Three-Dimensional Reconstruction of the Fast-Start Swimming Kinematics of Densely Schooling Fish,” Journal of the Royal Society Interface, Jun. 3, 2011, retrieved from the Internet <http://www.ncbi.nlm.nih.gov/pubmed/21642367>, pp. 0, 1-12. |
Kim, et al., “Development of an Orthogonal Double-Image Processing Algorithm to Measure Bubble,” Department of Nuclear Engineering and Technology, Seoul National University Korea, vol. 39 No. 4, Published Jul. 6, 2007, pp. 313-326. |
Kulesza, et al., “Arrangement of a Multi Stereo Visual Sensor System for a Human Activities Space,” Source: Stereo Vision, Book edited by: Dr. Asim Bhatti, ISBN 978-953-7619-22-0, Copyright Nov. 2008, I-Tech, Vienna, Austria, www.intechopen.com, pp. 153-173. |
Arthington, et al., “Cross-section Reconstruction During Uniaxial Loading,” Measurement Science and Technology, vol. 20, No. 7, Jun. 10, 2009, Retrieved from the Internet: http:iopscience.iop.org/0957-0233/20/7/075701, pp. 1-9. |
Chung, et al., “International Journal of Computer Vision: RecoveringLSHGCs and SHGCs from Stereo” [on-line], Oct. 1996 [retrieved on Apr. 10, 2014], Kluwer Academic Publishers, vol. 20, issue 1-2, Retrieved from the Internet: http://link.springer.com/article/10.1007/BF00144116#, pp. 43-58. |
Bardinet, et al., “Fitting of iso-Surfaces Using Superquadrics and Free-Form Deformations” [on-line], Jun. 24-25, 1994 [retrieved Jan. 9, 2014], 1994 Proceedings of IEEE Workshop on Biomedical Image Analysis, Retrieved from the Internet: http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=315882&tag=1, pp. 184-193. |
Dombeck, D., et al., “Optical Recording of Action Potentials with Second-Harmonic Generation Microscopy,” The Journal of Neuroscience, Jan. 28, 2004, vol. 24(4): pp. 999-1003. |
Chinese Patent with English Abstract, CN 1984236 (A), Jun. 20, 2007, “Method for Collecting Characteristics in Telecommunication Flow Information Video Detection,” 17 pages. |
Chinese Patent with English Abstract, CN 101729808 (A), Jun. 9, 2010, “Remote Control Method for Television and System for Remotely Controlling Televion by Same,” 19 pages. |
Chinese Patent with English Abstract, CN 101930610 (A), Dec. 29, 2010, “Method for Detecting Moving Object Using Adaptable Background Model,” 26 pages. |
Chinese Patent with English Abstract, CN 101951474 (A), Jan. 19, 2011, “Television Technology Based on Gesture Control,” 7 pages. |
Chinese Patent with English Abstract, CN 102201121 (A), Sep. 28, 2011, “System and Method for Detecting ARticle in Video Scene,” 12 pages. |
Chinese Patent with English Abstract, CN 201859393 (U), Jun. 8, 2011, “Three-Dimensional Gesture Recognition Box,” 6 pages. |
Chinese Patent with English Abstract, CN 201332447 (Y), Oct. 21, 2009, “Television for Controlling or Operating GMAE Through Gesture Change,” 8 pages. |
Germany Patent with English Abstract, DE 4201934 (A1), Jul. 29, 1993, “Interactive Computer System e.g. Mouse with Hand Gesture Controlled Operation—Has 2 or 3 Dimensional User Surface That Allows One or Two Hand Input Control of Computer,” 5 pages. |
Japanese Patent with English Abstract, JP 2006019526 (A), Jan. 19, 2006, “Optical Element, Package Substrate, and Device for Optical Communication,” 64 pages. |
Japanese Patent with English Abstract, JP 2009031939 (A), Feb. 12, 2009, “Image Processing Apparatus, Method and Program,” 30 pages. |
Japanese Patent with English Abstract, JP 2009037594 (A), Feb. 19, 2009, “System and Method for Constructing Three-Dimensional Image Using Camera-Based Gesture Input,” 21 pages. |
Japanese Patent with English Abstract, JP 2011065652 (A), Mar. 31, 2011, “Sign Based Man-Machine Interaction,” 29 pages. |
Japanese Patent with English Abstract, JP 4906960 (B2), Mar. 28, 2012, “Method and Apparatus for Performing Interaction Between Mobile Device and Screen,” 20 pages. |
Russian Patent with English Abstract, RU 2422878 (C1), Jun. 27, 2011, “Method of Controlling Television Using Multimodel Interface,” 17 pages. |
Taiwan Patent with English Abstract, TW 200844871 (A), Nov. 16, 2008, “Controlling Resource Access Based on User Gesturing in a 3D Captured Image Stream of the User,” 75 pages. |
Chinese Patent with English Abstract, CN 102236412 (A), Nov. 9, 2011, “Three Dimensional Gesture Recognition System and Vision-Based Gesture Recognition Method,” 17 pages. |
U.S. Appl. No. 13/742,845, Non-Final Office Action, dated Jul. 22, 2013, 21 pages (U.S. Pat. No. 8,693,731). |
U.S. Appl. No. 13/742,845, Notice of Allowance, dated Dec. 5, 2013, 7 pages (U.S. Pat. No. 8,693,731). |
U.S. Appl. No. 13/742,845, Issue Notification, dated Mar. 19, 2014, 1 page (U.S. Pat. No. 8,693,731). |
U.S. Appl. No. 13/742,953—Office Action dated Jun. 14, 2013, 13 pages (non HBW). |
U.S. Appl. No. 13/742,953—Notice of Allowance dated Nov. 4, 2013, 14 pages (non HBW). |
U.S. Appl. No. 13/742,953, Issue Notification, dated Jan. 8, 2014, 1 pages (U.S. Pat. No. 8,638,989). |
PCT/US2013/021713—International Preliminary Report on Patentability dated Jul. 22, 2014, 13 pages, (WO 2013/109609). |
Chinese Patent with English Abstract, CN 102053702 (A), May 11, 2011, “Dynamic Gesture Control System and Method,” 8 pages. |
U.S. Appl. No. 13/742,953, Notice of Allowance, dated Nov. 4, 2013, 9 pages (U.S. Pat. No. 8,638,989). |
Germany Patent with English Abstract, DE 102007015497 (B4), Jan. 23, 2014, “Speech Recognition Device for Use in Vehicle, Has Memory Medium for Storing Dictionary Data Structured in Tree, Where Data Contain Multiple Words as Node in Tree, and Reverse-Speech-Comparison Unit Comparing Reverse Language,” 29 pages. |
Korea Patent with English Abstract, KR 101092909 (B1), Dec. 6, 2011, “Gesture Interactive Hologram Display Apparatus and Method,” 11 pages. |
Chung, et al., “Recovering LSHGCs and SHGCs from Stereo,” International Journal of Computer Vision, vol. 20, No. 1/2, 1996, pp. 43-58. |
Germany Patent with English Abstract, DE 102007015495 (A1), Oct. 4, 2007, “Control object e.g. Driver's Finger, Detection Device e.g. Vehicle Navigation System, Has Illumination Section to Illuminate One Surface of Object, and Controller to Control Illumionation of Illumination and Image Recording Sections,” 17 pages. |
PCT/JP2008/062732, WO English Abstract with Japanese Publication of WO 2010/007662 A1, “Heat-Resistant Cushion material for Forming Press,” Jan. 21, 2010, Ichikawa Co Ltd, 35 pages. |
U.S. Appl. No. 14/214,605, Non Final Office Action dated Jul. 8, 2015, 38 pages. |
U.S. Appl. No. 14/214,605, Final Office Action dated Jan. 29, 2016, 11 pages. |
U.S. Appl. No. 14/280,018—Office Action dated Feb. 12, 2016, 38 pages. |
PCT/US2013/021713—International Search Report and Written Opinion dated Sep. 11, 2013, 18 pages. |
Barat et al., “Feature Correspondences From Multiple Views of Coplanar Ellipses”, 2nd International Symposium on Visual Computing, Author Manuscript, 2006, 10 pages. |
Cheikh et al., “Multipeople Tracking Across Multiple Cameras”, International Journal on New Computer Architectures and Their Applications (IJNCAA), vol. 2, No. 1, 2012, pp. 23-33. |
Davis et al., “Toward 3-D Gesture Recognition”, International Journal of Pattern Recognition and Artificial Intelligence, vol. 13, No. 03, 1999, pp. 381-393. |
Heikkila, J., “Accurate Camera Calibration and Feature Based 3-D Reconstruction from Monocular Image Sequences”, lnfotech Oulu and Department of Electrical Engineering, University of Oulu, 1997, 126 pages. |
Rasmussen, Matihew K., “An Analytical Framework for the Preparation and Animation of a Virtual Mannequin forthe Purpose of Mannequin-Clothing Interaction Modeling”, A Thesis Submitted in Partial Fulfillment of the Requirements for the Master of Science Degree in Civil and Environmental Engineering in the Graduate College of the University of Iowa, Dec. 2008, 98 pages. |
Zenzo et al., “Advantages in Image Segmentation,” Image and Vision Computing, Elsevier Guildford, GB, Nov. 1, 1983, pp. 196-210. |
U.S. Appl. No. 14/214,605—Office Action dated May 3, 2016, 12 pages. |
U.S. Appl. No. 14/212,485—Office Action dated Jul. 28, 2016, 44 pages. |
U.S. Appl. No. 14/214,605—Final Office Action dated Sep. 8, 2016, 16 pages. |
U.S. Appl. No. 14/214,605—Response to Office Action dated May 3, 2016 filed Aug. 3, 2016, 14 pages. |
U.S. Appl. No. 14/280,018—Replacement Response to Feb. 12, 2016 Office Action filed Jun. 8, 2016, 16 pages. |
U.S. Appl. No. 14/280,018—Notice of Allowance dated Sep. 7, 2016, 7 pages. |
U.S. Appl. No. 14/280,018—Response to Feb. 12, 2016 Office Action filed May 12, 2016, 15 pages. |
PCT/US2013/021709—International Search Report and Written Opinion dated Sep. 12, 2013, 22 pages. |
PCT/US2013/021713—International Search Report and Written Opinion dated Sep. 11, 2013, 7 pages. |
U.S. Appl. No. 13/742,845—Office Action dated Jul. 22, 2013, 19 pages. |
U.S. Appl. No. 13/742,845—Notice of Allowance dated Dec. 5, 2013, 11 pages. |
U.S. Appl. No. 13/742,953—Office Action dated Jun. 14, 2013, 13 pages. |
U.S. Appl. No. 13/742,953—Notice of Allowance dated Nov. 4, 2013, 14 pages. |
PCT/US2013/021709—International Preliminary Report on Patentability dated Jul. 22, 2014, 22 pages (WO 2013/109608). |
U.S. Appl. No. 13/414,485—Office Action dated May 19, 2014, 16 pages. |
U.S. Appl. No. 13/414,485—Final Office Action dated Feb. 12, 2015, 30 pages. |
U.S. Appl. No. 14/106,148—Office Action dated Jul. 6, 2015, 12 pages. |
PCT/US2013/069231—International Search Report and Written Opinion dated Mar. 13, 2014, 7 pages. |
U.S. Appl. No. 13/744,810—Office Action dated Jun. 7, 2013, 15 pages. |
U.S. Appl. No. 13/744,810—Final Office Action dated Dec. 16, 2013, 18 pages. |
Texas Instruments, “QVGA 3D Time-of-Flight Sensor,” Product Overview: OPT 8140, Dec. 2013, Texas Instruments Incorporated, 10 pages. |
Texas Instruments, “4-Channel, 12-Bit, 80-MSPS ADC,” VSP5324, Revised Nov. 2012, Texas Instruments Incorporated, 55 pages. |
Texas Instruments, “Time-of-Flight Controller (TFC),” Product Overview; OPT9220, Jan. 2014, Texas Instruments Incorporated, 43 pages. |
PCT/US2013/069231—International Preliminary Report with Written Opinion dated May 12, 2015, 8 pages. |
U.S. Appl. No. 14/250,758—Office Action dated Jul. 6, 2015, 8 pages. |
U.S. Appl. No. 13/414,485—Office Action dated Jul. 30, 2015, 22 pages. |
U.S. Appl. No. 14/106,148—Notice of Allowance dated Dec. 2, 2015, 41 pages. |
CN 2013800122765—Office Action dated Nov. 2, 2015, 17 pages. |
U.S. Appl. No. 14/959,880—Notice of Allowance dated Mar. 2, 2016, 51 pages. |
Matsuyama et al. “Real-Time Dynamic 3-D Object Shape Reconstruction and High-Fidelity Texture Mapping for 3-D Video,” IEEE Transactions on Circuits and Systems for Video Technology, vol. 14, No. 3, Mar. 2004, pp. 357-369. |
Fukui et al. “Multiple Object Tracking System with Three Level Continuous Processes” IEEE, 1992, pp. 19-27. |
U.S. Appl. No. 14/250,758—Final Office Action dated Mar. 10, 2016, 10 pages. |
Mendez, et al., “Importance Masks for Revealing Occluded Objects in Augmented Reality,” Proceedings of the 16th ACM Symposium on Virtual Reality Software and Technology, 2 pages, ACM, 2009. |
U.S. Appl. No. 13/414,485—Office Action dated Apr. 21, 2016, 24 pages. |
U.S. Appl. No. 14/710,512—Notice of Allowance dated Apr. 28, 2016, 25 pages. |
U.S. Appl. No. 14/959,891—Office Action dated Apr. 11, 2016, 47 pages. |
U.S. Appl. No. 14/250,758—Response to Final Office Action dated Mar. 10, 2016 filed May 5, 2016, 12 pages. |
U.S. Appl. No. 14/106,148—Response to Office Action dated Jul. 6, 2015 filed Nov. 6, 2015, 41 pages. |
U.S. Appl. No. 14/959,880—Notice of Allowance dated Jul. 12, 2016, 22 pages. |
U.S. Appl. No. 14/106,148—Notice of Allowance dated Jul. 20, 2016, 30 pages. |
U.S. Appl. No. 14/959,891—Notice of Allowance dated Jul. 28, 2016, 19 pages. |
JP 2014-552391—First Office Action dated Dec. 9, 2014, 6 pages. |
U.S. Appl. No. 13/742,845—Response to Office Action dated Jul. 22, 2013 filed Sep. 26, 2013, 7 pages. |
U.S. Appl. No. 14/959,891—Response to Office Action dated Apr. 11, 2016 filed Jun. 8, 2016, 25 pages. |
DE 11 2013 000 590.5—First Office Action dated Nov. 5, 2014, 7 pages. |
DE 11 2013 000 590.5—Response to First Office Action dated Nov. 5, 2014 filed Apr. 24, 2015, 1 page. |
DE 11 2013 000 590.5—Second Office Action dated Apr. 29, 2015, 7 pages. |
DE 11 2013 000 590.5—Response to Second Office Action dated Apr. 29, 2015 filed Sep. 16, 2015, 11 pages. |
DE 11 2013 000 590.5—Third Office Action dated Sep. 28, 2015, 4 pages. |
DE 11 2013 000 590.5—Response to Third Office Action dated Sep. 28, 2015 filed Dec. 14, 2015, 64 pages. |
DE 11 2013 000 590.5—Notice of Allowance dated Jan. 18, 2016, 8 pages. |
CN 2013800122765—Response to First Office Action dated Nov. 2, 2015 filed May 14, 2016, 14 pages. |
JP 2014-552391—Response to First Office Action dated Dec. 9, 2014 filed Jun. 8, 2016, 9 pages. |
JP 2014-552391—Second Office Action dated Jul. 7, 2015, 7 pages. |
JP 2014-552391—Response to Second Office Action dated Jul. 7, 2015 filed Dec. 25, 2015, 4 pages. |
JP 2014-552391—Third Office Action dated Jan. 26, 2016, 5 pages. |
CN 2013800122765—Second Office Action dated Jul. 27, 2016, 6 pages. |
U.S. Appl. No. 14/250,758—Office Action dated Sep. 8, 2016, 9 pages. |
U.S. Appl. No. 14/710,499—Notice of Allowance dated Sep. 12, 2016, 28 pages. |
U.S. Appl. No. 14/710,499—Office Action dated Apr. 14, 2016, 30 pages. |
U.S. Appl. No. 14/710,499—Response to Office Action dated Apr. 14, 2016 filed Jul. 14, 2016, 37 pages. |
CN 2013800122765—Response to Second Office Action dated Jul. 27, 2016 filed Oct. 11, 2016, 3 pages. |
U.S. Appl. No. 14/154,730—Office Action dated Nov. 6, 2015, 9 pages. |
U.S. Appl. No. 14/155,722—Office Action dated Nov. 20, 2015, 14 pages. |
U.S. Appl. No. 14/281,817—Office Action, dated Sep. 28, 2015, 5 pages. |
U.S. Appl. No. 14/262,691—Office Action dated Dec. 11, 2015, 31 pages. |
U.S. Appl. No. 14/154,730—Response to Office Action dated Nov. 6, 2016, filed Feb. 4, 2016, 9 pages. |
U.S. Appl. No. 14/154,730—Notice of Allowance dated May 3, 2016, 5 pages. |
U.S. Appl. No. 14/474,068—Office Action dated Sep. 12, 2016, 23 pages. |
U.S. Appl. No. 14/474,077—Office Action dated Jul. 26, 2016, 30 pages. |
Ballan et al., “Lecture Notes Computer Science: 12th European Conference on Computer Vision: Motion Capture of Hands in Action Using Discriminative Salient Points”, Oct. 7-13, 2012 [retrieved Jul. 14, 2016], Springer Berlin Heidelberg, vol. 7577, pp. 640-653. Retrieved from the Internet: <http://link.springer.com/chapter/10.1 007/978-3-642-33783-3 46>. |
Cui et al., “Applications of Evolutionary Computing: Vision-Based Hand Motion Capture Using Genetic Algorithm”, 2004 [retrieved Jul. 15, 2016], Springer Berlin Heidelberg, vol. 3005 of LNCS, pp. 289-300. Retrieved from the Internet: <http://link.springer.com/chapter/10.1007/978-3-540-24653-4_30>. |
Delamarre et al., “Finding Pose of Hand in Video Images: A Stereo-based Approach”, Apr. 14-16, 1998 [retrieved Jul. 15, 2016], Third IEEE Intern Conf on Auto Face and Gesture Recog, pp. 585-590. Retrieved from the Internet: <http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=671011&url=http%3A%2F%2Fieeexplore.ieee.org%2Fxpls%2Fabs_all.jsp%3Farnumber%3D671011>. |
Gorce et al., “Model-Based 3D Hand Pose Estimation from Monocular Video”, Feb. 24, 2011 [retrieved Jul. 15, 2016], IEEE Transac Pattern Analysis and Machine lntell, vol. 33, Issue: 9, pp. 1793-1805, Retri Internet: <http://ieeexplore.ieee.org/xpl/logi n .jsp ?tp=&arnu mber=571 9617&u rl=http%3A %2 F%2 Fieeexplore.ieee.org%2Fxpls%2 Fabs all.jsp%3Farnumer%3D5719617>. |
Guo et al., Featured Wand for 3D Interaction, Jul. 2-5, 2007 [retrieved Jul. 15, 2016], 2007 IEEE International Conference on Multimedia and Expo, pp. 2230-2233. Retrieved from the Internet: <http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=4285129&tag=1&url=http%3A%2F%2Fieeexplore.ieee.org%2Fxpls%2Fabs_all.jsp%3Farnumber%3D4285129%26tag%3D1>. |
Melax et al., “Dynamics Based 3D Skeletal Hand Tracking”, May 29, 2013 [retrieved Jul. 14, 2016], Proceedings of Graphics Interface, 2013, pp. 63-70. Retrived from the Internet: <http://dl.acm.org/citation.cfm?id=2532141>. |
Oka et al., “Real-Time Fingertip Tracking and Gesture Recognition”, Nov./Dec. 2002 [retrieved Jul. 15, 2016], IEEE Computer Graphics and Applications, vol. 22, Issue: 6, pp. 64-71. Retrieved from the Internet: <http://ieeexplore.ieee.org/xpI/login.jsp?tp=&arnumber=1046630&url=http%3A%2F%2Fieeexplore.ieee.org%2Fxpls%2Fabsall.jsp%3Farnumber%3D1046630>. |
Schlattmann et al., “Markerless 4 gestures 6 DOF real-time visual tracking of the human hand with automatic initialization”, 2007 [retrieved Jul. 15, 2016], Eurographics 2007, vol. 26, No. 3, 10 pages, Retrieved from the Internet: <http://cg.cs.uni-bonn.de/aigaion2root/attachments/schlattmann-2007-markerless.pdf>. |
Wang et al., “Tracking of Deformable Hand in Real Time as Continuous Input for Gesture-based Interaction”, Jan. 28, 2007 [retrieved Jul. 15, 2016], Proceedings of the 12th International Conference on Intelligent User Interfaces, pp. 235-242. Retrieved fromthe Internet: <http://dl.acm.org/citation.cfm?id=1216338>. |
Zhao et al., “Combining Marker-Based Mocap and RGB-D Camera for Acquiring High-Fidelity Hand Motion Data”, Jul. 29, 2012 [retrieved Jul. 15, 2016], Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation, pp. 33-42, Retrieved from the Internet: <http://dl.acm.org/citation.cfm?id=2422363>. |
PCT/US2014/013012—International Search Report and Written Opinion dated May 14, 2014, published as WO 2014116991, 12 pages. |
U.S. Appl. No. 14/151,394—Office Action dated Oct. 22, 2015, 26 pgs. |
U.S. Appl. No. 14/506,596—Notice of Allowance dated Nov. 9, 2016, 15 pages. |
U.S. Appl. No. 14/151,394—Office Action dated Apr. 5, 2016, 26 pages. |
U.S. Appl. No. 14/151,394—Office Action dated Sep. 30, 2016, 40 pages. |
U.S. Appl. No. 14/214,605—Advisory Action dated Dec. 21, 2016, 5 pages. |
U.S. Appl. No. 14/214,605—Response to Final Office Action dated Sep. 8, 2016 filed Dec. 7, 2016, 16 pages. |
U.S. Appl. No. 14/214,605—Response to Final Office Action dated Jan. 29, 2016 filed Apr. 1, 2016, 17 pages. |
U.S. Appl. No. 14/214,605—Response to Non Final Office Action dated Jul. 8, 2015 filed Dec. 1, 2015, 16 pages. |
U.S. Appl. No. 14/212,485—Response to Non-Final Office Action dated Jul. 28, 2016 filed Dec. 28, 2016, 16 pages. |
U.S. Appl. No. 14/214,605, Notice of Allowance dated Mar. 6, 2017, 20 pages. |
U.S. Appl. No. 14/212,485—Final Office Action dated Mar. 22, 2017, 9 pages. |
U.S. Appl. No. 15/392,920—Notice of Allowance dated Jun. 7, 2017, 89 pages. |
Number | Date | Country | |
---|---|---|---|
20190018141 A1 | Jan 2019 | US |
Number | Date | Country | |
---|---|---|---|
61801479 | Mar 2013 | US | |
61792025 | Mar 2013 | US | |
61800327 | Mar 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15625856 | Jun 2017 | US |
Child | 15936185 | US | |
Parent | 14214605 | Mar 2014 | US |
Child | 15625856 | US |