The present invention relates to the field of computer vision based tracking of objects and control of electronic devices based on the tracked objects. Specifically, the invention relates to tracking an object having a shape of a hand.
The need for more convenient, intuitive and portable input devices increases, as computers and other electronic devices become more prevalent in our everyday life.
Recently, human gesturing, such as hand gesturing, has been suggested as a user interface input tool in which a hand gesture is detected by a camera and is translated into a specific command Gesture recognition enables humans to interface with machines naturally without any mechanical appliances. The development of alternative computer interfaces (forgoing the traditional keyboard and mouse), video games and remote controlling are only some of the fields that may implement human gesturing techniques.
Recognition of a hand gesture usually requires identification of an object as a hand and tracking the identified hand to detect a posture or gesture that is being performed.
Known gesture recognizing systems identify a user hand by using color, shape and/or contour detectors. The hand is then tracked by following features, such as pixels, determined to represent the hand, throughout a plurality of images.
However, tracking a hand in a “noisy” environment (e.g., a moving background or a background having designs similar to a human hand) may prove to be a challenge for known methods of tracking. A system for controlling a device based on tracking of a hand, may, in non-ideal environments, lose sight of the hand and/or end up tracking an object that is not the hand, causing inaccurate and unreliable performance of the system.
The method for computer vision based tracking of a hand, according to embodiments of the invention, provides an efficient process for accurate tracking of a hand, regardless of the background environment and of other complications such as quick movement of the hand.
A method according to embodiments of the invention verifies that a tracked object is a hand, based on the shape of the object, and updates the location and optionally other parameters related to the hand such as size of the hand, the hand's orientation and others, during the process of tracking the hand to facilitate identification of a hand.
Embodiments of the invention may ensure efficient, accurate, continuous and uninterrupted tracking.
The invention will now be described in relation to certain examples and embodiments with reference to the following illustrative figures so that it may be more fully understood. In the drawings:
Computer vision based identification and tracking of a hand during a process of user-machine interaction may need to deal with diverse image backgrounds (e.g., image portions behind or separate from the hand) which may cause interruption of tracking of the hand.
A method for computer vision based tracking of a hand and control of a device, according to embodiments of the invention, verifies and updates the location and optionally other parameters of the hand such as size and orientation of a hand and updates the tracking based on the verified, updated location and/or additional parameters of the hand.
In the following description, various aspects of the present invention will be described. For purposes of explanation, specific configurations and details are set forth in order to provide a thorough understanding of the present invention. However, it will also be apparent to one skilled in the art that the present invention may be practiced without the specific details presented herein. Furthermore, well known features may be omitted or simplified in order not to obscure the present invention.
Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as “processing,” “computing,” “calculating,” “determining,” or the like, refer to the action and/or processes of a computer or computing system, or similar electronic computing device, that manipulates and/or transforms data represented as physical, such as electronic, quantities within the computing system's registers and/or memories into other data similarly represented as physical quantities within the computing system's memories, registers or other such information storage, transmission or display devices.
Methods according to embodiments of the invention may be implemented in a user-device interaction system which includes a device to be operated and controlled by user commands and an image sensor. An exemplary system, according to one embodiment of the invention, is described in
According to embodiments of the invention user commands or input are based on identification and tracking of the user's hand. The system identifies the user's hand in the images obtained by the image sensor. Once a user's hand is identified it is tracked such that movement of the hand may be followed and translated into operating, input and control commands. For example, the device may include a display and movement of a hand may be translated into movement on the display of an icon or symbol, such as a cursor or any other displayed object. Movement of the hand may be translated into another manipulation of content on the display.
The image sensor may be a standard two dimensional (2D) camera and may be associated with a processor and a storage device for storing image data. The storage device may be integrated within the image sensor or may be external to the image sensor. According to some embodiments, image data may be stored in the processor, for example in a cache memory. In some embodiments image data of a field of view (which includes a user's hand) is sent to the processor for analysis. A user command or input is generated by the processor, based on the image analysis, and is sent to a device, which may be any electronic device that can accept user commands, e.g., television (TV), DVD player, personal computer (PC, mobile phone, camera, STB (Set Top Box), streamer, etc. According to one embodiment the device is an electronic device available with an integrated standard 2D camera. According to other embodiments a camera is an external accessory to the device. According to some embodiments more than one 2D camera is provided to enable obtaining three dimensional (3D) information. According to some embodiments the system includes a 3D camera.
One or more detectors may be used for correct identification of a moving object and for identification of different postures of a hand. For example, a contour detector may be used together with a feature detector.
Methods for tracking a user's hand may include using an optical flow algorithm or other known tracking methods.
An embodiment of tracking or determining the changing location of a hand shaped object is schematically illustrated in
Detecting and selecting features may be done by using feature detection algorithms such as goodFeaturesToTrack™ or cornerHarris™ or other appropriate feature detection algorithms.
In a subsequent image frame, assuming movement in between the frames, the features 11, 12 and 13 will be located in a new location. In prior art tracking systems the features 11, 12 and 13 are searched for in the subsequent image, their new location is determined and their movement or transformation is calculated and a new bounding rectangle 15′, which includes features 11, 12 and 13 is then created.
It should be appreciated that using a bounding shape, such as a rectangle, in tracking is one possible technique; however, this explanation relates also to the use of other bounding shapes or other techniques not using a bounding shape.
Bounding rectangle 15′ is typically considered to represent the hand shaped object 10 in its new location or position. However, as schematically shown in
To avoid this situation, embodiments of embodiments of the invention may verify that the object being tracked has a shape of a hand. A method for computer vision based tracking of a hand, according to an embodiment of the invention, is schematically illustrated in
Detecting a shape of a hand may be done for example by applying a shape recognition algorithm (for example, an algorithm which calculates Haar-like features in a Viola-Jones object detection framework), using machine learning techniques and other suitable shape detection methods, and optionally checking additional parameters, such as color parameters.
It should be appreciated that a “shape of a hand” may refer to a shape of a hand in different positions or postures, such as a hand with all fingers extended (open hand) or a hand with all fingers brought together such that their tips are touching or almost touching (as if the hand is holding a bulb) or other postures.
Thus, referring back to
Typically, a hand shape is detected by applying a shape recognition algorithm at a suspected or possible location in the subsequent image. The suspected or possible location is a location having a probability which is above a certain threshold, of being the location of the user's hand, as is explained in detail further herein.
This process, of looking for a shape of a hand at a suspected or possible location and once detected, selecting a second, or other set of features from within the newly detected shape, and tracking the newly selected features, may be iterated or repeated, thus enabling accurate tracking of a hand shaped object throughout or across, or for, a plurality images.
Thus, if a shape of a hand is detected at the suspected location, the hand shape may be tracked (e.g., as described above) and a device may be controlled based, among other things, on the tracking of the hand. Referring to
Determining that no shape of a hand has been detected may usually be done based on several frames. Shape recognition algorithms may be applied to one or more image frames and a probability grade or rating may be assigned to or associated with the detected shape in each frame. E.g., each of a plurality of frames may have a probability grade assigned to it based on shapes within the images. The probability grades may be assigned based on considerations such as the likeness of the detected shape to a hand shape, color parameters and other suitable parameters. According to one embodiment the determination that an object does not have a shape of a hand is done if the probability grade is below a predetermined threshold. For example, an object may be determined not to have a shape of a hand if the probability grade of a shape is below 50%. The final determination that no shape of a hand has been detected may be based on a summation or other calculation of several probability grades. The probability grades may be used, e.g., by adding or averaging, to obtain a combined probability grade. For example, a shape algorithm can be applied to 10 images, the detected shape in each image (and thus each image) being assigned its own grade. The final probability grade may be an average of all 10 grades and the determination whether the shape is a hand shape or not is based on all 10 frames.
To save computational power a shape of a hand is typically searched at a certain, limited area within an image frame rather than in the whole frame. This area is typically where a user's hand is expected or suspected to be, e.g. a suspected location (where location can mean an area). In other words, a suspected location is a location having a probability which is above a certain threshold, of being the location of a user's hand.
According to one embodiment the probability of a location being a suspected location may be based on parameters such as relative location within the image frame or on the direction of the hand shaped object based on the tracking of the object throughout a sequence of images.
Typically, the probability of a location being a suspected location is based on the tracking of the object, for example, a suspected location may be in the direction of movement of the object as determined by the tracking of the object in previous frames.
In one embodiment the probability of a location being a location of the user's hand, is based on distance from the location of the hand in a previous frame. For example, as schematically illustrated in
It should be appreciated that using a bounding shape, such as a rectangle, in tracking is one possible technique; however, other bounding shapes or other techniques not using a bounding shape may also be used.
According to one embodiment, which is schematically illustrated in
Once the size or dimension of the hand is known, units such as “width of a hand” may be calculated and used in determining a suspected location (as shown in
Another parameter that can be used to facilitate searching and finding a shape of a hand is the posture, rotation, angle of the hand or a combination of these parameters. If a hand is found to be in a certain posture (e.g., at a certain angle relative to the camera) or having a certain shape (e.g., having a few fingers extended and a few fingers folded or other hand postures), this specific angle or posture may then be searched in a subsequent image.
Tracking the first set of features and/or the second set of features results in tracking a hand shaped object, which is in most cases, a user's hand. According to embodiments of the invention a device may be controlled according to the tracking of the user's hand. For example, an icon on a display of the device may be moved according to movement of the shape of the hand. According to one embodiment the icon is a cursor. Other icons, symbols or displayed content may be manipulated according to movement of the user's hand.
According to some embodiments the location of the user's hand may be periodically or continuously updated to keep accurate tracking of the user's hand, however, the location of the icon (e.g., cursor) on the display need not be updated and changed each time the location of the hand is updated since such updating may cause uneven movement of the icon on the display. Thus, embodiments of the invention may include a step of stopping movement of the icon (e.g., cursor) when tracking of the hand (or of the hand shaped object) is ended or terminated. Movement of the icon may then be resumed, possibly from the last location of the icon (when the movement of the icon was stopped), when tracking is resumed.
Examples of devices that may be controlled according to embodiments of the invention include TVs, DVD players, PCs, mobile telephones, cameras, STBs (Set Top Boxes), streamers, and other appliances.
As discussed above, a direction of movement of a hand shaped object from previous frames can be used in determining a suspected location. For example, a suspected location can be determined as being in an area of the image which is in the direction of movement of the hand shaped object, based on the tracking of the object in previous images. Direction and other parameters (such as location within the image and/or size or posture of the hand, as discussed above) may be considered together when determining a suspected location. However, sometimes, an image frame may have several possible suspected locations and the actual location of the hand shaped object needs to be determined from these several possible locations. An example of such an embodiment is schematically illustrated in
The probability of a shape being a hand shape may be determined by comparing to a model hand or by comparing to a “hand” and “non hand” (e.g., background) database (e.g., a shape most resembling a hand shape and/or being most different than a “non-hand”) or by applying other suitable shape recognition algorithms.
Thus, for example, all the vectors having the same direction and/or speed as vector 31 or 32 or 33 are tracked in separate groups and their new locations 31′, 32′ and 33′ in frame 39′ may be possible suspected locations. Shape parameters (such as specific contours or machine learning parameters) and optionally additional parameters (such as size, angle or posture of the hand) may be searched at each new location 31′, 32′ and 33′ and the locations may be graded according to the shapes detected at each location. The shape having the highest grade may be selected and new features for further tracking are selected from this shape.
According to some embodiments a display of a device may be controlled according to the detection of the shape of the hand. For example, the display may change (e.g., an icon may appear or the display may change color or brightness or transparency) when a shape of a hand is not detected, to warn the user that tracking may be terminated and/or to motivate the user to more clearly present his hand to the camera of the system.
According to one embodiment schematically illustrated in
The object is an object that is controlled by the user, for example, the object may be a stick or ball held by the user and may be controlled by the user's hand. According to one embodiment the object is a user's body part, for example, the user's hand. According to this embodiment, the predetermined shape is a shape of a hand. A “shape of a hand” may refer to a shape of a hand in a specific posture, for example, a posture in which all fingers of the hand are extended or a hand with all fingers brought together such that their tips are touching or almost touching.
Thus, for example, as schematically illustrated in
For example, the display, parts of the display or specific icons on the display may change color or transparency. Other visible changes may occur.
According to one embodiment the icon on the display is an icon of a cursor. According to another embodiment the icon 46 represents a hand (e.g., an icon having the appearance of a hand). In a case where is hand 45 is clearly visible and the system can determine that the shape of the hand 45 detected by the system 40 is similar to a predetermined shape of a hand, the icon 46 on the display 42 will be opaque. If the user's hand 45′ is held in a different posture (for example), the shape of the hand 45′ which is detected by the system 40 will not be similar to the predetermined shape of a hand and therefore the icon 46′ will become transparent.
According to some embodiments the change may be gradual, for example, the icon 46 may be completely opaque if the probability grade is 90% or more and may be partially transparent if the probability grade is around 50% and may become almost completely transparent if the probability grade is 20% or less. In another example, the icon 46 may have one color for a high probability grade and a different color for a low probability grade.
According to one embodiment assigning a probability grade to the detected shape is based on a probability that the detected shape is the predetermined shape and on another parameter, such as color or motion. For example, an object, such as an object held by the user or such as the user's hand, arm leg, head or other body part, may be determined to be only partly similar to a predetermined shape, thus being assigned a low probability grade. But, if, for example, the object is moving, or if the object is moving in a predetermined pattern (such as in a waving gesture), then the probability grade assigned to the shape may be higher.
Embodiments of the invention may use known methods for tracking selected features, such as optical flow techniques.
Detecting a shape of a hand may be done using known methods, for example by using machine learning techniques in which a shape of an object is compared to a learnt database of hands and to a database of “non-hand” (e.g., “non hand” may include background features, hands in postures other than a desired posture and other objects that are different than the desired hand shape).
In all the embodiments described above a “shape of a hand” may refer to a shape of a hand in any specific posture, such as a hand with all fingers extended or a hand with all fingers brought together such that their tips are touching or almost touching.
As system operable according to embodiments of the invention is schematically illustrated in
Processor 502 may include, for example, one or more processors and may be a central processing unit (CPU), a digital signal processor (DSP), a microprocessor, a controller, a chip, a microchip, an integrated circuit (IC), or any other suitable multi-purpose or specific processor or controller. Memory unit(s) 52 may include, for example, a random access memory (RAM), a dynamic RAM (DRAM), a flash memory, a volatile memory, a non-volatile memory, a cache memory, a buffer, a short term memory unit, a long term memory unit, or other suitable memory units or storage units.
The device 501 may be any electronic device that can accept user commands, e.g., TV, DVD player, PC, mobile phone, camera, etc. According to one embodiment, device 501 is an electronic device available with an integrated standard 2D camera. The device 501 may include a display 51 or a display 51 may be independent, not connected to the device 501.
The processor 502 may be integral to the image sensor 503 or may be a separate unit. Alternatively, the processor 502 may be integrated within the device 501. According to other embodiments a first processor may be integrated within the image sensor and a second processor may be integrated within the device.
The communication between the image sensor 503 and processor 502 and/or between the processor 502 and the device 501 may be through a wired or wireless link, such as through infrared (IR) communication, radio transmission, Bluetooth technology and other suitable communication routes.
According to one embodiment the image sensor 503 is a camera such as a forward facing camera. The image sensor 503 may be a standard 2D camera such as a webcam or other standard video capture device, typically installed on PCs or other electronic devices.
The image sensor 503 may obtain frames at varying frame rates. According to embodiments of the invention the image sensor 503 obtains image data of a user's hand 505 when the hand enters the field of view 504.
According to some embodiments image data may be stored in processor 502, for example in a cache memory. Processor 502 can apply image analysis algorithms, such as motion detection and shape recognition algorithms to identify and further track the user's hand. Processor 502 may perform methods according to embodiments discussed herein by for example executing software or instructions stored in memory 52. When discussed herein, a processor such as processor 502 which may carry out all or part of a method as discussed herein, may be configured to carry out the method by, for example, being associated with or connected to a memory such as memory 52 storing code or software which, when executed by the processor, carry out the method.
Optionally, the system 500 may include an electronic display 51. According to embodiments of the invention, mouse emulation and/or control of a cursor on a display, are based on computer visual identification and tracking of a user's hand, for example, as detailed above.
For example, the system 500 may include a device 501, an imager, such as image sensor 503, to receive a sequence of images of a field of view and a processor, such as processor 502, which is in communication with the image sensor 503 and with the device 501. The processor 502 (or several processors) may detect within an image from the sequence of images an object having a shape of a hand; track at least one first selected feature from within the object; detect a shape of a hand at a suspected location of the object; select at least one second feature to be tracked from within the detected shape of the hand; track the second feature; and control the device 501 based on the tracking of the second feature.
Different embodiments are disclosed herein. Features of certain embodiments may be combined with features of other embodiments; thus certain embodiments may be combinations of features of multiple embodiments.
Embodiments of the invention may include an article such as a computer or processor readable non-transitory storage medium, such as for example a memory, a disk drive, or a USB flash memory encoding, including or storing instructions, e.g., computer-executable instructions, which when executed by a processor or controller, cause the processor or controller to carry out methods disclosed herein.
The foregoing description of the embodiments of the invention has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. It should be appreciated by persons skilled in the art that many modifications, variations, substitutions, changes, and equivalents are possible in light of the above teaching. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the invention.
The present application is a continuation of prior PCT International Application No. PCT/IL2013/050396, International Filing Date May 9, 2013, which claims priority from U.S. Provisional application No. 61/645,212, filed on May 10, 2012, all of which are incorporated by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
5594469 | Freeman et al. | Jan 1997 | A |
6002808 | Freeman | Dec 1999 | A |
6084575 | Oktay | Jul 2000 | A |
6147678 | Kumar et al. | Nov 2000 | A |
6191773 | Maruno et al. | Feb 2001 | B1 |
6204852 | Kumar et al. | Mar 2001 | B1 |
6236736 | Crabtree et al. | May 2001 | B1 |
6323846 | Westerman et al. | Nov 2001 | B1 |
6434255 | Harakawa | Aug 2002 | B1 |
6598245 | Nishioka | Jul 2003 | B2 |
6771294 | Pulli et al. | Aug 2004 | B1 |
6788809 | Grzeszczuk et al. | Sep 2004 | B1 |
6819782 | Imagawa et al. | Nov 2004 | B1 |
6996460 | Krahnstoever et al. | Feb 2006 | B1 |
7274803 | Sharma et al. | Sep 2007 | B1 |
7340077 | Gokturk et al. | Mar 2008 | B2 |
7379566 | Hildreth | May 2008 | B2 |
7480414 | Brown et al. | Jan 2009 | B2 |
7483049 | Aman et al. | Jan 2009 | B2 |
7606411 | Venetsky et al. | Oct 2009 | B2 |
7620316 | Boillot | Nov 2009 | B2 |
7639881 | Viola et al. | Dec 2009 | B2 |
7849421 | Yoo et al. | Dec 2010 | B2 |
7949487 | Sugihara et al. | May 2011 | B2 |
8004492 | Kramer et al. | Aug 2011 | B2 |
8014567 | Yoon et al. | Sep 2011 | B2 |
8146020 | Clarkson | Mar 2012 | B2 |
8149210 | Klier et al. | Apr 2012 | B2 |
8339359 | Hsieh et al. | Dec 2012 | B2 |
8358355 | Deguchi et al. | Jan 2013 | B2 |
8358855 | Hamza et al. | Jan 2013 | B2 |
8526675 | Ruan | Sep 2013 | B2 |
20010001303 | Ohsuga et al. | May 2001 | A1 |
20010043719 | Harakawa et al. | Nov 2001 | A1 |
20020057383 | Iwamura | May 2002 | A1 |
20020075334 | Yfantis | Jun 2002 | A1 |
20020175894 | Grillo | Nov 2002 | A1 |
20030128871 | Naske et al. | Jul 2003 | A1 |
20030138130 | Cohen et al. | Jul 2003 | A1 |
20030146935 | Adleman | Aug 2003 | A1 |
20040001113 | Zipperer et al. | Jan 2004 | A1 |
20040101192 | Yokoyama | May 2004 | A1 |
20040141634 | Yamamoto et al. | Jul 2004 | A1 |
20040189720 | Wilson et al. | Sep 2004 | A1 |
20040242988 | Niwa et al. | Dec 2004 | A1 |
20050025345 | Ohta et al. | Feb 2005 | A1 |
20050064936 | Pryor | Mar 2005 | A1 |
20050104850 | Hu et al. | May 2005 | A1 |
20050134117 | Ito et al. | Jun 2005 | A1 |
20050271279 | Fujimura et al. | Dec 2005 | A1 |
20060033701 | Wilson | Feb 2006 | A1 |
20060132432 | Bell | Jun 2006 | A1 |
20060188849 | Shamaie | Aug 2006 | A1 |
20060238520 | Westerman et al. | Oct 2006 | A1 |
20060245618 | Boregowda et al. | Nov 2006 | A1 |
20060284837 | Stenger et al. | Dec 2006 | A1 |
20070057781 | Breed | Mar 2007 | A1 |
20070077987 | Gururajan et al. | Apr 2007 | A1 |
20070092134 | Fukui et al. | Apr 2007 | A1 |
20070113207 | Gritton | May 2007 | A1 |
20070118820 | Hatakeyama | May 2007 | A1 |
20070298882 | Marks et al. | Dec 2007 | A1 |
20080019589 | Yoon et al. | Jan 2008 | A1 |
20080036732 | Wilson et al. | Feb 2008 | A1 |
20080042989 | Westerman et al. | Feb 2008 | A1 |
20080065291 | Breed | Mar 2008 | A1 |
20080126937 | Pachet | May 2008 | A1 |
20080141181 | Ishigaki et al. | Jun 2008 | A1 |
20080166022 | Hildreth | Jul 2008 | A1 |
20080187213 | Zhang et al. | Aug 2008 | A1 |
20080205701 | Shamaie et al. | Aug 2008 | A1 |
20080244468 | Nishihara et al. | Oct 2008 | A1 |
20080273755 | Hildreth | Nov 2008 | A1 |
20090027337 | Hildreth | Jan 2009 | A1 |
20090060293 | Nagao et al. | Mar 2009 | A1 |
20090073117 | Tsurumi et al. | Mar 2009 | A1 |
20090079813 | Hildreth | Mar 2009 | A1 |
20090096871 | Kuwano et al. | Apr 2009 | A1 |
20090141940 | Zhao et al. | Jun 2009 | A1 |
20090217211 | Hildreth et al. | Aug 2009 | A1 |
20090254855 | Kretz et al. | Oct 2009 | A1 |
20090273571 | Bowens | Nov 2009 | A1 |
20090315740 | Hildreth et al. | Dec 2009 | A1 |
20100014758 | Yano et al. | Jan 2010 | A1 |
20100039378 | Yabe | Feb 2010 | A1 |
20100040292 | Clarkson | Feb 2010 | A1 |
20100050133 | Nishihara et al. | Feb 2010 | A1 |
20100066667 | MacDougall et al. | Mar 2010 | A1 |
20100104134 | Wang et al. | Apr 2010 | A1 |
20100110384 | Maekawa | May 2010 | A1 |
20100156783 | Bajramovic | Jun 2010 | A1 |
20100159981 | Chiang et al. | Jun 2010 | A1 |
20100169840 | Chen et al. | Jul 2010 | A1 |
20100171691 | Cook et al. | Jul 2010 | A1 |
20100199232 | Mistry et al. | Aug 2010 | A1 |
20100281440 | Underkoffler et al. | Nov 2010 | A1 |
20100295781 | Alameh et al. | Nov 2010 | A1 |
20100306710 | Poot | Dec 2010 | A1 |
20100315414 | Lowe | Dec 2010 | A1 |
20100321293 | Hsiung | Dec 2010 | A1 |
20100329511 | Yoon et al. | Dec 2010 | A1 |
20110001840 | Ishii et al. | Jan 2011 | A1 |
20110025601 | Wilson et al. | Feb 2011 | A1 |
20110026765 | Ivanich et al. | Feb 2011 | A1 |
20110034244 | Marks et al. | Feb 2011 | A1 |
20110050562 | Schoen et al. | Mar 2011 | A1 |
20110102570 | Wilf et al. | May 2011 | A1 |
20110107216 | Bi | May 2011 | A1 |
20110110560 | Adhikari | May 2011 | A1 |
20110117535 | Benko et al. | May 2011 | A1 |
20110154266 | Friend et al. | Jun 2011 | A1 |
20110193939 | Vassigh et al. | Aug 2011 | A1 |
20110242134 | Miller et al. | Oct 2011 | A1 |
20110260965 | Kim et al. | Oct 2011 | A1 |
20110267258 | Wang et al. | Nov 2011 | A1 |
20110267265 | Stinson | Nov 2011 | A1 |
20110268365 | Lou et al. | Nov 2011 | A1 |
20110280441 | Chen et al. | Nov 2011 | A1 |
20110289456 | Reville et al. | Nov 2011 | A1 |
20110304541 | Dalal | Dec 2011 | A1 |
20110304632 | Evertt et al. | Dec 2011 | A1 |
20120027252 | Liu et al. | Feb 2012 | A1 |
20120062729 | Hart et al. | Mar 2012 | A1 |
20120087543 | Choi et al. | Apr 2012 | A1 |
20120113223 | Hilliges et al. | May 2012 | A1 |
20120114173 | Ikenoue | May 2012 | A1 |
20120117514 | Kim et al. | May 2012 | A1 |
20120119991 | Tsai et al. | May 2012 | A1 |
20120120015 | Suggs et al. | May 2012 | A1 |
20120146903 | Arihara et al. | Jun 2012 | A1 |
20120154619 | Lee | Jun 2012 | A1 |
20120200494 | Perski et al. | Aug 2012 | A1 |
20120308140 | Ambrus et al. | Dec 2012 | A1 |
20130135199 | Perski et al. | May 2013 | A1 |
20140043234 | Eilat et al. | Feb 2014 | A1 |
20140053115 | Perski et al. | Feb 2014 | A1 |
20140071042 | Eilat et al. | Mar 2014 | A1 |
20140118244 | Kaplan et al. | May 2014 | A1 |
Number | Date | Country |
---|---|---|
2440348 | Jan 2008 | GB |
2004078977 | Mar 2004 | JP |
2008040576 | Feb 2008 | JP |
10-2006-0101071 | Sep 2006 | KR |
10-2009-0124172 | Dec 2009 | KR |
466438 | Dec 2001 | TW |
WO 2005114556 | Dec 2005 | WO |
WO 2007097548 | Aug 2007 | WO |
WO 2008018943 | Feb 2008 | WO |
WO 2008038096 | Apr 2008 | WO |
WO 2009055148 | Apr 2009 | WO |
WO 2009083984 | Jul 2009 | WO |
WO 2009128064 | Oct 2009 | WO |
WO 2010144050 | Dec 2010 | WO |
WO 2011045789 | Apr 2011 | WO |
WO 2011056731 | May 2011 | WO |
WO 2011137226 | Nov 2011 | WO |
WO 2011138775 | Nov 2011 | WO |
WO 2012020410 | Feb 2012 | WO |
WO 2013136333 | Sep 2013 | WO |
WO 2014106849 | Jul 2014 | WO |
WO 2014111947 | Jul 2014 | WO |
Entry |
---|
A Boosted Classifier Tree for Hand Shape Detection.Eng-Jon Ong and Richard Bowden. IEEE Xplore 2004. |
Argyros et al., Vision-Based Interpretation of Hand Gestures for Remote Control of a Computer Mouse, T.S. Huang et al. (Eds.): HCI/ECCV 2006, LNCS 3979, pp. 39-50, 2006. |
Suk et al., Dynamic Bayesian Network based Two-Hand Gesture Recognition, Journal of Korean Institute of Information Scientists and Engineers (KIISE): Software and Applications, vol. 35, No. 4, pp. 265-279, 2008. |
Argyros et al., Binocular Hand Tracking and Reconstruction Based on 2D Shape Matching, ICPR 1, p. 207-210. IEEE Computer Society, (2006). |
Nickel et al., Pointing Gesture Recognition based on 3D-Tracking of Face, Hands and Head Orientation, In Proceedings of the 5th international conference on Multimodal interfaces, pp. 140-146. AMC, Nov. 5, 2003. |
Rumpf et al., A Nonlinear Elastic Shape Averaging Approach, SIAM Journal on Imaging Sciences, vol. 2.3, pp. 800-833, 2009. |
Wilson Andrew D., Robust Computer Vision-Based Detection of Pinching for One and Two-Handed Gesture Input, Microsoft Research, UIST, Oct. 15-18, 2006. |
Kolarić et al., Direct 3D Manipulation Using Vision—Based Recognition of Uninstrumented Hands, X Symposium of Virtual and Augmented Reality, pp. 212-220, 2008. |
Garg et al., Vision Based Hand Gesture Recognition, World Academy of Science, Engineering and Technology, No. 49, 2009. |
Mikolajczyk et al., Shape recognition with edge-based features, British Machine Vision Conference (BMVC), pp. 779-788, 2003. |
Kristensson et al., Continuous Recognition of One-Handed and Two-Handed Gestures Using 3D Full-Body Motion Tracking Sensors, IUI'12, pp. 89-92, Feb. 14-17, 2012, Lisbon, Portugal. |
De La Gorce et al., Model-Based Hand Tracking with Texture, Shading and Self-Occlusions, IEEE Conference on Computer Vision and Pattern Recognition, Anchorage, Alaska, Jun. 2008. |
Shilov Anton, Toshiba Formally Unveils Notebooks With SuperEngine Chip, new report on Xbit Labs web site, Jun. 24, 2008. |
LG Press Release, LG unveils redesigned magic remote with advanced voice control technology, LG Press Release, Dec. 19, 2012. |
Spruyt et al, Real-time multi-colourspace hand segmentation, Proceedings of 2010 IEEE 17th International conference on image processing, pp. 3117-3120, Sep. 26-29, 2012. |
Irie et al, Skin color registration using recognition of waving hand, Journal of Robotics and Mechatronics, vol. 22, No. 3, 2010, pp. 262-272, Feb. 1, 2010. |
Xiong et al, Extraction of hand gestures with adaptive skin color models and its applications to meeting analysis, Proceedings of the Eighth IEEE International symposium on multimedia, pp. 647-651, Dec. 1, 2006. |
Greg, Skeleton tracking with kinect and processing, http://urbanhonking.com/ideasfordozens/2011/02/16/skeleton-tracking-with-kinect-and-processing>, Feb. 16, 2011. |
Umouse, http://www.larryo.org/work/information/umouse/index.html, Mar. 23, 2009, downloaded for this information disclosure statement May 23, 2013. |
Greg, An Introduction and Overview of a Gesture Recognition System Implelmented for Human Computer Interaction, A thesis submitted in partial fulfillment of the requirements for a baccalaureate degree in Computer Engineering, www.gergltd.com/thesis.pdf, Summer 2004. |
Freeman et al, Television Control by Hand Gestures, Mitsubishi Electric Research Laboratories, www.merl.com/papers/docs/TR94-24.pdf, Dec. 1994. |
Manresa et al, Real-Time Hand Tracking and Gesture Recognition for Human-Computer Interaction, Electronic Letters on Computer Vision and Image Analysis 0(0):1-7, 2000; www.dmi.uib.es/˜ugiv/papers/ELCVIAManresa.pdf. |
FTK Technologies Ltd.—Company Profile, www.matimop.org.il/newrding/company/c608.htm#general, Mar. 23, 2009. |
Cheng et al., Real-time Monocular Tracking of View Frustum for Large Screen Human-Computer Interaction, National ICT Australia and ViSLAB, School of Information Technologies, The University of Sydney, NSW 2006, Australia. |
US Office Action mailed on Jan. 14, 2014 for U.S. Appl. No. 13/932,137. |
US Office Action mailed on Feb. 5, 2014 for U.S. Appl. No. 13/984,853. |
US Office Action mailed on Feb. 25, 2014 for U.S. Appl. No. 13/969,654. |
Israel Office Action mailed on Mar. 18, 2014 for Israel Application No. 229984. |
Search Report of International Application No. PCT/IL2014/050073 dated May 14, 2014. |
Final Office Action of U.S. Appl. No. 12/937,676 dated May 14, 2014. |
Search Report of International Application No. PCT/IL2014/050011 dated May 1, 2014. |
Office Action of U.S. Appl. No. 13/932,112 dated Apr. 30, 2014. |
Office Action of U.S. Appl. No. 13/907,925 dated Jul. 11, 2014. |
Number | Date | Country | |
---|---|---|---|
20130301926 A1 | Nov 2013 | US |
Number | Date | Country | |
---|---|---|---|
61645212 | May 2012 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/IL2013/050396 | May 2013 | US |
Child | 13926445 | US |