The present disclosure relates to systems and methods for three-dimensional (3D) sensing technology. In particular, the disclosure relates to systems and methods for determining objects' three-dimensional (3D) absolute coordinates for enhanced human-machine interaction.
Machine-human interfaces encompass a variety of technologies including capacitive, resistive, and infrared, and are widely used in different applications. In devices such as cell phones and personal computing systems, these interfaces aid users in communicating with the devices via touchscreen or other sensing mechanisms. Motion sensing and object tracking have also become popular, especially for entertainment, gaming, educational, and training applications. For example, sales of Microsoft's Kinect®, a gaming console having motion-sensing functionalities, have topped more than 10 million units since its release in late 2010.
However, some of the designs or applications of traditional tracking technologies such as time-of-flight (TOF), laser tracking, and stereo vision, may lack the ability to provide certain information concerning the detected object or environment. For example, many do not provide an object's three-dimensional (3D) absolute coordinates in space.
It may therefore be desirable to have systems, methods, or both that may determine the three-dimensional (3D) absolute coordinates of objects under analysis. The application may include object-sensing, motion-sensing, scanning and recreating of a three-dimensional (3D) image. Further, with the introduction of affordable three-dimensional (3D) displays, it may be desirable to have systems and methods that may determine the three-dimensional (3D) absolute coordinates for various applications, such as human-machine interaction, surveillance, etc.
The disclosed embodiments may include systems, display devices, and methods for determining three-dimensional coordinates.
The disclosed embodiments include a non-contact coordinate sensing system for identifying three-dimensional coordinates of an object. The system may include a light source configured to illuminate light to the object and to be controlled for object detection, a first detecting device configured to detect light reflected from the object to a first location, the first location being identified by a first set of three-dimensional coordinates, a second detecting device configured to detect light reflected from the object to a second location, the second location being identified by a second set of a three-dimensional coordinates, a third detecting device configured to detect light reflected from the object to a third location, the third location being identified by a third set of a three-dimensional coordinates, and a control circuit coupled to the at least one light source and the first, second, and third detecting devices. The control circuit may be configured to determine the three-dimensional coordinates of the object at least based on the phase differences between the reflected light detected at one of the first, second, and third locations and the reflected light detected at remaining locations.
The disclosed embodiments further include an interactive three-dimensional (3D) display system including at least one light source for illuminating light to an object and to be controlled for object detection, a first light detecting device for detecting reflected light from the object to a first location, the first location being identified by a first set of three-dimensional coordinates, a second light detecting device for detecting reflected light from the object to a second location, the second location being identified by a second set of three-dimensional coordinates, a third light detecting device for detecting reflected light from the object to a second location, the second location being identified by a third set of three-dimensional coordinates, and a control circuit coupled the at least one light source and the first, second, and third light detecting devices. The control circuit may be configured to determine three dimensional coordinates of the object. The control circuit may also be configured to produce 3D images with three-dimensional coordinates and to determine an interaction between the object and the 3D images based on the three-dimensional coordinates of the object and the three-dimensional coordinates of the 3D images.
The disclosed embodiments further include a method of identifying three-dimensional (3D) coordinates of an object. The method may include illuminating light to the object, sensing light reflected by the object by at least three sensing devices at different locations identified by a different set of three-dimensional coordinates. The method may also include calculating, by a processor, the three-dimensional coordinates of the object based on the phase differences between the reflected light detected at one of the locations and the reflected light detected at the remaining locations.
It is to be understood that both the foregoing general description and the following detailed description are exemplary only and are not restrictive of the claimed subject matter.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate various disclosed embodiments and, together with the description, serve to explain the various embodiments.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate disclosed embodiments described below.
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.
In sensing system 100, a central processing unit/controller 110 controls a light source 120 to illuminate light. In one exemplary embodiment, the light source 120 is made of a laser diode generating light in the MHz range which may be adjusted by the central processing unit 110. The light from light source 120 is directed to a path altering unit 130 which changes the path of the light. The path altering unit 130 is composed of at least one MEMS mirror. The path altering unit 130 may also be other devices that may reflect light and/or may be controlled. In one embodiment, the processor 110 may continuously and automatically adjust the path altering unit 130 based on desired specifications appropriate for the various applications. When the redirected light from the path altering unit 130 shines on an object O, such as a hand or a fingertip, light reflected from object O is captured by sensing unit 140. In other embodiments, the light source 120 is directly illuminated on the object O and a path altering unit 130 is not required.
Sensing unit 140 includes three or more light detectors or sensors, and each may be controlled by the processor 110. Information from the sensing unit 140, including detectors positions and phase difference among the detectors, may be provided or made available to the central processing unit 110. The exemplary calculations performed by central processing unit 110 will be described in detail below. In alternative embodiments, the light source 120 may include one or more illumination elements, which may be operating at different frequencies and may be used in conjunction with the sensing unit 140, or with a plurality of sensing units 140.
Steps 220, 230, and 240 may be repeated according to the various applications or specifications that may vary based on the applications of the method or system. For example, these steps may be repeated for the purpose of provide enhanced or continuous tracking of an object, or to calculate a more accurate absolute coordinates of tracked objects. As shown in
Referring to
Also shown in
√{square root over ((xo−xA)2+(yo−yA)2+(zo−z)2)}{square root over ((xo−xA)2+(yo−yA)2+(zo−z)2)}{square root over ((xo−xA)2+(yo−yA)2+(zo−z)2)}=d Equation 1
√{square root over ((xo−xB)2+(yo−yB)2+(zo−zB)2)}{square root over ((xo−xB)2+(yo−yB)2+(zo−zB)2)}{square root over ((xo−xB)2+(yo−yB)2+(zo−zB)2)}=d+α Equation 2
√{square root over ((xo−xC)2+(yo−yC)2+(zo−zC)2)}{square root over ((xo−xC)2+(yo−yC)2+(zo−zC)2)}{square root over ((xo−xC)2+(yo−yC)2+(zo−zC)2)}=d+β Equation 3
Equation 1 represents the spatial distance formula from sensor A to the fingertip; Equation 2 represents the spatial distance formula from sensor A to the fingertip; and Equation 3 represents the spatial distance formula from sensor A to the fingertip.
As shown in
It will be apparent to those skilled in the art that various modifications and variations can be made to the disclosed methods and materials. For example, the three-dimensional (3D) absolute coordinate sensing system may be modified and used in various settings, including but not limited to security screening systems, motion tracking systems, medical imaging systems, entertainment and gaming systems, imaging creation systems, etc. Further, the three-dimensional (3D) display as disclosed above may be other types of displays such as volumetric displays or holographic displays.
In the foregoing Description of the Embodiments, various features are grouped together in a single embodiment for purposes of streamlining the disclosure. The disclosure is not to be interpreted as reflecting an intention that the claimed subject matter requires more features than are expressly recited in each claim.
Moreover, it will be apparent to those skilled in the art from consideration of the specification and practice of the present disclosure that various modifications and variations can be made to the disclosed systems and methods without departing from the scope of the disclosed embodiments, as claimed. For example, one or more steps of a method and/or one or more components of an apparatus or a device may be omitted, changed, or substituted without departing from the scope of the disclosed embodiments. Thus, it is intended that the specification and examples be considered as exemplary only, with a scope of the present disclosure being indicated by the following claims and their equivalents.