The present disclosure relates to human-computer interaction systems. More specifically, this disclosure pertains to methods and systems related to three-dimensional pointing, using a system integrated for both absolute position detection and relative position detection. Embodiments of a pointing/input device useful in the methods and systems are disclosed also.
As shown in
In systems where a pointer such as a cursor, cross-hair, or other indicator is always displayed at the point of the display surface where the pointing device 2 is aimed, the pointing device 2 is referred to as a direct or absolute pointing device and the process is referred to as a direct or absolute pointing process. On the other hand, in systems where the pointer is not necessarily displayed at the point of the display surface where the pointing device 2 is aimed, the pointing device 2 is referred to as a relative pointing device and the process is referred to as a relative pointing process. These concepts are depicted in more detail in
However, almost all modern 3D pointing devices and 3D remotes such as for smart televisions are relative pointing devices. This is because such devices use a motion sensor module to provide information for determining the location of a pointer on the device screen, and the information provided by the motion sensors is relative information. As an example, a motion sensor module may include a G-sensor, a gyroscope sensor, and a magnetic field sensor. These sensors provide rotation information (azimuth, pitch, and roll) and acceleration/velocity information (linear acceleration, angular velocity) relative to the previous location of the pointing device 2. A number of compensation techniques are known to correct a difference between an aimed point 20 and a displayed point 21 of a pointing device 2. However, because the location of the pointing device 2 relative to the display surface is not known, such compensation techniques are of reduced effectiveness.
In addition to a motion sensor module, it is known to include an imaging device in a pointing device to provide additional information for determination of an aimed point. As shown in
The system as depicted in
There is accordingly a need identified in the art for direct pointing systems that can utilize the computing capabilities of a console rather than relying only on the computing power provided by a pointing device, and also which can utilize imaging devices incorporated into or added-on to a display surface such as a computing device screen, smart TV screen, etc. The ability to interface with multiple imaging devices would also be desirable. Also, such systems would provide that a displayed point of a pointing device on a display surface always coincides with an aimed point of the pointing device, even when that aimed point departs from a boundary of a display surface and then reenters that boundary.
To solve the foregoing problems and address the identified need in the art, the present disclosure provides a direct pointing system that can take advantage of the computing power of the console and the imaging capability of a display surface equipped with one or more imaging devices. The described system can be operated with more flexibility, and can still provide good direct pointing results.
In one aspect, the present disclosure describes a computing system for direct three-dimensional pointing and command input. The system includes at least one computing device and a graphical user interface. A pointing/input device is provided including at least one light source and a motion sensor module. The motion sensor module provides information regarding an absolute and a relative displacement of the pointing/input device to the at least one computing device. At least one imaging device is operably linked to the computing device processor, and may be associated with the graphical user interface. In embodiments, a plurality of imaging devices are associated with the graphical user interface and operably linked to the computing device processor.
The imaging device is configured for capturing a plurality of image frames each including a view of the at least one light source as the pointing/input device is held and/or moved in a three-dimensional space and within a field of view of the at least one imaging device. A one non-transitory computer program product operable on the computing device processor includes executable instructions for calculating at least a position and/or a motion of the at least one light source in three-dimensional space from the plurality of sequential image frames and from the pointing/input device absolute and relative displacement information, and for rendering on the graphical user interface a visual indicator corresponding to the calculated position and/or the motion of the at least one light source.
In embodiments, the computer program product includes executable instructions for determining a position and/or a motion of the at least one light source in three-dimensional space by determining a current position of the at least one light source in a captured image frame and determining a prior position of the at least one light source in at least one prior captured image frame. A relative displacement of the pointing/input device is determined by the computing device processor from information provided by the motion sensor module. Next a location of the at least one light source in three-dimensional space is calculated from the determined relative displacement information and the determined at least one light source current and prior positions. A pointing direction of the pointing/input device defining an axis of the pointing/input device is determined from information provided by the motion sensor module. From the calculated three-dimensional location of the at least one light source and the determined pointing direction of the pointing/input device provided by the motion sensor module, an intersection point of the axis of the pointing/input device and the graphical user interface is calculated by the computing device processor and displayed as a visual indicator (an icon, cross-hairs, a pointer, etc.) in the graphical user interface. Movements of the pointing/input device may be interpreted as particular command inputs by the computing device processor.
In embodiments, the computer program product includes executable instructions for calculating a location of the at least one light source in three-dimensional space from the determined current and prior positions of the at least one light source by identifying a first region corresponding to a position of the at least one light source in the captured image frame and a second region corresponding to a position of the at least one light source in the at least one prior captured image frame. From those identified regions, a first position of the at least one light source in the captured image frame and a second position of the at least one light source in the at least one prior captured image frame are calculated. Then, a displacement vector of the pointing/input device caused by translating the at least one light source between the first region and the second region is calculated. This process is repeated in succeeding sets of current and prior captured image frames.
In another aspect, a method for tracking a pointing/input device is provided using the light source position and pointing/input device absolute and relative displacement information obtained from the system as described above.
These and other embodiments, aspects, advantages, and features will be set forth in the description which follows, and in part will become apparent to those of ordinary skill in the art by reference to the following description and referenced drawings or by practice of the invention. The aspects, advantages, and features are realized and attained by means of the instrumentalities, procedures, and combinations particularly pointed out in the appended claims. Unless otherwise indicated, any patent and/or non-patent citations discussed herein are specifically incorporated by reference in their entirety into the present disclosure.
The accompanying drawings incorporated in and forming a part of the specification, illustrate several aspects of the present invention, and together with the description serve to explain the principles of the invention. In the drawings:
In the following detailed description of the illustrated embodiments, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration specific embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention. Also, it is to be understood that other embodiments may be utilized and that process, reagent, materials, software, and/or other changes may be made without departing from the scope of the present disclosure.
At a high level, the present disclosure provides systems and methods for obtaining an absolute location of a visual marker, including a pointer such as a cursor, cross-hair, etc. on an image display apparatus using integrated absolute and relative devices. With reference to
In use, the pointing/input device 50 functions substantially in the manner of a laser pointer, except it moves an indicator such as a cursor, cross-hairs, etc. instead of a red dot. When an operator 0 uses the pointing device 50 to aim at a point (e.g., point 70 in
The pointing/input device 50 of this disclosure can also be used as an input device, substantially similar in function to a computer mouse or other such input device. The position specified by the pointing/input device 50 on the image display apparatus 52 on an x-y coordinate system defined on the image display apparatus 52 is computed, and (x, y) coordinates of that specified position can be used to identify an item or icon displayed on the image display apparatus 52. Therefore, by manipulating the pointing/input device 50, a user can interact with most operating systems (e.g., Android® or Microsoft® Windows®), for example selecting files, programs, or actions from lists, groups of icons, etc., and can freely move files, programs, etc., issue commands or perform specific actions, such as those used in a drawing program.
At least three components are embedded in the pointing/input device 50 (
The motion sensor module 66 consists of a set of motion detecting sensors to provide absolute and relative motion information of the device (e.g., acceleration, rotations, etc) to the computing device 56 in real time through some wireless channel. The set of motion detecting sensors contained in the motion sensor module 66 can include a G-sensor, a gyroscope sensor, a magnetic field sensor, and others.
The image capture device 54 functions as a viewing device for the computing device 42, taking images of the scene in front of the image display apparatus 52 at a fixed frame rate per second. Those images are sent to the computing device 56 for subsequent processing. Use of any conventional image capture device/imaging device 54, including single lens imaging devices such as a standard webcams, is contemplated. In an embodiment, an image capture device 54 having a frame rate that is at least 30 frames per second is contemplated.
The computing device 56 processor provides the functionality of light source location identification that will identify the location of the light source 62 in sequential images sent by the image capture device 54, and of computation of the point that the pointing/input device 50 points to on the image display apparatus 52 (see
The computation process of the computing device 56 processor is shown in flow chart form in
In Step 80, the computing device 56 processor identifies the region corresponding to the light source in the image taken at time tk+1 (see
The center of the region will be denoted 74 in the image recording portion 76 (such as a CCD) of the image capture device 54 (see
In Step 82, the computing device 56 processor computes the location of the light source 62 in 3D space using information obtained from Steps 88 and 90. Since the operator 0 can move and rotate the pointing device 50, the light source 62 experiences both translation and spinning. Steps 88 and 90 provide a displacement vector of the light source 62 between tk and tk+1 (or, tk+i and tk+i+1 for some i>0. However, for simplicity, we shall assume i=0 here) to Step 82. To make illustration easier, we enlarge the portion of
B−A={right arrow over (V)}
Since the parametric representations of 1110 and 1114 are known, finding A and B is a straightforward process, and 1102 is A and 1104 is B. So, the task here is to find {right arrow over (V)}1108. This process is discussed below.
If we use r(t) to represent the location of the light source 62 on an image at time t, then the displacement vector {right arrow over (V)} can be expressed as
{right arrow over (V)}=r(tk+1)−r(tk)=½({dot over (r)}(tk+1)+{dot over (r)}(tk))Δtk
where {dot over (r)}(t) stands for the velocity of the LED at time t and Δtk≡tk+1−tk. However, since a device held by hand does not have inertia, one can ignore {dot over (r)}(tk) in the above equation, simply use the following equation to compute V:
{right arrow over (V)}={dot over (r)}(tk+1)*Δtk (1)
Since the pointing/input device 50 experiences both translation and spin, {dot over (r)}(t) can be expressed as
{dot over (r)}(t)=ω(t)×(r(t)−x(t))+v(t) (2)
where ω(t) is the angular velocity of r(t) at time t, x(t) is the spin pivot of r(t) at time t, × is the cross-product notation, and v(t) is the linear velocity of r(t) at time t. Theoretically, the value of v(t) at tk+1 is computed as follows:
v(tk+1)=v(tk)+a(tk+1)*Δtk (3)
where a(tk+1) is the linear acceleration of r(t) at tk+1. Again, since a hand-held device does not have inertia, one can ignore the term v(tk) in equation (3), simply use the following equation to compute v(tk+1):
v(tk+1)=a(tk+1)*Δtk (4)
Therefore, from equations (1), (2), and (4), we have the following equation for {right arrow over (V)}:
{right arrow over (V)}=ω(tk+1)×(r(tk+1)−x(tk+1))*Δtk+a(tk+1)*(Δtk)2 (5)
The values of ω(tk+1) and a(tk+1) can be obtained respectively from the Gyroscope sensor and the G-sensor of the motion sensor module 66. On the other hand, r(tk+1)−x(tk+1) is nothing but D2 (direction of the pointing device 2 at tk+1, to be defined below) times L, the distance between 1104 and 1106 in
r(tk+1)−x(tk+1)=L*D2 (6)
Hence, by substituting (6) into (5), we have the following formula for {right arrow over (V)}1108:
{right arrow over (V)}=L*(ω(tk+1)×D2)*Δtk+a(tk+1)*(Δtk)2 (7)
To define D2, let N=(xN, 0, zN) be a vector in the direction of north with unit length, i.e.,
√{square root over ((xN)2+(zN)2)}=1
Let the azimuth, pitch and roll angles provided by the sensors of the motion sensor module 66 of the pointing/input device 50 at tk+1 be α, β, and γ, respectively (see
The pitch is a clockwise rotation of β about the x-axis. The rotation matrix is given by
The roll is a clockwise rotation of γ about the z-axis. The rotation matrix is given by
By multiplying these three matrices up, we get a single rotation matrix
To find the direction D2 of the pointing/input device 50 at tk+1, simply multiply the unit vector N by the rotation matrix R (γ, β, α):
D
2
=R(γ, β, α)N (9)
In Step 84, the computing device 56 processor computes the intersection point 70 of the axis of the pointing/input device 50 with the image display apparatus 52 at time tk+1 (see
L(u)=B+t*D2 t∈R (10)
where B 1104 is the location of the LED light source 62 and D2 is the pointed direction of the pointing/input device 50 at tk+1 (see
In Step 86, the computing device 56 processor sends instructions to show a cursor at the point computed in Step 84 or move the cursor from its previous location to the new location computed in Step 84. The computing device 56 processor will also perform required actions to accommodate interactions specified by the operator 0 through keys or buttons of the control panel 64 of the pointing/input device 50.
Of course, the computing device 56 repeats the above steps for each new image sent by the image capture device.
The foregoing discussion describes a system comprising a single imaging device 54, but multiple imaging devices may be incorporated into the presently disclosed system. In one embodiment as shown in
In Step 1402, the work is similar to Step 80 in
In Step 1404 (
Once the location of the light source in 3D space is known, the remaining steps depicted in
Summarizing, the present disclosure provides robust methods and systems for three-dimensional pointing using light tracking and relative position detection techniques. Advantageously, the disclosed methods and systems are likewise economical, simple, and likely already available in many homes but for the pointing/input device 50 and software. But for the pointing/input device 50 and software, for additional hardware the system 10 requires only a computing device 56 and conventional imaging devices 54 such as standard webcams, and has no requirement for any specific wired or wireless connection (such as wiring or cabling, or a specialized IR or other signal) between the pointing/input device 50 and the imaging device 54. Exemplary advantages of the disclosed system include allowing an operator to point and/or input gesture commands to a computing device, a “smart” television, and the like in 3D mode.
One of ordinary skill in the art will recognize that additional embodiments of the invention are also possible without departing from the teachings herein. Thus, the foregoing description is presented for purposes of illustration and description of the various aspects of the invention, and one of ordinary skill in the art will recognize that additional embodiments of the invention are possible without departing from the teachings herein. This detailed description, and particularly the specific details of the exemplary embodiments, is given primarily for clarity of understanding, and no unnecessary limitations are to be imported, for modifications will become obvious to those skilled in the art upon reading this disclosure and may be made without departing from the spirit or scope of the invention. Relatively apparent modifications, of course, include combining the various features of one or more figures with the features of one or more of other figures. All such modifications and variations are within the scope of the invention as determined by the appended claims when interpreted in accordance with the breadth to which they are fairly, legally and equitably entitled.
This utility patent application claims the benefit of priority in U.S. Provisional Patent Application Ser. No. 62/028,335 filed on Jul. 24, 2014, the entirety of the disclosure of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
62028335 | Jul 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14573008 | Dec 2014 | US |
Child | 16538432 | US |