The present invention can relate to multi-dimensional remote control systems.
Some electronic systems can permit a user to interact with software applications, e.g., video games, by manipulating a remote control. For example, the systems can permit a user to interact with an image shown on a display by pointing a remote control at desired locations on or proximate to the display. Using infrared (IR) sources and photodetectors, the remote control systems can detect light produced or reflected by the light sources. The systems then can determine the location to which the remote control is pointing based on the detected light. The remote control systems or electronic devices coupled thereto can then perform one or more predetermined actions.
The present invention can include multi-dimensional (e.g., 2-D or 3-D) remote control systems that can detect an absolute location to which a remote control is pointing in first and second orthogonal axes (e.g., the x- and y-axes). Remote control systems of the present invention also can detect the absolute position of the remote control in a third orthogonal axis (e.g., the z-axis).
To determine the absolute position of the remote control, remote control systems of the present invention can employ absolute position detection with relative position detection. Absolute position detection can indicate an initial absolute position of the remote control. Relative position detection can indicate changes in the position of the remote control. When the initial absolute position is combined with a change in the position of the remote control, an updated absolute position can be determined. Because relative position detection can provide greater resolution than some techniques used in absolute position detection, the updated absolute position can be more precise than the initial absolute position determined for the remote control.
The remote control system of the present invention also can zoom into and out of an image or a portion thereof based on the absolute position of the remote control in the third axis.
The above and other advantages of the present invention will be apparent upon consideration of the following detailed description, taken in conjunction with accompanying drawings, in which like reference characters refer to like parts throughout, and in which:
The present invention can incorporate a three-dimensional remote control system that can detect an absolute location to which a remote control is pointing in x- and y-axes and can detect the absolute position of the remote control in the z-axis with respect to one or more reference locations. The remote control system of the present invention can employ absolute position detection with relative position detection.
Remote control system 10 can permit a user to move object 28 (e.g., a cursor) displayed on display 30 in the x- and y-axes by pointing remote control 16 at desired locations on display 30. Ray R in
Remote control system 10 also can permit a user to control other parameters of the image show on display 30 (e.g., size of object 28) by moving remote control 16 in a z-axis that may be orthogonal to the x- and y-axes. Remote control system 10 can determine the absolute position of remote control 16 in the z-axis with respect to a reference location and correlate one or more parameters of the image thereto. Thus, for example, as a user moves remote control 16 towards or away from display 30 in the z-axis, remote control system 10 can enlarge or reduce at least a portion of the image shown on display 30 (e.g., the size of object 28). In one embodiment of the present invention, the reference location in the z-axis may be substantially co-planar with a screen of display 30 on which an image is shown. As used herein, the position of the remote control in the x-, y- and z-axes also may be referred to as the x-, y- and z-positions of the remote control (respectively).
Absolute position detection sub-system 12 can detect one or more of the following absolute positions with respect to one or more reference locations: (1) the x- and y-positions of remote control 16; (2) the x- and y-positions of the location on or proximate to display 30 to which the remote control is pointing; and (3) the z-position of remote control 16. Relative position detection sub-system 14 can detect changes in the position of remote control 16 as the user manipulates the remote control. For example, relative position detection sub-system 14 can detect the direction in which remote control 16 is moving and/or the speed at which remote control 16 is moving.
To detect the x-, y-, and z-positions, absolute position detection sub-system 12 can include one or more electro-optical components, e.g., one or more light sources and/or a photodetector. For example, as illustrated in
Predetermined light sources 22 can emit, e.g., infrared (IR) light 24 to remote control 16, which can detect the emitted light using photodetector 26. Photodetector 26 can include CCD arrays, CMOS arrays, two-dimensional position sensitive photodiode arrays, other types of photodiode arrays, other types of light detection devices known in the art or otherwise, or any combination thereof.
In one embodiment of the present invention, transmitter 20 can be disposed such that predetermined light sources 22 are substantially co-planar with the screen of display 30. In alternative embodiments of the present invention, transmitter 20 and/or predetermined light sources 22 can be disposed at another location near or on display 30. In one embodiment of the present invention, remote control system 10 can be configured to determine the absolute z-position of remote control 16 with respect to the light transmitter and/or one or more predetermined light sources. That is, the light transmitter and/or one or more predetermined light sources may serve as the reference location in the z-axis. One of the predetermined light sources also may serve as the reference location in the x- and y-axes.
Controller 32, which may be disposed within remote control 16, can determine the x- and y-positions of the display location to which a user is pointing remote control 16 based on the IR light detected by photodetector 26. Controller 32 also can be configured to generate signals for rendering display 30 that move object 28 to the determined x- and y-positions. Based on the IR light detected by photodetector 26, controller 32 also can be configured to determine an absolute z-position of remote control 16 with respect to a reference location. The controllers described herein may include processors, memory, ASICs, circuits and/or other electronic components.
Relative position detection system 14 can include relative motion sensor 34 disposed within remote control 16. Relative motion sensor 34 can include any sensor that can detect relative motion or change in position of an object to which it is coupled. Controller 32 can incorporate data from relative motion sensor 34 in calculating the absolute z-position of remote control 16. This can provide additional resolution of the determined z-position and can permit remote control system 10 to more accurately track movement of remote control 16.
In one embodiment of the present invention, relative motion sensor 34 can include a single or multi-dimensional accelerometer. In alternative embodiments of the present invention, relative motion sensor 34 can include a gyroscope, an accelerometer, any other sensor that can detect relative motion, or any combination thereof.
Remote control 12 can incorporate user input component 38. A user may actuate user input component 38 when the user wants remote control system 10 to perform an action. For example, a user my actuate user input component 38 when the user is pointing to a location on display 30 to which the user wants object 28 to be moved or when the user moves remote control 16 in the z-axis to, e.g., zoom in on or zoom out of the image shown on display 30. When the user is not actuating user input component 38, remote control system 10 can be configured to take no action.
User input component 38 can be a scrollwheel similar to that incorporated by a portable media player sold under the trademark iPod™ by Apple Computer, Inc. of Cupertino, Calif. The scrollwheel can include one or more buttons and a capacitive touchpad. The touchpad can permit a user to scroll through software menus by running the user's finger around the track of the scrollwheel. User input component 38 also can include, for example, one or more buttons, a touchpad, a touchscreen display, or any combination thereof.
Remote control system 10 also can include optional console 40. Console 40 can have controller 42 that can perform some or all of the processing described for controller 32. For example, remote control 16 can be configured to transmit data representing detected IR light 24 to console 40. Controller 42 in console 40 then can (1) determine the absolute x-, y-, and z-positions described above; and (2) generate signals for rendering display 30 based on the determined x-, y-, and z-positions.
Alternatively, controller 32 can determine the absolute x-, y-, and z-positions described above and controller 42 can generate signals for rendering display 30 based on the determined x-, y-, and z-positions.
In one embodiment of the present invention, console 40 can communicate with remote control 16 using cable 44 and/or one or more wireless communication protocols known in the art or otherwise. Console 40 also can communicate with display 30 using cable 46 and/or one or more wireless communication protocols known in the art or otherwise. Alternatively, console 40 can be integrated with display 30 as one unit.
Console 40 also can have one or more connectors 43 to which accessories can be coupled. Accessories can include cables 44 and/or 46, game cartridges, portable memory devices (e.g., memory cards, external hard drives, etc.), adapters for interfacing with another electronic device (e.g., computers, camcorders, cameras, media players, etc.), or combinations thereof.
Techniques for determining the x- and y-positions may be known in the art. For example, U.S. Pat. No. 6,184,863 to Sibert et al., issued on Feb. 6, 2001, and U.S. Pat. No. 7,053,932 to Lin et al, issued on May 30, 2006, the entireties of which are incorporated herein by reference, describe two techniques that can be employed by controller 32 or 42. U.S. Patent Application Publication No. 2004/0207597 to Marks, published on Oct. 21, 2004; No. 2006/0152489 to Sweetser et al., published on Jul. 13, 2006; No. 2006/0152488 to Salsman et al., published on Jul. 13, 2006; and No. 2006/0152487 to Grunnet-Jepsen et al., published on Jul. 13, 2006, the entireties of which also are incorporated herein by reference, describe additional techniques that can be employed by controller 32 or 42. Remote control system 10 also can employ other techniques known in the art or otherwise.
In step 54, controller 32 or 42 can use the data from photodetector 26 to determine an initial absolute z-position of remote control 16 using, e.g., an averaging technique. One embodiment of an averaging technique can include accepting multiple frames of data collected by photodetector 26 and determining an average absolute z-position based on the multiple frames of data. More details about one embodiment of the averaging technique is discussed below with respect to
In step 56, controller 32 or 42 can accept data or signals from accelerometer 34. Based on the accelerometer data/signals, controller 32 or 42 can extract information about changes in the z-position of remote control 16 (if any). For example, the sign of the slope of a signal waveform derived from accelerometer data can indicate whether a user is moving remote control 16 in the positive or negative z-direction with respect to a reference condition. The magnitude of signals derived from accelerometer data can indicate the rate at which the user is moving remote control 16. The controller can extract this information from the accelerometer signals and correlate the information to the direction and rate of change of remote control 16 in the z-axis. Given the direction, rate of change, and amount of time elapsed, controller 32 or 42 can determine changes in the position of remote control 16 in the z-axis.
In step 60, controller 32 or 42 can combine the average absolute z-position determined in step 54 with the change in z-position determined in step 58 to provide an updated absolute z-position. For example, controller 32 or 42 can add the average absolute z-position determined in step 54 with the change in z-position determined in step 58. Controller 32 or 42 also can weight either the average absolute z-position determined in step 54 or the change in z-position determined in step 58 before combining the values, e.g., to account for differences in accuracy, error rates, characteristics of the hardware, etc.
The value resulting from the combination can be a more precise indication of the absolute z-position of remote control 16 as compared to the average z-position determined in step 54. The updated z-position determined in step 60 can provide additional resolution and thereby permit remote control system 10 to more accurately track movement of remote control 16.
Controller 32 or 42 can be configured to perform steps 50-54 simultaneously with steps 56-60. The controller also can continuously reiterate steps 50-60, thereby continuously updating the absolute z-position of remote control 16.
In the embodiment of
While
Remote control system 10 also can compare the signal intensities of light rays 24.1 and 24.2 received by photodetector 26 to determine the absolute z-position of remote control 16. For example, as a user moves remote control 16 from position Z(a) to Z(b), the intensities of light rays 24.1 and 24.2 received by photodetector 26 may decrease. Also, as a user moves remote control 16 from side to side in the x-axis, the intensity of light 24.1 received by photodetector 26 may differ from that received from light 24.2. To determine the absolute z-position of remote control 16, controller 32 or 42 can correlate the detected intensities of light rays 24.1 and 24.2 to a z-position for the remote control. For example, controller 32 or 42 can be configured to calculate the z-position using formula(s) that are function(s) of the intensities of the detected light rays. The formulas can be determined by empirical testing or by using principles of light propagation. Alternatively, remote control system 10 can have a database that associates detected intensities to predetermined z-positions. Controller 32 or 42 can be configured to access this database to determine the z-position of remote control 16. In one embodiment of the present invention, controller 32 or 42 can correlate the z-position of remote control 16 using perceived distance D, angle Θ, intensities of light detected by photodetector 26, or any combination thereof. While
Remote control system 10 also can employ other techniques known in the art or otherwise for determining initial absolute z-positions of remote control 16. For example, U.S. Patent Application Publication Nos. 2006/0152489 to Sweetser et al.; 2006/0152488 to Salsman et al.; and 2006/0152487 to Grunnet-Jepsen et al., the entireties of which are incorporated herein by reference above, describe techniques that can be employed by controller 32 or 42 to determine the z-position of a remote control when two, three, or four predetermined light sources are provided.
In one embodiment of the present invention, remote control system 10 can use only the techniques described above to determine the initial absolute z-position of remote control 16 in step 54 of
In the embodiment of
In the embodiment of
In an alternative embodiment of the present invention, controller 32 or 42 can enlarge or zoom in on at least a portion of an image shown on the display (e.g., object 28) when the remote control is moved away from the display or transmitter 20 in the z-axis. Controller 32 or 42 also can reduce the size or zoom out of at least a portion of the image shown on the display (e.g., object 28) when the remote control is moved towards the display or transmitter 20 in the z-axis.
In step 110, the controller can determine how much the displayed image needs to be translated in the x- and y-directions so that the resulting image rendered in step 114 shows the desired feature at which the remote control is pointed. For example, the image rendered in step 114 can be centered about the location in the x- and y-axes to which the remote control is pointing.
In step 112, the controller can determine how much to scale the displayed image from its reference size in accordance with the z-position of remote control 16. In step 114, the controller can generate signals for rendering display 30 with an image that is translated and scaled in accordance with the translation and scaling factor determined in steps 110 and 112.
Although particular embodiments of the present invention have been described above in detail, it will be understood that this description is merely for purposes of illustration. Alternative embodiments of those described hereinabove also are within the scope of the present invention. For example, predetermined light sources can be disposed in a remote control and a photodetector can be disposed in a display, in a frame disposed proximate to the display, or at any location proximate to, on, or near a display.
Also, a controller in the display can perform some or all of the processing described above for controllers 32 and/or 42. Thus, multiple controllers may be used to control remote control systems of the present invention.
A remote control of the present invention can be any electronic device in a system that may need to determine the absolute positions of the electronic device with respect to one or more reference locations. For example, the remote control can be any portable, mobile, hand-held, or miniature consumer electronic device. Illustrative electronic devices can include, but are not limited to, music players, video players, still image players, game players, other media players, music recorders, video recorders, cameras, other media recorders, radios, medical equipment, calculators, cellular phones, other wireless communication devices, personal digital assistances, programmable remote controls, pagers, laptop computers, printers, or combinations thereof. Miniature electronic devices may have a form factor that is smaller than that of hand-held devices. Illustrative miniature electronic devices can include, but are not limited to, watches, rings, necklaces, belts, accessories for belts, headsets, accessories for shoes, virtual reality devices, other wearable electronics, accessories for sporting equipment, accessories for fitness equipment, key chains, or combinations thereof.
While the above description may have described certain components as being physically separate from other components, one or more of the components may be integrated into one unit. For example, photodetector 26 may be integrated with relative motion sensor 34. Controller 32 also may be integrated with photodetector 26 and relative motion sensor 34. Furthermore, the absolute and relative position detection sub-systems can share components, e.g., controller 32.
Furthermore, while the illustrative remote control systems described above may have included predetermined light sources that output light waves, one or more of the predetermined light sources can be replaced with component(s) that output or reflect other types of energy waves either alone or in conjunction with light waves. For example, the component(s) can output radio waves.
The above described embodiments of the present invention are presented for purposes of illustration and not of limitation, and the present invention is limited only by the claims which follow.
This application is a continuation of U.S. patent application Ser. No. 11/594,342 filed on Nov. 7, 2006 (now U.S. Pat. No. 8,291,346), which is incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4229009 | Ohta | Oct 1980 | A |
4395045 | Baer | Jul 1983 | A |
4813682 | Okada | Mar 1989 | A |
5115230 | Smoot | May 1992 | A |
5302968 | Heberle | Apr 1994 | A |
5467288 | Fasciano et al. | Nov 1995 | A |
5502459 | Marshall et al. | Mar 1996 | A |
5504501 | Hauck et al. | Apr 1996 | A |
5515079 | Hauck et al. | May 1996 | A |
5554980 | Hashimoto et al. | Sep 1996 | A |
5703623 | Hall et al. | Dec 1997 | A |
5736975 | Lunetta | Apr 1998 | A |
5892501 | Kim et al. | Apr 1999 | A |
5926168 | Fan | Jul 1999 | A |
6130662 | Umeda | Oct 2000 | A |
6146278 | Kobayashi | Nov 2000 | A |
6171190 | Thanasack et al. | Jan 2001 | B1 |
6184863 | Sibert et al. | Feb 2001 | B1 |
6184884 | Nagahara et al. | Feb 2001 | B1 |
6252720 | Haseltine | Jun 2001 | B1 |
6287198 | McCauley | Sep 2001 | B1 |
6331848 | Stove et al. | Dec 2001 | B1 |
6377242 | Sweed | Apr 2002 | B1 |
6476797 | Kurihara et al. | Nov 2002 | B1 |
6538665 | Crow et al. | Mar 2003 | B2 |
6618076 | Sukthankar et al. | Sep 2003 | B1 |
6650822 | Zhou | Nov 2003 | B1 |
6683628 | Nakagawa et al. | Jan 2004 | B1 |
6727885 | Ishino et al. | Apr 2004 | B1 |
6765555 | Wu | Jul 2004 | B2 |
7024228 | Komsi et al. | Apr 2006 | B2 |
7030856 | Dawson et al. | Apr 2006 | B2 |
7053932 | Lin et al. | May 2006 | B2 |
7102616 | Sleator | Sep 2006 | B1 |
7165227 | Ubillos | Jan 2007 | B2 |
7219308 | Novak et al. | May 2007 | B2 |
7302181 | Ng et al. | Nov 2007 | B2 |
7458025 | Crow et al. | Nov 2008 | B2 |
7496277 | Ackley et al. | Feb 2009 | B2 |
7627139 | Marks et al. | Dec 2009 | B2 |
7705799 | Niwa | Apr 2010 | B2 |
7728823 | Lyon et al. | Jun 2010 | B2 |
7954065 | Ubillos | May 2011 | B2 |
7984385 | Ubillos | Jul 2011 | B2 |
20010002830 | Rahn et al. | Jun 2001 | A1 |
20010050672 | Kobayashi | Dec 2001 | A1 |
20020045960 | Phillips et al. | Apr 2002 | A1 |
20030050092 | Yun | Mar 2003 | A1 |
20030076301 | Tsuk et al. | Apr 2003 | A1 |
20030095096 | Robbin et al. | May 2003 | A1 |
20030201984 | Falvo | Oct 2003 | A1 |
20030210286 | Gerpheide et al. | Nov 2003 | A1 |
20040070564 | Dawson et al. | Apr 2004 | A1 |
20040095317 | Zhang et al. | May 2004 | A1 |
20040105264 | Spero et al. | Jun 2004 | A1 |
20040207597 | Marks | Oct 2004 | A1 |
20040218104 | Smith et al. | Nov 2004 | A1 |
20040261037 | Ording et al. | Dec 2004 | A1 |
20050012723 | Pallakoff | Jan 2005 | A1 |
20050055624 | Seeman et al. | Mar 2005 | A1 |
20050174324 | Liberty et al. | Aug 2005 | A1 |
20050210417 | Marvit et al. | Sep 2005 | A1 |
20050210418 | Marvit et al. | Sep 2005 | A1 |
20050212749 | Marvit et al. | Sep 2005 | A1 |
20050212766 | Reinhardt et al. | Sep 2005 | A1 |
20060026521 | Hotelling et al. | Feb 2006 | A1 |
20060026535 | Hotelling et al. | Feb 2006 | A1 |
20060123360 | Anwar et al. | Jun 2006 | A1 |
20060152487 | Grunnet-Jepsen et al. | Jul 2006 | A1 |
20060152488 | Salsman et al. | Jul 2006 | A1 |
20060152489 | Sweetser et al. | Jul 2006 | A1 |
20060170874 | Yumiki et al. | Aug 2006 | A1 |
20060184966 | Hunleth et al. | Aug 2006 | A1 |
20060262105 | Smith et al. | Nov 2006 | A1 |
20060277571 | Marks et al. | Dec 2006 | A1 |
20060284841 | Hong et al. | Dec 2006 | A1 |
20070067798 | Wroblewski | Mar 2007 | A1 |
20070109324 | Lin | May 2007 | A1 |
20070132725 | Kitaura | Jun 2007 | A1 |
20080106517 | Kerr et al. | May 2008 | A1 |
20080235617 | Kim et al. | Sep 2008 | A1 |
20100058220 | Carpenter | Mar 2010 | A1 |
20100303440 | Lin et al. | Dec 2010 | A1 |
20110035700 | Meaney et al. | Feb 2011 | A1 |
20110235990 | Anzures et al. | Sep 2011 | A1 |
20110235998 | Pond et al. | Sep 2011 | A1 |
20110276881 | Keng et al. | Nov 2011 | A1 |
Number | Date | Country |
---|---|---|
1877506 | Dec 2006 | CN |
2004057454 | Jul 2004 | WO |
2007060287 | May 2007 | WO |
Entry |
---|
U.S. Appl. No. 60/967,835, filed Sep. 7, 2007. |
Agilent ADNK-2623 Optical Mouse Designer's Kit, “Product Overview,” Agilent Technologies pp. 1-4, Jul. 3, 2003. |
Agilent ADNS-2620 Optical Mouse Sensor, “Data Sheet,” Agilent Technologies, pp. 1-27, May 13, 2005. |
Agilent ADNB-6031 and ADNB-6032 Low Power Laser Mouse Bundles, “Data Sheet,” Agilent Technologies, pp. 1-44, Apr. 21, 2006. |
HeadMouse Extreme, Prentke Romich Company, p. 1, Feb. 10, 2008: <http://store.prentrom.com/cgi-bin/store/HE-X.html>. |
HiBall-3100 Wide-Area Tracker and 3D Digitizer, 3rdTech, pp. 1 and 2, 2006. |
LCD Topgun User Manual, pp. 1-4, Sep. 26, 2006: <http://www.hkems.com/m—main.htm>. |
RGT Tracking, eReal Games, pp. 1-9, Sep. 26, 2006: <http://www.erealgames.com/aboutrgt.php>. |
SpaceMouse Plus, 3D Connexion, pp. 1 and 2, 2006. |
Wii Remote, pp. 1-4, Sep. 26, 2006: <http://wii.nintendo.com/controllers.html>. |
F. Madritsch, “CCD-Camera Based Optical Tracking for Human-Computer Interaction,” Proceedings 1st European Conference on Disability, Virtual Reality and Associated Technologies, Maidenhead, pp. 161-170 (1996). |
B. A. Myers et al., “Interacting at a Distance: Measuring the Performance of Laser Pointers and Other Devices,” Proceedings of CHI'2002: Human factors in computing systems, pp. 33-40 (2002). |
D. R. Olsen, et al., “Laser pointer interactions,” Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 17-22 (2001). |
R. Sukthankar, et al., “Smarter Presentations: Exploiting homography in camera-projector systems,” Proceedings of International Conference on Computer Vision, 2001. |
Number | Date | Country | |
---|---|---|---|
20130027297 A1 | Jan 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11594342 | Nov 2006 | US |
Child | 13647088 | US |