The present invention relates to the use of mobile phones to control external devices. More particularly, the invention relates to the use of a mobile phone camera as part of a system that operates external devices.
Many devices are nowadays capable of connecting and exchanging data with mobile devices. As a result, it is now possible to integrate mobile devices as the controlling element of a system that operates such devices.
So far, the art has failed to provide useful applications of mobile phones to display control systems. It is an object of the present invention to provide a system and method that exploits the functionality of mobile phones to provide efficient control means for external devices.
A method for controlling a device, comprises:
a) providing a mobile device comprising a camera;
b) positioning said mobile device such that said camera acquires the image of an operator's hands;
c) analyzing the movements of said operator's hands to derive a control command therefrom; and
d) transmitting said control command to a controlled device.
In one embodiment of the invention the mobile device is a mobile phone. In another embodiment of the invention the controlled device is integral to the mobile phone, and in a further embodiment of the invention the controlled device is separate from the mobile device.
Different controlled devices can be used in conjunction with the invention, and in one embodiment of the invention the controlled device is a projector.
Control commands can be transmitted from the mobile device to the external device via any suitable channel, e.g., via WiFi.
In another aspect the invention encompasses a system comprising:
A) a controlled device;
B) a mobile device comprising a camera;
C) circuitry for deriving control commands from movements captured by said camera and for transmitting them to the controlled device.
As explained above, the camera and the controlled device can be integral to a mobile device, or separate therefrom.
The invention is further directed to a mobile phone comprising a camera, a projector and circuitry for deriving control commands from movements captured by said camera and for operating said projector therewith.
In the drawings:
As will be appreciated by the skilled person the invention is not limited to be used with any particular device. However, for the purpose of illustration, in the description to follow reference will be made to a projector at the device that is controlled by the mobile phone.
Modern mobile phones can incorporate a built-in projector (referred to hereinafter as “phone projector”) and, therefore, in the example to follow reference will be made to such integrated phone projector. It is however understood by the skilled person, as will be further elaborated in the following description, that the invention is suitable for use also with non-integrated devices, such as when a projector is not integrated within the mobile phone.
Referring now to
According to the invention the camera captures the movement of the presenter's hands, which are translated into instructions to the projector. For instance, swiping the hand from right to left may mean “next slide”, and from left to right may mean “previous slide”.
An additional example is a movie player with virtual button controls. When a user wants to play a movie, he makes with his hands a gesture that is configured to be interpreted as a “Play” command. A camera pointed to the user captures the user's movements and an algorithm engine analyzes the frames captured by the camera. The algorithm engine detects that the user made a gesture that is recognized as pressing a “Play” button with his hand and retrieves the corresponding event to the movie player application. The movie player application then calls initiates the playing of a movie.
As will be appreciated by the skilled person it is important to be able to capture and analyze images in poor light, particularly when operating a projector in relative darkness. This result is made possible by exploiting the capability of the mobile phone camera. The robust algorithm that performs gesture detection task well in poor lighting conditions does so by filtering background noise and detecting the user's hand (or other object held by the user) and its movement pattern/direction. The detection is based on analyzing the one-dimensional signal pattern which is the integrated two-dimensional images in the direction orthogonal to the movement direction. The analysis of signal peaks will retain the gesture detection.
In order to recognize hand gestures in low light, in one embodiment of the invention the following procedure is followed:
Detection of the gesture in Z axis (vertical)
Detection Criteria:
|Sn−A|>D*EnvTh 1
|Sn|>BaseTh 2
Sn−Sn−1>Sth (For moving heads cancellation) 3
Where,
Direction Estimation:
Features for gesture separation:
As will be appreciated by the skilled person the invention permits to control other equipment, such as for instance a standalone projector or a computer, as long as connectivity is established between the mobile phone and such equipment. For instance, connection between the cellular phone and equipment can be performed over WiFi, in which case the mobile phone operates as in the above example by “reading” the hand movements of the presenter, but then transmits appropriate commands to the external equipment over WiFi, instead of internally to a built-in device.
The above examples and description have been provided for the purpose of illustration and are not intended to limit the invention in any way. Many different types of mobile devices (e.g., PDAs) provided with the camera can be used, in conjunction with any suitable built-in or external device, without exceeding the scope of the invention.
Number | Name | Date | Kind |
---|---|---|---|
6002808 | Freeman | Dec 1999 | A |
6095989 | Hay et al. | Aug 2000 | A |
6266061 | Doi | Jul 2001 | B1 |
20040189720 | Wilson | Sep 2004 | A1 |
20040190776 | Higaki | Sep 2004 | A1 |
20060093186 | Ivanov | May 2006 | A1 |
20070229650 | McKay | Oct 2007 | A1 |
20080273755 | Hildreth | Nov 2008 | A1 |
20080291160 | Rabin | Nov 2008 | A1 |
20100039378 | Yabe | Feb 2010 | A1 |
20100138797 | Thorn | Jun 2010 | A1 |
20100159981 | Chiang | Jun 2010 | A1 |
20100315413 | Izadi et al. | Dec 2010 | A1 |
20110119638 | Forutanpour | May 2011 | A1 |
20120056971 | Kumar et al. | Mar 2012 | A1 |
Number | Date | Country |
---|---|---|
0823683 | Feb 1998 | EP |
WO2008083205 | Jul 2008 | WO |
WO2008139399 | Nov 2008 | WO |
WO2010086866 | Aug 2010 | WO |
Entry |
---|
XP 025272185, Hands and face tracking for VR applications, Varona et al., Computer & Graphics 29 (2005) 179-187. |
European search report for European application serial 11180613.9-2224; mailed Dec. 19, 2011, 10 pages. |
Number | Date | Country | |
---|---|---|---|
20120062453 A1 | Mar 2012 | US |