According to the invention, the behavior of at least a part of the human body, e.g. the head including the face or any part of the face such as eyes, chin, mouth, lips, nose, ear, cheek, hair, beard, or any combination thereof, is associated with remote control operations of a multimedia system, and such association is the basis of generating an indication signal to the multimedia system, resulting in changing the operation behavior of the multimedia system. In this way, the multimedia device can be operated more easily and directly. In addition, an intelligent remote-control method can be developed according to the invention and applied in a remote-control system to generate an indication signal, in response to detection results obtained in a situation, so as to control the operation of the multimedia device, without the need of giving direct commands by the user. The following embodiments illustrate different ways to implement the invention, wherein face and eye behavior, for example, is associated with the remote control operation of the multimedia system, such as playing contents or changing setting of the multimedia system.
The multimedia system 100 is a system for processing multimedia data or contents and reproducing the multimedia contents accordingly. The multimedia system 100 has at least one operation mode, e.g. a playing mode, and the operation mode has at least one control operation, such as playing, pausing, or stopping. As another example, the multimedia system 100 has a menu mode and the menu mode has at least one menu selection operation, and the menu selection operation is used for setting or controlling the multimedia system 100. At least one conditional parameter is associated with at least one control operation.
The remote-control system 130 generates an indication signal according to the behavior of a person, based on the face or at least one part of the face, to change the operation of the multimedia system 100. The remote-control system 130 includes a detection device 140 and an image-analyzing-and-indication-generating unit 150. The remote-control system 130 can make use of one or more detection or tracking techniques, such as object detection, object tracking, human face detection or any combination thereof, to detect the behavior of any person existing.
The detection device 140 is a device for detecting human bodies or acquiring images, such as a charge-coupled device (CCD), a digital camera, a webcam, or a digital video capture device. The detection device 140 is used for detecting any existing face or at least one part of any existing face according to an operation mode of the multimedia system in order to generate a face image signal. The detection device 140 is configured to perform, for example, auto-zooming, human face feature acquisition, or eyeballs tracking and location.
The image-analyzing-and-indication-generating unit 150 is coupled to the detection device 140 and is used for analyzing at least the part of the detected existing face according to the face image signal to generate at least one characteristic parameter correspondingly. In addition, the image-analyzing-and-indication-generating unit 150 is further used to determine whether to generate at least an indication signal to the multimedia system 100 according to the at least one characteristic parameter and at least one conditional parameter. For implementation, the image-analyzing-and-indication-generating unit 150 compares at least the characteristic parameter to at least the conditional parameter associated with at least one control operation of the operation mode in order to generate a comparison result, and generates at least an indication signal to the multimedia system according to the comparison result. For example, the image-analyzing-and-indication-generating unit 150 can be implemented with or include a digital image processor, such as a digital image acquisition device, or a microprocessor, or a system-on-chip. In addition, the image-analyzing-and-indication-generating unit 150 can include memory for storing data.
The playing device 110, such as a DVD player, a digital recording/reproducing device, or a multimedia player compliant with a wireless network. The playing device 110, in response to at least the indication signal, is used for processing multimedia data. The playing device 110 can also be configured or set according to the indication signal. The playing device 110 processes and outputs multimedia signals. The playing device 110 can be a processing device capable of processing multimedia data, such as a computer system or audio/video player or recorder. The playing device 110 can be implemented and configured to read storage media, such as optical disks, hard disks, or memory cards. The playing device 110 can also be implemented and configured to access digital image, video, or audio data to process or reproduce in either wired or wireless manner.
The display device 120, such as a flat plane display, liquid crystal display, cathode-ray tube display, or projection display, is employed to reproduce video and audio according to the multimedia signal outputted from the playing device 110.
In addition, the playing device 110 and display device 120 can be combined as a unit for multimedia data processing and multimedia reproduction, such as a desktop or notebook computer system, digital television, analog television, or video-conference system.
Unlike the multimedia system 100 in
In
In the following embodiments of the invention, the multimedia system may have one or more operation modes. Firstly, a playing mode is taken as an example of the operation mode to illustrate an embodiment of a remote-control method according to the invention. Next, a menu mode is taken as an example of the operation mode to illustrate another embodiment of a remote-control method according to the invention. It is noticed that setting conditional parameters can be implemented in different ways and thus the step of setting conditional parameters in the following embodiments can be performed in different sequences or ways, as above mentioned.
As shown in step 410, a number of conditional parameters are set, including (1) a viewer number Na, (2) a delay time Td, and (3) a remaining viewer threshold p %, and associated with playing, stopping, pausing operations of the playing mode. As indicated in step 420, according to the playing mode, any face of any existing viewer and at least any eye of any existing face is detected by using the detection device so as to generate a face image signal accordingly. In step 430, any detected existing face and at least any eye of any existing face are analyzed according to the face image signal so as to generate corresponding characteristic parameters including (1) a face detection number Nf indicating how many faces are detected and (2) an eye detection number Ne indicating how many eyes that are opened are detected. As indicated in step 440, the characteristic parameters generated according to the face image signal are compared to at least one conditional parameter associated with at least one control operation of the playing mode so as to generate an indication signal to the multimedia system. In addition, after step 440, the method according to this embodiment can proceed to step 420 to repeat the face and eye detection and to generate indication signals to the multimedia system.
Step 430 of this embodiment takes the face detection number Nf and the eye detection number Ne as examples of characteristic parameters. Step 440 can be implemented by different ways of comparisons according to the two characteristic parameters to generate at least one indication signal to the multimedia system. Referring to
In
In step 441 or 442, if the result is negative, step 443 is performed to determine whether the face detection number Nf is zero. If so, step 444 is performed to determine whether the pausing operation has been started and the delay time Td has been elapsed. If the result of step 443 or 444 indicates negative, the indication signal indicates performing the pausing operation, as indicated in steps 449 and 460.
If the result of step 444 indicates affirmative (i.e. the pausing operation has been started for at least the delay time Td; e.g. has been started for 11 minutes), the indication signal indicates performing the stopping operation, as indicated in steps 448 and 460.
In
In the embodiments illustrated by
As such, the above embodiments of remote control can be referred to as “intelligent” remote control. That is, a remote-control system according to one or more of these embodiments can generate an indication signal, in response to detection results obtained in a situation, so as to control the operation of the multimedia device, without the need of giving direct commands by the user.
Moreover, another embodiment of the remote-control method according to the invention is shown, taking a menu mode as the operation mode of the multimedia system. Referring to
When the multimedia system is in the menu mode, the display device 520 displays one or more options, called remote control options, for the user to select. The user needs to focus on one of the objects indicating the options for selection, such as a graph, image, block, region, or region of text. If the duration of eye focus satisfies a corresponding condition, e.g. the eye focus is on a remote control option for about a few seconds (e.g. 3 or more seconds), it indicates that the option is selected successfully.
In
Referring to
Next, in step 340, it is determined whether the direction of the eye focus points at a display screen of the display device 520. If so, a location at which the direction of the eye focus points is determined and whether the location corresponds to a remote control option on the display screen, such as the option 522 illustrated in
In step 350, if the comparison result in step 340 indicates affirmative, that is, the direction of the eye focus points at a remote control option and the duration of the eye focus Tf (e.g. about 3.7 seconds) Is larger than or equal to the eye-focus-duration threshold TF0 (e.g. about 3 seconds), an indication signal is generated, indicating that the remote control option is selected.
In step 340 of this embodiment, the detection device 540 and the image-analyzing-and-indication-generating unit 550 can determine the direction and distance from the detection device 540 to any detected face or eye, as illustrated by an arrow 580, and determine the direction of eye focus, as illustrated by an arrow 590. Further, whether the direction of eye focus points at one option on the display screen can be determined by table lookup or geometrical operations. For example, a look-up table is predetermined, indicating relationship between different directions of eye focus (e.g. in angles) and different regions on the display screen (e.g. in numbers) so as to determine a region on the display screen corresponding to a direction of an eye focus through table lookup.
According to the embodiment of the remote-control method in the menu mode, there are various applications can be developed as follows.
For example, when the eyeball focuses on an option for “next” or “previous” over the eye-focus-duration threshold TF0, an indication signal is generated, indicating start changing the current display screen, such as a video program or a song, to a next section or a previous section and continuing such change until a next remote control option is executed. When the eyeball focuses on a location not corresponding to any remote control option, this operation stops.
When the eyeball focuses on an option for “forward” or “backward”, as illustrated by the option 521 or 522, over or equal to the eye-focus-duration threshold TF0, an indication signal is generated, indicating start performing the “forward” or “backward” operation and speeding up the “forward” or “backward” operation if the eyeball still focuses on the same option after performing this operation. The operation selected continues until a next remote control option is executed or when the eyeball focuses on a location not corresponding to any remote control option, this operation stops.
When the eyeball focuses on a sound volume option, as illustrated by a plus sign “+” in
According to the invention, eye tracking can also be used in controlling the menu. For example, if there are 10 remote control options and the display screen can only display 5 of them for one time. Turning to another page of the menu can be performed by the eye tracking technique. In addition, when playing the multimedia contents, the remote control options can be closed or hidden by the eye tracking technique.
Moreover, for implementation, in the various embodiments of the remote-control method, such as steps 330 to 350 or corresponding steps can be implemented as software such as a program or program functions, operative with a database or data structures for recording relationship among conditional parameters, remote control operations, and related associations. In addition, the embodiments of the invention can also be implemented by hardware or firmware, or any combination of thereof.
In the face-detection-based remote-control system and method and face-detection-based remote-controllable multimedia system according to the above various embodiments of the invention, behavior of at least some part of a human body is associated with operations for remotely controlling the multimedia system and an indication signal is generated based on such association and outputted to the multimedia system, thereby changing the operation or behavior of the multimedia system. As such, the multimedia system can be controlled more easily and directly. In addition, an intelligent remote-control method can be developed according to the invention and applied in a remote-control system to generate an indication signal, in response to detection results obtained in a situation, so as to control the operation of the multimedia device, without the need of giving direct commands by the user.
While the invention has been described by way of example and in terms of a preferred embodiment, it is to be understood that the invention is not limited thereto. On the contrary, it is intended to cover various modifications and similar arrangements and procedures, and the scope of the appended claims therefore should be accorded the broadest interpretation so as to encompass all such modifications and similar arrangements and procedures.
Number | Date | Country | Kind |
---|---|---|---|
95131985 | Aug 2006 | TW | national |