This application claims the benefit of Korean Patent Application No. 10-2009-0133171 filed on 29 Dec. 2009 which are hereby incorporated by reference.
1. Field
This document relates to a display device and a control method thereof and, more particularly, to a display device and a control method thereof to execute functions respectively corresponding to specific gestures of users when the gestures exceed thresholds respectively corresponding to the users to operate the display device in a manner most suitable for the range of the gesture of each user.
2. Related Art
As the functions of terminals such as personal computers, laptop computers, cellular phones and the like are diversified, the terminals are constructed in the form of a multimedia player having multiple functions of capturing pictures or moving images, playing music, moving image files and games and receiving broadcasting programs.
A terminal as a multimedia player can be referred to as a display device since it generally has a function of displaying video information.
Terminals can be divided into a mobile terminal and a stationary terminal. Examples of the mobile terminal can include laptop computers, cellular phones, etc. and examples of the stationary terminal can include television systems, monitor for desktop computers, etc
An aspect of this document is to provide a display device and a control method thereof to execute functions respectively corresponding to specific gestures of users when the gestures exceed thresholds respectively corresponding to the users to operate the display device in a manner most suitable for the range of the gesture of each user.
The accompany drawings, which are included to provide a further understanding of this document and are incorporated on and constitute a part of this specification illustrate embodiments of this document and together with the description serve to explain the principles of this document.
This document will now be described more fully with reference to the accompanying drawings, in which exemplary embodiments of this document are shown. This document may, however, be embodied in many different forms and should not be construed as being limited to the embodiments set forth herein; rather, there embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of this document to those skilled in the art.
Hereinafter, a mobile terminal relating to this document will be described below in more detail with reference to the accompanying drawings. In the following description, suffixes “module” and “unit” are given to components of the mobile terminal in consideration of only facilitation of description and do not have meanings or functions discriminated from each other.
The mobile terminal described in the specification can include a cellular phone, a smart phone, a laptop computer, a digital broadcasting terminal, personal digital assistants (PDA), a portable multimedia player (PMP), a navigation system and so on.
As shown, the display device 100 may include a communication unit 110, a user input unit 120, an output unit 150, a memory 160, an interface 170, a controller 180, and a power supply 190. Not all of the components shown in
The communication unit 110 may include at least one module that enables communication between the display device 100 and a communication system or between the display device 100 and another device. For example, the communication unit 110 may include a broadcasting receiving module 111, an Internet module 113, and a near field communication module 114.
The broadcasting receiving module 111 may receive broadcasting signals and/or broadcasting related information from an external broadcasting management server through a broadcasting channel.
The broadcasting channel may include a satellite channel and a terrestrial channel, and the broadcasting management server may be a server that generates and transmits broadcasting signals and/or broadcasting related information or a server that receives previously created broadcasting signals and/or broadcasting related information and transmits the broadcasting signals and/or broadcasting related information to a terminal. The broadcasting signals may include not only TV broadcasting signals, radio broadcasting signals, and data broadcasting signals but also signals in the form of a combination of a TV broadcasting signal and a radio broadcasting signal of a data broadcasting signal.
The broadcasting related information may be information on a broadcasting channel, a broadcasting program or a broadcasting service provider, and may be provided even through a communication network.
The broadcasting related information may exist in various forms. For example, the broadcasting related information may exist in the form of an electronic program guide (EPG) of a digital multimedia broadcasting (DMB) system or in the form of an electronic service guide (ESG) of a digital video broadcast-handheld (DVB-H) system.
The broadcasting receiving module 111 may receive broadcasting signals using various broadcasting systems. The broadcasting signals and/or broadcasting related information received through the broadcasting receiving module 111 may be stored in the memory 160.
The Internet module 113 may correspond to a module for Internet access and may be included in the display device 100 or may be externally attached to the display device 100.
The near field communication module 114 may correspond to a module for near field communication. Further, Bluetooth®, radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB) and/or ZigBee® may be used as a near field communication technique.
The user input 120 is used to input an audio signal or a video signal and may include a camera 121 and a microphone 122.
The camera 121 may process image frames of still images or moving images obtained by an image sensor in a video telephony mode or a photographing mode. The processed image frames may be displayed on a display 151. The camera 121 may be a 2D or 3D camera. In addition, the camera 121 may be configured in the form of a single 2D or 3D camera or in the form of a combination of the 2D and 3D cameras.
The image frames processed by the camera 121 may be stored in the memory 160 or may be transmitted to an external device through the communication unit 110. The display device 100 may include at least two cameras 121.
The microphone 122 may receive an external audio signal in a call mode, a recording mode or a speech recognition mode and process the received audio signal into electric audio data. The microphone 122 may employ various noise removal algorithms for removing or reducing noise generated when the external audio signal is received.
The output unit 150 may include the display 151 and an audio output module 152.
The display 151 may display information processed by the display device 100. The display 151 may display a user interface (UI) or a graphic user interface (GUI) relating to the display device 100. In addition, the display 151 may include at least one of a liquid crystal display, a thin film transistor liquid crystal display, an organic light-emitting diode display, a flexible display and a three-dimensional display. Some of these displays may be of a transparent type or a light transmissive type. That is, the display 151 may include a transparent display. The transparent display may include a transparent liquid crystal display. The rear structure of the display 151 may also be of a light transmissive type. Accordingly, a user may see an object located behind the body of terminal through the transparent area of the terminal body, occupied by the display 151.
The display device 100 may include at least two displays 151. For example, the display device 100 may include a plurality of displays 151 that are arranged on a single face at a predetermined distance or integrated displays. The plurality of displays 151 may also be arranged on different sides.
Further, when the display 151 and a sensor sensing touch (hereafter referred to as a touch sensor) form a layered structure that is referred to as a touch screen, the display 151 may be used as an input device in addition to an output device. The touch sensor may be in the form of a touch film, a touch sheet, and a touch pad, for example.
The touch sensor may convert a variation in pressure applied to a specific portion of the display 151 or a variation in capacitance generated at a specific portion of the display 151 into an electric input signal. The touch sensor may sense pressure of touch as well as position and area of the touch.
When the user applies a touch input to the touch sensor, a signal corresponding to the touch input may be transmitted to a touch controller. The touch controller may then process the signal and transmit data corresponding to the processed signal to the controller 180. Accordingly, the controller 180 can detect a touched portion of the display 151.
The audio output module 152 may output audio data received from the radio communication unit 110 or stored in the memory 160. The audio output module 152 may output audio signals related to functions, such as a call signal incoming tone and a message incoming tone, performed in the display device 100.
The memory 160 may store a program for operation of the controller 180 and temporarily store input/output data such as a phone book, messages, still images, and/or moving images. The memory 160 may also store data about vibrations and sounds in various patterns that are output from when a touch input is applied to the touch screen.
The memory 160 may include at least a flash memory, a hard disk type memory, a multimedia card micro type memory, a card type memory, such as SD or XD memory, a random access memory (RAM), a static RAM (SRAM), a read-only memory (ROM), an electrically erasable programmable ROM (EEPROM), a programmable ROM (PROM) magnetic memory, a magnetic disk or an optical disk. The display device 100 may also operate in relation to a web storage performing the storing function of the memory 160 on the Internet.
The interface 170 may serve as a path to all external devices connected to the mobile terminal 100. The interface 170 may receive data from the external devices or power and transmit the data or power to internal components of the display device terminal 100 or transmit data of the mobile terminal 100 to the external devices. For example, the interface 170 may include a wired/wireless headset port, an external charger port, a wired/wireless data port, a memory card port, a port for connecting a device having a user identification module, an audio I/O port, a video I/O port, and/or an earphone port.
The controller 180 may control overall operations of the mobile terminal 100. For example, the controller 180 may perform control and processing for voice communication. The controller 180 may also include an image processor 182 for pressing image, which will be explained later.
The power supply 190 receives external power and internal power and provides power required for each of the components of the display device 100 to operate under the control of the controller 180.
Various embodiments described in this document can be implemented in software, hardware or a computer readable recording medium. According to hardware implementation, embodiments of this document may be implemented using at least one of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, and/or electrical units for executing functions. The embodiments may be implemented by the controller 180 in some cases.
According to software implementation, embodiments such as procedures or functions may be implemented with a separate software module executing at least one function or operation. Software codes may be implemented according to a software application written in an appropriate software language. The software codes may be stored in the memory 160 and executed by the controller 180.
As shown, the display device 100 may acquire information on the body shape of a user U in step S10. The body shape information may be acquired based on an image obtained from the camera 121 included in the display device 100. That is, when the camera 121 captures an image of the user U, the obtained image is analyzed to acquire the body shape information of the user U. According to other embodiments of this document, the body shape information can be obtained without using the camera 121, which will be explained in detail later in the other embodiments.
Upon the acquisition of the body shape information, the image processor 182 included in the controller 180 shown in
Body shape information may be set based on the actual body shape of each user. The actual body shape of the user U can be acquired in an initial stage in which the display device 100 is operated through the camera 121, acquired through the camera 121 while the user U uses the display device 100, or acquired in such a manner that the user U personally inputs his/her body shape information to the display device 100. Though the body shape information is obtained prior to other operations in
A user's gesture may be extracted from the image captured by the camera 121 in step S20.
The image of the user U, captured by the camera 121 set in or connected to the display device 100, may include a background image. If the image is photographed indoors, for example, the image can have furniture as the background of the user U. The user's gesture can be obtained by excluding the background image from the image.
The extracted user's gesture may be compared with the extracted body shape information to determine the user's gesture in step S30.
The user's gesture and the body shape information may be acquired through the above operations, and thus the user's gesture and the body shape information can be compared to each other.
When it is determined that the user's gesture exceeds a threshold from the comparison result, a function mapped to the gesture may be executed in step S40.
The threshold can be set based on the body shape information of the user U. The controller 180 shown in
The threshold may be an appropriate or inappropriate value according to standard. For example, if the threshold is set based on a tall adult, a gesture of a small child can be recognized as a gesture that does not reach the threshold. Accordingly, the channel of the display device 100 may not be changed even when the small child raises up the left arm with the intention of changing the channel of the display device 100. On the contrary, when the threshold is set based on the small child, the channel of the display device 100 may be changed even when a tall adult slightly raises up the left arm unconsciously. Accordingly, it is required to set an appropriate threshold to prevent the display device 100 from a wrong operation.
In the current embodiment of this document, the threshold may be set based on the body shape information of the user U of the display device 100. The body shape information has been acquired in the above operation S10. The controller 180 shown in
A mapped function is a specific function corresponding to a specific gesture. For example, a user's gesture of raising up the left arm to the left can be mapped to the function of changing the channel of the display device 100, as described above. Since a specific gesture of the user U is mapped to a specific function, an additional device for controlling the display device 100, such as a remote controller, may not be needed. This can improve the convenience of use.
As shown, the operation S10 of acquiring the body shape information of the user U, shown in
Preliminary data for determining the body shape of the user U can be acquired using the camera 121 in the present embodiment. That is, the body shape information of the user U can be extracted from the image captured by the camera 121.
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
As shown, the body shape information of the user can be directly acquired from the user image UI in the current embodiment of this document.
Referring to
Referring to
The controller 180 shown in
Referring to
The controller 180 shown in
Referring to
Referring to
As shown, the operation S20 of extracting the user's gesture from the image captured by the camera 121 and the operation S30 of comparing the extracted user's gesture to the body shape information may include an operation S22 of capturing the user's gesture through the camera 121 and a step S24 of extracting the user's gesture from the captured image.
The operations S22 and S24 of taking images of the user U and extracting the user image UI from the taken images TI1 and TI2 are identical to the aforementioned operations. However, the operations S22 and S24 will now be described for gestures that are made by different users but recognized to be identical.
Referring to
The first and second taken images TI1 and TI2 are different from each other, and thus it can be considered that the two users make their gestures with different intentions. In other words, while there is quite a possibility that the user corresponding to the first taken image TI1 makes the gesture with the intention of executing a specific function, there is a high possibility that the user corresponding to the second taken image TI2 makes an accidental gesture. However, user images respectively extracted from the first and second taken images TI1 and TI2 may be similar to each other.
The user images may be extracted from the taken images TI1 and TI2 in a rough manner, as described above, and thus the user images respectively extracted from the first taken image TI1 of the child who has arms shorter than those of the adult and raises up the left arm and the second taken image TI2 of the adult who half raises up the left arm may represent similar arm lengths and arm shapes even though the first and second taken images TI1 and TI2 are different from each other.
Referring back to
Subsequently, the user's gesture is recognized based on the body shape information in operation S34 and the recognized user's gesture is specified in operation S36.
Referring to
While the required information is acquired from the user image UI, the body shape information about the user U can be loaded from the memory 160 shown in
When the height of the user in the user image UI is 180 cm, the arm length of the user is estimated to be 70 cm through the table shown in
When the user's gesture is recognized based on the body shape information of the user, the function mapped to the gesture may be executed in step S40 shown in
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
In the case of moving the object OB without using the body shape information, the display device 100 can display the object OB such that the object OB is moved by a first distance D1, which his a relatively short distance, for the gesture of the first user U1 who has a relatively small frame.
In the case of moving the object OB using the body shape information, the display device 100 can determine that the first user U1 make a large gesture based on the body shape information of the first user U1. Accordingly, the display device 100 can display the object OB such that the object OB is moved by a second distance D2 which is a relatively long distance.
Referring to
Referring to
Referring to
Although
The above-described method of controlling the mobile terminal may be written as computer programs and may be implemented in digital microprocessors that execute the programs using a computer readable recording medium. The method of controlling the mobile terminal may be executed through software. The software may include code segments that perform required tasks. Programs or code segments may also be stored in a processor readable medium or may be transmitted according to a computer data signal combined with a carrier through a transmission medium or communication network.
The computer readable recording medium may be any data storage device that can store data that can be thereafter read by a computer system. Examples of the computer readable recording medium may include read-only memory (ROM), random-access memory (RAM), CD-ROMs, DVD±ROM, DVD-RAM, magnetic tapes, floppy disks, optical data storage devices. The computer readable recording medium may also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distribution fashion.
A mobile terminal may include a first touch screen configured to display a first object, a second touch screen configured to display a second object, and a controller configured to receive a first touch input applied to the first object and to link the first object to a function corresponding to the second object when receiving a second touch input applied to the second object while the first touch input is maintained.
A method may be provided of controlling a mobile terminal that includes displaying a first object on the first touch screen, displaying a second object on the second touch screen, receiving a first touch input applied to the first object, and linking the first object to a function corresponding to the second object when a second touch input applied to the second object is received while the first touch input is maintained.
Any reference in this specification to “one embodiment,” “an embodiment,” “example embodiment,” etc., means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of this document. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with any embodiment, it is submitted that it is within the purview of one skilled in the art to effect such feature, structure, or characteristic in connection with other ones of the embodiments.
Although embodiments have been described with reference to a number of illustrative embodiments thereof, it should be understood that numerous other modifications and embodiments can be devised by those skilled in the art that will fall within the spirit and scope of the principles of this disclosure. More particularly, various variations and modifications are possible in the component parts and/or arrangements of the subject combination arrangement within the scope of the disclosure, the drawings and the appended claims. In addition to variations and modifications in the component parts and/or arrangements, alternative uses will also be apparent to those skilled in the art.
Number | Date | Country | Kind |
---|---|---|---|
10-2009-0133171 | Dec 2009 | KR | national |