This application is the U.S. national phase of International Application No. PCT/KR2018/007935 filed Jul. 13, 2018 which designated the U.S. and claims priority to Korean Patent Application No. 10-2017-0150605 filed Nov. 13, 2017, the entire contents of each of which are hereby incorporated by reference.
The disclosure relates to a display apparatus and a control method thereof, and more particularly a display apparatus capable of performing operation based on a user input and a control method thereof.
Recently, a display apparatus has performed various operations with regard to a user input. The user input may include an input based on a touch. In response to the user input based on the touch, the display apparatus may perform various operations.
The display apparatus performs various operations, but is inconvenient because many touches are required to carry out the operations. Further, when a user input is made but not intended, the display apparatus has to be prevented from performing an operation. In addition, there is a need of checking whether the display apparatus has performed an operation desired by a user.
The related arts have no ways to solve the foregoing problems. Accordingly, there are needed a method of performing an operation desired by a user based on the user's minimal user input and a method of preventing an operation undesired by the user from being performed.
Accordingly, an aspect of the disclosure is to provide a display apparatus, a control method thereof and a computer program product, in which a user touch input is minimized when the display apparatus performs an operation based on the user touch input, an operation is not carried out when a user touch input is unintentionally made, and a user intuitively perceives whether an operation performed by the display apparatus is based on the user touch input.
According to an embodiment of the disclosure, there is provided a display apparatus comprising: a display configured to display a screen; a touch receiver configured to receive a user touch input; a sensor configured to detect a state of the display apparatus or surrounding states of the display apparatus; and a processor configured to: identify the state of the display apparatus based on information detected by the sensor, based on the user touch input, control a first operation corresponding to a preset first state to be performed based on the display apparatus being in the first state, and a second operation corresponding to a preset second state to be performed based on the display apparatus being in the second state.
The sensor is configured to detect a user position, and the processor is configured to perform the second operation based on a second user portion different from a first user position corresponding to the first operation.
The processor is configured to: display a user interface (UI) comprising a plurality of menus on a screen; move a highlight in the plurality of menus in a first direction, based on the first user position; and move the highlight in the plurality of menus in an opposite direction to the first direction, based on the second user position.
The sensor is configured to detect change in a posture of the display apparatus, and the processor is configured to perform the second operation based on a second posture different from a first posture of the display apparatus corresponding to the first operation.
The processor is configured to prevent the first operation from being performed based on the touch input, based on the detected states.
The sensor is configured to detect movement of the display apparatus, and the processor is configured to prevent the first operation from being performed based on the display apparatus being moved more than predetermined quantity.
The processor is configured to: display a UI on a screen; change the UI in a direction based on a first position, based on a first touch input corresponding to the first position on the screen; and change the UI in a direction based on a second position, based on the first touch input corresponding to the second position different from the first position on the screen.
According to another embodiment of the disclosure, there is provided a method of controlling a display apparatus, the method comprising: identifying a state of the display apparatus based on information detected by a sensor, based on a user touch input; performing a first operation corresponding to a preset first state, based on the display apparatus being in the first state; and performing a second operation corresponding to a preset second state, based on the display apparatus being in the second state.
Further comprising: detecting a user position; and performing a second operation based on a second user position different from a first user position corresponding to the first operation.
Further comprising: displaying a user interface (UI) comprising a plurality of menus on a display; moving a highlight in the plurality of menus in a first direction, based on the first user position; and moving the highlight in the plurality of menus in an opposite direction to the first direction, based on the second user position.
Further comprising: detecting change in a posture of the display apparatus; and performing the second operation based on a second posture different from a first posture of the display apparatus corresponding to the first operation.
Further comprising: preventing the first operation from being performed based on the user touch input, based on the detected states.
Further comprising: detecting movement of the display apparatus; and preventing the first operation from being performed based on the display apparatus being moved more than predetermined quantity.
Further comprising: displaying a UI on a screen; changing the UI in a direction based on a first position, based on a first touch input corresponding to the first position on the screen; and changing the UI in a direction based on a second position, based on the first touch input corresponding to the second position different from the first position on the screen.
According to another embodiment of the disclosure, there is provided a computer program product comprising: a memory configured to store an instruction; and a processor, the instruction controlling a first operation to be performed based on a first touch input of a user, and a second operation different from the first operation corresponding to the first touch input to be performed based on a state detected by a sensor during reception of the first touch input.
As described above, according to the disclosure, it is possible to minimize a user touch input to carry out an operation of a display apparatus.
Further, according to the disclosure, an operation is prevented from being performed when a user touch input is unintentionally is made.
Further, according to the disclosure, a user can intuitively perceive whether an operation performed by the display apparatus is based on the user touch input.
Below, embodiments of the disclosure will be described in detail with reference to the accompanying drawings. In the drawings, like numerals or symbols refer to like elements having substantially the same function, and the size of each element may be exaggerated for clarity and convenience of description. However, the technical concept of the disclosure and its key configurations and functions are not limited to those described in the following embodiments. In the following descriptions, details about publicly known technologies or configurations may be omitted if they unnecessarily obscure the gist of the disclosure.
In the following embodiments, terms ‘first’, ‘second’, etc. are used only to distinguish one element from another, and singular forms are intended to include plural forms unless otherwise mentioned contextually. In the following embodiments, it will be understood that terms ‘comprise’, ‘include’, ‘have’, etc. do not preclude the presence or addition of one or more other features, numbers, steps, operation, elements, components or combination thereof. In addition, a ‘module’ or a ‘portion’ may perform at least one function or operation, be achieved by hardware, software or combination of hardware and software, and be integrated into at least one module for at least one processor.
The display apparatus 100 is not limited to the type of the display apparatus 100 shown in
However, the configuration of the display apparatus 100 shown in
The touch receiver 220 receives a user touch input. The touch receiver 220 may be provided at least a portion of a casing (see ‘310’ in
Referring back to
The display apparatus 100 includes a sensor 240 configured to detect the display apparatus's own state or its surrounding states. The sensor 240 of the display apparatus 100 may be embodied by various sensors of the display apparatus 100. The sensor 240 may include a visible light optical sensor, an infrared light optical sensor, an illumination sensor, an acoustic sensor, an acceleration sensor, a shock sensor, a position sensor, etc., but be not limited to these examples. The position of the sensor is not fixed, and the sensor may be provided at any position suitable for receiving an input corresponding to each sensor.
The sensor 240 may detect the posture of the display apparatus 100; how strong a shock applied to the display apparatus 100 is; how much the display apparatus 100 is tilted; how far the display apparatus 100 is moved; noise, brightness, etc. around the display apparatus 100; and so on. The controller 210 may control operations of the display apparatus 100 based on the state detected by the sensor 240.
Under control of the controller 210, the communicator 250 may perform communication with the input sources 101. A communication method may include both wired communication and wireless communication, and there are no limits to the communication method.
The storage 260 may be configured to store various pieces of information. Various pieces of information may include information related to operation implementation of the display apparatus 100 with regard to a user touch. Therefore, the operations of the display apparatus 100 may be set differently according to users with regard to the same user touch.
The controller 210 performs control for operating generating elements of the display apparatus 100. The controller 210 may include a control program (or instruction) for carrying out the foregoing control operation, a nonvolatile memory in which the control program is installed, a volatile memory to which at least a part of the installed control program is loaded, and at least one processor or central processing unit (CPU) for executing the loaded control program. Further, such a control program may be stored in other electronic apparatuses as well as the display apparatus 100. The control program may include a program(s) achieved in the form of at least one of a basis input/output system (BIOS), a device driver, an operating system, firmware, a platform, and an application program. According to an embodiment, the application program may be previously installed or stored in the electronic apparatus 100 when the display apparatus 100 is manufactured, or may be installed in the display apparatus 100 based on application program data received from the outside when needed in the future. The application program data may for example be downloaded from an application market or the like external server to the display apparatus 100. Such an external server is an example of a computer program product of the disclosure, but not limited to this example.
The display apparatus 100 receives a touch input in the casing 310 having the touch receiver 220. The four kinds of touch inputs are as follows: an action of hitting the casing 310 once (hereinafter, referred to as ‘tap’, 400), an action of tapping and continuing a touch (hereinafter, referred to as ‘tap & hold’, 410), an action of pushing like a wipe (hereinafter, referred to as ‘swipe’, 420), and an action of moving a contact portion after tap & hold (hereinafter, referred to as ‘hold & slide’, 430).
Although
Below, the controller 210 according to an embodiment of the disclosure will be described in detail. The controller 210 according to an embodiment of the disclosure controls an operation intended by a user to be performed considering current situations when a user touch input is received.
When the state of the display apparatus 100 corresponds to the specific condition (e.g. a first state in S501, hereinafter referred to as a ‘first state’), the controller 210 may control a first operation to be performed corresponding to the touch input (S503). On the other hand, when the state of the display apparatus 100 does not correspond to the specific condition (e.g. a second state of S501, hereinafter referred to as a ‘second state’), the controller 210 may control a second operation to be performed corresponding to a current state as an operation different from the first operation corresponding to the touch input (S502). As the states detectable by the sensor 240, the first state and the second state may be incompatible with each other, like a tilted state, a user's position, ambient brightness, etc. of the display apparatus 100, but not limited thereto. Further, there may be states other than the first state and the second state. Thus, the controller 210 may control different operations to be performed according to the states of the display apparatus 100, with respect to the same touch input.
In this flowchart, the operation S501 is performed following the operation S500. However, these two operations may be performed in reversed order. Therefore, the second operation may be performed when the first touch input is received after the sensor 240 identifies the state of the display apparatus 100 or its surrounding states. Further, the second operation may be performed even when the sensor 240 identifies the state of the display apparatus 100 during the first operation based on the first touch input.
Although only two states, i.e. the first state and the second state are described above, the state of the display apparatus 100 may include combination of the states mentioned above, or may include a state other than the first state and the second state. Thus, a user does not have to make a touch input many times to carry out various operations.
According to an alternative embodiment, in a case where the first operation is set to be performed based on the first touch input when the state of the display apparatus 100 corresponds to the preset first state, the controller 210 may control the second operation different from the first operation to be performed based on the first touch input when the state of the display apparatus 100 or its surrounding states correspond to the second state different from the first state. In this regard, details will be described with reference to
The sensor 240 employs various sensors described above to identify a user's position. A user may be positioned in front of the display apparatus 100 (e.g. the first state, see ‘600’), and may be positioned in back of the display apparatus 100 (e.g. the second state, see ‘610’). First, in a case of the first state, the display apparatus may identify a touch input made leftward in a user's sight (i.e. the first touch input) as an input for an operation of reproducing previous content (i.e. the first operation), and identify a touch input made rightward (i.e. the second touch input) as an input for an operation of reproducing next content (i.e. the second operation).
A user does not always control the display apparatus 100 in the first state, but may control the display apparatus 100 in the second state. In a case where the display apparatus 100 performs an operation regardless of the state, when a touch input is made leftward in a user's sight (i.e. when a user intends to make the first touch input), the controller 210 regards the leftward touch input as the second touch input and controls the second operation to be performed. In the second state where a user performs control in back of the display apparatus 100 as indicated by the reference numeral of ‘610’, it will become easier for the user if the control is possible like that in the first state. To this end, the display apparatus 100 may identify a user's position through the sensor 240, and perform an operation different according to the user's positions with respect to the same touch input. For example, when the touch input is made leftward by a user who is in the second state (i.e. when the user intends to make the first touch input, which is regarded as the second touch input in the display apparatus), the controller 210 may perform control to carry out the same first operation as the operation of when the touch input is made leftward by a user who is in the first state (i.e. when the user intends to make the first touch input, which is regarded as the first touch input). In result, the controller 210 may control the first operation to be equally performed even though the first touch input in the first state is different from the second touch input in the second state. The foregoing detection of the user, who is positioned in front and back of the display apparatus 100, is merely an example, and the disclosure is not limited to this example. Therefore, the controller 210 detects another position of a user through the sensor 240, and control a different operation to be performed based on the detected position with respect to the user's same touch input. Thus, it is convenient for a user to make a touch input because there are no needs of taking the user's position into account in terms of making the touch input.
The bottom of the casing 310 may have two surfaces not to be flat so that the display apparatus 100 can stand upright (e.g. the first posture) or tilted (e.g. the second posture). The bottom having two surfaces is merely an example, and another method may be used to make the display apparatus 100 stand upright or tilted. The reference numeral of ‘700’ indicates the first state in which the display apparatus 100 stands upright, and the reference numeral of ‘710’ indicates the second state in which the display apparatus 100 stands tilted. The controller 210 identifies the posture of the display apparatus 100 through the sensor 240. When the display apparatus 100 is in the first state, the controller 210 controls the first operation to be performed based on a user's first touch input received in the touch receiver 220. When the display apparatus 100 is in the second state, it may be convenient for a user that the display apparatus performs an operation corresponding to the second state because the second state is based on a special intention of the user. Therefore, the sensor 240 detects the posture of the display apparatus 100, and the controller identifies the state of the detected posture and controls the second operation to be performed without performing the first operation based on the first touch input even though the first touch input is received in the touch receiver 220. For example, the controller 210 may control an operation such as play and stop in response to the first touch input in the first posture, but may control an operation such as play of another predetermined content in response to the first touch input in the second posture. The foregoing postures of the display apparatus 100 are merely an example, and an embodiment of the disclosure is not limited to this example. The sensor 240 may detect any possible postures of the display apparatus 100, in which the display apparatus 100 may be tilted, laid down, put upside down, stands, etc. unlike the first posture or the second posture, and the controller 210 may control an operation, which is different from the first operation based on the first touch input, to be performed based on the first touch input in each state of the postures even though the same first touch input is received. Thus, it is easy for a user to control the display apparatus 100 under a situation corresponding to a specific condition.
When the state of the display apparatus 100 or the surrounding states of the display apparatus 100 detected by the sensor 240 correspond to a state under a condition usable for a user (e.g. a normal state in S801, hereinafter referred to as a ‘normal state’), the controller 210 controls the first operation to be performed based on the first touch input (S803). On the other hand, when the state of the display apparatus 100 or the surrounding states of the display apparatus 100 correspond to a state under a condition that a user does not intend to make a touch input (e.g. an exceptional state, hereinafter referred to as an ‘exceptional state’), the controller 210 may control the first operation not to be performed (S802). The normal state and the exceptional state may be classified according to the states detectable by the sensor 240, and the classification results may have been stored in the storage 260 and may be stored and changed by a user input.
In this flowchart, the operation S801 is performed following the operation S500. However, these two operations may be performed in reversed order. Therefore, the first operation may be controlled not to be performed even when the first touch input is received after the sensor 240 identifies the state of the display apparatus. Further, the first operation may be controlled not to be performed even when the sensor 240 detects the state of the display apparatus or the surrounding states of the display apparatus while the first operation is performed based on the first touch input. Thus, a touch input is not additionally made with regard to an unnecessary operation corresponding to an unintended touch input of a user.
In
As indicated by the reference numeral of ‘1100’ in
Number | Date | Country | Kind |
---|---|---|---|
10-2017-0150605 | Nov 2017 | KR | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/KR2018/007935 | 7/13/2018 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2019/093618 | 5/16/2019 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
6538665 | Crow | Mar 2003 | B2 |
7088342 | Rekimoto | Aug 2006 | B2 |
7348967 | Zadesky | Mar 2008 | B2 |
7800592 | Kerr | Sep 2010 | B2 |
8261207 | Hawkins | Sep 2012 | B2 |
8312371 | Ording | Nov 2012 | B2 |
8368723 | Gossweiler, III | Feb 2013 | B1 |
8378985 | Thorn | Feb 2013 | B2 |
8467839 | Rema Shanmugam | Jun 2013 | B2 |
8522157 | Park | Aug 2013 | B2 |
8654095 | Cho | Feb 2014 | B1 |
8726194 | Hildreth | May 2014 | B2 |
8775966 | Stolyarov | Jul 2014 | B2 |
8803820 | Suzuki | Aug 2014 | B2 |
8847436 | Maxik | Sep 2014 | B2 |
8933892 | Woolley | Jan 2015 | B2 |
9207861 | Gan | Dec 2015 | B2 |
9367279 | Kim | Jun 2016 | B2 |
9448587 | Park | Sep 2016 | B2 |
9569088 | Kwak | Feb 2017 | B2 |
9594405 | Papakipos | Mar 2017 | B2 |
9792037 | Monteux | Oct 2017 | B2 |
9817546 | Shin | Nov 2017 | B2 |
9939951 | Luo | Apr 2018 | B2 |
9973620 | Baek | May 2018 | B2 |
10261589 | Sakai | Apr 2019 | B2 |
10296127 | Yun | May 2019 | B2 |
10509530 | Chung | Dec 2019 | B2 |
10514796 | Satake | Dec 2019 | B2 |
10627947 | Kono | Apr 2020 | B2 |
10628012 | Oh | Apr 2020 | B2 |
10845914 | Kono | Nov 2020 | B2 |
10891005 | Bae | Jan 2021 | B2 |
10963156 | Han | Mar 2021 | B2 |
20060187212 | Park | Aug 2006 | A1 |
20060197750 | Kerr | Sep 2006 | A1 |
20060238517 | King | Oct 2006 | A1 |
20090070711 | Kwak | Mar 2009 | A1 |
20090280868 | Hawkins | Nov 2009 | A1 |
20100056221 | Park | Mar 2010 | A1 |
20110096011 | Suzuki | Apr 2011 | A1 |
20110163986 | Lee | Jul 2011 | A1 |
20110209088 | Hinckley | Aug 2011 | A1 |
20110209097 | Hinckley | Aug 2011 | A1 |
20110292268 | Mann | Dec 2011 | A1 |
20120062564 | Miyashita | Mar 2012 | A1 |
20120162048 | Mitsunaga | Jun 2012 | A1 |
20120304133 | Nan | Nov 2012 | A1 |
20120306788 | Chen | Dec 2012 | A1 |
20130007653 | Stolyarov | Jan 2013 | A1 |
20130009890 | Kwon | Jan 2013 | A1 |
20130033438 | Monteux | Feb 2013 | A1 |
20130222286 | Kang | Aug 2013 | A1 |
20130234982 | Kang | Sep 2013 | A1 |
20130265284 | Yun | Oct 2013 | A1 |
20130300668 | Churikov | Nov 2013 | A1 |
20130300697 | Kim | Nov 2013 | A1 |
20130328793 | Chowdhury | Dec 2013 | A1 |
20130344919 | Kim | Dec 2013 | A1 |
20140009415 | Nishida | Jan 2014 | A1 |
20140019910 | Kim | Jan 2014 | A1 |
20140043265 | Chang | Feb 2014 | A1 |
20140043277 | Saukko | Feb 2014 | A1 |
20140118271 | Lee | May 2014 | A1 |
20140289668 | Mavrody | Sep 2014 | A1 |
20140365927 | Sakai | Dec 2014 | A1 |
20150015458 | Cho | Jan 2015 | A1 |
20150031347 | Kim | Jan 2015 | A1 |
20150042588 | Park | Feb 2015 | A1 |
20150062126 | Lee | Mar 2015 | A1 |
20150095826 | Ahn | Apr 2015 | A1 |
20150145797 | Corrion | May 2015 | A1 |
20150160849 | Weiss | Jun 2015 | A1 |
20150301665 | Kim | Oct 2015 | A1 |
20160026381 | Kim | Jan 2016 | A1 |
20160044153 | Kim | Feb 2016 | A1 |
20160062556 | Chung | Mar 2016 | A1 |
20160070338 | Kim | Mar 2016 | A1 |
20160110093 | S | Apr 2016 | A1 |
20160124633 | Kim | May 2016 | A1 |
20160154559 | Yu | Jun 2016 | A1 |
20160179207 | DeMena | Jun 2016 | A1 |
20160205237 | Baek | Jul 2016 | A1 |
20160255256 | Oh | Sep 2016 | A1 |
20160291731 | Liu | Oct 2016 | A1 |
20160371691 | Kang | Dec 2016 | A1 |
20160378318 | Tsuju | Dec 2016 | A1 |
20170038957 | Feng et al. | Feb 2017 | A1 |
20170285843 | Roberts-Hoffman | Oct 2017 | A1 |
20180024727 | Oh | Jan 2018 | A1 |
20180095501 | Lasyath | Apr 2018 | A1 |
20190179487 | Kong | Jun 2019 | A1 |
20200249812 | Han | Aug 2020 | A1 |
Number | Date | Country |
---|---|---|
10-2010-0027686 | Mar 2010 | KR |
10-2013-0099696 | Sep 2013 | KR |
10-2014-0043522 | Apr 2014 | KR |
10-1403025 | Jun 2014 | KR |
10-1415184 | Jul 2014 | KR |
10-1494810 | Feb 2015 | KR |
10-2015-0067676 | Jun 2015 | KR |
10-2015-0095540 | Aug 2015 | KR |
10-2015-0121443 | Oct 2015 | KR |
10-2016-0028338 | Mar 2016 | KR |
10-2016-0089265 | Jul 2016 | KR |
10-2016-0093499 | Aug 2016 | KR |
10-1667735 | Oct 2016 | KR |
10-1917692 | Nov 2018 | KR |
Entry |
---|
International Search Report for PCT/KR2018/007935 dated Nov. 15, 2018, 5 pages. |
Written Opinion of the ISA for PCT/KR2018/007935 dated Nov. 15, 2018, 6 pages. |
Office Action dated Nov. 23, 2021 in counterpart Korean Patent Application No. 10-2017-0150605 and partial English-language translation. |
Office Action dated May 6, 2022 in counterpart Korean Patent Application No. 10-2017-0150605 and partial English-language translation. |
Office Action dated Oct. 26, 2022 in counterpart Korean Patent Application No. 10-2017-0150605 and partial English-language translation. |
Number | Date | Country | |
---|---|---|---|
20210200371 A1 | Jul 2021 | US |