1. Field
Exemplary embodiments relate to a display apparatus which displays a user interface (UI) and a method of providing the UI. In particular, exemplary embodiments relate to a display apparatus and a method of providing a UI which is arranged to facilitate user manipulation of the UI provided via the display apparatus.
2. Description of the Related Art
Personal computers (PCs), mobile phones, smartphones, personal digital assistants (PDAs), etc., may be configured to perform various functions. Examples of the various functions may include a function for data and voice communication, a function for capturing an image or filming a video using a camera, a function for storing a voice, a function for reproducing a music file via a speaker system, a function for displaying an image or a video, etc.
In order to support or increase the various functions of the aforementioned devices, various attempts have been made that configure a terminal and improve software and hardware.
In order to perform the various functions of the devices, various user interfaces (UIs) are provided. A UI may include a UI object configured of a widget, an application-executed screen, a menu button, a function key, and an application execution icon. The UI may provide UI objects having various sizes to a user.
However, a small UI object may display only a few pieces of information. Further, only a small number of large UI objects may be displayed due to a limited display area. Thus, in order to change a small size of UI objects into a large size, the user has to inconveniently change the small size of each of the UI objects.
Exemplary embodiments may include a display apparatus capable of easily manipulating user interface (UI) objects included in a UI, and a method of providing the UI.
Additional aspects will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the presented embodiments.
According to an aspect of the exemplary embodiments, a display apparatus that provides a UI includes a touch screen which is configured to display the UI comprising a plurality of UI objects, and receive a user input; and a controller which is configured to determine a plurality of preset sizes corresponding to the UI objects, respectively, and a plurality of pieces of information to be displayed on the UI objects, respectively, in response to the user input, and control the touch screen to display the UI objects with the preset sizes and the plurality of pieces of information.
The user input may correspond to a user touch input which drags in a preset direction.
The controller may be configured to locate the UI objects on the UI, according to the preset sizes.
The controller may be configured to locate the UI objects on the UI, according to at least one category of the UI objects.
Each of the UI objects may include at least one of a widget, an application-executed screen, a menu button, a function key, and an application execution icon.
According to another aspect of the exemplary embodiments, a method in which a display apparatus provides a user interface (UI) includes displaying a UI including a plurality of UI objects on a screen of the display apparatus, receiving a user input on the touch screen of the display apparatus, determining a plurality of preset sizes corresponding to the UI objects and a plurality of pieces of information to be displayed on the UI objects in response to the user input, and displaying the UI objects with the preset sizes and the plurality of pieces of information.
According to another aspect of the exemplary embodiments, a non-transitory computer readable medium that stores a program, which when executed by a computer, performs displaying a UI including a plurality of UI objects on a touch screen of the display apparatus, receiving a user input on the touch screen of the display apparatus, determining a plurality of preset sizes corresponding to the UI objects and a plurality of pieces of information to be displayed on the UI objects in response to the user input, and displaying the UI objects with the preset sizes and the plurality of pieces of information.
These and/or other aspects will become apparent and more readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings in which:
Hereinafter, embodiments will be described in detail with reference to the attached drawings. The embodiments may, however, be embodied in many different forms, and should not be construed as being limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the embodiments to those skilled in the art. In the following description, well-known functions or constructions are not described in detail since they would obscure the embodiments with unnecessary detail. Similar and like reference numerals in the drawings denote like or similar elements throughout the specification.
While various terms are used to describe various components, it is obvious that the components are not limited to the terms. The terms are used only to distinguish between each of components.
Throughout the specification, it will be understood that when an element is referred to as being “connected to” or “coupled with” another element, it can be directly connected to or coupled with the other element, or it can be electrically connected to or coupled with the other element by having an intervening element interposed therebetween. Also, it will be understood that when an element is referred to as being “connected to” or “coupled with” another element, it can communicate with the other element by exchanging signals therebetween.
Also, when a part “includes” or “comprises” an element, unless there is a particular description contrary thereto, the part can further include other elements, not excluding the other elements.
Further, all examples and conditional language recited herein are to be construed as being without limitation to such specifically recited examples and conditions. All terms including descriptive or technical terms which are used herein should be construed as having meanings that are obvious to one of ordinary skill in the art. However, the terms may have different meanings according to an intention of one of ordinary skill in the art, precedent cases, or the appearance of new technologies. Also, some terms may be arbitrarily selected by the applicant. In this case, the meaning of the selected terms will be described in detail in the detailed description of the exemplary embodiments. Thus, the terms used herein have to be defined based on the meaning of the terms together with the description throughout the specification.
Throughout the specification, a singular form may include plural forms, unless there is a particular description contrary thereto. Also, terms such as “comprise” or “comprising” are used to specify existence of a recited form, a number, a process, an operation, a component, and/or groups thereof, not excluding the existence of one or more other recited forms, one or more other numbers, one or more other processes, one or more other operations, one or more other components and/or groups thereof.
Expressions such as “at least one of,” when preceding a list of elements, modify the entire list of elements and do not modify the individual elements of the list.
Hereinafter, embodiments will be described in detail with reference to the attached drawings.
However,
The display apparatus 100 may be connected to an external apparatus (not shown) using a mobile communication module 120, a sub-communication module 130, and a connector 165. The external apparatus may include at least one of another apparatus (not shown), a mobile phone (not shown), a smartphone (not shown), a tablet personal computer (PC) (not shown), and a server (not shown).
Referring to
The control unit 110 may include a central processing unit (CPU) 111, read-only memory (ROM) 112 that stores a control program for controlling the display apparatus 100, and random-access memory (RAM) 113 that stores a signal or data input by an external source of the display apparatus 100 or is used as a memory area for operations performed by the display apparatus 100. The CPU 111 may include a single core processor, a dual core processor, a triple core processor, or a quad core processor. The CPU 111, the ROM 112, and the RAM 113 may be connected to each other via an internal BUS.
The control unit 110 may control the mobile communication module 120, the sub-communication module 130, the multimedia module 140, the camera module 150, the GPS module 155, the I/O module 160, the sensor module 170, the storage unit 175, the power supply unit 180, a touch screen 190, and the touch screen controller 195.
The mobile communication module 120 may allow the display apparatus 100 to be connected to the external apparatus via mobile communication by using one or more antennas (not shown), in response to a control by the control unit 110. The mobile communication module 120 may transmit or receive a wireless signal for making a voice call or a video call or transmitting a short message service (SMS) or a multimedia message (MMS) to a mobile phone (not shown), a smartphone (not shown), a tablet PC (not shown), or another apparatus (not shown), which has a phone number input to the display apparatus 100.
The sub-communication module 130 may include at least one of the wireless LAN module 131 and the short-distance communication module 132. For example, the sub-communication module 130 may include the wireless LAN module 131 or the short-distance communication module 132, or may include both the wireless LAN module 131 and the short-distance communication module 132.
The wireless LAN module 131 may access the Internet in response to a control by the control unit 110, via a wireless access point (wireless AP) (not shown). The wireless LAN module 131 may support the wireless LAN standard of IEEE802.11x by the Institute of Electrical and Electronics Engineers (IEEE). The short-distance communication module 132 may wirelessly perform short-distance communication between the display apparatus 100 and an image forming apparatus (not shown), in response to a control by the control unit 110. The short-distance communication may include Bluetooth, infrared data association (IrDA), ZigBee, etc.
The display apparatus 100 according to its performance may include at least one of the mobile communication module 120, the wireless LAN module 131, and the short-distance communication module 132.
The multimedia module 140 may include the broadcasting communication module 141, the audio reproduction module 142, or the video reproduction module 143. The broadcasting communication module 141 may receive, in response to a control by the control unit 110, a broadcasting signal (e.g., a television (TV) broadcasting signal, a radio broadcasting signal, or a data broadcasting signal) that is transmitted from a broadcasting station, and broadcasting additional information (e.g., an electric program guide (EPG) or an electric service guide (ESG)) via a broadcasting antenna (not shown). The audio reproduction module 142 may reproduce a digital audio file that is stored or received in response to a control by the control unit 110. The video reproduction module 143 may reproduce a digital video file that is stored or received in response to a control by the control unit 110.
The multimedia module 140 may not include the broadcasting communication module 141 and may only include the audio reproduction module 142 and the video reproduction module 143. The audio reproduction module 142 or the video reproduction module 143 of the multimedia module 140 may be included in the control unit 110.
The camera module 150 may include at least one of the first camera 151 and the second camera 152 that captures a still image or films a video in response to a control by the control unit 110. The first camera 151 or the second camera 152 may include an auxiliary light source (not shown) for providing an amount of light which is required for the capturing or filming operation. The first camera 151 may be disposed at a front surface of the display apparatus 100, and the second camera 152 may be disposed at a rear surface of the display apparatus 100. Alternatively, the first camera 151 and the second camera 152 may be disposed adjacent to each other (e.g., a gap between the first camera 151 and the second camera 152 may be greater than 1 cm and less than 8 cm). Thus, the first camera 151 and the second camera 152 may capture a three-dimensional (3D) still image or may film a 3D video.
The GPS module 155 may receive waves from a plurality of GPS satellites (not shown) on orbits of the earth and may calculate a location of the display apparatus 100 using arrival times of the waves from the GPS satellites to the display apparatus 100.
The I/O module 160 may include at least one of the button 161, the microphone 162, the speaker 163, the vibration motor 164, the connector 165, and the keypad 166.
The button 161 may be formed at a front surface, a side surface, or a rear surface of a housing of the display apparatus 100, and may include at least one of a power/lock button (not shown), a volume button (not shown), a menu button, a home button, a back button, and a search button.
The microphone 162 may receive a voice or a sound. Thus, the microphone 162 may generate an electrical signal in response to a control by the control unit 110.
The speaker 163 may output, in response to a control by the control unit 110, sounds that correspond to various signals from the mobile communication module 120, the sub-communication module 130, the multimedia module 140, or the camera module 150 to an external source of the display apparatus 100. The speaker 163 may output a sound that corresponds to a function performed by the display apparatus 100. One or more speakers 163 may be formed at an appropriate location or appropriate locations of the housing of the display apparatus 100.
The vibration motor 164 may convert an electrical signal into a mechanical signal, in response to a control by the control unit 110. For example, in a case where the display apparatus 100 in a vibration mode receives a voice call from another apparatus (not shown), the vibration motor 164 may operate. The vibration motor 164 may operate in response to a touch motion by a user who contacts the touch screen 190 and sequential movements of a touch input on the touch screen 190.
The connector 165 may be used as an interface for connecting the display apparatus 100 and another apparatus (not shown) or a power source (not shown). In response to a control by the control unit 110, the display apparatus 100 may transmit data stored in the storage unit 175 of the display apparatus 100 to another apparatus (not shown) or may receive data from the other apparatus, via a cable connected to the connector 165. Also, a power may be supplied from the power source to the display apparatus 100 or a battery (not shown) may be charged, via the cable connected to the connector 165.
The keypad 166 may receive a key input by the user so as to control the display apparatus 100. The keypad 166 includes a physical keypad (not shown) formed at the display apparatus 100 or a virtual keypad (not shown) displayed on the touch screen 190. The physical keypad formed at the display apparatus 100 may be excluded (hence, the dashed lines in
The sensor module 170 includes one or more sensors that detect a status of the display apparatus 100. For example, the sensor module 170 may include a proximity sensor (not shown) for detecting whether a user accesses the display apparatus 100, a light sensor (not shown) for detecting an amount of light around the display apparatus 100, and a motion sensor (not shown) for detecting motions of the display apparatus 100 (e.g., rotation of the display apparatus 100, acceleration or vibration applied to the display apparatus 100, etc.). One or more sensors may be added or excluded depending on a performance of the display apparatus 100.
The storage unit 175 may store, in response to a control by the control unit 110, signals or a plurality of pieces of data that are input or output and correspond to operations of the mobile communication module 120, the sub-communication module 130, the multimedia module 140, the camera module 150, the GPS module 155, the I/O module 160, the sensor module 170, and the touch screen 190. The storage unit 175 may store a control program and applications for controlling the display apparatus 100 or the control unit 110.
The term “storage unit” may include the storage unit 175, the ROM 112 or the RAM 113 in the control unit 110, or a memory card (not shown) installed in the display apparatus 100. The storage unit 175 may include a non-volatile memory, a volatile memory, a hard disk drive (HDD), or a solid state drive (SSD).
The power supply unit 180 may supply, in response to a control by the control unit 110, power to at least one battery (not shown) that is disposed in the housing of the display apparatus 100. Also, the power supply unit 180 may supply the power from the power source to each of the aforementioned units of the display apparatus 100 via the cable connected to the connector 165.
The touch screen 190 may output a UI, which corresponds to various services, to the user. The touch screen 190 may transmit, to the touch screen controller 195, an analog signal that corresponds to at least one touch input to the UI. The touch screen 190 may receive the at least one touch input via a body part (e.g., a finger) of the user or a touchable input unit (e.g., a stylus pen). Also, the touch screen 190 may receive sequential movements of the at least one touch input. The touch screen 190 may transmit, to the touch screen controller 195, an analog signal that corresponds to the sequential movements of the at least one touch input.
Throughout the specification, the term ‘touch input’ is not limited to an input by a contact between the touch screen 190 and the body part of the user or the touchable input unit, and may include a contactless input (e.g., when a gap between the touch screen 190 and the body part is equal to or less than 1 mm). A gap that is detectable by the touch screen 190 may be changed depending on a performance or a structure of the display apparatus 100.
The touch screen 190 may be formed as a resistive touch screen, a capacitive touch screen, an infrared touch screen, or an ultrasound wave touch screen.
The touch screen controller 195 may convert the analog signal, which is received from the touch screen 190, into a digital signal (e.g., X and Y coordinates) and may transmit the digital signal to the control unit 110. The control unit 110 may control the touch screen 190 using the digital signal transmitted from the touch screen controller 195. For example, the control unit 110, in response to the touch input, may select an application execution icon (not shown) displayed on the touch screen 190 or may execute an application. The touch screen controller 195 may be included in the touch screen 190 or the control unit 110.
As illustrated in
In the present embodiment, each of the UI objects 211 through 226 may be formed as at least one of a widget, an application, an executed screen, a menu button, a function key, and an application execution icon.
A size of a UI object included in a UI may not be changed, as shown in first UI objects 311-1 through 311-3 of
Alternatively, the size of the UI object included in the UI may vary according to a plurality of phases, as shown in second through sixth UI objects 312-1 through 316-3 of
However, one or more embodiments are not limited to the embodiment of
According to the embodiment of
The UI object may be an icon such as an application execution icon 411-1 of
When the display apparatus again receives an input corresponding to a command of enlarging the UI object, the widget 411-2 of
In another example, the display apparatus may sequentially change an icon corresponding to a music reproduction application with a size of 1×1 to a widget with a size of 4×1 which displays information about a reproduced music file and then to a widget with a size of 4×2 which displays the information about the reproduced music file and information about a reproduction list.
The embodiment of
As illustrated in (a) of
Referring to (b) of
As illustrated in (b) of
Referring to (a) of
Referring to (b) of
As illustrated in (b) of
As illustrated in (a) of
When the display apparatus 500 receives the touch input in the second direction 532 or the third direction 533, the control unit may scroll the touch screen on which the UI is displayed or may move a display target page from among pages included in the UI, according to the touch input. Referring to (b) of
As illustrated in (a) of
As illustrated in (b) of
Also, as illustrated in (b) of
As illustrated in
As illustrated in (b) of
As illustrated in (b) of
As illustrated in (a) of
As illustrated in (b) of
According to the present embodiment, the touch screen may simultaneously display all of the UI objects 510 included in the UI. As illustrated in (b) of
According to another embodiment, the touch screen may display a plurality of taps that correspond to the categories, e.g., the SNS category 541, the media-related category 542, the call-related category 543, and the game-related category 544, respectively. When one of the taps is selected, the control unit may control the touch screen to display only the UI objects 510 included in the selected tap.
The display apparatus 1100 may include a touch screen 1110 and a control unit 1120.
The touch screen 1110 may display a UI including a plurality of UI objects and may receive a user input.
The control unit 1120, in response to the user input, may determine preset sizes corresponding to the UI objects, respectively, and a plurality of pieces of information to be displayed on the UI objects, respectively. Also, the control unit 1120 may control the touch screen 1110 to display the UI objects with the preset sizes and the plurality of pieces of information.
According to the present embodiment, the control unit 1120 may locate the UI objects on the UI, according to the preset sizes. Also, when sizes of the UI objects are enlarged or reduced, the control unit 1120 may enlarge or reduce a size of the UI.
According to another embodiment, the control unit 1120 may locate the UI objects, based on categories of the UI objects.
First, the display apparatus may display the UI on a touch screen of the display apparatus (S1210). The UI includes a plurality of UI objects.
Then, the display apparatus may receive a user input via the touch screen (S1220). In the present embodiment, the user input may be a touch input by a user dragging in a preset direction.
Afterward, the display apparatus may determine, in response to the user input, preset sizes corresponding to the UI objects, respectively, and a plurality of pieces of information to be displayed on the UI objects, respectively (S1230).
The display apparatus may display the determined information on the UI object with the preset size (S1240). In the present embodiment, the display apparatus may locate the UI objects on the UI, according to the preset sizes, and then may display the UI on which the UI objects are located.
In another embodiment, the display apparatus may locate the UI objects, based on one or more categories of the UI objects that the UI objects belong to. Afterward, the display apparatus may display the located UI objects.
The one or more embodiments of the exemplary embodiments may be embodied in a recording medium, e.g., as a program module to be executed in computers, which includes computer-readable commands. A computer storage medium may include any usable medium that may be accessed by computers, volatile and non-volatile media, and detachable and non-detachable media. Also, the computer storage medium may include both a computer storage medium and a communication medium. The computer storage medium includes all of volatile and non-volatile media, and detachable and non-detachable media, which are designed to store information including computer readable commands, data structures, program modules or other data. The communication medium includes computer-readable commands, a data structure, a program module, and other transmission mechanisms, and includes other information transmission media.
The present invention may, however, be embodied in many different forms and should not be construed as being limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the exemplary embodiments to those of ordinary skill in the art. For example, configuring elements that are singular forms may be executed in a distributed fashion. Also, configuring elements that are distributed may be combined and then executed.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope as defined by the following claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2013-0106302 | Sep 2013 | KR | national |
This application claims priority from U.S. Provisional Patent Application No. 61/805,632, filed on Mar. 27, 2013, in the U.S. Patent and Trademark Office, and Korean Patent Application No. 10-2013-0106302, filed on Sep. 4, 2013, in the Korean Intellectual Property Office, the disclosures of which are incorporated herein in their entireties by reference.
Number | Name | Date | Kind |
---|---|---|---|
5666502 | Capps | Sep 1997 | A |
6008809 | Brooks | Dec 1999 | A |
6397337 | Garrett et al. | May 2002 | B1 |
7512400 | Starbuck et al. | Mar 2009 | B2 |
7657849 | Chaudhri et al. | Feb 2010 | B2 |
8046721 | Chaudhri et al. | Oct 2011 | B2 |
8351897 | Shin et al. | Jan 2013 | B2 |
8411046 | Kruzeniski et al. | Apr 2013 | B2 |
9459788 | Kang | Oct 2016 | B2 |
20050149879 | Jobs et al. | Jul 2005 | A1 |
20050243979 | Starbuck et al. | Nov 2005 | A1 |
20060167861 | Arrouye et al. | Jul 2006 | A1 |
20060229097 | Flynt et al. | Oct 2006 | A1 |
20070099642 | Jin et al. | May 2007 | A1 |
20070150842 | Chaudhri et al. | Jun 2007 | A1 |
20080020803 | Rios et al. | Jan 2008 | A1 |
20080052717 | Lee | Feb 2008 | A1 |
20080166993 | Gautier et al. | Jul 2008 | A1 |
20080168368 | Louch et al. | Jul 2008 | A1 |
20080320033 | Koistinen et al. | Dec 2008 | A1 |
20090013275 | May et al. | Jan 2009 | A1 |
20090013282 | Mercer | Jan 2009 | A1 |
20090019120 | Muguda | Jan 2009 | A1 |
20090083847 | Fadell et al. | Mar 2009 | A1 |
20090094339 | Allen et al. | Apr 2009 | A1 |
20090240647 | Green et al. | Sep 2009 | A1 |
20090241072 | Chaudhri et al. | Sep 2009 | A1 |
20090327977 | Bachfischer | Dec 2009 | A1 |
20100023892 | Rakesh et al. | Jan 2010 | A1 |
20100070898 | Langlois et al. | Mar 2010 | A1 |
20100088634 | Tsuruta et al. | Apr 2010 | A1 |
20100146451 | Jun-Dong et al. | Jun 2010 | A1 |
20100211872 | Rolston et al. | Aug 2010 | A1 |
20100229115 | Augustine et al. | Sep 2010 | A1 |
20100269040 | Lee | Oct 2010 | A1 |
20100306705 | Nilsson | Dec 2010 | A1 |
20100313156 | Louch et al. | Dec 2010 | A1 |
20110016390 | Oh | Jan 2011 | A1 |
20110041101 | Choi | Feb 2011 | A1 |
20110047134 | Zhang et al. | Feb 2011 | A1 |
20110093812 | Fong | Apr 2011 | A1 |
20110126156 | Krishnaraj et al. | May 2011 | A1 |
20110154290 | Kelly | Jun 2011 | A1 |
20110167387 | Stallings et al. | Jul 2011 | A1 |
20110175930 | Hwang et al. | Jul 2011 | A1 |
20110202872 | Park | Aug 2011 | A1 |
20110258581 | Hu | Oct 2011 | A1 |
20110300831 | Chin | Dec 2011 | A1 |
20110316884 | Giambalvo et al. | Dec 2011 | A1 |
20120005569 | Roh | Jan 2012 | A1 |
20120005577 | Chakra et al. | Jan 2012 | A1 |
20120023573 | Shi | Jan 2012 | A1 |
20120084734 | Wilairat | Apr 2012 | A1 |
20120117599 | Jin et al. | May 2012 | A1 |
20120129496 | Park et al. | May 2012 | A1 |
20120131471 | Terlouw et al. | May 2012 | A1 |
20120164971 | Choi et al. | Jun 2012 | A1 |
20120174042 | Chang et al. | Jul 2012 | A1 |
20120179969 | Lee et al. | Jul 2012 | A1 |
20120190408 | Ferren | Jul 2012 | A1 |
20120210253 | Luna et al. | Aug 2012 | A1 |
20120210266 | Jiang et al. | Aug 2012 | A1 |
20120256863 | Zhang et al. | Oct 2012 | A1 |
20120272338 | Falkenburg et al. | Oct 2012 | A1 |
20120289287 | Kokubo | Nov 2012 | A1 |
20120290972 | Yook et al. | Nov 2012 | A1 |
20120297298 | Dovey et al. | Nov 2012 | A1 |
20120297304 | Maxwell | Nov 2012 | A1 |
20120309433 | Jeong et al. | Dec 2012 | A1 |
20120324357 | Viegers et al. | Dec 2012 | A1 |
20130042191 | Kim et al. | Feb 2013 | A1 |
20130047119 | Lee | Feb 2013 | A1 |
20130052993 | Kwon et al. | Feb 2013 | A1 |
20130053105 | Lee et al. | Feb 2013 | A1 |
20130054548 | Fosback et al. | Feb 2013 | A1 |
20130063452 | Ali et al. | Mar 2013 | A1 |
20130063479 | Butlin et al. | Mar 2013 | A1 |
20130067376 | Kim et al. | Mar 2013 | A1 |
20130083210 | Beckham et al. | Apr 2013 | A1 |
20130091468 | Xie | Apr 2013 | A1 |
20130139109 | Kim | May 2013 | A1 |
20130232256 | Lee et al. | Sep 2013 | A1 |
20140298226 | Jin | Oct 2014 | A1 |
20170090738 | Kurtz et al. | Mar 2017 | A1 |
Number | Date | Country |
---|---|---|
2 112 583 | Oct 2009 | EP |
2230623 | Sep 2010 | EP |
2 533 140 | Dec 2012 | EP |
2 551 762 | Jan 2013 | EP |
2971069 | Aug 2012 | FR |
3135104 | Feb 2001 | JP |
2004-191642 | Jul 2004 | JP |
2012-181847 | Sep 2012 | JP |
10-0683483 | Feb 2007 | KR |
10-2007-0115622 | Dec 2007 | KR |
10-2007-0120368 | Dec 2007 | KR |
10-2010-0027689 | Mar 2010 | KR |
10-2010-0114779 | Oct 2010 | KR |
10-2010-0134234 | Dec 2010 | KR |
10-2011-0011226 | Feb 2011 | KR |
10-2011-0026811 | Mar 2011 | KR |
10-2012-0006805 | Jan 2012 | KR |
10-2012-0054837 | May 2012 | KR |
10-2012-0126161 | Nov 2012 | KR |
10-2012-0131906 | Dec 2012 | KR |
10-2013-0024074 | Mar 2013 | KR |
10-2013-0024346 | Mar 2013 | KR |
2347258 | Feb 2009 | RU |
2363039 | Jul 2009 | RU |
2408923 | Jan 2011 | RU |
2010110613 | Sep 2010 | WO |
2010144331 | Dec 2010 | WO |
2012032180 | Mar 2012 | WO |
2013022849 | Feb 2013 | WO |
Entry |
---|
Communication dated Sep. 25, 2014, issued by the European Patent Office in counterpart European Application No. 14161980.9. |
Communication dated Oct. 9, 2014, issued by the European Patent Office in counterpart European Application No. 14161998.1. |
Communication dated Oct. 10, 2014, issued by the European Patent Office in counterpart European Application No. 14161616.9. |
Communication dated Nov. 5, 2014, issued by the European Patent Office in counterpart European Application No. 14161672.2. |
Communication dated Nov. 6, 2014, issued by the European Patent Office in counterpart European Application No. 14161739.9. |
“Private Desktop, Product Information”, Tropical Software Website, Jul. 1, 2012, 2 pages total, XP055142907. |
“Private Desktop, Screen Shots”, Tropical Software Website, Jul. 1, 2012, 5 pages total, XP 055142903. |
J R Raphael, “Android Power Tip : Put Your Dialer on Your Home Screen (or Lock Screen)”, Jan. 15, 2013, 6 pages total, XP 55143362. |
Communication dated Oct. 6, 2016 issued by the Russian Patent Office in counterpart Russian Patent Application No. 2015145945/08. |
Communication dated Oct. 6, 2016 issued by the European Patent Office in counterpart European Patent Application No. 14161980.9. |
Gina Trapani et al; “Seven Easy Ways to Integrate Your Google Apps”; Sep. 9, 2009; XP055305805; 6 pgs. total. |
Communication dated Aug. 28, 2014 issued by the European Patent Office in counterpart European Patent Application No. 14161672.2. |
Communication dated Jun. 26, 2014 issued by the International Searching Authority in counterpart International Patent Application No. PCT/KR2014/002472. |
Alexandra Chang; “Up Close with iOS 5: New Gestures”; Macworld; Oct. 14, 2011; 4 total pages; XP055135715. |
“Overview of webOS—User Interface”; HP webOS Developer Center; Jan. 13, 2012; 4 total pages; XP055135739. |
“HP/Palm WebOS—Multi-tasking Made Easy, Featuring the Palm Pre Plus”; Mar. 19, 2010; 1 total page; XP054975489. |
Communication dated Jul. 3, 2014 issued by the International Searching Authority in counterpart International Patent Application No. PCT/KR2014/002481. |
Communication dated Jul. 2, 2014 issued by the International Searching Authority in counterpart International Patent Application No. PCT/KR2014/002464. |
Communication dated Jul. 2, 2014 issued by the International Searching Authority in counterpart International Patent Application No. PCT/KR2014/002444. |
Communication dated Jul. 2, 2014 issued by the International Searching Authority in counterpart International Patent Application No. PCT/KR2014/002443. |
Communication dated Aug. 26, 2014 issued by the European Patent Office in counterpart European Patent Application No. 14161621.9. |
Communication dated Jul. 3, 2014 issued by the International Searching Authority in counterpart International Patent Application No. PCT/KR2014/002489. |
Communication dated Jul. 2, 2014 issued by the International Searching Authority in counterpart International Patent Application No. PCT/KR2014/002462. |
Communication dated Feb. 8, 2017, issued by the Federal Service on Intellectual Property in counterpart Russian Patent Application No. 2015145969. |
Communication dated Feb. 21, 2017, issued by the European Patent Office in counterpart European Patent Application No. 14161998.1. |
Communication dated May 23, 2017, issued by the Russian Patent Office in counterpart Russian Application No. 2015145945. |
Number | Date | Country | |
---|---|---|---|
20140298226 A1 | Oct 2014 | US |
Number | Date | Country | |
---|---|---|---|
61805632 | Mar 2013 | US |