The present application is related to and claims priority from and the benefit under 35 U.S.C. §119(a) of Korean Patent Application No. 10-2013-0160954, filed on Dec. 23, 2013, which is hereby incorporated by reference for all purposes as if fully set forth herein.
The present disclosure relates generally to an object processing method, and more particular to a method and an apparatus for processing an object provided through a display.
An electronic device is an input means and can include, for example, a touch panel installed in a screen. Further, the electronic device detects a touch input by a user through a touch screen (for example, the screen equipped with the touch panel) and recognizes a location on the touch screen corresponding to the touch input. The electronic device processes an object exiting on the recognized location and executes, for example, a function corresponding to the object (for example, a function of the electronic device or an application function).
A function executed in an electronic device may not be a function which a user desires. For example, hyperlinked objects can be concentrated and displaying on a webpage. At this time, an unintended object is selected and a webpage linked to the unintended object is executed (for example, displayed through a touch screen). In a method of preventing such an execution error, the electronic device enlarges and displays objects of which at least a part is included within a preset radius with a touch position (for example, a coordinate of the touch screen corresponding to a touch input) as the center. The electronic device executes a function of the electronic device corresponding to the object selected by the user from the enlarged objects. However, such a solution causes inconvenience in that even though an object which the user desires is selected, the user should select the same object again.
To address the above-discussed deficiencies, it is a primary object to provide a method and an apparatus for processing an object in which the user executes a desired function (for example, a function of the electronic device or an application function).
In a first example, a method of processing an object through an electronic device is provided. The method includes displaying a plurality of objects through a display functionally connected to the electronic device. The method also includes obtaining an input corresponding to a first object among the plurality of objects. The method further includes determining a second object related to the input among the plurality of objects. The method includes displaying execution information of a function corresponding to the first object and object information related to the second object through the display.
In second example, a method of processing an object through an electronic device is provided. The method includes obtaining an input by a user. The method also includes displaying execution information of a function corresponding to the obtained input and input information related to one or more inputs except for the obtained input through a display functionally connected to the electronic device.
In a third example, an electronic device is provided. The electronic device includes a display module. The display module includes a touch screen with a touch panel. The display module is configured to display a plurality of objects. The electronic device also includes a processor. The processor is configured to obtain an input corresponding to a first object among the objects through the touch panel. The processor is also configured to determine a second object related to the input among the objects. The processor is further configured to control the display module to display execution information of a function corresponding to the first object and object information related to the second object.
In a fourth example, an electronic device is provided. The electronic device includes a display module and a processor. The display module includes a touch screen with a touch panel. The processor is configured to obtain an input of a user through the touch panel, control the display module to display execution information of a function corresponding to the obtained input, and input information related to one or more second inputs except for the obtained input.
Various embodiments of the present disclosure may provide a method in which a user can execute a desired function, and an electronic device implementing the same. Various embodiments of the present disclosure may provide a method in which the user can cancel an executed function and execute another function through object information displayed through a display, and an electronic device implementing the same. Various embodiments of the present disclosure may provide a method in which the user can cancel an executed function and execute another function through input information displayed through a display, and an electronic device implementing the same.
Before undertaking the DETAILED DESCRIPTION below, it may be advantageous to set forth definitions of certain words and phrases used throughout this patent document: the terms “include” and “comprise,” as well as derivatives thereof, mean inclusion without limitation; the term “or,” is inclusive, meaning and/or; the phrases “associated with” and “associated therewith,” as well as derivatives thereof, may mean to include, be included within, interconnect with, contain, be contained within, connect to or with, couple to or with, be communicable with, cooperate with, interleave, juxtapose, be proximate to, be bound to or with, have, have a property of, or the like; and the term “controller” means any device, system or part thereof that controls at least one operation, such a device may be implemented in hardware, firmware or software, or some combination of at least two of the same. It should be noted that the functionality associated with any particular controller may be centralized or distributed, whether locally or remotely. Definitions for certain words and phrases are provided throughout this patent document, those of ordinary skill in the art should understand that in many, if not most instances, such definitions apply to prior, as well as future uses of such defined words and phrases.
For a more complete understanding of the present disclosure and its advantages, reference is now made to the following description taken in conjunction with the accompanying drawings, in which like reference numerals represent like parts:
An electronic apparatus according to the present disclosure is an apparatus having a communication function. For example, the electronic device is at least one of a smart phone, a tablet Personal Computer (PC), a mobile phone, a video phone, an electronic-boot (e-book) reader, a desktop PC, a laptop PC, a netbook computer, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), an MP3 player, a mobile medical appliance, an electronic bracelet, an electronic necklace, an electronic accessory, a camera, a wearable device, an electronic clock, a wrist watch, home appliances, such as a refrigerator, an air conditioner, a cleaner, an oven, a microwave oven, a washing machine, an air cleaner, and the like, an artificial intelligence robot, a television, a Digital Video Disk (DVD) player, an audio player, various medical appliances, such as a Magnetic Resonance Angiography (MRA) device, a Magnetic Resonance Imaging (MRI) device, a Computerized Tomography (CT) device, an ultrasonography device and the like, a navigation device, a Global Positioning System (GPS) receiver, an Event Data Recorder (EDR), a Flight Data Recorder (FDR), a set-top box, a Television (TV) box, such as HomeSync™ of SAMSUNG Electronics, Co., Apple TV™ of APPLE, Co., and Google TV™ of Google, Co., an electronic dictionary, an infotainment device for a vehicle, an electronic equipment for a ship, such as a navigation device, a gyrocompass, etc., an avionic device, a security device, an electronic cloth, an electronic key, a camcorder, a game console, a Head-Mounted Display (HMD) unit, a flat panel display device, an electronic frame, an electronic album, a piece of furniture having a communication function and/or a part of a building/structure, an electronic board, an electronic signature receiving device, and a protector. It is obvious to those skilled in the art that the electronic device according to the present disclosure is not limited to the aforementioned devices.
Referring to
The bus 110 is a circuit for interconnecting elements described above and for allowing a communication, such as by transferring a control message between the elements described above.
The processor 120 receives commands from the above-mentioned other elements, such as the memory 130, the user input module 140, the display module 150, and the communication module 160, through, for example, the bus 110, deciphers the received commands, and performs operations and/or data processing according to the deciphered commands.
The memory 130 stores commands received from the processor 120 and/or other elements, such as the user input module 140, the display module 150, and the communication module 160, and/or commands and/or data generated by the processor 120 and/or other elements. The memory 130 includes programming modules, such as a kernel 131, middleware 132, an Application Programming Interface (API) 133, and an application 134. Each of the programming modules described above can be configured by software, firmware, hardware, and/or combinations of two or more thereof.
The kernel 131 controls and/or manages system resources, such as the bus 110, the processor 120 or the memory 130, used for execution of operations and/or functions implemented in other programming modules, such as the middleware 132, the API 133, and/or the application 134. Further, the kernel 131 provides an interface through which the middleware 132, the API 133, and/or the application 134 can access and then control and/or manage an individual element of the electronic apparatus 100.
The middleware 132 performs a relay function which allows the API 133 and/or the application 134 to communicate with and exchange data with the kernel 131. Further, in relation to operation requests received from at least one of an application 134, the middleware 132 performs load balancing in relation to the operation requests by, for example, giving a priority in using a system resource, such as the bus 110, the processor 120, and/or the memory 130, of the electronic apparatus 100 to at least one application from among the at least one of the application 134.
The API 133 is an interface through which the application 134 controls a function provided by the kernel 131 and/or the middleware 132, and can include, for example, at least one interface or function for file control, window control, image processing, and/or character control.
The user input module 140 receives, for example, a command and/or data from a user, and transfers the received command and/or data to the processor 120 and/or the memory 130 through the bus 110. The display module 150 displays an image, a video, and/or data to a user.
The communication module 160 establishes a communication between the electronic apparatus 100 and another electronic devices 102 and 104 and/or a server 164. The communication module 160 supports short range communication protocols, such as a Wireless Fidelity (WiFi) protocol, a BlueTooth (BT) protocol, and a Near Field Communication (NFC) protocol, communication networks, such as Internet, Local Area Network (LAN), Wire Area Network (WAN), a telecommunication network, a cellular network, and a satellite network, or a Plain Old Telephone Service (POTS), or any other similar and/or suitable communication networks, such as network 162, or the like. Each of the electronic devices 102 and 104 can be a same type and/or different types of electronic apparatus.
A hardware 200 can be, for example, the electronic apparatus 100 illustrated in
The processor 210 includes at least one Application Processor (AP) 211 and/or at least one Communication Processor (CP) 213. The processor 210 can be, for example, similar to the processor 120 as illustrated in
The AP 211 executes an OS or an application program to control a plurality of hardware and/or software elements connected to the AP 211 and performs processing and calculation of various data including the multimedia data. The AP 211 can be implemented by, for example, a System on Chip (SoC). According to an embodiment, the processor 210 can further include a Graphic Processing Unit (GPU).
The CP 213 performs functions of managing a data link and/or converting a communication protocol in communication between an electronic apparatus, such as the electronic apparatus 100, including the hardware 200 and/or another electronic apparatus connected through a network to the electronic apparatus. The CP 213 can be implemented by, for example, an SoC. According to an embodiment, the CP 213 performs at least a part of a multimedia control function. The CP 213 performs identification and authentication of a terminal in a communication network by using, for example, a user identification module, such as the SIM card 214. Further, the CP 213 provides services, such as a voice communication service, a video communication service, a short message service, and a packet data service, to a user.
Further, the CP 213 controls data transmission and/or reception of the communication module 230. Although the elements including the CP 213, the power management module 295, and the memory 220 are illustrated as being separate from the AP 211 in
According to an embodiment, the AP 211 or the CP 213 loads a command and/or data received from at least one of a non-volatile memory and/or other elements connected thereto in a volatile memory and then processes the same. Further, the AP 211 or the CP 213 stores data received from and/or generated by at least one of the other elements in a non-volatile memory.
The SIM card 214 is a card implementing a SIM and is inserted in a slot formed at a particular position of an electronic apparatus. The SIM card 214 can include specific identification information, such as an Integrated Circuit Card IDentifier (ICCID), and/or subscriber information, such as an International Mobile Subscriber Identity (IMSI).
The memory 220 includes an internal memory 222 and/or an external memory 224. The memory 220 can be, for example, similar to the memory 130 as illustrated in
The communication module 230 includes a wireless communication module 231 and/or a Radio Frequency (RF) module 234. The communication module 230 can be, for example, similar to the communication module 160 as illustrated in
The RF module 234 performs data transmission/reception, for example, transmission and/or reception of an RF signal and/or a paged electronic signal. The RF module 234 includes, for example, a transceiver, a Power Amplifier Module (PAM), a frequency filter, a Low Noise Amplifier (LNA), and/or the like. Further, the RF module 234 can further include a component for transmitting and/or receiving an electromagnetic wave in a free space in a wireless and/or wired communication, for example, a conductor, a conductive wire, and/or the like.
The sensor module 240 includes, for example, at least one of a gesture sensor 240A, a gyro sensor 240B, an atmospheric pressure sensor 240C, a magnetic sensor 240D, an acceleration sensor 240E, a grip sensor 240F, a proximity sensor 240G, a Red, Green, Blue (RGB) sensor 240H, a bio-physical sensor 240I, a temperature/humidity sensor 240J, an illuminance sensor 240K, and an Ultra Violet (UV) sensor 240M. The sensor module 240 measures a physical property and/or detect an operation state of an electronic apparatus and converts the measured and/or detected information to an electric signal. Additionally/alternatively, the sensor module 240 includes, for example, an olfactory sensor, such as an E-nose sensor, an Electro MyoGraphy (EMG) sensor, an Electro EncephaloGram (EEG) sensor, an Electro CardioGram (ECG) sensor, a fingerprint sensor, or the like. The sensor module 240 may further include a control circuit for controlling at least one sensor included in the sensor module 240.
The user input module 250 includes a touch panel 252, a pen sensor 254, which may be a digital pen sensor 254, a key 256, and an ultrasonic input device 258. The user input module 250 can be, for example, the user input module 140, as illustrated in
The pen sensor 254 can be implemented, for example, in the same and/or similar method as that of receiving a user's touch input and/or by using a separate sheet for recognition. For example, a keypad and/or a touch key can be used as the key 256. The ultrasonic input device 258 is a device that identifies data by detecting a sound wave from a terminal to a microphone, such as a microphone 288, through a pen generating an ultrasonic wave signal, and can achieve wireless recognition. According to an embodiment, the hardware 200 receives a user input from an external device, such as such as a network, a computer, and/or a server connected with the communication module 230, by using the communication module 230.
The display module 260 can include a panel 262 and/or a hologram 264. The display module 260 can be, for example, similar to the display module 150 as illustrated in
The interface 270 includes, for example, a High-Definition Multimedia Interface (HDMI) 272, a Universal Serial Bus (USB) 274, a projector 276, and a D-subminiature (D-sub) 278. Additionally or alternatively, the interface 270 can include, for example, a SD drive, a Multi-Media Card (MMC), and/or an Infrared Data Association (IrDA) interface.
The audio codec 280 bilaterally converts a voice and an electrical signal to each other. The audio codec 280 converts voice information input and/or output through, for example, a speaker 282, a receiver 284, an earphone 286, and/or the microphone 288.
The camera module 291 is a device capable of photographing a still image and a moving image, and can include at least one image sensor, such as such as a front lens and/or a rear lens, an Image Signal Processor (ISP), and/or a flash LED according to an embodiment.
The power management module 295 manages power of the hardware 200. The power management module 295 can include, for example, a Power Management IC (PMIC), a charger IC, and/or a battery gauge.
The PMIC can be mounted in, for example, an IC and/or an SoC semiconductor. Charging methods are classified into a wired charging method and a wireless charging method. The charger IC charges a battery and prevents introduction of over-voltage and/or over-current from a charger. According to an embodiment, the charger IC includes a charger IC for at least one of the wired charging method and the wireless charging method. A magnetic resonance scheme, a magnetic induction scheme, and/or an electromagnetic scheme can be exemplified as the wireless charging method, and an additional circuit for wireless charging, such as a coil loop circuit, a resonance circuit, a rectifier circuit, and the like may be added.
The battery gauge measures, for example, a residual quantity of the battery 296, and a voltage, a current, and/or a temperature during the charging. The battery 296 supplies power by generating electricity, and can be, for example, a rechargeable battery.
The indicator 297 displays a specific state, for example, a booting state, a message state, and/or a charging state of the hardware 200 and/or a part of the hardware, such as the AP 211. The motor 298 converts an electrical signal into a mechanical vibration.
The hardware 200 includes a processing unit, such as a GPU for supporting a mobile TV. The processing unit for supporting a mobile TV processes media data according to a standard of Digital Multimedia Broadcasting (DMB), Digital Video Broadcasting (DVB), media flow, or the like. Each of elements of the hardware can be configured by one or more components, which may have different names according to the type of the electronic apparatus. The hardware can include at least one of the aforementioned elements and/or can further include other additional elements, and/or some of the aforementioned elements can be omitted. Further, some of the elements of the hardware according to the present disclosure can be combined into one entity, which can perform the same functions as those of the elements before the combination.
The term “module” used in the present disclosure refers to, for example, a unit including at least one combination of hardware, software, and firmware. The “module” can be interchangeably used with a term, such as unit, logic, logical block, component, and/or circuit. The “module” can be a minimum unit of an integrally configured article and/or a part thereof The “module” can be a minimum unit performing at least one function and/or a part thereof The “module” can be mechanically and/or electronically implemented. For example, the “module” can include at least one of an Application-Specific ICt (ASIC) chip, a Field-Programmable Gate Arrays (FPGA), and a programmable-logic device for performing operations which has been known and/or are to be developed hereinafter.
Referring to
The kernel 310, which can be like the kernel 131, includes a system resource manager 311 and/or a device driver 312. The system resource manager 311 can include, for example, a process manager, a memory manager, and a file system manager. The system resource manager 311 can control, allocate, and/or collect system resources. The device driver 312 can include, for example, a display driver, a camera driver, a Bluetooth driver, a shared memory driver, a USB driver, a keypad driver, a WiFi driver, and an audio driver. Further, according to an embodiment, the device driver 312 can include an Inter-Process Communication (IPC) driver (not illustrated).
The middleware 330 includes a plurality of modules implemented in advance for providing functions commonly used by the applications 370. Further, the middleware 330 provides the functions through the API 360 such that the applications 370 can efficiently use restricted system resources within the electronic apparatus. For example, as shown in
The runtime library 335 can include a library module that a compiler uses in order to add a new function through a programming language while one of the applications 370 is being executed. According to an embodiment, the runtime library 335 performs an input/output, memory management, and/or a function for an arithmetic function.
The application manager 341 manages a life cycle of at least one of the applications 370. The window manager 342 manages Graphical User Interface (GUI) resources used by a screen. The multimedia manager 343 detects formats used for reproduction of various media files, and performs encoding and/or decoding of a media file by using a codec suitable for the corresponding format. The resource manager 344 manages resources such as a source code, a memory, and a storage space of at least one of the applications 370.
The power manager 345 manages a battery and/or power, while operating together with a Basic Input/Output System (BIOS), and provides power information used for operation. The database manager 346 manages generation, search, and/or change of a database to be used by at least one of the applications 370. The package manager 347 manages installation and/or an updates of an application distributed in a form of a package file.
For example, the connectivity manager 348 manages wireless connectivity such as Wi-Fi or Bluetooth. The notification manager 349 displays and/or notifies of an event, such as an arrival message, a promise, a proximity notification, and the like, in such a way that does not disturb a user. The location manager 350 manages location information of an electronic apparatus. The graphic manager 351 manages a graphic effect which will be provided to a user, and/or a user interface related to the graphic effect. The security manager 352 provides all security functions used for system security and/or user authentication. According to an embodiment, when an electronic apparatus, such as the electronic apparatus 100, has a telephone call function, the middleware 330 further includes a telephony manager for managing a voice and/or video communication function of the electronic apparatus.
The middleware 330 generates and uses a new middleware module through various functional combinations of the aforementioned internal element modules. The middleware 330 provides modules specialized according to types of OSs in order to provide differentiated functions. Further, the middleware 330 dynamically removes some of the existing elements and/or add new elements. Accordingly, the middleware 330 excludes some of the elements described herein, further includes other elements, and/or substitute the elements with elements having a different name and performing a similar function.
The API 360, which may be similar to the API 133, is a set of API programming functions, and can be provided with a different configuration according to the OS. For example, in a case of Android or iOS, one API set is provided for each platform, and in a case of Tizen, two or more API sets are provided.
The applications 370 can include, for example, a preloaded application and/or a third party application.
At least a part of the programming module 300 can be implemented by commands stored in computer-readable storage media. When the commands are executed by at least one processor, such as the processor 210, at least one processor performs functions corresponding to the commands. The computer-readable storage media can be, for example, the memory 204. At least a part of the programming module 300 can be implemented, such as executed, by, for example, the processor 210. At least a part of the programming module 300 can include, for example, a module, a program, a routine, a set of instructions and/or a process for performing at least one function.
The titles of the aforementioned elements of the programming module, such as the programming module 300, can vary depending on the type of the OS. The programming module according to the present disclosure can include at least one of the aforementioned elements and/or can further include other additional elements, and/or some of the aforementioned elements can be omitted. The operations performed by a programming module and/or other elements according to the present disclosure can be processed through a sequential, parallel, repetitive, and/or heuristic method, and some of the operations can be omitted and/or other operations may be added.
Referring to
The processor (for example, the processor 211) analyzes information on the tap to determine a touch position (for example, a touch coordinate). The processor recognizes an object corresponding to the touch position among objects of the webpage 410. For example, the processor distinguishes the objects of the webpage 410 based on, for example, a distinguisher (for example, a delimiter or a frame), a type (for example, an icon, an image, or text), or hyperlink. The delimiter can be, for example, an arrow, a figure, or a call symbol, and the frame can be, for example, a line between texts or a box.
Further, the processor can determine an object located among other objects in an area corresponding to a touch coordinate (for example, an area closest to the touch coordinate) as an object corresponding to a touch position. The processor executes a function corresponding to the determined object (for example, a function of the electronic device or an application function). For example, the determined object can be linked to a content (for example, a downloaded previous webpage or a new webpage which has not been downloaded yet). According to an embodiment, the processor can determine whether the recognized object is the previous webpage or the new webpage with reference to information related to the corresponding webpage, for example, address information or a reference field.
According to an embodiment, when the recognized object is the previous webpage, the processor accesses a memory (for example, the memory 204) to read the previous webpage. When the recognized object is the new webpage, the processor controls a communication module (for example, the communication module 230) to download the new webpage. According to an embodiment, the processor controls the display module 260 to display information designated for loading guidance (for example, a white image) during a time for which the webpage is loaded (for example, a reading time or a downloading time). According to an embodiment, the loading guidance information may not be displayed. For example, a target to be displayed can be changed from the webpage 410 to another webpage without displaying the loading guidance information.
According to any embodiment, the processor controls the display to display candidate lists for a designated time (for example, the loading time). According to an embodiment, the candidate lists can include one or more objects close to the recognized object. For example, the processor determines an area configured based on the touch coordinate as an area for determining the candidate lists (hereinafter, referred to as a “touch area” for convenience of the description). Further, the processor can determine an object existing within the touch area (for example, a case where at least a part of the object exists within the touch area or the object is completely included within the touch area) as a candidate to be included in the candidate lists.
Referring to
According to an embodiment, the candidate list 430 can be displayed together with the execution information corresponding to the recognized object 431 (for example, the webpage 440 displayed through the display). For example, the candidate list 430 can be displayed together with the execution information from a time point when the execution information is displayed on the display. Alternatively, the candidate list 430 can be displayed regardless of the displaying of the execution information corresponding to the recognized object 431. For example, the candidate list 430 can be displayed in advance before the execution information is displayed. Alternatively, the execution information can be first displayed and the candidate list 430 may be displayed based on a new input (for example, a designated touch input or hovering input).
According to an embodiment, the display can display the recognized object 431 with emphasis so that the recognized object 431 is distinguished from other objects (for example, a deep ground color as illustrated and corresponding text in bold type). Further, the display can display the objects of the candidate list 430 after enlarging the objects to make it larger than before. Further, the display can display the objects of the candidate list 430 such that an interval between the objects is further separated from each other than before. The user 420 can perform a touch input on at least one (for example, the candidate object 432) of the candidate objects of the candidate list 430. Then, the processor can recognize the candidate object 432 corresponding to the touch input among the candidate objects 432, 433, 434, 435, and 436.
Referring to
Referring to
According to any embodiment, when the user input is not recognized for a designated time (for example, a loading time) in a state where the candidate list 430 is displayed, the processor can control to terminate the displaying of the candidate list 430 and display only the webpage 450.
According to an embodiment, the termination button 433 can be inserted into the candidate list 430 based on the displaying of the candidate list 430 and be provided to the user together with the candidate list 430. According to another embodiment, the termination button 433 may not be displayed in the candidate list 430 and then can be displayed in the candidate list 430 based on a new user input when the new user input (for example, an input touching the candidate list 430 or a hovering input related to the candidate list 430) is obtained.
Referring to
Referring back to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
According to any embodiment, the object can be a text input box 730 illustrated in
Referring to
Referring to
Referring to
Referring to
In Table 1, the finger down can be a gesture in which an object (for example, a finger) contacts a touch screen, the movement can be a gesture in which an object moves in a state where the object contacts a touch screen, and a finger up can be a gesture in which a contact of an object is released from a touch screen. Alternatively, in Table 1, the finger down can be a gesture in which an object is close to a touch screen within a preset distance, the movement can be a gesture in which an object moves in a state the object is close to a touch screen within a preset distance, and the finger up can be a gesture in which an object escapes from a touch screen by a preset distance.
Referring to
Referring to
There are a variety of user gestures which can be recognized by the processor 221. For example, referring to
According to this disclosure, when the electronic device recognizes an object selected by the user from among other objects, the electronic device executes a function of the recognized object and displays a candidate list. The candidate list can include all objects which have not been selected. Further, the electronic device can determine only some of the objects which have not been selected as candidates and display the determined objects.
According to this disclosure, the electronic device recognizes a user gesture, executes a function of the recognized gesture, and displays information (for example, an icon) indicating the candidate gesture. The electronic device can determine all gestures which can be recognized in a target to be displayed (for example, a webpage) as candidates. Alternatively, the electronic device can determine a gesture related to the recognized gesture among all the gestures as a candidate.
The processor (for example, the processor 211) selects a candidate object from the other objects and selects a candidate gesture from the gestures based on at least one of a touch position, history information, sensitivity, and frequency shown in Table 2 below.
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
In an embodiment, a method includes displaying a plurality of objects through a display functionally connected to the electronic device. The method also includes obtaining an input corresponding to a first object among the plurality of objects. The method further includes determining a second object related to the input among the plurality of objects. The method includes displaying execution information of a function corresponding to the first object and object information related to the second object through the display.
The determining of the second object can include determining a touch area related to the input and selecting an object of which at least a part is displayed in the touch area as the second object.
The displaying of the execution information and the object information can include simultaneously displaying the execution information and the object information. Alternatively, the displaying of the execution information and the object information can include displaying the execution information. The method can also include obtaining a designated user input related to the display. The method can further include displaying the object information based on the designated user input. Alternatively, the displaying of the execution information and the object information can include displaying object information related to the first object.
The method can further comprise canceling an execution of the function corresponding to the first object in response to an input corresponding to the object information related to the second object.
The method can further comprise obtaining a second input corresponding to the object information related to the second object and displaying execution information related to a function corresponding to the second input.
The method can further comprise terminating the displaying of the object information when a preset time elapses. The preset time can include a loading time for which data for the execution of the function is loaded. The loading time can include a time for which the data is read from a memory or a time for which the data is downloaded from an external device. While the data is loaded, designated information for loading guidance is displayed together with the object information.
The displaying of the execution information and the object information can include determining one or more objects as a candidate object from the plurality objects except for the first object and determining one or more second inputs except for the input as a candidate input and displaying input information related to the candidate input and the candidate object. The determining of the candidate input can include determining one or more inputs related to the input as the candidate input based on sub inputs of the input.
The determining of the second object can include determining a touch position of a touch screen corresponding to the input. The method can also include determining a preset area with the touch position as a center as the touch area. The method can further include determining an object of which at least a part exists within the touch area as a candidate object.
In an embodiment, a method can include obtaining an input by a user. The method can also include displaying execution information of a function corresponding to the obtained input and input information related to one or more inputs except for the obtained input through a display functionally connected to the electronic device.
In an embodiment, an electronic device can include a display module displaying a plurality of objects. The electronic device can also include a touch panel installed in a touch screen of the display module. The electronic device can further include a processor. The processor obtains an input corresponding to a first object among the objects through the touch panel, determines a second object related to the input among the objects, and controls the display module to display execution information of a function corresponding to the first object and object information related to the second object.
The processor can determine a touch area related to the input and select an object of which at least a part is displayed in the touch area as the second object.
The processor can cancel an execution of the function corresponding to the first object in response to an input corresponding to the object information related to the second object.
The processor can obtain a second input corresponding to the object information related to the second object and control the display module to display execution information of a function corresponding to the second input.
In an embodiment, an electronic device can include a display module including a touch screen with a touch pane. The electronic device can also include a processor configured to obtain an input of a user through the touch panel and control the display module to display execution information of a function corresponding to the obtained input and input information related to one or more inputs except for the obtained input.
The method according to this disclosure as described above can be implemented as a program command which can be executed through various computers and recorded in a computer-readable recording medium. The recording medium can include a program command, a data file, and a data structure. Further, the program command can be specially designed and configured for the present disclosure or may be used after being known to those skilled in computer software fields. The recording medium can include magnetic media such as a hard disk, a floppy disk and a magnetic tape, optical media such as a Compact Disc Read-Only Memory (CD-ROM) and a Digital Versatile Disc (DVD), magneto-optical media such as a floptical disk, and hardware devices such as a Read-Only Memory (ROM), a Random Access Memory (RAM) and a flash memory. In addition, the program instructions may include high class language codes, which can be executed in a computer by using an interpreter, as well as machine codes made by a compiler.
Although the present disclosure has been described with an exemplary embodiment, various changes and modifications may be suggested to one skilled in the art. It is intended that the present disclosure encompass such changes and modifications as fall within the scope of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2013-0160954 | Dec 2013 | KR | national |