This application claims the benefit under 35 U.S.C. §119(a) of a Korean patent application filed on Dec. 13, 2013 in the Korean Intellectual Property Office and assigned Serial number 10-2013-0155240, the entire disclosure of which is hereby incorporated by reference.
The present disclosure relates to a method of displaying a touch indicator and an electronic device thereof.
In general, electronic devices of various types, such as smart phones or tablet Personal Computers (PCs), may perform two-way communication with at least one or more other electronic devices through several communication networks such as a mobile communication network, a Wi-Fi communication network, and a BlueTooth (BT) communication network. For example, a first user having a first electronic device and a second user having a second electronic device may share one object in real time through two-way communication. In addition, the first electronic device may be set to a master terminal and the second electronic device may be set to a slave terminal. Alternatively, the first electronic device may be set to a slave terminal and the second electronic device may be set to a master terminal.
The first user and the second user may perform various object co-production operations in real time, such that the first user edits the shared object using the master terminal and the second user edits the shared object using the slave terminal.
The above information is presented as background information only to assist with an understanding of the present disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the present disclosure.
Aspects of the present disclosure are to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present disclosure is to provide a method of displaying a touch indicator to acquire object touch information of another electronic device and display an indicator which may differentiate a surface touch and a hovering touch while each of electronic devices of various types, such as smart phones or tablet PCs, co-produces an object with the another electronic device through two-way communication and an electronic device thereof.
In accordance with an aspect of the present disclosure, an operation method of an electronic device is provided. The operation method includes transmitting an object to be co-produced to another electronic device and sharing the object with the other electronic device, acquiring touch information of the object from a message received from the another electronic device, classifying a surface touch and a hovering touch of the object based on the touch information, and displaying an indicator which may differentiate the surface touch and the hovering touch.
In accordance with another aspect of the present disclosure, an electronic device is provided. The electronic device includes a communication module, a touch screen panel configured to detect a surface touch and a hovering touch, and a processor configured to control the communication module and the touch screen panel, wherein the processor transmits an object to be co-produced to another electronic device and shares the object with the another electronic device, acquires touch information of the object from a message received from the other electronic device, classifies a surface touch and a hovering touch of the object based on the touch information, and displays an indicator which may differentiate the surface touch and the hovering touch.
In accordance with another aspect of the present disclosure, a computer readable medium which stores one or more programs including instructions for allowing an electronic device to transmit an object to be co-produced to another electronic device and share the object with the another electronic device, acquire touch information of the object from a message received from the another electronic device, classify a surface touch and a hovering touch of the object based on the touch information, and display an indicator which may differentiate the surface touch and the hovering touch.
In accordance with another aspect of the present disclosure, operation method of editing an object using an electronic device is provided. The method includes displaying at least a part of an object on a touch screen, receiving, from another electronic device, touch information relating to the object, determining, based on the received touch information relating to the object, at least one of a location on the object at which the object is being edited on the other electronic device and a type of input to the object being made on the other electronic device, and displaying an indicator overlaid with the at least the part of the object indicating so as to indicate the at least one of the location on the object at which the object is being edited on the other electronic device and the type of input to the object being made on the other electronic device.
Other aspects, advantages, and salient features of the disclosure will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses various embodiments of the present disclosure.
The above and other aspects, features, and advantages of certain embodiments of the present disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
Throughout the drawings, it should be noted that like reference numbers are used to depict the same or similar elements, features, and structures.
The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of various embodiments of the present disclosure as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the various embodiments described herein can be made without departing from the scope and spirit of the present disclosure. In addition, descriptions of well-known functions and constructions may be omitted for clarity and conciseness.
The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the present disclosure. Accordingly, it should be apparent to those skilled in the art that the following description of various embodiments of the present disclosure is provided for illustration purpose only and not for the purpose of limiting the present disclosure as defined by the appended claims and their equivalents.
It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a component surface” includes reference to one or more of such surfaces.
An electronic device according to various embodiments of the present disclosure may be a device including a communication function. For example, the electronic device may be one or a combination of one or more of various devices, such as a smart phone, a tablet PC, a mobile phone, a video phone, an e-book reader, a desktop PC, a laptop PC, a netbook computer, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), a Moving Picture Experts Group (MPEG) layer 3 (MP3) player, a mobile medical device, an electronic bracelet, an electronic necklace, electronic Appcessories, a camera, a wearable device, an electronic clock, a wristwatch, smart white appliances (e.g., a refrigerator, an air conditioner, a cleaner, a cybot, a TV, a Digital Versatile Disc (DVD) player, an audio, an oven, a microwave oven, a washing machine, an air cleaner, an electronic picture frame, and/or the like), various medical devices (e.g., a Magnetic Resonance Angiography (MRA), a Magnetic Resonance Imaging (MRI), a CT (Computed Tomography), an imaging apparatus, a ultrasonic machine, and/or the like), a navigation device, a Global Positioning System (GPS) receiver, an Event Data Recorder (EDR), a Flight Data Recorder (FDR), a set-top box, a TV box (e.g., Samsung HomeSync™, Apple TV™, or Google TV™), an electronic dictionary, a car infotainment device, electronic equipment for ship (e.g., a navigation device for ship, a gyrocompass, and/or the like), avionics, a security device, electronic clothes, an electronic key, a camcorder, a game console, a Head Mounted Display (HMD), a flat panel display, an electronic album, a part of furniture or a building/structure including a communication function, an electronic board, an electronic signature receiving device, or a projector. It is obvious to a person skilled in the art that the electronic device according to various embodiments of the present disclosure is not limited to the above-described devices.
Referring to
The bus 110 may be a circuit which may connect the above-described components with each other and transmit communication (e.g., a control message) between the components.
The processor 120 may receive, for example, commands from the above-described other components (e.g., the memory 130, the user input module 140, the display module 150, the communication module 160, and/or the like) through the bus 110, decode the received commands, and perform calculation or data processing according to the decoded commands.
The memory 130 may store commands or data which are received from the processor 120 or the other components (e.g., the user input module 140, the display module 150, the communication module 160, and/or the like) or are generated by the processor 120 or the other components. The memory 130 may include programming modules such as a kernel 131, a middleware 132, an Application Programming Interface (API) 133, or an application 134. Herein, the above-described respective programming modules may be composed of software, firmware, hardware, or combination of at least two or more of them.
The kernel 131 may control or manage system resources (e.g., the bus 110, the processor 120, or the memory 130, and/or the like) used to execute an operation or function implemented in the other programming modules, for example, the middleware 132, the API 133, or the application 134.
The kernel 131 may provide an interface which may access a separate component of the electronic device 100 in the middleware 132, the API 133, or the application 134 and control or manage the separate component.
The middleware 132 may play a role as a go-between such that the API 133 or the application 134 communicates with the kernel 131 and transmits and receives data with it. In addition, the middleware 132 may perform load balancing for work requests using a method of assigning priority which may use system resources (the bus 110, the processor 120, or the memory 130, and/or the like) of the electronic device 100 to, for example, at least one of the plurality of applications 134, in association with the work requests received from the plurality of applications 134.
The API 133 is an interface in which the application 134 may control a function provided from the kernel 131 or the middleware 132. For example, the API 133 may include at least one interface or function for file control, window control, image processing, or text control.
The user input module 140 may receive, for example, commands or data from the user and transmit the received commands or data to the processor 120 or the memory 130 through the bus 110.
The display module 150 displays videos, images, or data to the user.
The communication module 160 may perform communication between another electronic device 102 and the electronic device 100. The communication module 160 may support a local-area communication protocol (e.g., Wi-Fi, BT, and Near Field Communication (NFC)), or certain network communication 162 (e.g., the Internet, a Local Area Network (LAN), a Wide Area Network (WAN), a telecommunication network, a cellular network, a satellite network, or a Plain Old Telephone Service (POTS), and/or the like). According to various embodiments of the present embodiments, the electronic device 100 may communicate with one or more of the electronic device 102, the electronic device 104, a server 164, and/or the like over the network 162. Each of the other electronic devices 102 and 104 may be the same (e.g., the same type) device as the electronic device 100 or a device (e.g., a different type) which is different from the electronic device 100.
Referring to
The processor 210 (e.g., the processor 120) may include one or more Application Processors (APs) 211 and/or one or more Communication Processors (CPs) 213. The processor 210 may be, for example, the processor 120 shown in
The AP 211 may execute an OS or an application program, control a plurality of hardware or software components connected thereto, and process and calculate various data including multimedia data. The AP 211 may be implemented as, for example, System on Chip (SoC). In accordance with an embodiment of the present disclosure, the processor 210 may further include a Graphic Processing Unit (GPU) (not shown).
The CP 213 may perform a function for managing a data link in communication between an electronic device (e.g., the electronic device 100) including the hardware 200 and other electronic devices connected with the electronic device through a network and changing a communication protocol. The CP 213 may be implemented as, for example, SoC. In accordance with an embodiment of the present disclosure, the CP 213 may perform at least a part of a multimedia control function. The CP 213 may identify and authenticate, for example, a terminal in a communication network using a SIM (e.g., the SIM card 214). In addition, the CP 213 may provide services, such as a voice communication service, a video communication service, a text message service, or a packet data service, to a user of the hardware 200.
In addition, the CP 213 may control data transmission and reception of the communication module 230. Referring to
In accordance with an embodiment of the present disclosure, the AP 211 or the CP 213 may load and process commands or data received from at least one of a non-volatile memory or another component connected thereto to a volatile memory. In addition, the AP 211 or the CP 213 may store data which are received from at least one of other components or are generated by at least one of other components in a non-volatile memory.
The SIM card 214 may be a card implementing a SIM. The SIM card 214 may be inserted into a slot formed in a specific position of the electronic device. The SIM card 214 may include unique identification information (e.g., an Integrated Circuit Card IDentity (ICCID)) or subscriber information (e.g., an International Mobile Subscriber Identity (IMSI)).
The memory 220 may include an internal memory 222 and/or an external memory 224. The memory 220 may be, for example, the memory 130 shown in
In accordance with an embodiment of the present disclosure, the internal memory 222 may have a type of a Solid State Disk (SSD). The external memory 224 may further include, for example, a Compact Flash (CF) card, a Secure Digital (SD) card, a micro-SD card, a mini-SD card, an extreme Digital (xD) card, or a memory stick, and/or the like.
The communication module 230 may include a wireless communication module 231 or a Radio Frequency (RF) module 234. The communication module 230 may be, for example, the communication module 160 shown in
Additionally or alternatively, the wireless communication module 231 may include a network interface (e.g., a LAN card), a modem, and/or the like for connecting the hardware 200 with the network (e.g., the Internet, a LAN, a WAN, a telecommunication network, a cellular network, a satellite network, or a POTS, and/or the like).
The RF module 234 may be in charge of transmitting and receiving data, for example, an RF signal or a called electronic signal. Although it is not shown in
The sensor module 240 may include, for example, at least one of a gesture sensor 240A, a gyro sensor 240B, an atmospheric pressure sensor 240C, a magnetic sensor 240D, an acceleration sensor 240E, a grip sensor 240F, a proximity sensor 240G, a Red, Green, and Blue (RGB) sensor 240H, a bio-sensor 240I, a temperature/humidity sensor 240J, an illumination sensor 240K, or a Ultra Violet (UV) sensor 240M. The sensor module 240 may measure a physical quantity or detect an operation state of the electronic device, and convert the measured or detected information into an electric signal.
Additionally or alternatively, the sensor module 240 may include, for example, an Electronic-noise (E-nose) sensor (not shown), an ElectroMyoGraphy (EMG) sensor (not shown), an ElectroEncephaloGram (EEG) sensor (not shown), an ElectroCardioGram (ECG) sensor (not shown), or a fingerprint sensor (not shown), and/or the like. The sensor module 240 may further include a control circuit for controlling at least one or more sensors included therein.
The user input module 250 may include a touch panel 252, a (digital) pen sensor 254, a key 256, an ultrasonic input device 258, and/or the like. The user input module 250 may be, for example, the user input module 140 shown in
In addition, the touch panel 252 may further include a controller (not shown). In case of the capacity type, the touch panel 252 may recognize not only direct touch input but also proximity touch input. The touch panel 252 may further include a tactile layer. If the touch panel 252 includes a tactile layer, the touch panel 252 may provide a tactile response to the user.
The (digital) pen sensor 254 may be implemented, for example, using the same or similar method as or to a method of receiving touch input of the user or using a separate sheet for recognition.
The key 256 may be, for example, a keypad or a touch key.
The ultrasonic input device 258 is a device which may detect sound waves using a microphone (e.g., the microphone 288) and verify data in the electronic device through a pen which generates ultrasonic waves. The ultrasonic input device 258 may perform wireless recognition. In accordance with an embodiment of the present disclosure, the hardware 200 may receive input of the user from an external device (e.g., the network 102 of
The display module 260 may include a panel 262 and/or a hologram 264. The display module 260 may be, for example, the display module 150 shown in
The panel 262 and the touch panel 252 may be integrated with each other to constitute one module. The hologram 264 shows stereoscopic images on the air using interference of light. In accordance with an embodiment of the present disclosure, the display module 260 may further include a control circuit for controlling the panel 262 or the hologram 264.
The interface 270 may include, for example, a High Definition Multimedia Interface (HDMI) 272, a Universal Serial Bus (USB) interface 274, a projector 276, or a D-sub (subminiature) interface 278. Additionally or alternatively, the interface 270 may include, for example, a Secure Digital/Multi-Media Card (SD/MMC) interface (not shown) or an Infrared Data Association (IrDA) interface (not shown).
The audio codec 280 may convert voices and electronic signals in a two-way direction. The audio codec 280 may convert, for example, voice information input or output through a speaker 282, a receiver 284, an earphone 286, or the microphone 288.
The camera module 291 may be a device which may capture images and videos. In accordance with an embodiment of the present disclosure, the camera module 291 may include, for example, one or more image sensors (e.g., a front lens or a rear lens) (not shown), an Image Signal Processor (ISP) (not shown), or a flash LED (not shown).
The power management module 295 may manage power of the hardware 200. Although it is not shown in
The PMIC may be mounted in, for example, an IC or an SoC semiconductor. A charging method of the power management module 295 may be classified into a wire charging method or a wireless charging method. The charger IC may charge a battery and prevent inflow of over voltage or over current from a charger.
In accordance with an embodiment of the present disclosure, the charger IC may include a charger IC for at least one of the wire charging method or the wireless charging method. The wireless charging method is, for example, a magnetic resonance method, a magnetic induction method, or an electromagnetic wave method. In the wireless charging method, additional circuits, for example, a coil loop, a resonance circuit, a rectifier, and/or the like for wireless charging may be added.
The battery fuel gauge may measure, for example, the remaining capacity of the battery 296, voltage in charging, current, or a temperature. The battery 296 may generate electricity and supply power. For example, the battery 296 may be a rechargeable battery.
The indicator 297 may indicate a specific state, for example, a booting state, a message state, a charging state, and/or the like of the hardware 200 or a part (e.g., the AP 211) of the hardware 200. The motor 298 may convert an electric signal into a mechanical vibration. A Micro Control Unit (MCU) may control the sensor module 240. Although it is not shown in
Names of the above-described components of the hardware according to an embodiment of the present disclosure may differ according to kinds of electronic devices. The hardware according to an embodiment of the present disclosure may be configured to include at least one of the above-described components. Some components of the hardware may be omitted or the hardware may further include other additional components. In addition, some of the components of the hardware according to an embodiment of the present disclosure are combined and configured as one entity. Therefore, the one device may equally perform functions of the corresponding components before some of the components are combined.
Referring to
The programming module 300 may include an OS which is implemented in hardware (e.g., the hardware 200) and controls resources related to an electronic device (e.g., the electronic device 100) or a plurality of applications (e.g., an application 370) executed in the OS. For example, the OS may be Android, iOS, Windows, Symbian, Tizen, Bada, or the like. Referring to
The kernel 310 (e.g., the kernel 131) may include a system resource manager 311 or a device driver 312. The system resource manager 311 may include, for example, a process management unit, a memory management unit, a file system management unit, and/or the like. The system resource manager 311 may control, assign, collect, and/or the like system resources. The device driver 312 may include, for example, a display driver, a camera driver, a Bluetooth driver, a shared memory driver, a USB driver, a keypad driver, a Wi-Fi driver, or an audio driver. In addition, in accordance with an embodiment of the present disclosure, the device driver 312 may include an Inter-Process Communication (IPC) driver (not shown).
The middleware 330 may include a plurality of modules which are previously implemented to provide functions the application 370 needs in common. In addition, the middleware 330 may provide functions through the API 360 such that the application 370 uses limited system resources in the electronic device efficiently.
For example, as shown in
The runtime library 355 may include, for example, a library module used by a compiler to add a new function through a programming language while the application 370 is executed. In accordance with an embodiment of the present disclosure, the runtime library 355 may perform a function for input and output, memory management, or an arithmetic function.
The application manager 341 may manage, for example, a life cycle of at least one of the applications 370.
The window manager 342 may manage Graphic User Interface (GUI) resources used on a screen of the electronic device.
The multimedia manager 343 may determine a format necessary for reproducing various media files and encode or decode a media file using a codec corresponding to the corresponding format.
The resource manager 344 may manage source codes of at least one of the applications 370, and manage resources of a memory or storage, and/or the like.
The power manager 345 may act with a Basic Input Output System (BIOS), manage a battery or a power source, and provide power information necessary for an operation.
The database manager 346 may perform a management operation to generate, search, or change a database to be used in at least one of the applications 370.
The package manager 347 may manage installation or update of an application distributed by a type of a package file.
The connectivity manager 348 may manage, for example, wireless connection such as Wi-Fi, BT, and/or the like.
The notification manager 349 may display or notify events such as an arrival message, an appointment, and proximity notification by a method which is not disturbed to the user.
The location manager 350 may manage location information of the electronic device.
The graphic manager 351 may manage a graphic effect to be provided to the user or a UI related to the graphic effect.
The security manager 352 may provide all security functions necessary for system security or user authentication, and/or the like.
In accordance with an embodiment of the present disclosure, when the electronic device (e.g., the electronic device 100) has a phone function, the middleware 330 may further include a telephony manager (not shown) for managing a voice or video communication function of the electronic device. The middleware 330 may generate and use a new middleware module through combination of various functions of the above-described internal component modules. The middleware 330 may provide a module which specializes while being classified according to kinds of OSs to provide a differentiated function.
In addition, the middleware 330 may dynamically delete some of old components or add new components. In addition, some of components described in various embodiments of the present disclosure may be omitted, other components may be further added, or components having different names for performing similar functions may be replaced.
The API 360 (e.g., the API 133) as a set of API programming functions may be provided as different components according to OSs. For example, in case of Android or iOS, one API set may be provided while being classified according to platforms. In case of Tizen, for example, two or more API sets may be provided.
The application 370 (e.g., the application 134) may include, for example, a preloaded application or a third party application. At least a part of the programming module 300 may be implemented as instructions stored in computer-readable storage media. One or more processors may perform functions corresponding to the instructions when the instructions are executed by the one or more processors (e.g., the processor 210). The application 370 may be or otherwise include one or more of a home application 371, a dialer application 372, a short messaging service (SMS)/multimedia messaging service (MMS) application 373, an instant messaging (IM) application 374, a browser application 375, a camera application 376, an alarm application 377, a contacts application 378, a voice dial application 379, an email application 380, a calendar application 381, a media player application 382, an album application 383, a clock application 384, and/or the like.
The computer-readable storage media (e.g., a non-transitory computer-readable storage medium) may be, for example, the memory 220 shown in
Names of the components of the programming module (e.g., the programming module 300) according to an embodiment of the present disclosure may differ according to kinds of OSs. In addition, the programming module according to an embodiment of the present disclosure may include at least one or more of components. Some of the components may be omitted. The programming module according to an embodiment of the present disclosure may further include additional other components.
Hereinafter, a description will be given for an operation of the present disclosure in detail with reference to the attached drawings. When it is determined that a detailed description for related well-known functions or constitutions may obscure the subject matter of the present disclosure unnecessarily in describing the present disclosure, the detailed description may be omitted. In addition, terms which will be described later are terms defined in consideration of functions in various embodiments of the present disclosure and may be changed according to intent of a user and operator or custom, and/or the like. Therefore, the definitions may be given based on content throughout the present specification.
Hereinafter, a description will be given in detail for a method of displaying a touch indicator and an electronic device thereof according to various embodiments of the present disclosure. The electronic device such as a smart phone or a tablet PC according to various embodiments of the present disclosure may include a TSP. The TSP may be a capacitive type TSP, and/or the like with strong durability, short response time, and excellent transparency.
Referring to
A transceiver which transmits a pulse signal may be formed as a lateral X-pattern in a contact surface of the ITO film 1, which comes in contact with the OCA 1. A receiver which receives the pulse signal may be formed as a longitudinal Y-pattern in a contact surface of the ITO film 2, which comes in contact with the OCA 2. As an example, referring to
Referring to
As shown in
In contrast, when there is a surface touch of the user, because some of pulse signals transmitted to the transceiver Tx of the X pattern are induced to the touched finger of the user, only the other is received in the receiver Rx of the Y pattern. A coupling voltage between the transceiver Tx and the receiver Rx may be detected as voltage which is lower than the reference voltage (e.g., V_ref=1.0V). A coupling voltage falling from the reference voltage may be referred to as various names such as a falling coupling voltage (V_fall).
Referring to
The electronic device according to an embodiment of the present disclosure may include the components shown in
The processor 210 may set an object co-production mode, control an operation of the communication module 230, perform two-way communication with at least one or more other electronic devices, share an object to be co-produced in real time, according to a request of the user, and/or the like. For example, the object may be content of various types such as drawing, coloring, and writing a document.
Referring to
The terminal 1 transmits an original object to be co-produced to the terminal 2, share the original object with the terminal 2, and co-produces an object in real time with the terminal 2 through two-way communication. For example, a user 1 using the terminal 1 may perform a surface or hovering touch on or above a specific position of an object displayed on a TSP of the terminal 1 and edit the object.
Similarly, a user 2 using the terminal 2 may perform a surface or hovering touch on or above a specific position of an object displayed on a TSP of the terminal 2 and edit the object. The terminal 1 may merge the object edited by the user 1 with the object edited by the user 2, display the merged object, transmit the merged object, and share the merged object with the terminal 2 in real time. For example, when the user 2 performs a surface or hovering touch on or above the TSP, the terminal 2 transmits touch information corresponding to the surface or hovering touch to the terminal 1. The terminal 1 receives the touch information and displays an indicator which may differentiate a surface touch or a hovering touch on the object.
According to various embodiments of the present disclosure, at least one or more of a shape, a color, and luminance of the indicator may be differently displayed to differentiate the surface touch and the hovering touch.
Referring to
In contrast, as shown in
Referring to
The processor 210 of the terminal 1 transmits an object to be co-produced to the terminal 2 of the slave, shares the object with the terminal 2 of the slave, and performs an operation co-producing the object in real time through two-way communication at operation S11.
The processor 210 of the terminal 1 receives a message transmitted from the terminal 2 through the communication module 230. The processor 210 of the terminal 1 acquires touch information of an object displayed on the terminal 2 at operation S13 and classifies a surface touch and a hovering touch of the object based on the touch information.
When the touch information is information indicating the surface touch at operation S14, the processor 210 of the terminal 1 displays an indicator 1 on a position of the object at which the surface touch is generated at operation S15. When the touch information is information indicating the hovering touch at operation S16, the processor 210 of the terminal 1 displays an indicator 2 on a position of the object at which the hovering touch is generated at operation S17. The processor 210 of the terminal 1 determines whether the co-production is ended at operation S18. The indicator 1 indicating the surface touch may be, as shown in
In addition, one or more of a shape, a color, and luminance of each of the indicators 1 and 2 may be differently displayed. The user may verify the indicator 1 and the indicator 2 and differentiate the surface touch and the hovering touch. The message transmitted from the terminal 2 may include one or more of identification information of the terminal 2, touch information of the object, and type and size information of the object. The information may be referred to as various names such as object co-production information.
The object co-production information may include, for example, as shown in
The touch position and the touch type may expressed as a coordinate (X, Y, Z) value. If the coordinate (Z) value is zero, the touch type may indicate the surface touch (e.g., Z=0). If the coordinate (Z) value is greater than zero, the touch type may indicate the hovering touch (e.g., Z>0). The touch state may indicate one or more of touch press, touch move, touch release, touch speed, multi-touch, and hovering touch depth. For example, as shown in
In addition, as shown in
The touch position and the touch type, as mandatory information, are included in touch information of the object. The touch state, as optional information, may not be included in touch information of the object. The type information of the object indicates whether the corresponding object is an original object or a partial object which is at least one of a plurality of partial objects. The size information of the object indicates a size of an original object and a size of a partial object.
For example, the size of the original object indicates a horizontal length and a vertical length. As another example, the size of the partial object may be indicated as a specific start position (start_position_(X, Y)) divided based on the original object and a horizontal width and a vertical height based on the start position. As another example, the size of the partial object may be indicated as a coordinate (X, Y, W, H) value simply.
The processor 210 of the terminal 1 transmits an object on which the first indicator or the second indicator is displayed to the terminal 2 and shares an object co-production process in real time with the terminal 2. In addition, the processor 210 of the terminal 1 may divide an original object to be co-produced into a plurality of partial objects, transmit the partial objects to other electronic devices, and share the partial objects with the other electronic devices.
Referring to
The partial object 1 may be displayed on the terminal 1 and the partial object 2 may be displayed on the terminal 2. For example, referring to
Referring to
The processor 210 of a terminal 1 merges a partial object 1 with a partial object 2 of a terminal 2 and completes an object co-production. For example, the processor 210 of the terminal 1 may merge the partial object 1 with the partial object 2 based on an area or a layer.
Referring to
Referring to
In accordance with various embodiments of the present disclosure, each of electronic device of various types, such as smart phones or tablet PCs, may predict work intent of a user who uses another electronic device by displaying an indicator which may differentiate a surface touch and a hovering touch of the another electronic device while it co-produces an object with the another electronic device through two-way communication. In accordance with various embodiments of the present disclosure, each of the electronic devices may improve efficiency of an object co-production by predicting work intent of a user who uses another electronic device and generating an alarm, a warning, and/or the like when the work intent is improper.
Methods according to claims of the present disclosure or various embodiments described in the specification of the present disclosure may be implemented as hardware, software, or combinational type of the hardware and the software.
When the method is implemented by the software, a non-transitory computer-readable storage medium for storing one or more programs (software modules) may be provided. The one or more programs stored in the non-transitory computer-readable storage medium are configured for being executed by one or more processors in an electronic device.
The one or more programs include instructions for allowing an electronic device to execute the methods according to the claims of the present disclosure and/or the various embodiments described in the specification of the present disclosure. These programs (software modules, software) may be stored in a Random Access Memory (RAM), a non-volatile memory including a flash memory, a Read Only Memory (ROM), an Electrically Erasable Programmable ROM (EEPROM), a magnetic disc storage device, a Compact Disc-ROM (CD-ROM), a Digital Versatile Disc (DVD) or an optical storage device of a different type, and a magnetic cassette.
The programs may be stored in a memory configured by combination of some or all such storage devices. In addition, the configured memory may include a plurality of memories. In addition, the programs may stored in an attachable storage device which may access an electronic device through each of communication networks such as the Internet, an intranet, a Local Area Network (LAN), a Wide LAN (WLAN), and a Storage Area Network (SAN) or a communication network configured by combination of them. This storage device may connect to the electronic device through an external port. In addition, a separate storage device on a communication network may connect to a portable electronic device.
In the above-described detailed embodiments of the present disclosure, elements included in the present disclosure were expressed as a single element or a plurality of elements according to the detailed embodiments of the present disclosure. However, the single or plural expression is selected to be suitable for conditions given for convenience of description. The present disclosure is not limited to the single element or the plurality of elements. Although there are elements expressed as a plurality of elements, they may be composed of a single element. Alternatively, although there is an element expressed as a single element, the element may be composed of a plurality of elements.
While the present disclosure has been shown and described with reference to various embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present disclosure as defined by the appended claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
10-2013-0155240 | Dec 2013 | KR | national |