1. Field of the Invention
The present invention relates generally to an electronic device and a method of controlling the same.
2. Description of the Related Art
As various kinds of electronic devices, such as smartphones, tablet personal computers (PCs), notebook computers, wearable devices, etc., are practically used, various kinds of content available to the electronic devices are being provided. For example, the electronic devices may reproduce various kinds of content, such as photographs, videos, e-books, e-mails, etc. As specifications of the electronic devices are enhanced and storage space increases, the number, size, length, etc. of content available to users are increasing. For example, a user may view hundreds to thousands of photographs, tens of videos, a number of e-books, etc. by using a smartphone. However, as the number, length, etc. of content increases, it is difficult for a user to search for desired content or a desired portion of content.
The present invention has been made to address at least the problems and disadvantages described above, and to provide at least the advantages described below.
Accordingly, an aspect of the present invention is to enable a user to easily change displayed content objects when a plurality of content objects are being displayed.
Accordingly, another aspect of the present invention is to decrease the number of manipulations by a user when the user changes content objects to be displayed.
In accordance with an aspect of the present invention, an electronic device is provided. The electronic device includes a photographing unit configured to photograph a hand including fingers, a display unit configured to display a plurality of content objects, and a control unit configured to recognize a finger gesture of the photographed hand and a distance from the electronic device to the fingers and control the display unit to change and display a range of a displayed content object according to the recognized finger gesture and the distance.
In accordance with another aspect of the present invention, an electronic device control method is provided. The method includes displaying a plurality of content objects, photographing a hand including fingers recognizing a finger gesture of the photographed hand and a distance from the electronic device to the fingers, and changing a range of a displayed content object according to the recognized finger gesture and the distance.
The above and other aspects, features, and advantages of the present invention will become apparent from the following detailed description taken in conjunction with the accompanying drawings in which:
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to like elements throughout. In this regard, the embodiments of the present invention may have different forms and should not be construed as being limited to the descriptions set forth herein. Accordingly, the embodiments of the present invention are merely described below, by referring to the figures, to explain the various aspects of the present invention. Therefore, the embodiments of the present invention described herein are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those of ordinary skill in the art.
As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items. Expressions such as “at least one of,” when preceding a list of elements, modify the entire list of elements and do not modify the individual elements of the list.
Terms used herein have been selected as general terms which are widely used at present, in consideration of the functions of the present invention. Unless otherwise defined, all terns used herein, including technical and scientific terms, have the same meaning as commonly understood by those of skill in the art to which the present invention pertains. Such terms as those defined in a generally used dictionary are to be interpreted to have the same meanings as the contextual meanings in the relevant field of art, and are not to be interpreted to have ideal or excessively formal meanings, unless clearly defined herein.
When it is described that an element comprises (or includes or has) some other elements, it should be understood that the element may comprise (or include or have) only those other elements, or may comprise (or include or have) additional elements as well as those other elements if there is no specific limitation.
The term “module”, as used herein, means, but is not limited to, a software or hardware component, such as a field programmable gate array (FPGA) or an application specific integrated circuit (ASIC), which performs certain tasks. A module may advantageously be configured to reside in the addressable storage medium and configured to execute on one or more processors. Thus, a module may include, by way of example, components, such as software components, object-oriented software components, class components, task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables. The functionality provided for in the components and modules may be combined into fewer components and modules or further separated into additional components and modules.
Various embodiments of the present invention will now be described in detail with reference to the accompanying drawings. In addition, descriptions of well-known functions and constructions are omitted for clarity.
Referring to
The electronic device 100 may be implemented as, for example, various kinds of devices, such as a smartphone, a tablet personal computer (PC), a television (TV), a wearable device, a notebook computer, an e-book terminal, a portable phone, etc.
The content object is an object representing certain content. The content object may be an object where corresponding content is reproduced when the object is selected. For example, the content object may include a thumbnail image corresponding to a still image or a moving image, an application execution icon, an object representing an e-mail, a music file icon, a contact number, etc. Alternatively, the content object may be a unit of reproduction with respect to certain content. For example, the content object may include a video frame, a table of contents or pages of e-books, a date or a schedule of a calendar function, a notice of a social network service (SNS), etc.
Changing a range of displayed content object refers to sequentially changing a range of a content object displayed on a screen. For example, a content object displayed on a screen may be changed to be in the form of a scroll or the like.
Referring to
The photographing unit 210 photographs a subject. The photographing unit 210 may include a lens, an aperture, a shutter, and an imaging device. Additionally, the electronic device may include a plurality of photographing units.
The lens may include a plurality of lens groups and a plurality of lenses. A position of the lens may be adjusted by a lens driver of the photographing unit 210. The lens driver adjusts a position of the lens to adjust a focus distance or correct shaking of a hand.
An opening/closing degree of the aperture is adjusted by an aperture driver of the photographing unit 210 to control the amount of light incident on the imaging device. The aperture driver adjusts the aperture to adjust a depth of a captured image.
An optical signal passing through the lens and the aperture is transferred to a light receiving surface of the imaging device to generate an image of a subject. The imaging device may be a charge coupled device (CCD) image sensor or a complementary metal oxide semiconductor (CIS) image sensor that converts an optical signal into an electrical signal. A sensitivity and the like of the imaging device is adjusted by an image device controller of the photographing unit 210. The imaging device controller controls the image device according to a control signal. The control signal may be automatically generated according to an image signal which is input in real time or may be manually input through manipulation by a user.
An exposure time of the imaging device is adjusted by using the shutter. The shutter may be categorized into a mechanical shutter, which moves a shade to adjust the amount of incident light, and an electronic shutter that supplies an electrical signal to the imaging device to control exposure.
Referring to
Referring to
Accordingly, the photographing units 210a and 210b may be disposed on a surface which is the same as or different from the display unit 230. In this case, the user may select which of the photographing unit 210a or 210b is to be used for photographing a hand including the fingers.
Referring to
An operation of the photographing unit 210 will be described with reference to
The photographing unit 210 may photograph a user's hand including the fingers. The photographing unit 210 may photograph various parts of the user's hand. The photographing unit 210 may perform photographing according to a current mode or a user input.
When an input for requesting photographing of a hand is received from a user while a certain function (for example, a photograph album, video reproduction, etc.) of displaying a plurality of content objects is being executed, the photographing unit 210 continuously photographs a hand including one or more fingers. The photographing unit 210 may continuously photograph the fingers at a certain frame rate. For example, the photographing unit 210 may photograph the fingers at a frame rate of 30 frames/sec, 60 frames/sec, or the like.
Alternatively, when an input for requesting photographing of a hand is received from a user while a certain function of displaying a plurality of content objects is being executed, the photographing unit 210 may photograph a hand including one or more fingers at least once, and when a finger gesture of the hand is photographed, the control unit 220 activates a sensor (for example, an infrared (IR) sensor, a proximity sensor, a depth camera, etc.) for measuring a distance from the photographing unit 210 to the one or more fingers. In this case, the control unit 220 measures the distance to one or more recognized fingers by using the sensor.
Referring to
The command to photograph a finger may be received by the photographing unit 210 by using a key input. In this case, when a key input is received in a certain function of displaying a plurality of content objects, the photographing unit 210 begins to photograph the finger. For example, when a certain key of the electronic device 100 is pressed, photographing of the finger begins, and when another key input is applied to the electronic device 100, photographing of the finger ends. As another example, photographing of a finger may be performed in a state of pressing a certain key of the electronic device 100, and when the certain key is released, photographing of the finger ends.
A command to end photographing of a finger may additionally be received by the photographing unit 210 according to a user input. The user input may be, for example, a touch input, a key input, etc. which is applied through a user interface of the electronic device 100. The user input may additionally be a certain finger gesture detected from a captured image. For example, when a finger gesture corresponding to a fist shape is detected from a captured image, the photographing unit 210 may end photographing.
The photographing unit 210 may further include a depth camera for measuring a distance to a subject. In this case, the photographing unit 210 includes the depth camera and an imaging camera.
The control unit 220 recognizes, from an image captured by the photographing unit 210, a finger gesture and a distance from the electronic device 100 to a finger and controls the display unit 230 to change and display a range of a displayed content object, based on the finger gesture and the distance.
Referring to
Additionally, information related to each of the finger gestures is stored in the electronic device 100. For example, when a user defines a finger gesture the user may input information related to the finger gesture to the electronic device 100. For example, the user may make a finger gesture which is to be newly defined, photograph the finger gesture with the electronic device 100, and input information related to the finger gesture to the electronic device 100.
A distance from the electronic device 100 to a finger may be measured by various kinds of sensors. The electronic device 100 may include an IR sensor, a proximity sensor, etc. In this case, the control unit 220 measures the distance from the electronic device 100 to the finger by using a sensing value of a sensor.
Alternatively or additionally, the electronic device 100 may include a depth camera. In this case, the control unit 220 measures the distance from the electronic device 100 to the finger by using the depth camera.
Alternatively or additionally, the control unit 220 may measure the distance from the electronic device 100 to the finger by using auto-focusing (AF) information of the photographing unit 210. In this case, the control unit 220 measures the distance from the electronic device 100 to the finger by using information including a focus evaluation value, a focus distance, etc.
Alternatively or additionally, the control unit 220 may measure the distance from the electronic device 100 to the finger, based on a change in a size of a finger gesture in a captured image.
The display unit 230 displays a plurality of content objects. The display unit 230 may be implemented as, for example, a touch screen. Also, the display unit 230 may be implemented as, for example, a liquid crystal display (LCD), an organic light-emitting display, an electrophoretic display, or the like.
The electronic device 100 changes the displayed content objects based on a figure gesture.
The electronic device 100 switches a unit for changing the displayed content objects based on a change in a distance of a finger gesture. For example, while a plurality of content objects, such as thumbnail images corresponding to image data, are being displayed, when a distance to a finger is changed by using the first finger gesture, the electronic device 100 may change the displayed plurality of thumbnail images by a first unit, such as a year unit. When the distance to the finger is changed by using the second finger gesture, the electronic device 100 may change the displayed plurality of thumbnail images by a second unit, such as a month unit. When the distance to the finger is changed by using the third finger gesture, the electronic device 100 may change the displayed plurality of thumbnail images by a third unit, such as a day unit.
A ‘unit for changing a content object’ refers to a measurement unit by which a displayed content object is incremented or decremented whenever the electronic device 100 detects that a distance to a finger has changed by a predefined. For example, a content object which is displayed by a unit for changing a content object may be switched whenever a distance to a finger is changed by 3 cm.
Referring to
In step S802, the electronic device 100 displays a plurality of content objects. The electronic device 100 displays the plurality of content objects while executing a function or a mode of displaying a plurality of content objects. For example, the electronic device 100 displays a plurality of thumbnail images in the middle of performing a photograph album function.
In step S804, the electronic device 100 photographs a user's hand including fingers. For example, a finger may be automatically photographed depending on a state of the electronic device 100, or may be photographed according to a user input. The electronic device 100 may continuously photograph a finger at a certain frame rate. Alternatively, the electronic device 100 photographs a finger a predetermined number of times according to a user input.
In step S806, the electronic device 100 recognizes a finger gesture from a captured image and measures a distance from the electronic device 100 to the finger. The distance to the finger, as described above, may be measured with an IR sensor, a proximity sensor, a depth camera, or using AF information of the captured image.
In step S808, the electronic device 100 changes a range of each of the displayed content objects, based on the recognized finger gesture and distance.
Referring to
Referring to
Referring to
At least one or a combination of the number of content objects displayed on one screen and a layout representing a content object is changed according to a recognized finger gesture. For example, when the first finger gesture is recognized, a plurality of thumbnail images may be displayed as a layout illustrated in
A unit length is a reference distance for changing a plurality of displayed content objects. The unit length is changed according to a recognized finger gesture. For example, the unit length may be 5 cm in the first finger gesture, may be 3 cm in the second finger gesture, and may be 1 cm in the third finger gesture. Also, as an interval at which a range of changing a displayed content object corresponding to a finger gesture is changed increases, the unit length may increase, and as the interval at which a range of changing a displayed content object corresponding to a finger gesture is changed is reduced, the unit length may be reduced.
The photographing unit 210 may continuously capture a hand image including a finger at a certain frame rate, and when a captured image is generated, the control unit 220 determines whether a finger gesture maintains a recognized finger gesture. The control unit 220 changes a range of a displayed content object when a distance to a finger is changed while the finger gesture is maintaining the recognized finger gesture. When the recognized finger gesture is changed, the control unit 220 recognizes a changed finger gesture and changes a range of the displayed content object according to the distance to the finger being changed by a unit for changing a content object corresponding to the changed finger gesture. For example, when the recognized finger gesture is not a predefined finger gesture, the control unit 220 does not change the displayed content object despite the distance to the finger being changed.
The control unit 220 increases or decreases an order of a displayed content object according to a direction in which a distance to a finger is changed. For example, when a plurality of thumbnail images are arranged with respect to photographed dates, a user may make a certain finger gesture and may change a distance to a finger. In this case, when the distance to the finger is reduced, thumbnail images of an image captured prior to a plurality of currently displayed thumbnail images are displayed, and when the distance to the finger increases, thumbnail images of an image captured after the plurality of currently displayed thumbnail images are displayed
Referring to
Here, each of the e-mail objects 1210 is an object where a text of an e-mail is displayed when a corresponding object is selected. Each of the e-mail objects 1210 may be displayed in a form of displaying a title of an e-mail, a form of displaying an icon corresponding to the e-mail, etc.
Each of the e-mail objects 1210 may include attributes such as a title, a received date, a sender, a mail text, a size, etc. When the e-mail objects 1210 are displayed by the display unit 230, the e-mail objects 1210 may be arranged with respect to one of the attributes. For example, the e-mail objects 1210 being arranged and displayed with respect to a mail-received date may be a default. As another example, the e-mail objects 1210 may be arranged based on the attributes, such as the title, the sender, the size, and/or the like, according to a selection by the user.
The control unit 220 determines a unit of change for changing a range of each of the displayed e-mail objects 1210 according to a distance to a finger and a reference distance where the e-mail objects 1210 are currently arranged, based on a recognized finger gesture. For example, when the e-mail objects 1210 are arranged with respect to the received date, the control unit 220 determines the unit of change as a year, a month, a day, etc. When the e-mail objects 1210 are arranged with respect to the sender, the control unit 220 determines the unit of change as a consonant unit, a person unit, an individual mail unit, etc. The control unit 220 changes the displayed e-mail objects 1210 according to the distance to the finger and the reference distance where the e-mail objects 1210 are currently arranged, based on the recognized finger gesture.
Referring to
The control unit 220 displays a cover 1220, representing a range of a currently displayed content object, in the display unit 230 for guiding a range of a displayed content object being changed. Also, the cover 1220 representing the range of the currently displayed content object may include information about a change unit of a range of a displayed content object corresponding to a recognized finger gesture. In this case, when the recognized finger gesture is changed, the control unit 220 changes the cover 1220 according to the recognized finger gesture. Also, as a distance to a finger is changed, the control unit 220 changes the cover 1220 to correspond to the range of the displayed content object. Referring to
Referring to
Referring to
The e-book content object includes a book cover object 1510, a content-table object 1610, and an e-book page object 1710.
Referring to
Referring to
Referring to
Referring back to
The control unit 220 changes the displayed book cover objects 1510 according to the distance to the finger and a distance reference where the book cover objects 1510 are currently arranged, based on a recognized finger gesture. For example, when the book cover objects 1510 are arranged with respect to purchased dates, the control unit 220 changes the book cover objects 1510 displayed in the order of purchased dates according to the distance to the finger, and when the book cover objects 1510 are arranged with respect to book titles, the control unit 220 changes the book cover objects 1510 displayed in the order of book titles according to the distance to the finger.
Referring back to
Referring to
Referring to
The video content object may include a video file folder object 1810, a video file object 1910, and a video frame object 2010.
Referring to
The video file folder object 1810 is a storage space for storing a video file. The video file folder object 1810 including a plurality of video files may be selected based on a user input.
The video file folder object 1810 stores video files classified based on attributes of the video files. For example, when a video file is a part of a series, the video file may have attributes related to the series, such as genre, season, etc. The video files may be classified by series and stored in the video file folder object 1810. In this case, the video file folder object 1810 may have attributes such as genre, season, etc. and may include video files having corresponding attributes.
Referring to
Referring back to
Referring back to
Referring back to
The content object may be an object of a calendar function, and an object of a calendar displayed by a year unit, a month unit, and a day unit may be changed according to a finger gesture and a distance to a finger.
The content object may be an object of SNS, and a displayed SNS notice may be changed by a year unit, a month unit, and a day unit according to the finger gesture and the distance to the finger.
The content object may be an object of a map, and an area of a displayed map may be changed by a mile unit, a yard unit, a feet unit, etc. units according to the finger gesture and the distance to the finger.
The content object may be a music content object, and a displayed or selected music content object may be changed by album, musician, track number, etc. units according to the finger gesture and the distance to the finger.
Referring to
When a user changes a distance to a finger in a state of maintaining a second finger gesture 2110, as illustrated in
Changing of a range of a displayed content object may be terminated, and then, when the electronic device 100 recognizes a third finger gesture 2130 in a captured image, the range of the displayed content object may be changed according to a distance to a finger as shown in section 3.
When changing a range of a displayed content object is terminated, the electronic device 100 stops an operation of photographing, by the photographing unit 210, a hand including a finger. Subsequently, when a user input for requesting photographing of the hand is received, the electronic device 100 may start to photograph the hand including the finger, recognize a finger gesture in the captured image shown in section 3, and change the range of the displayed content object according to a distance to the finger.
The user may make the fourth finger gesture to terminate changing a range of a displayed content object, and by applying a touch input, a key input, etc. to the electronic device 100, the user changes the range of the displayed or selected content object.
Referring to
When the fifth finger gesture 2210 is recognized, although a distance to a finger is not changed, the electronic device 100 continuously changes a range of a displayed content object until a signal for issuing a request to terminate changing the range of the displayed content object is received. For example, if the fifth finger gesture 2210 is recognized, although a distance to a finger is not changed, the electronic device 100 may continuously scroll a plurality of displayed thumbnail images.
Alternatively, when the fifth finger gesture 2210 is recognized, although the fifth finger gesture 2210 is not continuously recognized, the electronic device 100 continuously changes a range of a displayed content object until a signal for terminating changing the range of the displayed content object is received. The signal for terminating changing the range of the displayed content object may be input in a form of a touch input, a key input, an image input including a finger gesture, or the like. For example, as illustrated in
The electronic device 100 continuously changes a range of a displayed content object while the fifth finger gesture 2210 is being recognized, and when the fifth finger gesture 2210 is not recognized, the electronic device 100 terminates changing the range of the displayed content object.
A unit of change and a scroll direction where a range of a content object, which is displayed when the fifth finger gesture 2210 is recognized is changed, may be determined based on a unit of change and a scroll direction where a range of a recently displayed content object is changed. For example, as illustrated in
If changing a range of a displayed content object is terminated, the electronic device 100 recognizes a predefined finger gesture, as shown in section 3 to change the range of the displayed content object according to a finger gesture and a distance to a finger.
Referring to
For example, as illustrated in
A finger gesture may be previously photographed, and then, a unit for changing a displayed content object corresponding to the finger gesture may be selected.
In the finger gesture definition function, the user selects the kind of content for using the finger gesture or a function of the electronic device 100. For example, the user may select whether to apply a finger gesture to a photograph album function or an e-book function.
Referring to
Referring to
The first range and the second range may be defined in various manners, such as using an absolute distance from the electronic device 100 to a finger, a size of a recognized finger, etc.
Referring to
When in the first range, the first direction is a direction of the previously captured image,
When in the second range, the first direction and the second direction are related to a direction in which the distance to the finger is changed. For example, when the distance to the finger is reduced in the second range, the electronic device 100 may scroll the displayed thumbnail images in a direction of a previously captured image, and when the distance to the finger increases, the electronic device 100 may scroll the displayed thumbnail images in a direction of a recently captured image.
When in the third range, the second direction is a direction of the recently captured image.
The first range and the second range may be defined in various manners, such as using an absolute distance from the electronic device 100 to a finger, a size of a recognized finger, etc.
Referring to
Alternatively, as illustrated in
Referring to
Referring to
Referring to
Referring to
The display unit 110 includes a display panel 111 and a controller that controls the display panel 111. The display panel 111 may be implemented as various types of displays such as an LCD, an organic light-emitting diode (OLED) display, an active-matrix organic light-emitting diode (AM-OLED), a plasma display panel (PDP), etc. The display panel 111 may be implemented to be flexible, transparent, or wearable. The display unit 110 may be combined with a touch panel 147 included in the user input unit 145 and, thus, may be provided as a touch screen. For example, the touch screen may include an integrated module where the display panel 111 and the touch panel 147 are combined with each other in a stacked structure.
The memory 120 includes at least one of an internal memory and an external memory.
The internal memory may include at least one of a volatile memory (for example, a dynamic random access memory (DRAM), a static random access memory (SRAM), a synchronous dynamic random access memory (SDRAM), etc.), a nonvolatile memory (for example, a one time programmable read-only memory (OTPROM), a programmable read-only memory (PROM), an erasable and programmable read-only memory (EPROM), an electrically erasable and programmable read-only memory (EEPROM), a mask read-only memory (MROM), a flash read-only memory (FROM), etc.), a hard disk drive (HDD), and a solid state drive (SSD). The control unit 170 loads and processes a command or data, received from at least one of the nonvolatile memory and another element, into a volatile memory. Also, the control unit 170 stores data received from or generated by the other element in the nonvolatile memory.
The external memory includes at least one of compact flash (CF), secure digital (SD), micro-secure digital (micro-SD), mini-secure digital (mini-SD), extreme digital (Xd), memory stick, etc.
The memory 120 stores various programs and data used to operate the electronic device 100a. For example, at least a portion of content to be displayed on a lock screen may be temporarily or semi-permanently stored in the memory 120.
The control unit 170 controls the display unit 110 to display the portion of the content stored in the memory 120. In other words, the control unit 170 displays the portion of the content, stored in the memory 120, on the display unit 110. Additionally, when a user gesture is applied through one region of the display unit 110, the control unit 170 may perform a control operation corresponding to the user gesture.
The control unit 170 includes at least one of a RAM 171, a ROM 172, a central processing unit (CPU) 173, a graphic processing unit (GPU) 174, and a bus 175. The RAM 171, the ROM 172, the CPU 173, and the GPU 174 are connected to each other through the bus 2005.
The CPU 173 accesses the memory 120 to perform booting by using an operating system (OS) stored in the memory 120. Furthermore, the CPU 173 may perform various operations by using various programs, content, data, and/or the like stored in the memory 120.
A command set and/or the like for system booting may be stored in the ROM 172. For example, when a turn-on command is input and power is supplied to the electronic device 100a, the CPU 173 copies the OS, stored in the memory 120, to the RAM 171 and executes the OS to boot a system according to a command stored in the ROM 172. When the booting is completed, the CPU 173 copies various programs, stored in the memory 120, to the RAM 171 and executes the programs copied to the RAM 171 to perform various operations. When booting of the electronic device 100a is completed, the GPU 174 displays a user interface (UI) screen on a region of the display unit 110. In detail, the GPU 174 generates a screen that displays an electronic document including various objects such as content, an icon, a menu, etc. The GPU 174 performs an arithmetic operation on an attribute value such as a form, a size, a color, or a coordinate value where the objects are to be displayed, based on a layout of a screen. Also, the GPU 174 generates a screen of various layouts including an object, based on an attribute value obtained through the arithmetic operation. The screen generated by the GPU 174 is provided to the display unit 110 and is displayed on each of regions of the display unit 110.
The GPS chip 125 may receive a GPS signal from a GPS satellite to calculate a current position of the electronic device 100a. When a navigation program is used or a current position of a user is necessary, the control unit 170 may calculate a user position by using the GPS chip 1
The communication unit 130 communicates with various types of external devices according to various types of communication schemes. The communication unit 130 includes at least one of a Wi-Fi chip 131, a Bluetooth chip 132, a wireless communication chip 133, and a near field communication (NFC) chip 134. The control unit 170 communicates with various external devices by using the communication unit 130.
The Wi-Fi chip 131 and the Bluetooth chip 132, respectively, perform communication in a Wi-Fi scheme and a Bluetooth scheme. In a case of using the Wi-Fi chip 131 or the Bluetooth chip 132, various pieces of connection information such as an SSID, a session key, etc. are first transmitted or received, a communication connection is made by using the connection information, and various pieces of information are transmitted or received.
The wireless communication chip 133 refers to a chip that performs communication according to various communication standards such as IEEE, zigbee, 3rd generation (3G), 3rd generation partnership project (3GPP), long term evolution (LTE), etc.
The NFC chip 134 refers to a chip that operates in an NFC scheme using a band of 13.56 MHz among various radio frequency-identification (RF-ID) frequency bands such as 135 kHz, 13.56 MHz, 433 MHz, 860 to 960 MHz, 2.45 GHz, etc.
The video processor 135 processes video data included in content received through the communication unit 130 or in content stored in the memory 120. The video processor 135 performs various image processing functions, such as decoding, scaling, noise filtering, frame rate conversion, resolution conversion, and/or the like, for video data.
The audio processor 140 processes audio data included in the content received through the communication unit 130 or in the content stored in the memory 120. The audio processor 140 performs various processing such as decoding, amplification, noise filtering, and/or the like for the audio data.
When a reproduction program for multimedia content is executed, the control unit 170 drives the video processor 135 and the audio processor 140 to reproduce corresponding content.
The speaker unit 160 may output the audio data generated by the audio processor 140.
The user input unit 145 receives various commands from a user. The user input unit 145 includes at least one of a key 146, a touch panel 147147, and a pen recognition panel 148.
The key 146 includes various types of keys such as a mechanical button, a wheel, etc. disposed in various regions such as a front part, a side part, a rear part, etc. of a body of the electronic device 100a.
The touch panel 147 senses a touch input of the user and outputs a touch event value corresponding to the sensed touch signal. When the touch panel 147 is combined with the display panel 111 to configure a touch screen, the touch screen may be implemented with various types of touch sensors such as a capacitive touch sensor, a pressure sensitive touch sensor, a piezoelectric touch sensor, etc. A capacitive type is a method that, by using dielectric coated on a surface of a touch screen, senses fine electricity which is applied to a user body when a part of the user's body touches the surface of the touch screen, and calculates touch coordinates by using the sensed electricity. A pressure sensitive type is a method that, by using two electrode plates (an upper plate and a lower plate) built into a touch screen, senses a current that is generated by a contact between the upper plate and the lower plate at a touched position when a user touches a screen, and calculates touch coordinates by using the sensed current. A touch event occurring in a touch screen is generally generated by a person's finger, but may be generated by an object including a conductive material for changing a capacitance.
The pen recognition panel 148 senses a pen proximity input or a pen touch input which is applied thereto by a touch pen (for example, a stylus pen), a digitizer pen, etc., and outputs a sensed pen proximity event or a pen touch event. The pen recognition panel 148 may be implemented in, for example, an EMR type. The pen recognition panel 148 senses a touch or proximity input, based on an intensity change of an electromagnetic field generated by a proximity or a touch of a pen. In detail, the pen recognition panel 148 includes an electronic signal processing unit that sequentially supplies an alternating current (AC) signal having a certain frequency to an electronic induction coil sensor having a grid structure and a loop coil of the electronic induction coil sensor. When a pen with a built-in resonance circuit is located near the loop coil of the pen recognition panel 148, a magnetic field transmitted from the loop coil generates a current based on mutual electronic induction in the resonance circuit of the pen. An inductive magnetic field is generated from a coil configuring the resonance circuit of the pen, based on the current. The pen recognition panel 148 detects the inductive magnetic field in the loop coil which is in a state of receiving a signal, and senses a proximity position or a touch position of the pen. The pen recognition panel 148 may be provided to have a certain area (for example, an area for covering a display area of the display panel 111) at a lower portion of the display panel 111.
The microphone unit 150 receives user voice or other sound and converts the received voice or sound into audio data. The control unit 170 uses the user voice, input through the microphone unit 150, in a call operation or converts the user voice into the audio data to store the audio data in the memory 120.
The photographing unit 155 captures a still image or a moving image according to control by the user. The photographing unit 155may be provided in plurality like a front camera, a rear camera, etc.
When the photographing unit 155 and the microphone unit 150 are provided, the control unit 170 performs a control operation according to a user voice, which is input through the microphone unit 150, or a user motion recognized by the photographing unit 155. For example, the electronic device 100a operates a motion control mode or a voice control mode. When the electronic device 100a operates in the motion control mode, the control unit 170 activates the photographing unit 155 to allow the photographing unit 155 to photograph the user and traces a motion change of the user to perform a control operation corresponding to the motion change. When the electronic device 100a operates in the voice control mode, the control unit 170 analyzes the user voice input through the microphone unit 150 and operates in a voice recognition mode of performing a control operation according to the analyzed user voice.
The motion detection unit 165 senses a movement of the electronic device 100a. The electronic device 100a may be rotated or inclined in various directions. In this case, the motion detection unit 165 senses movement characteristics such as a rotation direction, a rotated angle, a slope, etc. by using at least one of various sensors such as a geomagnetic sensor, a gyro sensor, an acceleration sensor, and/the like.
In addition, the electronic device 100a may further include a universal serial bus (USB) connectable to a USB connector, various external input ports connectable to various external devices such as a headset, a mouse, a local area network (LAN), etc., a digital multimedia broadcasting (DMB) chip that receives and processes a DMB signal, and/or various sensors.
Names of the above-described elements of the electronic device 100a may be changed. Also, the electronic device 100a may be configured with at least one of the above-described elements. However, some elements may be omitted, or the electronic device 100a may further include another element.
The methods of the present invention may be implemented as computer-readable codes in non-transitory computer-readable recording media. The non-transitory computer-readable recording media includes all kinds of recording devices that store data readable by a computer system.
The computer-readable codes may be implemented to perform operations of the electronic device control method according to an embodiment of the present invention when the codes are read from the non-transitory computer-readable recording medium and executed by a processor. The computer-readable codes may be implemented using various programming languages. Functional programs, codes, and code segments for implementing the embodiments may be easily programmed by one of ordinary skill in the art.
Examples of the non-transitory computer-readable recording medium include read-only memory (ROM), random-access memory (RAM), CD-ROMs, magnetic tapes, floppy disks, optical data storage devices, etc. The computer-readable recording medium may also be distributed over network coupled computer systems so that the computer-readable code is stored and executed in a distributed fashion.
According to the embodiments of the present invention, when a plurality of content objects is being displayed, a user may easily change the displayed content objects. Moreover, when a user changes content objects to be displayed, the number of manipulations necessarily performed by the user is reduced.
It should be understood that various embodiments described herein should be considered in a descriptive sense only and not for purposes of limitation. Descriptions of features or aspects within each embodiment should typically be considered as available for other similar features or aspects in other exemplary embodiments.
While one or more embodiments have been described with reference to the figures, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope as defined by the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
10-2014-0152856 | Nov 2014 | KR | national |
This application claims priority under 35 U.S.C. §119(a) to Korean Patent Application No. 10-2014-0152856 filed on Nov. 5, 2014, in the Korean Intellectual Property Office, the entire disclosure of which is incorporated herein by reference.