The present disclosure relates to an image processing apparatus and method.
With the current release of smartphones or tablet Personal Computers (PCs) having ultra-high resolution display modules equivalent to High-Definition Televisions (HDTVs) in the market, mobile displays have evolved to Wide Video Graphics Array (WVGA)-class or full-HD class displays.
In line with this, a display driving circuit needs to process an increasing amount of data, the amount of electrical current used in the driving circuit is also increasing. For example, increases in a frame rate and a resolution in a flat panel display device increase the amount of data to be processed. As the size of an image increases, the amount of data to be transmitted also increases in image transmission.
The increase in the amount of image data to be transmitted causes the excessive use of memory resources and increases the amount of power consumption.
The above information is presented as background information only to assist with an understanding of the present disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the present disclosure.
Aspects of the present disclosure are to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present disclosure is to provide an image processing apparatus and a method for controlling the image processing apparatus, in which a predetermined image is encoded using a Contrast Dependent Encoding (CDE) scheme, which depends on a contrast of the image and a pentile encoding scheme, or using the CDE scheme, and data regarding the image is transmitted, thereby reducing the amount of data transmission and preventing the unnecessary use of memory resources.
In accordance with an aspect of the present disclosure, an image processing method is provided. The image processing method includes encoding an image according to a first encoding type, encoding the image, which has been encoded according to the first encoding type, according to a second encoding type, and decoding the image, which has been encoded according to the second encoding type, in which the second encoding type is CDE which is dependent on a contrast of the image.
In accordance with another aspect of the present disclosure, an image processing apparatus is provided. The image processing apparatus includes a processor configured to encode an image according to a first encoding type, to encode the image, which has been encoded according to the first encoding type, according to a second encoding type, and to decode the image, which has been encoded according to the second encoding type and a display module configured to display the decoded image, in which the second encoding type is CDE which is dependent on a contrast of the image.
In accordance with another aspect of the present disclosure, an image processing method is provided. The image processing method includes encoding an image according to a predetermined encoding scheme, storing the encoded image in a frame buffer, and decoding the encoded image, in which the predetermined encoding scheme is CDE which depends on a contrast of the image.
Other aspects, advantages, and salient features of the disclosure will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses various embodiments of the present disclosure.
The above and other aspects, features, and advantages of certain embodiments of the present disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
Throughout the drawings, like reference numerals will be understood to refer to like parts, components, and structures.
The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of various embodiments of the present disclosure as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the various embodiments described herein may be made without departing from the scope and spirit of the present disclosure. In addition, descriptions of well-known functions and constructions may be omitted for clarity and conciseness.
The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the present disclosure. Accordingly, it should be apparent to those skilled in the art that the following description of various embodiments of the present disclosure is provided for illustration purpose only and not for the purpose of limiting the present disclosure as defined by the appended claims and their equivalents.
It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a component surface” includes reference to one or more of such surfaces.
The term “include” or “may include” used in the various embodiments of the present disclosure indicates the presence of disclosed corresponding functions, operations, elements, and the like, and does not limit additional one or more functions, operations, elements, and the like. In addition, it should be understood that the term “include” or “has” used in the various embodiments of the present disclosure is to indicate the presence of features, numbers, operations, elements, parts, or a combination thereof described in the specifications, and does not preclude the presence or addition of one or more other features, numbers, operations, elements, parts, or a combination thereof.
The term “or” or “at least one of A or/and B” used in the various embodiments of the present disclosure includes any and all combinations of the associated listed items. For example, the term “A or B” or “at least one of A or/and B” may include A, B, or all of A and B.
Although the terms such as “first” and “second” used in the various embodiments of the present disclosure may modify various elements of the various embodiments, these terms do not limit the corresponding elements. For example, these terms do not limit an order and/or importance of the corresponding elements. These terms may be used for the purpose of distinguishing one element from another element. For example, a first user device and a second user device all indicate user devices or may indicate different user devices. For example, a first element may be named as a second element without departing from the right scope of the various embodiments of the present disclosure, and similarly, a second element may be named as a first element.
It will be understood that when an element is “connected” or “coupled” to another element, the element may be directly connected or coupled to the other element, and there may be another new element between the element and the another element. To the contrary, it will be understood that when an element is “directly connected” or “directly coupled” to another element, there is no other element between the element and the another element.
The terms used in the various embodiments of the present disclosure are for the purpose of describing particular embodiments only and are not intended to be limiting.
All of the terms used herein including technical or scientific terms have the same meanings as those generally understood by an ordinary skilled person in the related art unless they are defined otherwise. The terms defined in a generally used dictionary should be interpreted as having the same meanings as the contextual meanings of the relevant technology and should not be interpreted as having ideal or exaggerated meanings unless they are clearly defined in the various embodiments of the present disclosure.
An electronic device according to various embodiments of the present disclosure may be a device including a fingerprint function or a communication function. For example, the electronic device may be a combination of one or more of a smart phone, a tablet Personal Computer (PC), a mobile phone, a video phone, an electronic book (e-book) reader, a desktop PC, a laptop PC, a netbook computer, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), a Motion Picture Experts Group (MPEG-1 or MPEG-2) Audio Layer III (MP3) player, mobile medical equipment, an electronic bracelet, an electronic necklace, an electronic appcessory, a camera, a wearable device (e.g., a Head-Mounted Device (HMD) such as electronic glasses), an electronic cloth, an electronic bracelet, an electronic necklace, an electronic appcessory, an electronic tattoo, and a smart watch.
According to various embodiments of the present disclosure, the electronic device may be a smart home appliance having a communication function. The electronic device may include, for example, a Television (TV), a Digital Versatile Disc (DVD) player, audio equipment, a refrigerator, an air conditioner, a vacuum cleaner, an oven, a microwave oven, a laundry machine, an air cleaner, a set-top box, a TV box (e.g., HomeSync™ of Samsung, TV™ of Apple, or TV™ of Google), a game console, an electronic dictionary, an electronic key, a camcorder, and an electronic frame.
According to various embodiments of the present disclosure, the electronic device may include at least one of various medical equipment (e.g., Magnetic Resonance Angiography (MRA), Magnetic Resonance Imaging (MRI), Computed Tomography (CT), an imaging device, or an ultrasonic device), a navigation system, a Global Positioning System (GPS) receiver, an Event Data Recorder (EDR), a Flight Data Recorder (FDR), a vehicle infotainment device, electronic equipment for ships (e.g., navigation system and gyro compass for ships), avionics, a security device, a vehicle head unit, an industrial or home robot, an Automatic Teller's Machine (ATM), and a Point of Sales (POS).
According to various embodiments of the present disclosure, the electronic device may include a part of a furniture or building/structure having a communication function, an electronic board, an electronic signature receiving device, a projector, and various measuring instruments (e.g., a water, electricity, gas, or electric wave measuring device). The electronic device according to various embodiments of the present disclosure may be one of the above-listed devices or a combination thereof. The electronic device according to various embodiments of the present disclosure may be a flexible device. It will be obvious to those of ordinary skill in the art that the electronic device according to various embodiments of the present disclosure is not limited to the above-listed devices.
Hereinafter, an electronic device according to various embodiments of the present disclosure will be described with reference to the accompanying drawings. Herein, the term “user” used in various embodiments of the present disclosure may refer to a person who uses the electronic device or a device using the electronic device (e.g., an artificial intelligence electronic device).
Referring to
The bus 130 may include a circuit for interconnecting the foregoing components and delivering communication (e.g., a control message) among the components.
The processor 110 may receive a command from the foregoing other components (e.g., the memory 120, the I/O interface 140, the display 150, the communication interface 160, or the image control module 170) through the bus 130, interprets the received command, and executes an operation or data processing according to the interpreted command.
The memory 120 may store commands or data received from the processor 110 or other components (e.g., the I/O interface 140, the display 150, the communication interface 160, and/or the image control module 170) or generated by the processor 110 or other components. The memory 120 may include programming modules, for example, a kernel 121, middleware 122, an Application Programming Interface (API) 123, or an application 124. Each of the foregoing programming modules may be configured with software, firmware, or hardware, or a combination of at least two of them.
The kernel 121 controls or manages system resources (e.g., the bus 130, the processor 110, and/or the memory 120) used to execute an operation or a function implemented in other programs (e.g., the middleware 122, the API 123, or the application 124). The kernel 121 provides an interface through which the middleware 122, the API 123, or the application 124 accesses separate components of the electronic device 101 to control or manage the system resources.
The middleware 122 may work as an intermediary for allowing, for example, the API 133 or the application 134 to exchange data in communication with the kernel 131. In regard to task requests received from the application 134, the middleware 132 performs control (e.g., scheduling or load balancing) with respect to the task requests, for example, by giving at least one of the applications 134 priorities for using a system resource (e.g., the bus 130, the processor 110, and/or the memory 120) of the electronic device 101.
The API 123 is an interface used for the application 124 to control a function provided by the kernel 121 or the middleware 122, and may include, for example, at least one interface or function (e.g., a command) for file control, window control, image processing or character control.
According to various embodiments of the present disclosure, the application 124 may include a Short Message Service (SMS)/Multimedia Messaging Service (MMS) application, an e-mail application, a calendar application, an alarm application, a healthcare application (e.g., an application for measuring an exercise volume or a blood sugar level), or an environment information application (e.g., an application for providing air pressure, humidity, or temperature information). Additionally or alternatively, the application 124 may be an application associated with information exchange between the electronic device 101 and an external electronic device 104. The application associated with information exchange may include a notification relay application for relaying particular information to the external electronic device or a device management application for managing the external electronic device.
For example, the notification relay application may include a function of relaying notification information generated in another application (e.g., the SMS/MMS application, the e-mail application, the healthcare management application, or the environment information application) of the electronic device 101 to the external electronic device 104. Additionally or alternatively, the notification relay application may, for example, receive notification information from the external electronic device 104 and provide the notification information to a user. The device management application may manage (e.g., install, delete, and/or update) a function of at least a part of the external electronic device 104 communicating with the electronic device 101 (e.g., turn-on/turn-off of the external electronic device (or a part thereof) or brightness (or resolution) adjustment of the display), an application operating on the external electronic device 104, and/or a service (e.g., a call service or a message service) provided on the external electronic device 104.
According to various embodiments of the present disclosure, the application 124 may include an application designated according to an attribute (e.g., a type) of the external electronic device 104. For example, if the external electronic device 104 is an MP3 player, the application 124 may include an application associated with music playback. Similarly, if the external electronic device 104 is a mobile medical device, the application 124 may include an application associated with healthcare. According to various embodiments of the present disclosure, the application 124 may include at least one of an application designated in the electronic device 101 and an application received from another electronic device (e.g., a server 106 or the external electronic device 104).
The I/O interface 140 delivers a command or data input from a user through an input/output device (e.g., a sensor, a keyboard, or a touch screen) to the processor 110, the memory 120, the communication interface 160, or the image control module 170 through, for example, the bus 130. For example, the I/O interface 140 may provide data corresponding to a user's touch input through the touch screen to the processor 110. The I/O interface 140 may output a command or data, which is received from the processor 110, the memory 120, the communication interface 160, or the image control module 170 through the bus 130, through an I/O device (e.g., a speaker or a display). For example, the I/O interface 140 may output audio data processed through the processor 110 to the user through the speaker.
The display 150 may display various information (e.g., multimedia data, text data, and the like) to users.
The communication interface 160 sets up communication, for example, between the electronic device 101 and an external device (e.g., a first external electronic device 104 or the server 106). For example, the communication interface 160 is connected to a network 162 through wireless or wired communication to communicate with the external device 104.
The wireless communication may use at least one of Wi-Fi, Bluetooth (BT), Near Field Communication (NFC), a GPS, or cellular communication (e.g., Long Term Evolution (LTE), LTE-Advanced (LTE-A), Code Division Multiple Access (CDMA), Wideband CDMA (WCDMA), a Universal Mobile Telecommunication System (UMTS), Wireless Broadband (WiBro), or Global System for Mobile Communications (GSM)). The wired communication may include, for example, at least one of a universal serial bus (USB), a High Definition Multimedia Interface (HDMI), a Recommended Standard (RS)-232, and a Plain Old Telephone Service (POTS).
According to an embodiment of the present disclosure, the network 162 may be a telecommunications network. The communication network may include at least one of a computer network, the Internet, the Internet of things, and a telephone network. According to an embodiment of the present disclosure, a protocol (e.g., a transport layer protocol, a data link layer protocol, or a physical layer protocol) for communication between the electronic device 101 and an external electronic device may be supported in at least one of the application 124, the API 123, the middleware 122, the kernel 121, and the communication interface 160.
The image control module 170 may re-encode an image, which has been encoded using an encoding scheme for display on a display module including a pentile structure (herein, the encoding scheme will be referred to as “pentile encoding” or a “first encoding type” when necessary), using a Contrast Dependent Encoding (CDE) (which will be referred to as a “second encoding type” when necessary). With such encoding operations, the size of the image may be reduced to about 33.3%. At least one function or at least one operation performed by the image control module 170 may be set to be performed by, for example, the processor 110 according to various embodiments of the present disclosure. A detailed description of the image control module 170 will be provided below.
Referring to
The first encoding module 202 may be configured to encode an image according to the first encoding type. When the first encoding type (pentile encoding) is used, the image may be encoded to have a size of about 66.6% of the original size of the image before encoding. For reference, the pentile structure may mean structure in which a plurality of sub pixels are arranged in a Red/Green/Blue/Green (RGBG) and/or Red/Green/Blue/White (RGBW) structure. The meaning of the pentile structure may be easily understood by those of ordinary skill in the art, and thus will not be described in detail.
The second encoding module 204 may be set to encode the image according to the second encoding type. When the image is encoded according to the second encoding type (CDE), the image may be encoded to have a size of about 50% of a size of the image before being encoded according to the first encoding type. Thus, the image processing apparatus according to various embodiments of the present disclosure may encode an image having a size of about 33.3% of a size of the image before being encoded, using a combination of the first encoding module 202 set to perform pentile module and the second encoding module 204 set to perform CDE. As a result, in image transmission, it is possible to reduce the amount of data transmission, to prevent unnecessary use of memory resources, and to reduce power consumption.
The frame buffer 206 temporarily stores the encoded image. According to various embodiments of the present disclosure, the frame buffer 206 may be separately configured from a main memory device (e.g., the memory 120). The frame buffer 206 may be easily understood by those of ordinary skill in the art, and thus will not be described in detail. In various embodiments of the present disclosure described below, the frame buffer 206 may be omitted.
The decoding module 208 decodes the image encoded by the second encoding module 204. In this case, a display module (not illustrated) for displaying the decoded image may be a display module supporting a pentile scheme. According to various embodiments of the present disclosure, the decoding module 208 sequentially decodes the image encoded by the first encoding module 202 and the second encoding module 204. In this case, the display module (not illustrated) for displaying the decoded image is not limited to the display module supporting the pentile scheme and various display modules may be applied.
The communication module 210 transmits the image encoded by the second encoding module 204 to other electronic devices (e.g., a smartphone, a wearable device, and so forth). The image processing method according to various embodiments of the present disclosure may be applied to other electronic devices as well as a single electronic device.
According to an embodiment of the present disclosure, performing pentile encoding by the first encoding module 202 includes receiving Red/Green/Blue (RGB) data for a pixel of the image for gamma correction and performing gamma correction for the RGB data, converting the gamma-corrected RGB data into a Sub Pixel Rendering (SPR) domain, and outputting the RGB data converted into the SPR domain.
According to an embodiment of the present disclosure, performing CDE by the second encoding module 204 includes encoding the image based on at least one type of box filtering for a pixel of the image and truncation for the pixel.
According to an embodiment of the present disclosure, encoding the image based on at least one type of box filtering and truncation includes encoding the image according to a mode in which a luminance error for sub pixels of the pixel, calculated based on at least one type of the box filtering and the truncation is minimum.
According to an embodiment of the present disclosure, pixels of the encoded image include an indicator indicating one of the box filtering type and the truncation type.
According to an embodiment of the present disclosure, decoding the image encoded according to the second encoding type includes decoding the image by repairing and replicating a sub pixel corresponding to a sub pixel in which the indicator is located, from among sub pixels of the pixel, if the indicator indicates the box filtering type.
According to an embodiment of the present disclosure, decoding the image encoded according to the second encoding type includes decoding the image by determining an upper nibble of the pixel and replicating the determined upper nibble to a lower nibble, if the indicator indicates the truncation type.
Referring to
The AP 300 may include one or more APs or one or more Communication Processors (CPs, not illustrated).
For example, the AP 300 may be configured by including the AP 300 and the CP in a single IC package or different IC packages. The AP 300 may be implemented using, for example, the processor 110.
The AP 300 may control multiple hardware or software components connected to the AP 300 by driving an Operating System (OS) or an application program, or may process various data including multimedia data and perform operations. The AP 300 may be implemented using, for example, a System on Chip (SoC). According to an embodiment of the present disclosure, the AP 300 may further include a Graphic Processing Unit (GPU, not illustrated).
For example, if the electronic device 101 has a communication function, the AP 110 may be set to perform at least one function or at least one operation such as short-range communication, recognition of location information of an electronic device, broadcasting reception, wireless Internet access, user input recognition, and the like. The AP 300 according to an embodiment of the present disclosure may be implemented using, for example, the above-described processor 110.
The AP 300 according to an embodiment of the present disclosure may include an image processing module 302, a pentile processing module 304, and a CDE processing module 306.
The image processing module 302 may be configured to perform signal processing on the image by using various digital data (e.g., encoded data). The signal processing may include functions or operations such as color interpolation, color correction, auto white balance, gamma correction, color saturation correction, formatting, bad pixel correction, hue correction, and the like.
The pentile processing module 304 may be configured to perform the above-described pentile encoding. The pentile encoding may mean encoding, for example, a pixel having RGB/RGB (48-bit) sub pixels into a pixel having RGBG (32-bit) or RGBW (32-bit) sub pixels.
Referring to
After performing the gamma correction, the pentile processing module 304 converts the corrected RGB data into an SPR domain in operation S520 and outputs the RGB data converted into the SPR domain in operation S530.
The CDE processing module 306 may be a module set to perform CDE. The CDE is an encoding method based on a fact that if an error of a gradation value of a color has a value a user cannot notice, the user recognizes the color as the same color as an expected color. Through the CDE, if a defective pixel is included in a display module, the defective pixel is corrected in a software manner rather than in a physical manner, to use the display module.
For encoding with the CDE, box filtering that averages gradation values of pixels to be encoded or truncation that quantizes bit depths of the pixels into, for example, 1/2 may be used.
The CDE processing module 306 may calculate a luminance error for the CDE. The luminance error for the CDE with respect to pentil-encoded pixels is as follows:
L
err=2*Rerr+5*Gerr+Berr Equation 1
As may be seen in Equation 1, a luminance error Lerr may be expressed as a value obtained by applying predetermined weights to a luminance error Rerr for Red (R), a luminance error Gerr for Green (G), and a luminance error Berr for Blue (B), and Equation 1 may be specified as follows:
Lerr=2*|Rorg−Renc|+5*|Gorg−Genc|+|Borg−Benc| Equation 2
A luminance error for each sub pixel (RGB) may be an absolute value of a difference between a gradation value (e.g., Rorg) of a sub pixel before box filtering or truncation (i.e., encoding based on CDE) is applied to the sub pixel and a gradation value (e.g., Renc) of a sub pixel after box filtering or truncation (i.e., encoding based on CDE) is applied to the sub pixel. When CDE is applied, the largest weight value is applied to a luminance error for green G, considering that most retinal cells of a human perceive “green”. The CDE processing module 306 may be set to perform encoding based on CDE in such a way that a luminance error calculated for each pixel using Equation 1 or 2 is smallest. At least one function or at least one operation of CDE performed by the CDE processing module 306 are illustrated in
Referring to
Referring to
A transmission module Tx 308 is set to transmit data of an encoded image to the DDIC 310.
The DDIC 310 may include a reception module Rx 312, a frame buffer 314, and a decoding module 316.
The reception module 312 is set to receive the data transmitted by the transmission module Tx 308.
The frame buffer 314 is set to temporarily store the encoded image. The frame buffer 314 according to an embodiment of the present disclosure may be, for example, the frame buffer 206 described with reference to
The decoding module 316 is set to decode the image including the encoded pixels. A decoding function or operation (illustrated in
Referring to
Referring to
The display module 320 is set to display the image including the decoded pixels. The display module 320 may be implemented by, for example, the display 150.
The DDIC 310 has been described to be included in one image processing apparatus, together with the application processor 300, but the embodiment of the present disclosure is not limited by this description. That is, the DDIC 310 may be included in a separate electronic device (not illustrated) that is different from an image processing apparatus including the application processor 300 according to various embodiments of the present disclosure.
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
L
err=(2/8)*Rerr+(5/8)*Gerr+(1/8)*Berr Equation 3
The same description regarding Equation 2 may be equally applied to a luminance error (e.g., Rerr) of each sub pixel in Equation 3, and thus a detailed description thereof will not be provided.
A CDE processing module (e.g., 902) according to various embodiments of the present disclosure may be set to calculate a luminance error based on Equation 3 and to perform CDE based on a mode in which the luminance error is minimum.
Referring to
Referring to
Referring to
The AP 1710 may control multiple hardware or software components connected to the AP 1710 by driving an OS or an application program, and may process various data including multimedia data and perform operations. The AP 1710 may be implemented by, for example, an SoC. According to an embodiment of the present disclosure, the AP 1710 may further include a GPU (not illustrated).
The communication module 1720 (e.g., the communication interface 160) may perform data transmission/reception in communication between the electronic device 1701 (e.g., the electronic device 101) and other electronic devices (e.g., the electronic device 104 or the server 106) connected through the network. According to an embodiment of the present disclosure, the communication module 1720 may include at least one of a cellular module 1721, a Wi-Fi module 1723, a BT module 1725, a GPS module 1727, an NFC module 1728, and a Radio Frequency (RF) module 1729.
The cellular module 1721 provides at least one of voice communication, video communication, a messaging service, and an Internet service through a communication network (e.g., LTE, LTE-A, CDMA, WCDMA, UMTS, WiBro, GSM, and the like). The cellular module 1721 may identify and authenticate an electronic device in a communication network by using a SIM (e.g., the SIM card 1724). According to an embodiment of the present disclosure, the cellular module 1721 performs at least some of functions provided by the AP 1710. For example, the cellular module 1721 may perform at least a part of a multimedia control function.
According to an embodiment of the present disclosure, the cellular module 1721 may include a Communication Processor (CP). The cellular module 1721 may be implemented with, for example, a SoC. Although components such as the cellular module 1721 (e.g., the CP), the memory 1730, or the power management module 1795 are illustrated as being separated from the AP 1710, the AP 1710 may be implemented to include at least some (e.g., the cellular module 1721) of the foregoing components.
According to an embodiment of the present disclosure, the AP 1710 or the cellular module 1721 (e.g., the CP) may load a command or data received from at least one of a nonvolatile memory connected thereto and other components to a volatile memory and process the received command or data. The AP 1710 or the cellular module 1721 may store data received from at least one of other components or data generated by at least one of other components in the nonvolatile memory.
Each of the Wi-Fi module 1723, the BT module 1725, the GPS module 1727, and the NFC module 1728 may include a processor for processing data transmitted and received through the corresponding module. Although the cellular module 1721, the Wi-Fi module 1723, the BT module 1725, the GPS module 1727, and the NFC module 1728 are illustrated as separate blocks in
The RF module 1729 may transmit and receive data, for example, an RF signal. The RF module 1729 may include, although not shown, at least one of a transceiver, a Power Amplification Module (PAM), a frequency filter, and a Low Noise Amplifier (LNA). The RF module 1729 may further include at least one of parts for transmitting and receiving electromagnetic waves on a free space, for example, a conductor and a conductive wire, in wireless communication. Although the cellular module 1721, the Wi-Fi module 1723, the BT module 1725, the GPS module 1727, and the NFC module 1728 are illustrated as sharing one RF module 1729 in
The SIM card 1724 may be a card including a SIM, and may be inserted into a slot formed in a particular position of the electronic device. The SIM card 1724 may include unique identification information (e.g., an Integrated Circuit Card Identifier (ICCID)) or subscriber information (e.g., an International Mobile Subscriber Identity (IMSI)).
The memory 1730 (e.g., the memory 130) may include an internal memory 1732 or an external memory 1734. The internal memory 1732 may include at least one selected from among a volatile memory (e.g., a Dynamic Random Access Memory (DRAM), a Static RAM (SRAM), a Synchronous Dynamic RAM (SDRAM), and the like) and a nonvolatile memory (e.g., a One Time Programmable Read Only Memory (OTPROM), a Programmable ROM (PROM), an Erasable and Programmable ROM (EPROM), an Electrically Erasable and Programmable ROM (EEPROM), a mask ROM, a flash ROM, a NAND flash memory, a NOR flash memory, and the like).
According to an embodiment of the present disclosure, the internal memory 1732 may be a Solid State Drive (SSD). The external memory 1734 may further include a flash drive, for example, at least one of a compact flash, Secure Digital (SD), micro-SD, mini-SD, extreme digital (xD), and a memory stick. The external memory 1734 may be functionally connected with the electronic device 1701 through various interfaces. According to an embodiment of the present disclosure, the electronic device 1701 may further include a storage device (or storage medium) such as a hard drive.
The sensor module 1740 measures a physical quantity or senses an operation state of the electronic device 1701 to convert the measured or sensed information into an electric signal. The sensor module 1740 may include at least one selected from among a gesture sensor 1740A, a gyro sensor 1740B, a pressure sensor 1740C, a magnetic sensor 1740D, an acceleration sensor 1740E, a grip sensor 1740F, a proximity sensor 1740G, a color sensor 1740H (e.g., RGB sensor), a bio sensor 1740I, a temperature/humidity sensor 1740J, an illumination sensor 1740K, an Ultra Violet (UV) sensor 1740M, and the like. Additionally or alternatively, the sensor module 1740 may include at least one selected from among an E-nose sensor (not illustrated), an Electromyography (EMG) sensor (not illustrated), an Electroencephalogram (EEG) sensor (not illustrated), an Electrocardiogram (ECG) sensor (not illustrated), an Infrared (IR) sensor (not illustrated), an iris sensor (not illustrated), a fingerprint sensor (not illustrated), and the like. The sensor module 1740 may further include a control circuit for controlling at least one sensors included therein.
The input device 1750 may include a touch panel 1752, a (digital) pen sensor 1754, a key 1756, or an ultrasonic input device 1758. The touch panel 1752 may recognize a touch input by using at least one of a capacitive, a resistive, infrared, or ultrasonic scheme. The touch panel 1752 may further include a control circuit. For the capacitive touch panel 1752, recognition of a physical contact or approach is possible. The touch panel 1752 may further include a tactile layer. In this case, the touch panel 1752 may provide tactile reaction to a user.
The (digital) pen sensor 1754 may be implemented using a method that is the same as or similar to a user's touch input or by using a separate recognition sheet. The key 1756 may include a physical button, an optical key, or a keypad. The ultrasonic input device 1758 is a device that allows the electronic device 1701 to sense ultrasonic waves, input using an input means that generates an ultrasonic signal, through a microphone (e.g., a microphone 1788), and to check data. The ultrasonic input device 1758 is capable of performing wireless recognition. According to an embodiment of the present disclosure, the electronic device 1701 may receive a user input from an external electronic device (e.g., a computer or a server) connected thereto by using the communication module 1720.
The display 1760 (e.g., the display 150) may include a panel 1762, a hologram device 1764, or a projector 1766. The panel 1762 may be, for example, a Liquid Crystal Display (LCD), an Active-Matrix Organic Light-Emitting Diode (OLED), and the like. The panel 1762 may be implemented as being flexible, transparent, or wearable. The panel 1762 may be implemented with the touch panel 1752 as one module. The hologram device 1764 may show a stereoscopic image in the air by using interference of light. The projector 1766 may project light onto the screen to display an image. The screen may be positioned inside or outside the electronic device 1701. According to an embodiment of the present disclosure, the display 1760 may further include a control circuit for controlling the panel 1762, the hologram device 1764, or the projector 1766.
The interface 1770 may include an HDMI 1772, a USB 1774, an optical interface 1776, or a D-subminiature 1778. The interface 1770 may be included in the communication interface 160 illustrated in
The audio module 1780 bi-directionally converts sound and an electric signal. At least some components of the audio module 1780 may be included in the I/O interface 140 illustrated in
The camera module 1791 is a device capable of capturing still and moving images, and according to an embodiment of the present disclosure, the camera module 1791 may include one or more image sensors (e.g., a front sensor or a rear sensor), a lens (not illustrated), an Image Signal Processor (ISP, not illustrated), or a flash (not illustrated, for example, an LED or a xenon lamp).
The power management module 1795 may manage power of the electronic device 1701. Although not shown, a Power Management Integrated Circuit (PMIC), a charger IC, or a battery or fuel gauge may be included in the power management module 1795.
The PMIC may be mounted in, for example, an IC or an SoC semiconductor. The charging method may be classified into a wired type and a wireless type. The charger IC may charge a battery, and may prevent introduction of an over-voltage or over-current from a charger. According to an embodiment of the present disclosure, the charger IC may include a charger IC for at least one of a wired charging method and a wireless charging method. The wireless charging method may include, for example, a magnetic resonance method, a magnetic induction method, or an electromagnetic wave method, and an additional circuit for wireless charging, for example, a coil loop, a resonance circuit, or a rectifier may be added for the wireless charging method.
The battery gauge measures the remaining capacity of the battery 1796, and a voltage, a current, or a temperature of the battery 1796 during charging. The battery 1796 stores or produces electricity and supplies power to the electronic device 1700 by using the stored or produced electricity. The battery 1796 may include a rechargeable battery or a solar battery.
The indicator 1797 may display a particular state, for example, at least one of a booting state, a message state, and a charging state, of the electronic device 1700 or a part thereof (e.g., the AP 810). The motor 1798 may convert an electric signal into mechanical vibration. Although not shown, a processing unit for supporting mobile TVs (e.g., a GPU) may be included in the electronic device 101. The processing unit for supporting mobile TVs may process media data complying with, for example, Digital Multimedia Broadcasting (DMB), Digital Video Broadcasting (DVB), or a media flow.
The foregoing components of the electronic device according to various embodiments of the present disclosure may include one or more components, and a name of a component may vary according to a type of the electronic device. The electronic device according to various embodiments of the present disclosure may include at least one of the foregoing components, and some of them may be omitted from the electronic device or other components may be further included in the electronic device. Also, some of the components of the electronic device according to various embodiments of the present disclosure may be combined into one entity to perform the same function as those of the components that have not been combined.
Referring to
According to an embodiment of the present disclosure, the device discovery protocol 1851 may be a protocol according to which electronic devices (e.g., an electronic device 1810 or an electronic device 1830) senses an external electronic device capable of communicating with the electronic devices or connect to the sensed external electronic device. For example, the electronic device 1810 (e.g., the electronic device 101) may sense the electronic device 1830 (e.g., the electronic device 104) as a device capable of communicating with the electronic device 1810, through a communication method (e.g., Wi-Fi, BT, or USB) available in the electronic device 1810, by using the device discovery protocol 1851. The electronic device 1810 may obtain and store identification information regarding the sensed electronic device 1830 by using the device discovery protocol 1851 for communication connection with the electronic device 1830. For example, the electronic device 1810 may establish communication connection with the electronic device 1830 based on at least the identification information.
According to an embodiment of the present disclosure, the device discovery protocol 1851 may be a protocol for mutual authentication among a plurality of electronic devices. For example, the electronic device 1810 may perform authentication between the electronic device 1810 and the electronic device 1830 based on communication information (e.g., a Media Access Control (MAC) address, a Universally Unique Identifier (UUID), a Subsystem Identification (SSID), and an Internet Protocol (IP) address).
According to an embodiment of the present disclosure, the capability exchange protocol 1853 is a protocol for exchanging information associated with a capability of a service that may be supported in at least one of the electronic device 1810 and the electronic device 1830. For example, the electronic device 1810 and the electronic device 1830 may exchange information associated with a capability of a service currently provided by each of them through the function exchange protocol 1853. The exchangeable information may include identification information indicating a particular service among the plurality of services that may be supported by the electronic device 1810 and the electronic device 1830. For example, the electronic device 1810 may receive identification information of a particular service provided by the electronic device 1830 from the electronic device 1830 through the capability exchange protocol 1853. In this case, the electronic device 1810 may determine based on the received identification information whether the electronic device 1810 may support the particular service.
According to an embodiment of the present disclosure, the network protocol 1855 may be a protocol for controlling a flow of data transmitted and received to provide a service through interworking between electronic devices (e.g., the electronic device 1810 and the electronic device 1830) connected for communication therebetween. For example, at least one selected from among the electronic device 1810 and the electronic device 1830 may perform error control or data quality control by using the network protocol 1855. Additionally or alternatively, the network protocol 1855 may determine a transmission format of data transmitted and received between the electronic device 1810 and the electronic device 1830. At least one selected from the electronic device 1810 and the electronic device 1830 may manage (e.g., connect or terminate) at least a session for data exchange by using the network protocol 1855.
According to an embodiment of the present disclosure, the application protocol 1857 may be a protocol for providing a procedure or information for exchanging data associated with a service provided to the external electronic device. For example, the electronic device 1810 (e.g., the electronic device 101) may provide a service to the electronic device 1830 (e.g., the electronic device 104 or the server 106) through the application protocol 1857.
According to an embodiment of the present disclosure, the communication protocol 900 may include a standard communication protocol, a communication protocol designated by an individual or an organization (e.g., a communication protocol designated by a communication device manufacturer or a network supplier), or a combination thereof
The term “module” used in various embodiments of the present disclosure may refer to, for example, a “unit” including one of hardware, software, and firmware, or a combination of two or more thereof. The term “module” may be interchangeable with other terms such as unit, logic, logical block, component, or circuit. A “module” may be a minimum unit of integrally configured components or a part thereof. A “module” may be a minimum unit for performing one or more functions or a part thereof. A “module” may be mechanically or electronically implemented. For example, a “module” according to various embodiments of the present disclosure may include at least one of an Application-Specific IC (ASIC) chip, a Field-Programmable Gate Arrays (FPGAs), and a programmable-logic device for performing operations which has been known or will be developed in the future.
At least a part of a device (e.g., modules or functions thereof) or a method (e.g., operations) according to various embodiments of the present disclosure may be implemented by instructions stored in the form of program modules in computer-readable storage media. When the instruction is executed by a processor (e.g., the processor 110), the one or more processors may perform a function corresponding to the instruction. The computer-readable storage medium may be, for example, the memory 120. At least a part of the programming module may be implemented (e.g., executed) by, for example, the processor 110. At least a part of the programing module may include, for example, a module, a program, a routine, sets of instructions, or a process to perform one or more functions.
The computer readable recording media may include a hardware device specially configured to store and perform a program command (e.g., a programing module), including a magnetic media such as a hard disc, a floppy disc, and a magnetic tape, an optical recording media such as a Compact Disc ROM (CD-ROM) and a DVD, a magneto-optical media such as a floptical disk, and a hardware device, such as a ROM, a RAM, and a flash memory, specifically configured to store and execute program instructions. In addition, the program instructions may include high class language codes, which may be executed in a computer by using an interpreter, as well as machine codes made by a compiler. The aforementioned hardware device may be configured to operate as one or more software modules in order to perform the operation of the present disclosure, and vice versa.
The module or program module according to various embodiments of the present disclosure may include at least one of the above-described elements, exclude some of them, or further include other elements. The operations performed by the module, the program module, or other elements according to various embodiments of the present disclosure may be executed in a sequential, parallel, repeated, or heuristic manner. Also, some operations may be executed based on a different order, may be omitted, or may additionally include another operation.
According to various embodiments of the present disclosure, in a storage medium having command stored therein, the commands are set to cause at least one processors to perform at least one operations when executed by the at least one processors, in which the at least one operations include encoding an image according to the first encoding type, encoding an image, encoded using the first encoding type, according to a second encoding type, and decoding the image encoded according to the second encoding type, and the second encoding type is CDE which depends on a contrast of the image.
With the image processing apparatus and method according to various embodiments of the present disclosure, by encoding a predetermined image using CDE, which depends on a contrast of the image, and processing and transmitting data, unnecessary use of memory resources may be prevented and power consumption may be reduced in image data processing and/or transmission.
The effects of the present disclosure are not limited to the above-described effects, and it would be obvious to those of ordinary skill in the art that various effects are included in the present disclosure.
While the present disclosure has been shown and described with reference to various embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present disclosure as defined by the appended claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
10-2014-0083962 | Jul 2014 | KR | national |
This application claims the benefit under 35 U.S.C. §119(e) of a U.S. Provisional application filed on Jun. 11, 2014 in the U.S. Patent and Trademark Office and assigned Ser. No. 62/010,707, and under 35 U.S.C. §119(a) of a Korean patent application filed on Jul. 4, 2014 in the Korean Intellectual Property Office and assigned Serial number 10-2014-0083962, the entire disclosure of each of which is hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
62010707 | Jun 2014 | US |