The disclosure relates to a wearable electronic device and an operation method thereof.
Augmented reality (AR) is a field of virtual reality (VR) that refers to a computer graphics technique in which virtual objects or information are synthesized into an existing real environment to appear as if they are things of the original environment. Augmented reality is a display technology that overlays virtual objects onto the real world viewed by a user, and can be applied to products such as wearable electronic devices, providing diverse user experiences to the user. For example, a wearable electronic device that supports augmented reality may be a head-mounted display (HMD) device or AR glasses.
A wearable electronic device that supports augmented reality may include a display panel as a light source that outputs images, a projection lens that inputs the image output from the display panel to a light waveguide, and the light waveguide that propagates the input image to reach the user's eyes. A wearable electronic device that supports augmented reality may provide a see-through display, e.g., augmented reality functionality, as a light waveguide is disposed on at least a portion of at least one lens.
The information described above may be provided as the related art to aid in understanding of the present disclosure. No assertion or determination is made with respect to the applicability of any of the above-mentioned as the prior art related to the present disclosure.
When a wearable electronic device performs an augmented reality function, the visibility of an image displayed through at least one lens (e.g., see-through display) may be affected by external illuminance. When the external environment of the wearable electronic device is a bright outdoor condition, the visibility of an image may decrease. When the external illuminance is bright, the wearable electronic device may increase luminance of an image output from a light source unit to improve visibility. However, such operation of the wearable electronic device may increase power consumption and generate heat.
Embodiments of the disclosure may provide a wearable electronic device and an operation method thereof that is capable of reducing power consumption and heat generation in a high-illuminance ambient light environment and improve the visibility of an image by dynamically adjusting the luminance for each area displayed through a display panel, which is a light source unit, a form of the displayed image, and/or saturation of the displayed image, depending on the magnitude of illuminance of the ambient light.
A wearable electronic device, according to an example embodiment, may include: at least one lens, a battery, a display, a waveguide configured to receive an image from the display and output the received image through the at least one lens, an illuminance sensor configured to detect external illuminance of the wearable electronic device, and at least one processor, comprising processing circuitry, wherein at least one processor, individually and/or collectively, may be configured to: in response to a specified event, activate a visibility enhancement mode; detect, in response to the activation of the visibility enhancement mode, ambient illuminance of the wearable electronic device using the illuminance sensor; and dynamically adjust, based on the detected illuminance, luminance of the image output through the display and a displaying a form of at least one object included in the image.
In a method of a wearable electronic device, according to an example embodiment, the wearable electronic device may include at least one lens, a battery, a display, a waveguide configured to receive an image from the display and output the received image through the at least one lens, an illuminance sensor configured to detect external illuminance of the wearable electronic device, wherein the method may include: activating a visibility enhancement mode in response to a predetermined event, detecting ambient illuminance of the wearable electronic device using the illuminance sensor in response to the activation of the visibility enhancement mode, and adjusting dynamically luminance of an image output through the display and a displaying a form of at least one object included in the image based on the detected illuminance.
A wearable electronic device and an operation method thereof, according to various example embodiments of the present disclosure, can reduce power consumption and heat generation in a high-illuminance ambient light environment and improve the visibility of an image by dynamically adjusting the luminance for each area displayed through a display panel, which includes a light source unit, a form of the displayed image, and/or saturation of the displayed image, depending on the magnitude of illuminance of the ambient light.
In addition, various effects that can be directly or indirectly identified through the present disclosure may be provided.
The effects obtained by the present disclosure are not limited to the aforementioned effects, and other effects, which are not mentioned above, will be clearly understood by those skilled in the art from the following description.
The above and other aspects, features, and advantages of certain embodiments of the present disclosure will be more apparent from the following detailed description, taken in conjunction with the accompanying drawings, in which:
Throughout the drawings, similar reference numerals are understood to refer to similar parts, elements, and structures.
The following description with reference to the accompanying drawings is provided to aid in understanding of the various example embodiments of the present disclosure. Various specific details are included herewith for the purpose of the understanding, but should be considered as illustrative only. Therefore, one skilled in the art will recognize that various alterations and modifications may be made to the various embodiments disclosed without departing from the scope and spirit of the present disclosure. In addition, for clarity and conciseness, descriptions of well-known features and configurations may be omitted.
The terms and words used in the following descriptions and claims are not limited to their bibliographic meanings, but are used to enable a clear and consistent understanding of the present disclosure. Therefore, it should be apparent to those skilled in the art that the following descriptions of various embodiments of the present disclosure, are not intended to limit the present disclosure, but are provided for the purpose of illustration.
The expressions in the singular form should be understood to include the plural referents unless the context clearly dictates otherwise. Therefore, for example, a reference to a “surface of an element” may include a reference to one or more of those surfaces.
Referring to
The processor 120 may include various processing circuitry and/or multiple processors. For example, as used herein, including the claims, the term “processor” may include various processing circuitry, including at least one processor, wherein one or more of at least one processor, individually and/or collectively in a distributed manner, may be configured to perform various functions described herein. As used herein, when “a processor”, “at least one processor”, and “one or more processors” are described as being configured to perform numerous functions, these terms cover situations, for example and without limitation, in which one processor performs some of recited functions and another processor(s) performs other of recited functions, and also situations in which a single processor may perform all recited functions. Additionally, the at least one processor may include a combination of processors performing various of the recited/disclosed functions, e.g., in a distributed manner. At least one processor may execute program instructions to achieve or perform various functions. The processor 120 may execute, for example, software (e.g., a program 140) to control at least one other component (e.g., a hardware or software component) of the electronic device 101 coupled with the processor 120, and may perform various data processing or computation. According to an embodiment, as at least part of the data processing or computation, the processor 120 may store a command or data received from another component (e.g., the sensor module 176 or the communication module 190) in volatile memory 132, process the command or the data stored in the volatile memory 132, and store resulting data in non-volatile memory 134. According to an embodiment, the processor 120 may include a main processor 121 (e.g., a central processing unit (CPU) or an application processor (AP)), or an auxiliary processor 123 (e.g., a graphics processing unit (GPU), a neural processing unit (NPU), an image signal processor (ISP), a sensor hub processor, or a communication processor (CP)) that is operable independently from, or in conjunction with, the main processor 121. For example, when the electronic device 101 includes the main processor 121 and the auxiliary processor 123, the auxiliary processor 123 may be adapted to consume less power than the main processor 121, or to be specific to a specified function. The auxiliary processor 123 may be implemented as separate from, or as part of the main processor 121.
The auxiliary processor 123 may control at least some of functions or states related to at least one component (e.g., the display module 160, the sensor module 176, or the communication module 190) among the components of the electronic device 101, instead of the main processor 121 while the main processor 121 is in an inactive (e.g., sleep) state, or together with the main processor 121 while the main processor 121 is in an active state (e.g., executing an application). According to an embodiment, the auxiliary processor 123 (e.g., an image signal processor or a communication processor) may be implemented as part of another component (e.g., the camera module 180 or the communication module 190) functionally related to the auxiliary processor 123. According to an embodiment, the auxiliary processor 123 (e.g., the neural processing unit) may include a hardware structure specified for artificial intelligence model processing. An artificial intelligence model may be generated by machine learning. Such learning may be performed, e.g., by the electronic device 101 where the artificial intelligence is performed or via a separate server (e.g., the server 108). Learning algorithms may include, but are not limited to, e.g., supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning. The artificial intelligence model may include a plurality of artificial neural network layers. The artificial neural network may be a deep neural network (DNN), a convolutional neural network (CNN), a recurrent neural network (RNN), a restricted boltzmann machine (RBM), a deep belief network (DBN), a bidirectional recurrent deep neural network (BRDNN), deep Q-network or a combination of two or more thereof but is not limited thereto. The artificial intelligence model may, additionally or alternatively, include a software structure other than the hardware structure.
The memory 130 may store various data used by at least one component (e.g., the processor 120 or the sensor module 176) of the electronic device 101. The various data may include, for example, software (e.g., the program 140) and input data or output data for a command related thereto. The memory 130 may include the volatile memory 132 or the non-volatile memory 134.
The program 140 may be stored in the memory 130 as software, and may include, for example, an operating system (OS) 142, middleware 144, or an application 146.
The input module 150 may receive a command or data to be used by another component (e.g., the processor 120) of the electronic device 101, from the outside (e.g., a user) of the electronic device 101. The input module 150 may include, for example, a microphone, a mouse, a keyboard, a key (e.g., a button), or a digital pen (e.g., a stylus pen).
The sound output module 155 may output sound signals to the outside of the electronic device 101. The sound output module 155 may include, for example, a speaker or a receiver. The speaker may be used for general purposes, such as playing multimedia or playing record. The receiver may be used for receiving incoming calls. According to an embodiment, the receiver may be implemented as separate from, or as part of the speaker.
The display module 160 may visually provide information to the outside (e.g., a user) of the electronic device 101. The display module 160 may include, for example, a display, a hologram device, or a projector and control circuitry to control a corresponding one of the display, hologram device, and projector. According to an embodiment, the display module 160 may include a touch sensor adapted to detect a touch, or a pressure sensor adapted to measure the intensity of force incurred by the touch.
The audio module 170 may convert a sound into an electrical signal and vice versa. According to an embodiment, the audio module 170 may obtain the sound via the input module 150, or output the sound via the sound output module 155 or a headphone of an external electronic device (e.g., an electronic device 102) directly (e.g., wiredly) or wirelessly coupled with the electronic device 101.
The sensor module 176 may detect an operational state (e.g., power or temperature) of the electronic device 101 or an environmental state (e.g., a state of a user) external to the electronic device 101, and then generate an electrical signal or data value corresponding to the detected state. According to an embodiment, the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, an atmospheric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an infrared (IR) sensor, a biometric sensor, a temperature sensor, a humidity sensor, or an illuminance sensor.
The interface 177 may support one or more specified protocols to be used for the electronic device 101 to be coupled with the external electronic device (e.g., the electronic device 102) directly (e.g., wiredly) or wirelessly. According to an embodiment, the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, a secure digital (SD) card interface, or an audio interface.
A connecting terminal 178 may include a connector via which the electronic device 101 may be physically connected with the external electronic device (e.g., the electronic device 102). According to an embodiment, the connecting terminal 178 may include, for example, a HDMI connector, a USB connector, a SD card connector, or an audio connector (e.g., a headphone connector).
The haptic module 179 may convert an electrical signal into a mechanical stimulus (e.g., a vibration or a movement) or electrical stimulus which may be recognized by a user via his tactile sensation or kinesthetic sensation. According to an embodiment, the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electric stimulator.
The camera module 180 may capture a still image or moving images. According to an embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
The power management module 188 may manage power supplied to the electronic device 101. According to an embodiment, the power management module 188 may be implemented as at least part of, for example, a power management integrated circuit (PMIC).
The battery 189 may supply power to at least one component of the electronic device 101. According to an embodiment, the battery 189 may include, for example, a primary cell which is not rechargeable, a secondary cell which is rechargeable, or a fuel cell.
The communication module 190 may support establishing a direct (e.g., wired) communication channel or a wireless communication channel between the electronic device 101 and the external electronic device (e.g., the electronic device 102, the electronic device 104, or the server 108) and performing communication via the established communication channel. The communication module 190 may include one or more communication processors that are operable independently from the processor 120 (e.g., the application processor (AP)) and supports a direct (e.g., wired) communication or a wireless communication. According to an embodiment, the communication module 190 may include a wireless communication module 192 (e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (e.g., a local area network (LAN) communication module or a power line communication (PLC) module). A corresponding one of these communication modules may communicate with the external electronic device via the first network 198 (e.g., a short-range communication network, such as Bluetooth™ wireless-fidelity (Wi-Fi) direct, or infrared data association (IrDA)) or the second network 199 (e.g., a long-range communication network, such as a legacy cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or wide area network (WAN)). These various types of communication modules may be implemented as a single component (e.g., a single chip), or may be implemented as multi components (e.g., multi chips) separate from each other. The wireless communication module 192 may identify and authenticate the electronic device 101 in a communication network, such as the first network 198 or the second network 199, using subscriber information (e.g., international mobile subscriber identity (IMSI)) stored in the subscriber identification module 196.
The wireless communication module 192 may support a 5G network, after a 4G network, and next-generation communication technology, e.g., new radio (NR) access technology. The NR access technology may support enhanced mobile broadband (eMBB), massive machine type communications (mMTC), or ultra-reliable and low-latency communications (URLLC). The wireless communication module 192 may support a high-frequency band (e.g., the mmWave band) to achieve, e.g., a high data transmission rate. The wireless communication module 192 may support various technologies for securing performance on a high-frequency band, such as, e.g., beamforming, massive multiple-input and multiple-output (massive MIMO), full dimensional MIMO (FD-MIMO), array antenna, analog beam-forming, or large scale antenna. The wireless communication module 192 may support various requirements specified in the electronic device 101, an external electronic device (e.g., the electronic device 104), or a network system (e.g., the second network 199). According to an embodiment, the wireless communication module 192 may support a peak data rate (e.g., 20 Gbps or more) for implementing eMBB, loss coverage (e.g., 164 dB or less) for implementing mMTC, or U-plane latency (e.g., 0.5 ms or less for each of downlink (DL) and uplink (UL), or a round trip of 1 ms or less) for implementing URLLC.
The antenna module 197 may transmit or receive a signal or power to or from the outside (e.g., the external electronic device) of the electronic device 101. According to an embodiment, the antenna module 197 may include an antenna including a radiating element including a conductive material or a conductive pattern formed in or on a substrate (e.g., a printed circuit board (PCB)). According to an embodiment, the antenna module 197 may include a plurality of antennas (e.g., array antennas). In such a case, at least one antenna appropriate for a communication scheme used in the communication network, such as the first network 198 or the second network 199, may be selected, for example, by the communication module 190 (e.g., the wireless communication module 192) from the plurality of antennas. The signal or the power may then be transmitted or received between the communication module 190 and the external electronic device via the selected at least one antenna. According to an embodiment, another component (e.g., a radio frequency integrated circuit (RFIC)) other than the radiating element may be additionally formed as part of the antenna module 197.
According to various embodiments, the antenna module 197 may form a mmWave antenna module. According to an embodiment, the mmWave antenna module may include a printed circuit board, a RFIC disposed on a first surface (e.g., the bottom surface) of the printed circuit board, or adjacent to the first surface and capable of supporting a designated high-frequency band (e.g., the mmWave band), and a plurality of antennas (e.g., array antennas) disposed on a second surface (e.g., the top or a side surface) of the printed circuit board, or adjacent to the second surface and capable of transmitting or receiving signals of the designated high-frequency band.
At least some of the above-described components may be coupled mutually and communicate signals (e.g., commands or data) therebetween via an inter-peripheral communication scheme (e.g., a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)).
According to an embodiment, commands or data may be transmitted or received between the electronic device 101 and the external electronic device 104 via the server 108 coupled with the second network 199. Each of the electronic devices 102 or 104 may be a device of a same type as, or a different type, from the electronic device 101. According to an embodiment, all or some of operations to be executed at the electronic device 101 may be executed at one or more of the external electronic devices 102, 104, or 108. For example, if the electronic device 101 should perform a function or a service automatically, or in response to a request from a user or another device, the electronic device 101, instead of, or in addition to, executing the function or the service, may request the one or more external electronic devices to perform at least part of the function or the service. The one or more external electronic devices receiving the request may perform the at least part of the function or the service requested, or an additional function or an additional service related to the request, and transfer an outcome of the performing to the electronic device 101. The electronic device 101 may provide the outcome, with or without further processing of the outcome, as at least part of a reply to the request. To that end, a cloud computing, distributed computing, mobile edge computing (MEC), or client-server computing technology may be used, for example. The electronic device 101 may provide ultra low-latency services using, e.g., distributed computing or mobile edge computing. In an embodiment, the external electronic device 104 may include an internet-of-things (IoT) device. The server 108 may be an intelligent server using machine learning and/or a neural network. According to an embodiment, the external electronic device 104 or the server 108 may be included in the second network 199. The electronic device 101 may be applied to intelligent services (e.g., smart home, smart city, smart car, or healthcare) based on 5G communication technology or IoT-related technology.
The electronic device according to various embodiments may be one of various types of electronic devices. The electronic devices may include, for example, a portable communication device (e.g., a smartphone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, a home appliance, or the like. According to an embodiment of the disclosure, the electronic devices are not limited to those described above.
It should be appreciated that various embodiments of the present disclosure and the terms used therein are not intended to limit the technological features set forth herein to particular embodiments and include various changes, equivalents, or replacements for a corresponding embodiment. With regard to the description of the drawings, similar reference numerals may be used to refer to similar or related elements. It is to be understood that a singular form of a noun corresponding to an item may include one or more of the things, unless the relevant context clearly indicates otherwise. As used herein, each of such phrases as “A or B,” “at least one of A and B,” “at least one of A or B,” “A, B, or C,” “at least one of A, B, and C,” and “at least one of A, B, or C,” may include any one of, or all possible combinations of the items enumerated together in a corresponding one of the phrases. As used herein, such terms as “1st” and “2nd,” or “first” and “second” may be used to simply distinguish a corresponding component from another, and does not limit the components in other aspect (e.g., importance or order). It is to be understood that if an element (e.g., a first element) is referred to, with or without the term “operatively” or “communicatively”, as “coupled with,” “coupled to,” “connected with,” or “connected to” another element (e.g., a second element), the element may be coupled with the other element directly (e.g., wiredly), wirelessly, or via a third element.
As used in connection with various embodiments of the disclosure, the term “module” may include a unit implemented in hardware, software, or firmware, or any combination thereof, and may interchangeably be used with other terms, for example, “logic,” “logic block,” “part,” or “circuitry”. A module may be a single integral component, or a minimum unit or part thereof, adapted to perform one or more functions. For example, according to an embodiment, the module may be implemented in a form of an application-specific integrated circuit (ASIC).
Various embodiments as set forth herein may be implemented as software (e.g., the program 140) including one or more instructions that are stored in a storage medium (e.g., internal memory 136 or external memory 138) that is readable by a machine (e.g., the electronic device 101). For example, a processor (e.g., the processor 120) of the machine (e.g., the electronic device 101) may invoke at least one of the one or more instructions stored in the storage medium, and execute it, with or without using one or more other components under the control of the processor. This allows the machine to be operated to perform at least one function according to the at least one instruction invoked. The one or more instructions may include a code generated by a compiler or a code executable by an interpreter. The machine-readable storage medium may be provided in the form of a non-transitory storage medium. Wherein, the “non-transitory” storage medium is a tangible device, and may not include a signal (e.g., an electromagnetic wave), but this term does not differentiate between where data is semi-permanently stored in the storage medium and where the data is temporarily stored in the storage medium.
According to an embodiment, a method according to various embodiments of the disclosure may be included and provided in a computer program product. The computer program product may be traded as a product between a seller and a buyer. The computer program product may be distributed in the form of a machine-readable storage medium (e.g., compact disc read only memory (CD-ROM)), or be distributed (e.g., downloaded or uploaded) online via an application store (e.g., PlayStore™), or between two user devices (e.g., smart phones) directly. If distributed online, at least part of the computer program product may be temporarily generated or at least temporarily stored in the machine-readable storage medium, such as memory of the manufacturer's server, a server of the application store, or a relay server.
According to various embodiments, each component (e.g., a module or a program) of the above-described components may include a single entity or multiple entities, and some of the multiple entities may be separately disposed in different components. According to various embodiments, one or more of the above-described components may be omitted, or one or more other components may be added. Alternatively or additionally, a plurality of components (e.g., modules or programs) may be integrated into a single component. In such a case, according to various embodiments, the integrated component may still perform one or more functions of each of the plurality of components in the same or similar manner as they are performed by a corresponding one of the plurality of components before the integration. According to various embodiments, operations performed by the module, the program, or another component may be carried out sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations may be executed in a different order or omitted, or one or more other operations may be added.
The wearable electronic device illustrated in
With reference to
According to an embodiment, the wearable electronic device 200 may include a see-through display 204 corresponding to a near-to-eye display (e.g., first see-through display 204-1, second see-through display 204-2). At least a portion of the lens of the wearable electronic device 200 may include the see-through display 204. For example, the wearable electronic device 200 may include a left-eye lens or a right-eye lens, at least a portion of which may include a light waveguide (e.g., waveguide 430 in
According to an embodiment, the see-through display 204 may be positioned close to the user's eye, and the user 202 may wear the wearable electronic device 200, including the see-through display 204, like glasses.
According to an embodiment, the wearable electronic device 200 may display augmented reality images through the see-through display 204. The see-through display 204 may transmit light from the real environment (or real-world objects). The user 202 may perceive the light from the real environment transmitted through the see-through display 204 and thereby see the real environment. The see-through display 204 may refer to a transparent display that can transmit light from real-world objects while simultaneously displaying images of virtual objects. For example, the wearable electronic device 200 may display images of virtual objects through the see-through display 204. The user 202 may perceive real-world objects through the see-through display 204 of the wearable electronic device 200, and may perceive virtual objects overlaid thereon.
Various embodiments of the present disclosure describe a glasses-type wearable electronic device 200, but are not limited thereto. Various embodiments of the present disclosure may be applied to various electronic devices including a near-to-eye display. For example, various embodiments of the present disclosure may also be applied to a head-mounted display (HMD) device or a goggle-type wearable electronic device.
The wearable electronic device 200 illustrated in
With reference to
According to an embodiment, the module included in the wearable electronic device 200 may be understood as a hardware module (e.g., circuitry) included in the wearable electronic device 200. The elements included in the wearable electronic device 200 may not be limited to the elements illustrated in the block diagram of
According to an embodiment, the elements of the wearable electronic device 200 illustrated in
According to an embodiment, the processor 300 may include various processing circuitry and execute instructions stored in memory to control the operation of the elements of the wearable electronic device 200 (e.g., display module 310, sensor module 320, battery 340, camera 350, and communication interface 360). The processor 300 may be electrically and/or operatively connected to the display module 310, sensor module 320, battery 340, camera 350, and communication interface 360.
The processor 300 may execute software to control at least one of the other elements connected to the processor 300 (e.g., display module 310, sensor module 320, battery 340, camera 350, and communication interface 360). The processor 300 may obtain commands from elements included in the wearable electronic device 200, interpret the obtained commands, and process and/or calculate various data based on the interpreted commands. The processor 300 may include various processing circuitry and/or multiple processors. For example, as used herein, including the claims, the term “processor” may include various processing circuitry, including at least one processor, wherein one or more of at least one processor, individually and/or collectively in a distributed manner, may be configured to perform various functions described herein. As used herein, when “a processor”, “at least one processor”, and “one or more processors” are described as being configured to perform numerous functions, these terms cover situations, for example and without limitation, in which one processor performs some of recited functions and another processor(s) performs other of recited functions, and also situations in which a single processor may perform all recited functions. Additionally, the at least one processor may include a combination of processors performing various of the recited/disclosed functions, e.g., in a distributed manner. At least one processor may execute program instructions to achieve or perform various functions.
According to an embodiment, the wearable electronic device 200 may receive data processed through the processor 120 embedded in an external device (e.g., electronic device 102 or 104 in
According to an embodiment, the display module 310 may include a display panel (e.g., display panel 410 in
According to an embodiment, the display panel 410 may emit display light for displaying augmented reality images on the basis of the control of the processor 300. The display panel 410 may be understood as a self-emissive display that emits light from the display itself or as a display that reflects and emits light emitted from a separate light source. For example, the wearable electronic device 200 (e.g., processor 300) may emit display light through the display panel 410 to display an augmented reality image in an display area of the see-through display 204. According to an embodiment, the wearable electronic device 200 (e.g., processor 300) may control the display panel 410 to display augmented reality images in the display area of the see-through display 204 in response to input from the user 202. According to an embodiment, the type of input from the user 202 may include button input, touch input, voice input, and/or gesture input, and is not limited thereto, and may include various input methods capable of controlling the operation of the display panel 410.
According to an embodiment, the wearable electronic device 200 may further include a light source unit (not illustrated) that emits additional light different from the display light emitted by the display panel 410 to enhance brightness around the user's eye. The light source unit may include a white LED or an infrared LED.
According to an embodiment, the glasses 330 (it will be understood that the terms “glass” and “glasses” may be used interchangeably throughout) may include a waveguide (e.g., waveguide 430 in
According to an embodiment, the display waveguide may form a light path by guiding the display light emitted from the display panel 410 so that the display light is emitted into the display area of the see-through display 204. The see-through display 204 may correspond to at least one area of the display waveguide. For example, the area of the see-through display 204 may correspond to an area of the display waveguide where light propagating inside the display waveguide is emitted, while external light is transmitted simultaneously. For example, the see-through display 204 may be disposed at one end of the display waveguide included in the glass 330.
According to an embodiment, the display waveguide may include at least one of at least one diffraction element or a reflective element (e.g., a reflective mirror). The display waveguide may guide the display light emitted from the display panel 410 to the user's eye using at least one diffraction element or reflective element included in the display waveguide. For example, the diffraction element may include an input/output (IN/OUT) grating, and the reflective element may include total internal reflection (TIR).
According to an embodiment, an optical material (e.g., glass) may be processed into a wafer form for use as a display waveguide, and the refractive index of the display waveguide may vary from approximately 1.5 to approximately 1.9.
According to an embodiment, the display waveguide may include a display area through which light traveling inside the waveguide 430 via total internal reflection is emitted to the outside. The display area may be disposed on a portion of the display waveguide. At least one area of the display waveguide may include a see-through display (e.g., see-through display 204 in
According to an embodiment, the display waveguide may include a material (e.g., glass or plastic) capable of completely or substantially completely internally reflecting display light in order to guide the display light to the user's eye. The material is not limited to the aforementioned examples.
According to an embodiment, the display waveguide may disperse the display light emitted from the display panel 410 by wavelength (e.g., blue, green, or red), allowing each wavelength to travel along a separate path within the display waveguide.
According to an embodiment, the display waveguide may be disposed in the glass 330. For example, with respect to an imaginary axis that aligns a center point of the glass 330 with a center point of the user's eye, and an imaginary line perpendicular to the imaginary axis at the center point of the glass 330, an upper end and a lower end of the glass 330 may be distinguished, and the display waveguide may be disposed at the upper end of the glass 330. For another example, the display waveguide may be disposed across an area defined from the imaginary line to one-third point in the direction of the lower end between the upper end and lower end of the glass 330. The area in which the display waveguide is disposed is not limited to the aforementioned area of the glass 330, and the area in which the display waveguide is disposed may include any area of the glass 330 where the amount of light reflected to the user's eye is equal to or greater than a reference value.
According to an embodiment, the sensor module 320 may include at least one sensor (e.g., eye-tracking sensor and/or illuminance sensor). The at least one sensor is not limited to the aforementioned examples. For example, the at least one sensor may further include a proximity sensor or a contact sensor capable of detecting whether the user 202 is wearing the wearable electronic device 200. The wearable electronic device 200 may detect whether the user 202 is wearing the wearable electronic device 200 through the proximity sensor or contact sensor. When it is detected that the user 202 is wearing the wearable electronic device 200, the wearable electronic device 200 may pair passively and/or automatically with another electronic device (e.g., smartphone).
According to an embodiment, the eye-tracking sensor (e.g., gaze tracking module 1064 in
According to an embodiment, the illuminance sensor (e.g., illuminance sensor 1010 in
According to an embodiment, the wearable electronic device 200 may detect the ambient illuminance (or brightness) of the user 202 through the illuminance sensor 1010. The wearable electronic device 200 may adjust the amount of light (or brightness) of the display (e.g., display panel 410) on the basis of the detected illuminance (or brightness).
According to an embodiment, the glass 330 may include at least one of a display waveguide or an eye-tracking waveguide.
According to an embodiment, the eye-tracking waveguide may form a light path by guiding the reflected light from the user's eye so that the reflected light is emitted to the sensor module 320. The eye-tracking waveguide may be used to deliver the reflected light to the eye-tracking sensor.
According to an embodiment, the eye-tracking waveguide may be formed of the same elements as or different elements from the display waveguide.
According to an embodiment, the eye-tracking waveguide may be disposed in the glass 330. For example, with respect to an imaginary axis that aligns a center point of the glass 330 with a center point of the user's eye, and an imaginary line perpendicular to the imaginary axis at the center point of the glass 330, an upper end and a lower end of the glass 330 may be distinguished, and the eye-tracking waveguide may be disposed at the lower end of the glass 330. For another example, the eye-tracking waveguide may be disposed below the display waveguide. The eye-tracking waveguide and the display waveguide may be disposed in the glass 330 without overlapping each other. For another example, the eye-tracking waveguide may be disposed across an area excluding an area defined from the imaginary line to one-third point in the direction of the lower end of the lower end of the glass 330. The area in which the eye-tracking waveguide is disposed is not limited to the aforementioned area of the glass 330, and the area in which the eye-tracking waveguide is disposed may include any area of the glass 330 that allows an eye-tracking sensor to detect the amount of concentrated reflected light through the eye-tracking waveguide to be equal to or greater than a set value.
According to an embodiment, the display waveguide and the eye-tracking waveguide of the wearable electronic device 200 may be disposed in the glass 330. For example, the glass 330 (e.g., first see-through display 204-1 and/or second see-through display 204-2 in
According to an embodiment, the battery 340 may supply power to at least one element of the wearable electronic device 200. The battery 340 may be charged by being connected to an external power source either wired or wirelessly.
According to an embodiment, the camera 350 may capture images of the surroundings of the wearable electronic device 200. For example, the camera 350 may capture an image of the user's eye or capture an image of a real-world object outside the wearable electronic device 200.
According to an embodiment, the communication interface 360 may include various communication circuitry including, for example a wired interface or a wireless interface. The communication interface 360 may support the performance of direct communication (e.g., wired communication) or indirect communication (e.g., wireless communication) between the wearable electronic device 200 and an external device (e.g., smartphone or tablet PC).
With reference to
The waveguide 430 may be designed to form a grating with diffraction functionality, such as diffraction optical elements (DOE) or holographic optical elements (HOE), on some area of the plate, with variations in the period, depth, or refractive index of the grating. Accordingly, when the light signal input into the waveguide 430 propagates within the waveguide 430, part of the light signal may be delivered inside the waveguide 430, while another part of the light signal may be output to the outside of the waveguide 430, thereby distributing the light signal.
In
According to an embodiment, the display panel 410 may be configured to use individual LEDs as red pixels (not illustrated), green pixels (not illustrated), and blue pixels (not illustrated). The arrangement form of micro LEDs forming the red pixels, green pixels, and blue pixels may be variously modified and designed.
At least some of the operations illustrated in
The operations illustrated in
At operation 510, the wearable electronic device 200 according to an embodiment may activate a visibility enhancement mode in response to a predetermined event. In the present disclosure, the term “visibility enhancement mode” is merely an example and may be variously modified. For example, the term “visibility enhancement mode” may be replaced with terms such as “power-saving mode.”
According to an embodiment, the processor 120 may identify a battery level as the remaining charge of the battery. The processor 120 may activate the visibility enhancement mode in response to the battery level being below a designated threshold. In this case, the predetermined event may include a state in which the battery level is below the designated threshold.
According to an embodiment, the processor 120 may activate the visibility enhancement mode on the basis of input from the user 202 through an external device. For example, the user 202 may control the visibility enhancement mode of the wearable electronic device 200 using an external device (e.g., a smartphone) paired with the wearable electronic device 200 through short-range communication (e.g., Bluetooth™). The external device may output a control signal to activate the visibility enhancement mode or a control signal to deactivate the visibility enhancement mode to the wearable electronic device 200 through short-range communication on the basis of the input from the user 202. The wearable electronic device 200 may receive a control signal corresponding to the user's input from an external device through short-range communication, and may activate or deactivate the visibility enhancement mode on the basis of the received control signal. In this case, the predetermined event may include input from the user 202 through an external device.
According to an embodiment, operation 510 may be bypassed. For example, the processor 120 may always perform the visibility enhancement mode without performing operation 510.
At operation 520, the wearable electronic device 200 according to an embodiment may detect the ambient illuminance of the wearable electronic device 200 using an illuminance sensor (e.g., illuminance sensor 1010 in
According to an embodiment, the processor 120 may determine whether the detected illuminance is within a designated first range. The first range may be an illuminance range representing an outdoor environment on a clear day. For example, the wearable electronic device 200 may set the first range to be approximately 10,000 lux or more, but is not limited to this range.
According to an embodiment, the processor 120 may determine whether the detected illuminance is within a designated second range that is smaller than the first range. The second range may be an illuminance range representing an outdoor environment on a cloudy day or a shaded environment. For example, the wearable electronic device 200 may set the second range to be approximately 1,000 lux or more to less than approximately 10,000 lux, but is not limited to this range.
According to an embodiment, the processor 120 may determine whether the detected illuminance is within a designated third range that is smaller than the second range. The third range may be an illuminance range representing an indoor environment. For example, the wearable electronic device 200 may set the third range to be less than approximately 1,000 lux, but is not limited to this range.
In the example, the processor 120 divides the illuminance corresponding to the external environment into three different ranges. However, this is merely an example and the disclosure is not limited thereto. For example, the processor 120 may be configured to divide the illuminance corresponding to the external environment into two different ranges or more than two ranges.
At operation 530, the wearable electronic device 200 according to an embodiment may dynamically adjust the luminance of the image output through the display and/or the displaying form of at least one object included in the image on the basis of the detected illuminance.
According to an embodiment, the processor 120 may set the overall luminance of the image to a first luminance level when the detected illuminance is within the designated first range. The processor 120 may identify the outline of at least one object included in the image. The processor 120 may generate a first converted image including only the identified outline and control the display to display the first converted image on the basis of the first luminance level. The operation of such a wearable electronic device 200 will be described in greater detail below with reference to
According to an embodiment, the processor 120 may set the overall luminance of the image to a second luminance level that is lower than the first luminance level when the detected illuminance is within the designated second range that is smaller than the first range. The processor 120 may identify the outline of at least one object included in the image. The processor 120 may divide the image into an outline area corresponding to the outline and a non-outline area excluding the outline area, on the basis of the identified outline. The processor 120 may generate a second converted image by setting the luminance of the outline area higher than the luminance of the non-outline area, and control the display to display the second converted image on the basis of the second luminance level. The processor 120 may set the color of the outline included in the first converted image to white or green. The operation of such a wearable electronic device 200 will be described in greater detail below with reference to
According to an embodiment, the processor 120 may set the overall luminance of the image to a third luminance level that is lower than the second luminance level when the detected illuminance is within a designated third range that is smaller than the second range. The processor 120 may control the display to display the image on the basis of the third luminance level. The operation of such a wearable electronic device 200 will be described in greater detail below with reference to
In
In
According to an embodiment, the wearable electronic device 200 may control the display to display only the outline image when the external illuminance is within the designated first range. In
In an outdoor environment on a clear day, where the external illuminance is in the first range, such as approximately 80,000 lux, the visibility of the image viewed by the user 202 through the see-through display 204 of the wearable electronic device 200 may be low. For driving the wearable electronic device 200 with high visibility in an outdoor illuminance environment on a bright day, illustrated as approximately 80,000 lux, a very high display driving power may be required. According to an embodiment, the wearable electronic device 200 may set the maximum power consumption available for the display to be used for displaying an outline portion 613a of the display screen to enhance visibility in an outdoor environment on a clear day, where the external illuminance is in the first range, such as approximately 80,000 lux.
According to the example of
According to an embodiment, the wearable electronic device 200 may set the color of the outline portion 613a to white or to green, which has a high response in terms of visual sensitivity characteristics.
According to an embodiment, the processor 120 may set the overall luminance of the image to a first luminance level when the detected illuminance is within the designated first range. The processor 120 may identify the outline of at least one object 613 included in the image. The processor 120 may generate a first converted image including only the identified outline and control the display (e.g., display panel 410 in
According to an embodiment, the processor 120 may change the color of the outline included in the first converted image not only to white or green but also change the color of the outline on the basis of the color of the image displayed around the outline. For example, the processor 120 may determine the complementary color of the color of the image displayed around the outline and set the color of the outline to the determined complementary color. The electronic device 101 may enhance the visibility of the outline by setting the color of the outline to the complementary color of the color of the image displayed around the outline.
According to an embodiment, the processor 120 may increase the width (e.g., length, thickness, or breadth) of the outline included in the first converted image in proportion to magnitude of the detected illuminance. For example, the processor 120 may adjust the width of the outline on the basis of the magnitude of external illuminance even when the external illuminance is within the first range.
According to an embodiment, the processor 120 may divide a plurality of pixels of the display 410 into an on-pixel group corresponding to the outline portion 613a and an off-pixel group corresponding to the non-outline portion 613b when the visibility enhancement mode is activated. The processor 120 may apply designated power and offset power to the on-pixel group to enhance the luminance of the on-pixel group while displaying the first converted image through the display 410. Here, the offset power may be the power used to turn on the off-pixel group.
In the present disclosure, the first converted image may refer to an image displaying only the outline portion 613a of at least one object 613 included in the image, as described above.
In the present disclosure, the outline portion 613a may refer to an outline area including only the outline of at least one object 613 included in the image.
In the present disclosure, the non-outline portion 613b may refer to a non-outline area positioned inside the outline of at least one object 613 included in the image.
In
In
According to an embodiment, the wearable electronic device 200 may control the display (e.g., display panel 410 in
In an outdoor environment with cloudy weather or in a shaded environment where the external illuminance is within the second range, such as approximately 2500 lux, the visibility of the image viewed by the user 202 through the see-through display (e.g., see-through display 204 in
According to an embodiment, the processor 120 may set the overall luminance of the image to a second luminance level that is lower than the first luminance level when the detected illuminance is within the designated second range that is smaller than the first range. The processor 120 may identify the outline of at least one object 623 included in the image and, on the basis of the identified outline, divide the image into an outline area corresponding to the outline and a non-outline area excluding the outline area. The processor 120 may generate a second converted image by setting the luminance of the outline area higher than the luminance of the non-outline area, and control the display 410 to display the second converted image on the basis of the second luminance level.
According to an embodiment, the saturation of the second converted image may be set lower than the saturation of the pre-converted image. For example, the wearable electronic device 200 may be set to display the non-outline portion 623b of the object 623 included in the image in an outdoor environment with cloudy weather or in a shaded environment where the external illuminance is within the second range, such as approximately 2500 lux, and may reduce power consumption by lowering the saturation of the non-outline portion 623b.
In the present disclosure, the second converted image may refer to an image in which the luminance of the outline portion 623a of at least one object 623 included in the image is set higher than the luminance of the non-outline portion 623b, as described above.
In
In
According to an embodiment, the wearable electronic device 200 may control the display 410 to display the image without separate luminance control when the external illuminance is within the designated third range. In
In a dark indoor environment with external illuminance of approximately 700 lux, the visibility of the image viewed by the user 202 through the see-through display (e.g., see-through display 204 in
According to an embodiment, the processor 120 may set the overall luminance of the image to a third luminance level that is lower than the second luminance level when the detected illuminance is within a designated third range that is smaller than the second range. The processor 120 may control the display 410 to display the image on the basis of a third luminance level without separate luminance control.
In
In
In
With reference to
According to an embodiment, the processor 120 may generate a brightness map (e.g., brightness map 1162 in
According to an embodiment, the processor 120 may control the display 410 to display the first converted image through the sunny area 821 and the second converted image through the shaded area 822. For example, the processor 120, in the wearable electronic device 200, may display a portion of the image corresponding to the sunny area 821 in the form of the first converted image, which displays only the outline portion (e.g., outline portion 613a in
With reference to
According to an embodiment, the wearable electronic device 200 may have a glasses form including frame members 961 and 962 and temple members 951 and 952. The frame members 961 and 962 may include a first frame member 961 corresponding to the user's right eye and surrounding the first see-through display 204, a second frame member 962 corresponding to the user's left eye and surrounding the second see-through display 204, and a bridge member 971 positioned between the first frame member 961 and the second frame member 962. The temple members 951 and 952 may include a first temple member 951 connected to one end of the first frame member 961, and a second temple member 952 connected to one end of the second frame member 962.
According to an embodiment, the illuminance sensor 910 of the wearable electronic device 200 may be disposed on at least a portion of the bridge member 971, but is not limited thereto. According to an embodiment, the wearable electronic device 200 may detect the ambient illuminance of the wearable electronic device 200 using the illuminance sensor 910.
According to an embodiment, the wearable electronic device 200 may include a first camera 921 and a second camera 922, as at least one front camera configured to capture the front of the wearable electronic device 200. The first camera 921 may be disposed on at least a portion of the first frame member 961. The second camera 922 may be disposed on at least a portion of the second frame member 962. The first camera 921 and the second camera 922 may be positioned symmetrically with respect to the bridge member 971. The first camera 921 and the second camera 922 may be a pair of cameras that monitor the front situation of the wearable electronic device 200, and may be configured to detect the movement of the wearable electronic device 200, the rotation of the user's head, and the like. The wearable electronic device 200 may generate an environmental map of the surroundings of the wearable electronic device 200 using the data or signals detected by the first camera 921 and the second camera 922, and may perform a mixing of the surrounding environment with the image generated by the wearable electronic device 200.
According to an embodiment, the wearable electronic device 200 may include a third camera 931 and a fourth camera 932, as at least one eyeball-tracking camera configured to track the user's eyeball. The third camera 931 may be disposed on at least a portion of the first frame member 961. The fourth camera 932 may be disposed on at least a portion of the second frame member 962. The third camera 931 and the fourth camera 932 may be positioned symmetrically with reference to the bridge member 971. The wearable electronic device 200 may determine the user's gaze direction by tracking the user's eyeball using the third camera 931 and the fourth camera 932, and may perform functions related to user 202 interaction on the basis of the determined gaze direction. The functions related to user 202 interaction may include a function of displaying information corresponding to the user's gaze direction through the see-through display 204, and a function of dynamically varying the form in which an object is displayed depending on the brightness for each area in the field of view corresponding to the user's gaze direction, and the like.
With reference to
According to an embodiment, the illuminance detection module 1061 may determine the overall illuminance level of the surrounding environment of the wearable electronic device 200 on the basis of the signal obtained through the illuminance sensor 1010.
According to an embodiment, the brightness map module 1062 may generate a brightness map (e.g., brightness map 1162 in
According to an embodiment, the head tracking module 1063 may determine the movement of the wearable electronic device 200 and the direction of the user's head using the image information obtained through the first camera 1021 and the second camera 1022.
According to an embodiment, the gaze tracking module 1064 may track the user's gaze through the third camera 1031 and the fourth camera 1032, and the data related to the user's gaze may be combined with the data on the movement of the wearable electronic device 200 and the direction of the user's head, as determined by the head tracking module 1063, to generate multidimensional composite information on which direction the user 202 is looking at within the surrounding environment of the wearable electronic device 200.
According to an embodiment, the edge detection module 1065 may perform an operation of detecting an edge area (e.g., outline area) of at least one object from the image to be displayed by the display panel 410.
According to an embodiment, the image conversion module 1066 may generate a first converted image including only the outline portion based on the detected edge area, or generate a second converted image in which the luminance of the outline portion differs from that of the non-outline portion. According to an embodiment, the image conversion module 1066 may generate the first converted image or the second converted image by matching and combining the brightness map 1162 with the user's gaze direction. The first converted image or the second converted image generated by the image conversion module 1066 may be delivered from the image processing processor 1060 to a display driving IC (DDI) 1080 that drives the display panel 410, so as to be output through the display panel 410.
With reference to
According to an embodiment, the wearable electronic device 200 may include a depth camera 1120 configured to extract depth information. The depth camera 1120 may be a camera that combines the first camera 1021 and the second camera 1022 illustrated in
According to an embodiment, the resolution changing module 1101 of the image signal processor 1100 may adjust a size of the image obtained using the depth camera 1120. The resolution changing module 1101 may perform operations such as binning a portion of the input image, thereby generating an image with a lower resolution compared to the original image. Such an image signal processor 1100 may increase the speed of extracting the brightness map 1162 by lowering the resolution of the input image. According to an embodiment, the image input by the depth camera 1120 may be input directly to the accumulation module 1103 without passing through the resolution changing module 1101.
In
According to an embodiment, the timing control module 1102 may control the timing of receiving image information from the depth camera 1120. The timing control module 1102 may generate a control signal to control the exposure time of the imaging element inside the depth camera 1120. According to an embodiment, the timing control module 1102 may control the timing at which the accumulation module 1103 performs calculations. The timing control module 1102 may vary the interval for outputting the control signal depending on the shooting situations. Accordingly, the timing control module 1102 may vary frames per second (FPS) of the image output from the image signal processor 1100 and adjust a duty ratio of a timing control signal under the same FPS situations.
According to an embodiment, the accumulation module 1103 may be an adder that accumulates an image signal output from the resolution changing module 1101. The calculation result of the accumulation module 1103 is Gi(x, y), which is the image information in a form where the previous information Gi−1(x, y) is added to Fi(x, y). For example, when initial image output information on the resolution changing module 1101 is F1(x, y), a “ ” may be input to the accumulation module 1103 along with F1(x, y). Therefore, initial output information on the accumulation module 1103 will be a “ ”, and second output information on the accumulation module 1103 will be a “ ”. As described above, the accumulation module 1103 may perform an operation of accumulating output information from the resolution changing module 1101.
According to an embodiment, the processing module 1104 may include a calculation unit for extracting the absolute amount of light of the image coordinates (x, y) through the output information Gi(x, y) of the accumulation module 1103. The processing module 1104 may measure the absolute amount of light of the light rays reaching the imaging element using information such as a lens F-value, international standard organization (ISO), exposure, and shutter speed, which are parameters for adjusting the brightness of the image in the camera. Generally, when imaging a very bright light source, the shutter speed needs to be fast and the ISO needs to be set as low as possible to prevent and/or reduce the pixel output of the imaging element from saturating. The processing module 1104 may control the timing control module 1102 to adjust the shutter speed quickly and extract an area with high illuminance in the surrounding environment of the wearable electronic device 200.
According to an embodiment, the processing module 1104 may control the timing control module 1102 to adjust the number of extracted images. As the number of extracted images increases, the brightness of the image may be extracted more precisely. For example, in Gi(x, y) illustrated in
According to an embodiment, the processing module 1104 may use position information on the wearable electronic device 200 extracted through the position tracking module 1105, and accordingly, the processing module 1104 may generate the brightness map 1162 corresponding to the entire surrounding environment of the wearable electronic device 200.
The operations illustrated in
Operation 1261 may represent an operation of obtaining the illuminance level of the surrounding environment of the wearable electronic device 200 using the illuminance sensor 1010 according to an embodiment described in
Operation 1262 may represent an operation of obtaining the brightness map 1162 according to an embodiment described in
Operation 1263 may represent an operation of determining the position of the wearable electronic device 200 based on the movement of the wearable electronic device 200, the direction of the user's head, and the user's gaze direction, using the head tracking module 1063 and the gaze tracking module 1064 according to an embodiment described in
Operation 1271 may represent an operation of dividing the screen of the image output from the display of the wearable electronic device 200 (e.g., display panel 410 in
Block 1270 illustrated in
Operation 1265 may represent an operation of detecting an edge area (e.g., outline area) of at least one object from the image to be displayed by the display panel 410, using the edge detection module 1065 according to an embodiment described in FIG. 10. Edge information corresponding to the edge area generated at operation 1265 may be input to operation block 1266.
At operation 1266, the processor 120 may receive the edge information and the original image, and generate a final output image to be displayed by the wearable electronic device 200 according to the driving scheme determined for each area at operation 1271 (e.g., a form that displays only the outline portion, or a form in which the luminance ratio between the outline portion 623a and the non-outline portion 623b is adjusted). The final output image generated at operation 1266 may be delivered to the display driving IC (DDI) that drives the display panel 410 (e.g., DDI 1080 in
The operations illustrated in
Block 1370 in
Operation 1371 may represent the processing result of operation 1266 according to an embodiment described in
At operation 1372, the image signal processor 1060 may determine a reduction ratio for the resolution of the final output image generated at operation 1371. The image signal processor 1060 may determine the reduction ratio for the resolution on the basis of designated conditions and output a control signal related to the determined reduction ratio.
At operation 1373, the image signal processor 1060 may reduce the resolution of the final output image on the basis of the control signal related to the reduction ratio. The image signal processor 1060 may generate a low-resolution image by reducing the resolution of the final output image. The image signal processor 1060 may divide a plurality of pixels of the display (e.g., display panel 410 in
At operation 1380, the image signal processor 1060 may deliver the final output image with reduced resolution, as determined at operation 1373, to the display driving IC (DDI) (e.g., DDI 1080 in
A wearable electronic device (e.g., wearable electronic device 200 in
According to an example embodiment, at least one processor, individually and/or collectively, may be configured to: set the overall luminance of the image to a first luminance level based on the detected illuminance being within a designated first range, identify an outline of at least one object included in the image, generate a first converted image including only the identified outline, and control the display to display the first converted image based on the first luminance level.
According to an example embodiment, at least one processor, individually and/or collectively, may be configured to: based on the detected illuminance being within a designated second range, less than the first range, set overall luminance of the image to a second luminance level, less than the first luminance level, identify the outline of at least one object included in the image, divide, based on the identified outline, the image into an outline area corresponding to the outline and a non-outline area excluding the outline area, generate a second converted image by setting luminance of the outline area higher than luminance of the non-outline area, and control the display to display the second converted image on the basis of the second luminance level.
According to an example embodiment, at least one processor, individually and/or collectively, may be configured to set a color of the outline included in the first converted image to white or green.
According to an example embodiment, at least one processor, individually and/or collectively, may be configured to: based on the detected illuminance being within a designated third range, less than the second range, set overall luminance of the image to a third luminance level, less than the second luminance level, and control the display to display the image based on the third luminance level.
According to an example embodiment, at least one processor, individually and/or collectively, may be configured to set saturation of the second converted image to be lower than saturation of the image.
According to an example embodiment, at least one processor, individually and/or collectively, may be configured to increase a width of the outline included in the first converted image in proportion to the detected magnitude of illuminance.
According to an example embodiment, the wearable electronic device may further include: at least one front camera configured to capture a front of the wearable electronic device, and at least one eyeball-tracking camera configured to track a user's eyeball, wherein at least one processor, individually and/or collectively, may be configured to: generate a brightness map corresponding to a front environment of the wearable electronic device using the illuminance sensor, the brightness map including brightness information mapped for each area of the front environment of the wearable electronic device, determine the user's gaze direction within the front environment of the wearable electronic device by tracking the user's eyeball, determine the brightness for each area of a field of area corresponding to the user's gaze direction based on the brightness map, divide the field of area into a sunny area and a shaded area based on brightness for each area of the field of area, and display the first converted image through the sunny area, and display the second converted image through the shaded area.
According to an example embodiment, at least one processor, individually and/or collectively, may be configured to: based on the visibility enhancement mode being activated, generate a low-resolution image by reducing resolution of the image, divide a plurality of pixels of the display into an on-pixel group and an off-pixel group in relation to the low-resolution image, and apply designated power and offset power to the on-pixel group so that luminance of the on-pixel group is enhanced while displaying the low-resolution image through the display, in which the offset power may be power used to turn on the off-pixel group.
According to an example embodiment, at least one processor, individually and/or collectively, may be configured to: identify a battery level as the remaining battery capacity, and the specified event that activates the visibility enhancement mode may include a state in which the battery level is less than a designated threshold.
According to an example embodiment, the specified event that activates the visibility enhancement mode may include an input through an external device.
In a method of the wearable electronic device according to an example embodiment, the wearable electronic device may include at least one lens, a battery, a display, a waveguide configured to receive an image from the display and output the image through the at least one lens, and an illuminance sensor configured to detect the external illuminance of the wearable electronic device, wherein the method may include: activating a visibility enhancement mode in response to a specified event, detecting the ambient illuminance of the wearable electronic device using the illuminance sensor in response to the activation of the visibility enhancement mode, and dynamically adjusting the luminance of the image output through the display and a displaying form of at least one object included in the image based on the detected illuminance.
According to an example embodiment, the method may include: setting, based on the detected illuminance being within a designated first range, overall luminance of the image to a first luminance level, identifying an outline of at least one object included in the image; generating a first converted image including only the identified outline, and controlling the display to display the first converted image based on the first luminance level.
According to an example embodiment, the method may include: setting, based on the detected illuminance being within a designated second range, less than the first range, overall luminance of the image to a second luminance level, less than the first luminance level, identifying the outline of at least one object included in the image, dividing, based on the identified outline, the image into an outline area corresponding to the outline and a non-outline area excluding the outline area, generating a second converted image by setting luminance of the outline area higher than luminance of the non-outline area, and controlling the display to display the second converted image based on the second luminance level.
According to an example embodiment, the method may include setting a color of the outline included in the first converted image to white or green.
According to an example embodiment, the method may include setting overall luminance of the image to a third luminance level, less than the second luminance level, based on the detected illuminance being within a designated third range less than the second range, and controlling the display to display the image based on the third luminance level.
According to an example embodiment, the method may include setting saturation of the second converted image to be lower than saturation of the image.
According to an example embodiment, the method may include increasing a width of the outline included in the first converted image in proportion to magnitude of the detected illuminance.
According to an example embodiment, the wearable electronic device may further include: at least one front camera configured to capture a front of the wearable electronic device, and at least one eyeball-tracking camera configured to track a user's eyeball, in which the method may include: generating a brightness map corresponding to a front environment of the wearable electronic device using the illuminance sensor, the brightness map including brightness information mapped for each area of the front environment of the wearable electronic device, determining the user's gaze direction within the front environment of the wearable electronic device by tracking the user's eyeball, determining the brightness for each area of a field of area corresponding to the user's gaze direction based on the brightness map, dividing the field of area into a sunny area and a shaded area based on the brightness for each area of the field of area, displaying the first converted image through the sunny area, and displaying the second converted image through the shaded area.
According to an example embodiment, based on the visibility enhancement mode being activated, the method may include: generating a low-resolution image by reducing the resolution of the image, dividing a plurality of pixels of the display into an on-pixel group and an off-pixel group in relation to the low-resolution image, and applying designated power and offset power to the on-pixel group so that the luminance of the on-pixel group is enhanced while displaying the low-resolution image through the display, in which the offset power may be the power used to turn on the off-pixel group.
While the disclosure has been illustrated and described with reference to various example embodiments thereof, it will be understood by those skilled in the art that various modifications in form and detail may be made without departing from the true spirit and full scope of the disclosure including the appended claims and their equivalents. It will also be understood that any of the embodiment(s) described herein may be used in conjunction with any other embodiment(s) described herein.
Number | Date | Country | Kind |
---|---|---|---|
10-2022-0092746 | Jul 2022 | KR | national |
10-2022-0114327 | Sep 2022 | KR | national |
This application is a continuation of International Application No. PCT/KR2023/007682 designating the United States, filed on Jun. 5, 2023, in the Korean Intellectual Property Receiving Office and claiming priority to Korean Patent Application Nos. 10-2022-0092746, filed on Jul. 26, 2022, and 10-2022-0114327, filed on Sep. 8, 2022, in the Korean Intellectual Property Office, the disclosures of each of which are incorporated by reference herein in their entireties.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/KR2023/007682 | Jun 2023 | WO |
Child | 19032911 | US |